Maxwell’s Demon: Planning for Technology Obsolescence in Acquisition Strategy

Imagine a chamber divided into two parts by a removable partition. On one side is a hot sample of gas and on the other side a cold sample of the same gas. The chamber is a closed system with a certain amount of order, because the statistically faster moving molecules of the hot gas on one side of the partition are segregated from statistically slower moving molecules of the cold gas on the other side. Maxwell’s demon guards a trap door in the partition, which is still assumed not to conduct heat. It spots molecules coming from either side and judges their speeds…The perverse demon manipulates the trap door so as to allow passage only to the very slowest molecules of the hot gas and the very fastest molecules of the cold gas. Thus the cold gas receives extremely slow molecules, cooling it further, and the hot gas receives extremely fast molecules, making it even hotter. In apparent defiance of the second law of thermodynamics, the demon has caused heat to flow from the cold gas to the hot one. What is going on?

Because the law applies only to a closed system, we must include the demon in our calculations. Its increase of entropy must be at least as great as the decrease of entropy in the gas-filled halves of the chamber. What is it like for the demon to increase its entropy? –Murray Gell-Mann, The Quark and the Jaguar: Adventures in the Simple and the Complex, W. H. Freeman and Company, New York, 1994, pp. 222-223

“Entropy is a figure of speech, then,” sighed Nefastis, “a metaphor. It connects the world of thermodynamics to the world of information flow. The Machine uses both. The Demon makes the metaphor not only verbally graceful, but also objectively true.” –Thomas Pynchon, The Crying of Lot 49, J.B. Lippincott, Philadelphia, 1965

Technology Acquisition: The Basics

I’ve recently been involved in discussions regarding software development and acquisition that cut across several disciplines that should be of interest to anyone engaged in project management in general, but IT project management and acquisition in particular.

The specific statement that prompted this line of thought was: are our technology acquisition systems flexible enough to allow for exploiting new capabilities and providing pathways to entry by new innovations? A related concern is that when a dominant buyer—especially when involving a public agency in a monopsonistic market—commits to a specific solution in a highly organized industry, which has the effect of creating a de facto standard; a condition which will limit technological flexibility and may restrict competitive supplier entry to the market.

There are obvious objections and qualifications to the deterministic conclusions of this line of reasoning. The first is: it depends. If your acquisition strategy is segmented and focused on addressing specific needs, as opposed to a complex multi-year development, then such concerns can be addressed by the application of the basic economic concept of sunk versus opportunity costs. That is, just because a solution solved an issue in the last fiscal year is not, on its own terms, a sufficient justification for sticking with that solution.

The second is the public good basis of economic theory. That is, that in order to incentivize and reward innovation, such objections are otherwise unfounded as long as the process is fair, such as sufficient market research, absent the corrupt practice of pre-selection or unfair influence, and are conducted using full and open competition based on salient characteristics that define the framing assumptions of the requirement within a defensible assessment of best value. Note that I do not criticize the fact that sometimes the selected solution also meets the criteria of lowest acceptable bidder because the most innovative solutions, particularly in technology, also tend to the be the least costly.

The third is a test in addressing the underlying socioeconomic concerns. If there is more than adequate market competition in the general market, even though specific alternatives may be inferior in a niche market, then such concerns are unfounded. Thus, one supplier may have a competitive advantage in meeting the organization’s needs due to some set of features or functionality, but it is a counterfactual to conclude that the condition does not meet a social good. Otherwise, public institutions would always be excluded from acquiring new technology.

But the question of agility in acquisition is still a concern, especially in extremely complex project management environments involving multi-year research and development. Here the economic and socioeconomic considerations are more complex. Furthermore, these two aspects of social science do not provide all of the solutions relevant to the issue.

I believe that we find the answers, instead, in the basic physics of information theory and the practical application of the theory as it applies to computing and digitization. These answers then provide us with an outline for practical solutions in project management. On a more prosaic level, our outline can strongly suggest the approach that is advisable in application of acquisition methods, acquisition planning, service life management, and anticipated obsolescence.

What Lies Beneath Our Assumptions

A Little Wonky from a Tech Perspective…but I’ll Lead You Through It

At the heart of information theory is the proposition that there are universal laws tied to the nature of the universe that can be understood and leveraged to our advantage. This proposition was first anticipated by physicist James Clerk Maxwell in 1871, that the use of information can contradict the Second Law of Thermodynamics to the point of statistical certainty.

The Second Law, first discovered by Lord Kelvin, states that the universe expends its energy in what has been coined a “heat death” with time’s arrow always pointing forward, moving toward equilibrium. In order to achieve order of some sort over the course of time’s arrow, energy or heat must be expended; you don’t get anything for free in this universe. Building or organizing something requires that disorder, or entropy, be increased somewhere else. But disorder is not uniform, and there is a tradeoff between order and disorder.

It wasn’t until 1929 that physicist Leó Szilárd was able to come up with a mathematical proof that reconciled the Second Law with the concept of “Maxwell’s Demon” as described above. Szilárd did this by identifying the “demon” as intelligent enough to measure the speed of the molecules to distinguish whether they are hot or cold. Entropy—disorder and decay—is transferred not to the system being measured but to the “demon.”

Szilard’s measurement later became known as a “bit,” which has become most widely known through the work of Bell Labs’ Claude Shannon, the recognized father of information theory. For those of you who are mathematically minded, the equation that describes the cost of measurement of a bit of information is:

S = k log 2 where k is Boltzmann’s constant. The base-2 logarithm reflects the binary decision.

The amount of entropy generated by measurement can always exceed this amount, but never be less than it, in order not to violate the Second Law. Thus, entropy can be diverted or absorbed by some other object. There can be natural “demons” (such as the agency of natural selection and adaptation), or there can be imposed “demons.” In our modern worldview, it is increasingly apparent that the universe is made up of information of one kind or another. The practical application of this equation on computing first extended to cryptography, and is seen today in the widespread use and availability of such things as VPNs, CAPTCHA, encrypted storage, and the use of zip-type files.

The purpose of all of this effort, of course, is to find the minimum lower bound to the amount of energy expended in extracting information from any bit, and thus minimizing or deflecting entropy, while maximizing the information content that could be compacted or communicated. The connection between the thermodynamic concept of entropy and informational concept of entropy as made by Shannon was further confirmed in 1961 by the work of Rolf Landauer of IBM, though coming from a different direction, which has come to be called Landauer’s Principle.

This principle calculates the lowest theoretical energy level of computation that is required to erase one bit of information. Once again, it matches Szilard’s equation pertaining to Maxwell’s Demon, though coming from the direction of information deletion. Just as Pynchon used the concept of Maxwell’s Demon in his novel to connect thermodynamics and real-world information, so too do we find that the mathematics underlying it demonstrate a connection that reflects a common theory of limitations both for thermodynamics and the information contained in every bit of the universe.

Putting It All Together

Computing and software, like all artifacts of human nature that attempt to extend our capabilities, are limited by physics and human mortality. We live in a universe in which our limits are predetermined, but accord us a great deal of flexibility within our sphere, where there is vast uncertainty within the bounds of determinism that is a function of probability.

According to Moore’s Law, introduced by Intel cofounder Gordon Moore in 1965, processing power doubles (at minimum) every eighteen months to two years. At the same time, we know that our digitization processes, while never being able to break the Second Law, are rapidly approaching the lower bound of the equation that underlies information theory. I think this process is perpetual; that is, it will continue to approach the lower bound without actually touching it.

By definition then, all of our software applications and attempts at artificial intelligence have a built-in obsolescence of two years, unless those tools are updated to take into account the machine and operating level languages that can harness such power. This is not so much planned obsolescence as unplanned obsolescence.

Furthermore, as we apply this power to larger datasets, the entropy that is needed to be expended also exponentially increases, constrained by Landauer’s Principle. In Landauer’s time this entropy was expelled as heat. But in our own time, given the nature of some of the information that is processed in the realm of what we now call big data, entropy can be expelled as uncertainty.

This application of Landauer’s Principle is why, on previous occasions in writing about big data, that I distinguish between data that requires a high final state of validity, requiring normalization and rationalization, and big data that can withstand some uncertainty, its validity reduced to probability.

Information Economics Steps In

Thus, combining an understanding of the physical limitations of information theory with information economics will give us a better idea of how to approach the concerns that opened this article.

According to Dr. Brad DeLong and Michael Froomkin in their seminal paper “The Next Economy?,” there are three principles to information economics that differentiate it from neoclassical economics. These are:

  1. Exclusion is not a natural property of information, meaning it is extremely hard to exclude others from enjoying the information.
  2. Information is non-rivalrous, which means that information systems involve technologies where the willingness to pay for a bit of information is greater than the marginal cost of producing another copy of that bit. Thus, price is determined by the perceived value applied to the information.
  3. The information market does not exhibit high degrees of transparency; to understand the value of the information, it must be known. The relationship, therefore, between the supplier of the information and the consumer of the information is asymmetrical. To understand the value and applicability of a bit of software, you have to learn to use it. Oftentimes, when piloting or evaluating the product, the information is simultaneously consumed or utilized, involving cost to both supplier and consumer.

Information Asymmetry Informs our Strategy and Core Principles

DeLong and Froomkin wrote their paper during the heady days of the 1990s tech bubble, which was soon to burst due to the economic characteristics of information that they identified. Both the non-exclusiveness and non-rivalrous features of information essentially make it a public good, changing the locus of who owns it. The key distinction missing here is between data and information—a concept that also goes back to theories of cognition and intelligence-gathering contemporaneous with information theory.

What this means is that while data has taken the form of a public good, with little marginal cost attached to making copies of it, the processing of that data still possesses the neoclassical characteristics of exclusion and rivalry, while still also offering the unique characteristic of being opaque. The marginal cost of implementation, upgrades, and maintenance is non-trivial.

Joseph Stiglitz, George Akerlof, and Mike Spence, who all received the Nobel Prize in 2001 for their work, identified the effects of this asymmetry in information. The asymmetry has an enhancing effect on the opaqueness of digital products. This increases both risk and uncertainty in technology acquisition.

Thus, understanding both the physical and economic characteristics of data and information, we now have a basis for establishing an acquisition strategy to exploit and influence the characteristics of the market. Strategies that take into account that data is a public good establish ownership of the data in the consumer. Thus, it is imperative that open data be a cornerstone principle of any such strategy, otherwise, the deleterious effects of exclusivity and asymmetry undermine both its intrinsic and business value to the consumer–and restrict flexibility in both operational capabilities and acquisition strategy.

What this also reveals is that the paradigm that has reigned since the introduction of applications for PCs: that the relative advantages and the assessed value of software is based on the power and features associated with its data processing, analysis, and visualization capabilities is now is largely irrelevant, particularly since such features are included within the operating environments upon which software interacts. For example, .NET visual components provide virtually unlimited features in visualization and integration that can be deployed in an instant by the user, given that data is transparent and non-exclusive. Thus, the key for government agencies and consumers is to eliminate or mitigate asymmetry in data and, hence, information.

Understanding that underlying hardware and software technologies offer greater storage and processing power—essentially a new software generation—every two years, our acquisition strategy must include a decision point where the costs associated with old technologies are greater than the comparative value of newly introduced technologies. This should be done through piloting or proof of concept, an evaluation of the platforms in terms of whether they provide an ability to adapt to the introduction of a new generation of processing every two years, and whether the platforms behave in a way that supports open architectures, and data transparency and neutrality.