Yesterday the New York Times published an article about Moore’s Law. While interesting in that John Markoff, who is the Times science writer, speculates that in about 5 years the computing industry will be “manipulating material as small as atoms” and therefore may hit a wall in what has become a back of the envelope calculation of the multiplicative nature of computing complexity and power in the silicon age.
This article prompted a follow on from Brian Feldman at NY Mag, that the Institute of Electrical and Electronics Engineers (IEEE) has anticipated a broader definition of the phenomenon of the accelerating rate of computing power to take into account quantum computing. Note here that the definition used in this context is the literal one: the doubling of the number of transistors over time that can be placed on a microchip. That is a correct summation of what Gordon Moore said, but it not how Moore’s Law is viewed or applied within the tech industry.
Moore’s Law (which is really a rule of thumb or guideline in lieu of an ironclad law) has been used, instead, as a analogue to describe the geometric acceleration that has been seen in computer power over the last 50 years. As Moore originally described the phenomenon, the doubling of transistors occurred every two years. Then it was revised later to occur about every 18 months or so, and now it is down to 12 months or less. Furthermore, aside from increasing transistors, there are many other parallel strategies that engineers have applied to increase speed and performance. When we combine the observation of Moore’s Law with other principles tied to the physical world, such as Landauer’s Principle and Information Theory, we begin to find a coherence in our observations that are truly tied to physics. Thus, rather than being a break from Moore’s Law (and the observations of these other principles and theory noted above), quantum computing, to which the articles refer, sits on a continuum rather than a break with these concepts.
Bottom line: computing, memory, and storage systems are becoming more powerful, faster, and expandable.
Thus, Moore’s Law in terms of computing power looks like this over time:
Furthermore, when we calculate the cost associated with erasing a bit of memory we begin to approach identifying the Demon* in defying the the Second Law of Thermodynamics.
Note, however, that the Second Law is not really being defied, it is just that we are constantly approaching zero, though never actually achieving it. But the principle here is that the marginal cost associated with each additional bit of information become vanishingly small to the point of not passing the “so what” test, at least in everyday life. Though, of course, when we get to neural networks and strong AI such differences are very large indeed–akin to mathematics being somewhat accurate when we want to travel from, say, San Francisco to London, but requiring more rigor and fidelity when traveling from Kennedy Space Center to Gale Crater on Mars.
The challenge, then, in computing is to be able to effectively harness such power. Our current programming languages and operating environments are only scratching the surface of how to do this, and the joke in the industry is that the speed of software is inversely proportional to the advance in computing power provided by Moore’s Law. The issue is that our brains, and thus the languages we harness to utilize computational power, are based in an analog understanding of the universe, while the machines we are harnessing are digital. For now this knowledge can only build bad software and robots, but given our drive into the brave new world of heuristics, may lead us to Skynet and the AI apocalypse if we are not careful–making science fiction, once again, science fact.
Back to present time, however, what this means is that for at least the next decade, we will see an acceleration of the ability to use more and larger sets of data. The risks, that we seem to have to relearn due to a new generation of techies entering the market which lack a well rounded liberal arts education, is that the basic statistical and scientific rules in the conversion, interpretation, and application of intelligence and information can still be roundly abused and violated. Bad management, bad decision making, bad leadership, bad mathematics, bad statisticians, specious logic, and plain old common human failings are just made worse, with greater impact on more people, given the misuse of that intelligence and information.
The watchman against these abuses, then, must be incorporated into the solutions that use this intelligence and information. This is especially critical given the accelerated pace of computing power, and the greater interdependence of human and complex systems that this acceleration creates.
Note: I’ve defaulted to the Wikipedia definitions of both Landauer’s Principle and Information Theory for the sake of simplicity. I’ve referenced more detailed work on these concepts in previous posts and invite readers to seek those out in the archives of this blog.