"Posted by Peter GlaskowskyMoore's Law is familiar to most people who work with computing systems. It basically states that the number of transistors on a chip will double every two years. Most people understand this as involving not only keeping the chip size constant, but the cost level as well. Keeping the cost level, though, was not part of the original law. To be sure, as transistor count goes up, cost does seem to go down, though. Using the Intel family of processors, the graphic below shows how remarkably close Moore's Law has been as predicting chip evolution. (click the image to enlarge it)
My friend Jerry Pournelle calls Unix the full-employment act for computer wizards (presumably a reference to the Humphrey-Hawkins Full Employment Act of 1978).
Similarly, I regard Moore's Law as the full-employment act for computer pundits. I've written about it several times myself (e.g. here and here); the phrase gets 930,000 hits on Google today.
One of the duties of any publication in the computer industry is to cast periodic doubt on the future reliability of Moore's Law, thus keeping the phrase prominent in the public perception. EDN Magazine discharged its duty for this year with great aplomb by publishing this piece last week.
You'll note this article says that this is 'the first time' there's been such doubt. Never mind; they always say that.
The first time I heard that the sky was falling--excuse me, that Moore's Law was being threatened--was as the industry began to consider how to make chips with line widths below one micron (a millionth of a meter). That milestone was passed easily, and a while later we sailed past the quarter-micron mark, and now we're making chips with line widths a quarter of that--65 nm (nanometers, a billionth of a meter).
So now, right on schedule, doubt is being cast on our ability to reach the 15nm generation, which represents another four-fold linear reduction."
This image is from the Wikipedia article on Moore' Law, and has been released into the public domain.
Gordon Moore is one of the co-founders of Intel (and not suprisingly, Intel has its own page on Moore's Law).
Here is where the rub is: Current processor chips are already doing things at a 65 nanometer scale, and for Moore's law to hold, lines etched in the silicon will eventually have to reach 15 nanometers in width (about 30 atoms wide). We are approaching some theoretical limits here...
On a slight change of subject, the story of Admiral Grace Hopper is fascinating. She used a short length of wire to illustrate the problems to be solved as the need for faster computers became more acute. She called her piece of string "a nanosecond". It was the distance light traveled during a nanosecond (1 billionth of a second). I'll leave it as an exercise to the reader to compute the length of that wire -- it's easier using the metric system -- and no fair reading the Wikipedia article for the answer. The speed of light in a vacuum is 299,792,458 meters/second.
As with Moore's Law, Hopper's illustration is instructive. You might say Moore's Law made it possible to not only have more powerful computers in the same amount of space, but it also made it possible for that space to become smaller, and the distance between components to shorten. To perhaps oversimplify, this made faster computers a reality.