Fourty years and one day ago, the PC revolution started when Micro Instrumentation and Telemetry Systems (MITS) released its Altair 8800 personal computer.
In 1994, this guy, Bill Gates, said a few words about his and Paul Allen's decision to write BASIC for that machine (which was released in early 1975). Note that BASIC was invented as a popular language at a New Hampshire school in 1964.
I was just one year and two weeks old when the model was introduced. But even for those of you who are older, it must feel like some mysterious pre-history of the PCs because almost no one bought it. It was using the Intel 8080 microprocessor that is, up to relatively minor variations, still around in Windows PCs. That microprocessor was introduced in 1974, two years after Intel's first microprocessor, Intel 8008 (see its restricted instruction set).
I find the modesty and innocence of those early computers touching and remarkable, being aware that pretty much all of the gadgets we own today grew up as variations of those early attempts. It's funny to see a few videos about Altair 8800. For example, I enjoyed this demonstration where a not-so-good text game of a sort is loaded and started.
You had to write the programs in the machine code by pressing the switches on the ugly box. Gates and Allen decided that they needed a more modern way to insert the code, and they did it rather easily. When this small company Micro-Soft's Altair BASIC became a part of the computer, it was easier to write the code, I suppose that it was later sold with the Altair BASIC and with some TV output and keyboard that we knew from the epoch of Sinclairs and Commodores.
You had something like 4 kB or 8 kB of memory to be used for the software and the data, for everything, and one could still do nontrivial things. It seemed "more than enough" for many graphics-non-intensive projects to have 64 kB (actually just 38 kB or how much) on Commodore 64.
Compare it with the technology in 2014.
The cheapest cell phones still have at least half a gigabyte of memory – 100,000 times more than the 1974 personal computer. Today, the minimum "tiny programs" you download from the Play Store or Apple Store or Windows Phone Store are slightly below 1 MB and you routinely download programs that are closer to 1 GB. The internal flash memory is even bigger. The increase of the memory is comparable to the factor of one million. With that increase, there's of course a lot of wasting of the memory. It's likely that most programs we use today could be shortened to 1/10 or a smaller fraction without crippling their essential functionality. But people are no longer forced to save memory.
The frequency of the 1972 Intel 8008 was 0.5 MHz. These days, we have several GHz i.e. many thousand times faster microprocessors. You may still see that the increase of the memory was more substantial – this increase is comparable to the "square" of the relative increase of the frequency. That's roughly compatible with the idea that the bits are still being encoded in a two-dimensional array (therefore the second power) because the length scale of the microprocessors has shrunk (almost) 1,000-fold, much like the frequency, from 10 microns of Intel 8008 to 10 nanometes of the state-of-the-art Intel and other microprocessor technologies.
If you only use the two data points, 1972 and 2014, to estimate the rate of the Moore's law exponential increases, you see that 42 years or about 500 months is needed to shrink the distances 1,000 times – let's say 1,024 times. But 1,024 is 2 to the tenth power, so each 50 months (4 years or so), the frequency doubles, the length scale gets halved, and the memory quadruples (the memory doubles each 2 years). And despite these improvements, we seem to pay a slightly decreasing price for the not-really-the-same thing! It's a somewhat slower progress than we often hear about in the context of Moore's law but it may be the more realistic one in recent years.
By the way, the non-increasing price makes it so funny to think about "economics" of those computers. We are paying a part of the price of PCs and cell phones for the RAM, the hard disk, and so on. Just imagine how much some extra 8 kB would cost today. A thousandth of a penny. And it was comparable to tens or hundreds of (more valuable) dollars in the past. Clearly, "1 kB of memory" isn't a good entry for a basket used to quantify inflation. The deflation rate that you would get from this memory-disk basket would be remarkable, like minus 10% per year or a bit more.
The same, just slightly less extreme comments apply to the code. Imagine how important Altair BASIC was for Bill Gates and it was a few kilobytes of machine code. How much money does an average programmer get for a few kilobytes of code today?
Can Moore's law continue?
Well, using the same technologies, we surely can't add additional 40 years of the same exponential progress which means, as the Age of Stupid "movie" correctly calculates LOL, that the doom must arrive by 2055. ;-) The microprocessors would have to go from 10 nanometers to 10 picometers which is already smaller than the atoms. ;-) What we have today, 10 nanometers, is already just 100 atomic radii or so. It means that we only have "20 years" of the extrapolated (from the past) progress in the shrinking dimensions.
I would actually be surprised if the 14 nm (2014) or 10 nm (2016) technology would ever drop below 1 nm.
Along with that, the memory in our gadgets may increase less than 10,000 times if it remains stuck in two-dimensional arrays – and my guess is that it won't jump more than 100-fold in the devices of the same size. I actually still recommend everyone to try to switch to three dimensions. Microprocessors and memories should have lots of (interconnected?) layers. Even though the shrinking of the dimensions will essentially stop in the coming decades, there will still be some room for the shrinking of the overall size of the microprocessors – and therefore the increase of the frequency – if the three-dimensional construction is used somewhat effectively. And there will be some extra increase of the memory. The physical nature of the microscopic components may keep on changing (it hasn't been changing much for the microprocessors).
But it's hard to see how we could go beyond that. In 40 years, the electronics we use today will probably be everywhere. Every bottle of beer will be intelligent, remembering whom it belongs, whether it's fresh, asking to be saved from too hot environment, and many other things. But it's plausible that electronics of the size comparable to what we have today won't exceed 1,000 times – and I would even say 100 times – the memories and CPU power that we have today.
Quantum computers may be constructed – I think it is guaranteed that they are physically possible – but I don't really believe that they will change the world of the devices we have today because they only offer the improvement of some rather specific computational tasks and they represent no positive progress when it comes to the "mundane" activities that dominate the cell phones etc. today – video editing software is among the most complex and CPU+memory-demanding tasks that we need today.
It's also fun to think about Moore's law in the context of biology. Many of the principles of the evolution could be much more analogous to the world of electronics than what most people think.