In it, he noted [PDF] that in three years, the optimal cost per component on a chip had dropped by a factor of 10, while the optimal number had increased by the same factor, from 10 to 100. Based on not much more but these few data points and his knowledge of silicon chip development – he was head of R&D at Fairchild Semiconductors, the company that was to seed Silicon Valley – he said that for the next decade, component counts by area could double every year. By 1975, as far as he would look, up to 65,000 components such as transistors could fit on a single chip costing no more than the 100-component chips at the time of publishing.
He was right. Furthermore, as transistors shrank they used less power and worked faster, leading to stupendous sustained cost/performance improvements. In 1975, eight years after leaving Fairchild to co-found Intel, Moore revised his "law", actually just an observation, to a doubling every two years. But the other predictions in his original paper of revolutions in computing, communication and general electronics had taken hold. The chip industry had the perfect metric to aim for a rolling, virtuous milestone like no other.
Since then, according to Professor Erica Fuchs of Carnegie Mellon University, "half of economic growth in the US and worldwide has also been attributed to this trend and the innovations it enabled throughout the economy.” Virtually all of industry, science, medicine, and every aspect of daily life now depends on computers that are ever faster, cheaper, and more widely spread.