We’ve got so used to computing getting ever cheaper that it has been an article of faith that it will keep getting cheaper.  Rodolfo Rosini, CEO of

Exponential View by Azeem Azhar

submited by
Style Pass
2023-03-18 12:30:07

We’ve got so used to computing getting ever cheaper that it has been an article of faith that it will keep getting cheaper. Rodolfo Rosini, CEO of a stealth startup and long-time EV member asks us to think differently. What if we are reaching the fundamental physical limits of classical models of computation, just as our economies depend on cheap computation for their effective functioning? 

We are sending this out on Saturday so you can have a chance to relax over this challenging idea. Tomorrow’s newsletter is deep with analysis of AI, so keep a cup of coffee handy for that as well.

TLDR; The Great Stagnation 1 puts forward the contrarian argument that the United States (and, by extension, allied Western economies) have reached a plateau due to a lack of technological innovation. A strong argument can be made that computing has entered such an era and that can be validated by observing the cost of computation and its impact on productivity. Moreover, it is due to deteriorate even further in the next decade.

In many industries, Wright’s Law2 —that each 20%-or-so improvement in manufacturing processes leads to a doubling of productivity—holds. In the technology sector, Wright’s Law manifests as part of Moore’s Law. First described in the 1960s by Intel co-founder Gordon Moore, when he noticed that the number of transistors in integrated circuits appeared to double almost year-on-year. Since then this has become the foundation of a covenant between marketing and engineering forces, a drive to build products across the computing stack that exploit this excess computing capacity and reduction in size 3. The promise is ever faster and cheaper microprocessors, with an exponential improvement in computing capability over time.

Leave a Comment