When I speak about the inefficiency of popular blockchains, or mention that we seem to be hurtling towards a “web3” so centralized it challenges big tech’s firm grasp on today’s web, or point out that somehow no one appears to have managed to find a positive use for blockchains that wouldn’t be better served by one of the many more performant databases we have available to us these days, I often hear “it’s the early days”. “Give it a chance”. “People are still figuring all this blockchain stuff out, ironing out the kinks”.
Bitcoin, currently one of the best-known and most-used blockchains, began to be used in 2009. Ethereum, another well-known and popular blockchain, launched in 2015. In the grand scheme of things, 2009 and 2015 were not that long ago. In the technology world, that was a lifetime ago.
In 2009, smartphones without physical keyboards were starting to become more popular. We still aggregated our favorite blogs to read on Google Reader, but people had started posting their thoughts on this weird new website called “Twitter”. VentureBeat had just published an article urging people not to “believe the hype” around fully-electric cars, writing that Tesla was “floundering” after recently bringing in Elon Musk as CEO. Uber was founded, and people started to talk more widely about this “gig economy” idea. Intel Core processors had just been released, starting with the i3 and i5. Consumer-grade desktop computers usually had 4 or 8GB of RAM. In the software world more specifically, Go was publicly announced, though not yet popular. MongoDB and Redis were brand new players in databases. jQuery was just taking off, about to reach near-ubiquity over the coming years. Node.js was first released. Windows 7 was the hot new thing, after the horror that was Windows Vista.