For decades corporations have been doing anything in their power to make computers worse. Software used to much faster, much leaner than it is now. Hardware performance is progresing by leaps and bounds, but somehow the software is slower. Yes, programs we run (cough, web applications, cough) are more complex, but the complexity increase is smaller than the compute improvements.
Software used to also be substantially better written. When sending patches to clients was a risk by itself, releasing barely working software was akin to shooting yourself in the foot. Now bug are expected. Again - I am comparing simpler software to what we have now, but the complexity increase is smaller than increase in budgets and team sizes.
I am often accused of romanticizing past, but I am far from that. Software of the golden age had lots of problems, it was far from perfect. I am critical of the path we took - as it was aimed not at better software, but at better return of investment. We could have had much better working computers, but here we are. Everything is barely working but dollar pours. Some call it enshittification, others distribution. All I know is this is not going to stop. It is never enough.
This brings me to ChatGPT, LLMs and all that crap. I have to admit, that the way it works is extremely cool. Some vector math is able to fool us into thinking we are interacting with a human being? How cool! I’d love bo be amazed by it, but if you we look at it critically, it a is terrible peace of software. It is slow, expensive, and non-deterministic. If I released a code to production with has some chance to work or not, I would need to fix it. If the same software yielded different returns by design, I’d need to rewrite it.