Despite the inciting title, I don't want to strawman this argument, so let's pull the definition straight from every keyboard warrior's best friend, Wikipedia:
The technological singularity — or simply the singularity [1] — is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization. [2] [3]
There's an entire subreddit dedicated to this word, with over 3 million members, awaiting this fated singularity moment, kinda like a religion. It's a bit weird and creepy, to be honest.
After reading a few posts on this illustrious (read: sarcasm) singularity subreddit, I'm convinced the vast majority of posters and critical thinkers (read: clueless gamers) assume that the singularity will be some sort of instantaneous moment where afterward, god knows what is going to happen and we'll never know because we'll all be meat bags floating in some primordial soup (or maybe not exist at all, because, you know, the AI will become self-aware and start launching nukes… wait a second, I've seen this movie before). I argue this notion is like the start of a dream: you somehow “know” how you got there, but if you really think about it, you actually can't explain at all how you got there! (Yes, tip of the hat to Inception.)
A “singularity” of this nature would require some sort of magic-level development in multiple areas of science: physics, biology, chemistry, and probably even a bit of math, too. I also mentioned this in my other recent post, Why LLMs will never be AGI. But seeing as there seems to be no shortage of overhyped AI hucksters, I'd figure I'd write another post to keep drilling the message home. (Not to mention that my last post of mine was quite popular 😉)