There’s a popular story being told by effective altruist /  longtermist / rationalist /  “grey tribe” / Bay Area / nerdy types. The story goes s

Everything Is Bullshit

submited by
Style Pass
2024-05-13 17:00:13

There’s a popular story being told by effective altruist / longtermist / rationalist / “grey tribe” / Bay Area / nerdy types. The story goes something like this:

If you look at the history of life on Earth, humans are a new species. There’s a good chance we could be around for another hundred million years. Whoa. Think of what we could accomplish with that time—brain uploads, dyson spheres, quantum supercomputers, immortality… We could fill the entire cosmos with our digital consciousnesses, enjoying simulated ecstasy beyond our wildest dreams!

What if we create an artificial superintelligence that breaks free of human control and murders everyone on the planet? Or what if some diabolical bioweapon escapes the lab and destroys the human race? Or what if we accidentally blow up the world in a nuclear Armageddon? These three threats—superintelligent AI, bioweapons, and nuclear war (but mostly just AI, if we’re being honest)—are our biggest existential risks, or threats to humanity’s long-term potential. What we do now, with these dangerous technologies, could determine the fate of our species for millions of years.

Now, the effective altruist / longtermist / rationalist / “grey tribe” / Bay Area nerdy types have a point. They’re on to something with this idea of “threats to humanity’s long-term potential” or “existential risks” to our awesome future. All I would suggest is that they add one more to their list: mediocrity.

Leave a Comment