Long-Termism's Irony

submited by
Style Pass
2021-05-17 05:01:29

Defined by Will MacAskill for EA Forum, Longtermist is “the view that the most important determinant of the value of our actions today is how those actions affect the very long-run future.”

The irony is that this only holds true in the abstract. According to an Open Philanthropy estimate and AI Expert Surveys, there’s a 50% chance of transformative Artificial Intelligence emerging by around 2050. If that happens, basically nothing we do in the meantime will matter, at least not with regards to total expected utility. Ensuring that the AI is safe, human-aligned, benevolent, etc, is of primary and nearly sole importance.

This has practical implications. For example: should you expend energy cultivating the next generation of scientists, or just focus on your own research output? If we seriously only have 30 years, the latter becomes much more compelling.

Similarly, if you’re serious about longtermism, the altruistic case for having children becomes much weaker. Particularly precocious offspring might be able to productively do longtermism-relevant work in their mid-twenties, but that’s not enough time to recuperate the cost of raising them. [1][2]

Leave a Comment