Philosophers have speculated that an AI tasked with a task such as creating paperclips might cause an apocalypse by learning to divert ever-increasi

AI and the paperclip problem

submited by
Style Pass
2023-01-24 05:00:09

Philosophers have speculated that an AI tasked with a task such as creating paperclips might cause an apocalypse by learning to divert ever-increasing resources to the task, and then learning how to resist our attempts to turn it off. But this column argues that, to do this, the paperclip-making AI would need to create another AI that could acquire power both over humans and over itself, and so it would self-regulate to prevent this outcome. Humans who create AIs with the goal of acquiring power may be a greater existential threat.

Professor of Strategic Management and Jeffrey S. Skoll Chair of Technical Innovation and Entrepreneurship, Rotman School of Management University Of Toronto

Leave a Comment