Elon Musk, Andrew Yang, and Steve Wozniak Propose an A.I. 'Pause.' It's a Bad Idea and Won't Work Anyway.

submited by
Style Pass
2023-04-01 23:30:02

"AI systems with human-competitive intelligence can pose profound risks to society and humanity," asserts an open letter signed by Twitter's Elon Musk, universal basic income advocate Andrew Yang, Apple co-founder Steve Wozniak, DeepMind researcher Victoria Krakovna, Machine Intelligence Research Institute co-founder Brian Atkins, and hundreds of other tech luminaries. The letter calls "on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4." If "all key actors" will not voluntarily go along with a "public and verifiable" pause, the letter's signatories argue that "governments should step in and institute a moratorium."

The signatories further demand that "powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable." This amounts to a requirement for nearly perfect foresight before allowing the development of artificial intelligence (A.I.) systems to go forward.

Human beings are really, really terrible at foresight—especially apocalyptic foresight. Hundreds of millions of people did not die from famine in the 1970s; 75 percent of all living animal species did not go extinct before the year 2000; and "war, starvation, economic recession, possibly even the extinction of homo sapiens" did not happen since global petroleum production failed to peak in 2006.

Leave a Comment