We take for granted that an event in one part of the world cannot instantly affect what happens far away. This principle, which physicists call locality, was long regarded as a bedrock assumption about the laws of physics. So when Albert Einstein and two colleagues showed in 1935 that quantum mechanics permits “spooky action at a distance,” as Einstein put it, this feature of the theory seemed highly suspect. Physicists wondered whether quantum mechanics was missing something.
Then in 1964, with the stroke of a pen, the Northern Irish physicist John Stewart Bell demoted locality from a cherished principle to a testable hypothesis. Bell proved that quantum mechanics predicted stronger statistical correlations in the outcomes of certain far-apart measurements than any local theory possibly could. In the years since, experiments have vindicated quantum mechanics again and again.
Bell’s theorem upended one of our most deeply held intuitions about physics, and prompted physicists to explore how quantum mechanics might enable tasks unimaginable in a classical world. “The quantum revolution that’s happening now, and all these quantum technologies — that’s 100% thanks to Bell’s theorem,” says Krister Shalm, a quantum physicist at the National Institute of Standards and Technology.