Back in 1996, at age 10, I played a computer game at a friend’s house called Spycraft: The Great Game. In the game, you play as a CIA operative investigating an assassination plot; to mislead a suspect during an interrogation, you have the option to doctor a photograph. The process blew my 10-year-old mind — so much so that I’ve remembered how powerful that minigame felt, all these years. Although it was blurry and pixelated, the photo editor that appeared in Spycraft was a bit like what Adobe Photoshop would one day become. In 1996, it felt like the stuff of high-tech espionage and trickery. In 2023, it’s utterly mundane. It isn’t difficult or expensive to alter a photograph — not anymore. Anyone can do it, and as a result, we have all come to accept that we cannot trust any image we see.
Deepfake technology has already proven that we can’t trust video or audio recordings, either. And the prevalence of generative artificial intelligence has only made creating such deepfakes easier. We all need to get used to this new reality — and fast.