Last Saturday, a Telegram message popped up on Heejin’s phone from an anonymous sender. “Your pictures and personal information have been leaked. Let’s discuss.”
As the university student entered the chatroom to read the message, she received a photo of herself taken a few years ago while she was still at school. It was followed by a second image using the same photo, only this one was sexually explicit, and fake.
Terrified, Heejin, which is not her real name, did not respond, but the images kept coming. In all of them, her face had been attached to a body engaged in a sex act, using sophisticated deepfake technology.
Deepfakes, the majority of which combine a real person’s face with a fake, sexually explicit body, are increasingly being generated using artificial intelligence.
Two days earlier, South Korean journalist Ko Narin had published what would turn into the biggest scoop of her career. It had recently emerged that police were investigating deepfake porn rings at two of the county’s major universities, and Ms Ko was convinced there must be more.