The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information ar

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for deepfakes

submited by
Style Pass
2024-02-11 04:00:03

The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information are real in the new age of easy-to-use generative AI.

People in the White House have been looking into AI and generative AI since Biden became president in 2020, but in the last year, the use of generative AI has exploded with the release of OpenAI's ChatGPT. Big Tech players such as Meta, Google, Microsoft, and a range of startups have raced to release consumer-friendly AI tools, leading to a new wave of deepfakes — last month, an AI-generated robocall attempted to undermine voting efforts related to the 2024 presidential election using Biden's voice. 

On Thursday, the Federal Communications Commission declared that such calls are illegal. Yet, there is no end in sight for more sophisticated new generative-AI tools that make it easy for people with little to no technical know-how to create fake images, videos, and calls that seem authentic.

That's a problem for any government looking to be a trusted source of information. Ben Buchanan, Biden's Special Advisor for Artificial Intelligence, told Business Insider that the White House is working on a way to verify all of its official communications due to the rise in fake generative-AI content.

Leave a Comment