Saturday, November 2, 2024
HomeCHINA'S AGGRESSION TOWARDS THE USAChinese Deepfakes A Threat To US National Security

Chinese Deepfakes A Threat To US National Security

Share

Chinese AI Models Create Deepfakes and Misinformation As A Threat To US National Security

Deepfakes are realistic, yet fabricated videos, created by AI algorithms trained on copious amounts of online footage. They can be seen surfacing on social media, blurring fact and fiction, for example in the polarized world of U.S. politics.

New “generative AI” tools such as “Midjourney” make it cheap and easy to create convincing deepfakes. Such synthetic media has been around for several years, but has changed really fast over the past year.

Researchers said in a report released this March that image creation tools powered by artificial intelligence from companies including OpenAI and Microsoft, can be used to produce materials to promote election or voting-related disinformation, despite each having policies against creating misleading content. 

Some disinformation campaigns simply harness the ability of AI to mimic real news articles as a means of disseminating false information.

Major social media platforms like Facebook, Twitter, and YouTube have made efforts to prohibit and remove deepfakes, but their effectiveness at policing such content varies.

For example, last year the Department of Homeland Security (DHS) said in its 2024 homeland threat assessment that a Chinese government-controlled news site, using a generative AI platform, pushed a false claim that the United States was running a lab in Kazakhstan to create biological weapons for use against China.

On Wednesday National Security Advisor Jake Sullivan, speaking at an AI event in Washington, said the problem has no easy solutions, because  it combines the capacity of AI with “the intent of state, non-state actors, to use disinformation at scale, to disrupt democracies, to advance propaganda, to shape perception in the world.”

“Right now the offense is beating the defense big time,” he added.

Most Popular