Earlier this summer, photos circulated on social media showed former US President Donald Trump. Anthony is seen hugging and kissing Fauci. The images, of course, were not real and were not the work of some prankster. The images, created with the help of artificial intelligence-backed “deep fake” technology, were shared online by Florida Governor Ron DeSantis’ rapid response team.
It was part of a campaign to criticize Trump for firing Fauci, a former US infectious disease official who pushed for Covid-19 restrictions at the height of the pandemic.
The use of deepfakes in the 2024 election is already seen as a major concern, and last month the Federal Election Commission began a process to potentially regulate such AI-generated content in political ads ahead of the 2024 election. Advocates say this is necessary to protect voters from election misinformation.
The real AI threat
For years, there have been warnings about the dangers of AI, and most critics have suggested that machines could take over in a scenario similar to science fiction movies. Terminator Or matrixWhere they literally rise up and enslave humanity.
Still, a clear and present threat could actually be using AI to trick voters into the next primary season.
“Deep fakes are almost certain to influence the 2024 elections,” Dr. Craig Albert, professor of political science and graduate director of the Master of Arts in Intelligence and Security Studies at Augusta University, warned.
“In fact, the US intelligence community had anticipated these types of social media influence operations during the last major election cycle in 2022, but they had no tangible impact,” Albert noted.
However, the international community has already seen sophisticated deep fakes in the Russia-Ukraine war. Although the most sophisticated of these come from Ukraine, it is certain that the Russian government has taken notice and plans to use them in the near future, suggested Albert.
“Based on their history of social media information warfare and how they have generally influenced US elections over the last decade, it is almost certain that the US can expect to see this in the 2024 election cycle,” he added.
Too much trust in social media
The threat of AI-generated content is heightened by the fact that many Americans now rely on social media as their primary news source. Videos from sources that have paid to be “verified” on platforms like X (formerly Twitter) and Facebook can quickly go viral, and other users question the validity of content from otherwise unsuspecting sources, even though many will believe it to be real. .
It is made worse by the lack of trust in politicians today.
“The risk for individuals is that this practice can do a lot of damage to the image and credibility of the person being attacked, and eventually laws will be passed that will penalize the practice more effectively,” suggested tech industry analyst Rob Enderle. Enderle group.
“Identity theft laws can be enforced now once lawyers start figuring out how to reduce this behavior,” Enderle added. “It’s one thing to accuse an opponent of doing something they didn’t do, but creating false evidence to convince others should be illegal but laws need to be reformed to deal with this bad behavior more effectively.”
Fight Deep Fakes
Political candidates—at all levels—should not wait for the FEC to act. To restore election integrity, anyone seeking office should be urged not to use deep fake or other manipulated videos and photos as a campaign tool.
“Beyond a doubt, all U.S. officials must agree not to engage in any social media or cyber-enabled influence campaigns, including deep fakes for domestic or domestic use,” Albert said. “Candidates should not endorse campaigns in the US to influence voting behavior or policy making at all. Engaging in the creation or construction of deep fakes would fall into this category and should be strictly prohibited for candidates and politicians on ethical and national security grounds.”
However, there will be domestic and foreign operators using the technology even as the candidates make such pledges. All political campaigns will be on the lookout for such attacks, but voters need to be vigilant as well. A lot of this is actually pretty straight forward and obvious.
“Never trust unverified, unofficial sources of video and sound bites,” Albert added. “All of this is easy to fake, manipulate and distort, and for candidate pages, it’s easy to create cyber-personas that aren’t genuine. If a video, sound bite or social media post appears and seems to evoke some kind of emotional reaction, in the public domain, it It’s a sign of being slow to judge a medium until it’s verified as authentic.”
follow me Twitter.