China uses new technique to sow misinformation about aerial fire

Rate this post

When wildfires raged in Maui last month, China’s increasingly resourceful information warriors pounced.

The disaster was not natural, they said in a flurry of false posts spreading across the Internet, but the result of a secret “weather weapon” tested by the United States. To reinforce plausibility, the posts contained photographs that had been generated by artificial intelligence programs, making them the first to use these new tools to enhance the impression of the veracity of a disinformation campaign.

For China – which largely stood in favor of the 2016 and 2020 US presidential elections while Russia ran hacking operations and disinformation campaigns – the attempt to paint the wildfire as a deliberate act by US intelligence agencies and the military was a swift change of strategy.

Until now, China’s influence campaigns have focused on raising propaganda to defend its policies on Taiwan and other issues. The most recent efforts uncovered by researchers at Microsoft and several other organizations suggest that Beijing is making more direct efforts to sow discord in the United States.

The move comes as the Biden administration and Congress grapple with how to push back against China without plunging the country into open conflict and how to mitigate the threat of AI being used to amplify disinformation.

The impact of the Chinese campaign — identified by researchers at Microsoft, Recorded Future, the Rand Corporation, NewsGuard and the University of Maryland — is difficult to measure, although early indications suggest that some social media users are engaging in wild conspiracy theories. .

Brad Smith, vice president and president of Microsoft, whose researchers analyzed the covert campaign, harshly criticized China for exploiting the natural disaster for political gain.

“I don’t think that’s fair to any country, much less any country that aspires to be a great country,” Mr. Smith said in an interview Monday.

China was not the only country to make political use of Mau’s fire. Russia did the same, spreading posts emphasizing how much money the United States was spending on the war in Ukraine and suggesting the cash would be better spent at home on disaster relief.

Researchers suggest that China is building a network of accounts that could be used in future information operations, including the next US presidential election. This is the pattern set by Russia in the year leading up to the 2016 election or earlier.

“It’s going in a new direction, which is promoting conspiracy theories that are not directly related to some of their interests, like Taiwan,” said Brian Liston, a researcher at Recorded Future, a cybersecurity company in Massachusetts.

If China engages in an influence operation for next year’s election, US intelligence officials have assessed in recent months, it could undermine President Biden and former President Donald J. Trump is likely to try to raise his profile. It may seem counterintuitive to Americans who remember Mr. Trump’s attempts to blame Beijing as the “China virus,” but intelligence officials have concluded that the Chinese leader prefers Mr. Trump. He has called for Americans to pull out of Japan, South Korea and other parts of Asia, while Mr. Biden has cut off China’s access to sophisticated chips and equipment made to manufacture them.

China promoted a conspiracy theory about the fire last fall when Mr. Biden spoke to Chinese President Xi Jinping in Bali about Beijing’s role in spreading such misinformation. Mr. Biden angrily criticized Mr. Xi for spreading false allegations that the United States operated a biological weapons laboratory in Ukraine, according to administration officials.

Researchers and administration officials say there is no indication that Russia and China are working together on information operations, but they often echo each other’s messages, especially when criticizing U.S. policies. Their combined efforts indicate that a new phase of the disinformation war is about to begin, enhanced by the use of AI tools.

“We don’t have direct evidence of coordination between China and Russia in these missions, but we’re definitely looking for alignment and some kind of synchronization,” said William Marcellino, a RAND researcher and author of a new report that says artificial intelligence will enable a “serious leap” in global influence operations.

The wildfires in Hawaii—like many natural disasters these days—gave rise to countless rumors, false reports, and conspiracy theories almost from the start.

Caroline Amy Or Bueno, a researcher at the University of Maryland’s Applied Research Lab for Intelligence and Security, reported that a coordinated Russian campaign began on Twitter, the social media platform now known as X, on August 9, a day after the fire. .

The phrase “Hawaii, not Ukraine” spread through a series of conservative or right-wing accounts, such as Breitbart, and finally from an obscure account with a few followers through Russian state media, reaching tens of thousands of users with a message aimed at reducing US military aid. To Ukraine.

China’s state media apparatus often echoes Russian themes, particularly hostility toward the United States. But in this case, he also sent a different disinformation campaign.

Recorded Future first reported that the Chinese government had waged a covert campaign to blame “weather weapons” for the fires, identifying numerous posts in mid-August falsely claiming that MI6, the British foreign intelligence service, had uncovered “the startling truth behind wildfires.” .” Posts with the exact language appeared on social media sites on the Internet, including Pinterest, Tumblr, Medium, and Pixiv, a Japanese site used by artists.

Other rogue accounts spread similar content, often with mislabeled videos, including one from the popular TikTok account, The Paranormal Chic, in which a transformer exploded in Chile. According to Recorded Future, the Chinese text often echoes — and amplifies — conspiracy theorists and extremists in the United States, including white supremacists.

The Chinese campaign ran on several major social media platforms — and in multiple languages, indicating that it aimed to reach a global audience. Microsoft’s Threat Analysis Center has identified unverified posts in 31 languages, including French, German and Italian, but also less well-known ones such as Igbo, Odia and Guarani.

Artificially generated images of an aerial forest fire identified by Microsoft researchers appeared on several platforms, including a Reddit post in Dutch. “These particular AI-generated images were used exclusively by the Chinese accounts used in this campaign,” Microsoft said in a report. “It doesn’t seem to be present anywhere else online.”

Clint Watts, general manager of Microsoft’s Threat Analysis Center, said China has adopted Russia’s playbook for influence operations, laying the foundation for influencing politics in the United States and other countries.

“This will be Russia in 2015,” he said, referring to bots and unverified accounts created by Russia before its extensive online influence operation during the 2016 election. “If you look at how other artists have done it, they are building capacity. Now they are creating secret accounts.”

Natural disasters are often the focus of disinformation campaigns, allowing bad actors to exploit emotions and accuse governments of errors, either in preparation or in response. The goal may be to undermine trust in specific policies, such as US support for Ukraine, or more generally to sow internal dissent. By suggesting that the United States is secretly testing or using weapons against its own citizens, China’s effort was intended to portray the country as a reckless, militaristic power.

“We have always been able to come together in the wake of humanitarian disasters and provide relief in the wake of earthquakes or hurricanes or fires,” Mr. said Smith, who is presenting some of Microsoft’s findings to Congress on Tuesday. “And instead seeing this kind of follow-up is both, I think, deeply troubling and the global community should draw a red line and put a limit.”

Leave a Comment