Meta reckons China’s troll farms could learn proper OpSec from Russia’s fake news crews
Russia appears to be “better” at running online trolling campaigns aimed at pushing its political narratives than China, according to Meta’s latest Adversarial Threat Report.
The report [PDF], published Tuesday, features Meta’s claims that it has made the world a little bit safer by blocking two of the largest political influence operations it’s ever detected on its platforms – one linked to China and the other likley driven by Russia.
Both used spammy links and fake news in attempts to discredit Western governments, or to weaken support for Ukraine.
The China-based campaign involved 7,704 Facebook accounts, 954 Facebook pages, 15 groups on the social network and 15 Instagram accounts. The accounts and actors spilled out across over 50 platforms beyond Meta’s properties, with activity spotted on X (formerly Twitter), YouTube, TikTok, Reddit, Pinterest, Medium, Blogspot, LiveJournal, Vimeo, Russian social media service VKontakte, and dozens of smaller online forums.
The campaign targeted Taiwan, the United States, Australia, the UK, Japan, and global Chinese-speaking audiences. Its favorite topics included pro-China commentary, and negative content about the US and Western foreign policies. Critics of the Chinese government also came in for unkind treatment.
Meta tied this campaign to a group known as Spamouflage, aka Dragonbridge, that’s been linked to Chinese law enforcement agencies.
“On our platform, this network was run by geographically dispersed operators across China who appear to have been centrally provisioned with internet access and content directions,” the report states, noting that many of the accounts the group employed were automatically detected and disabled by Meta’s systems.
Meta’s threat-trackers believe its actions drove the gang to smaller, lesser-known social media sites to amplify its messages and keep its campaign alive after it was blocked on Facebook and the ‘Gram.
“We have not found evidence of this network getting any substantial engagement among authentic communities on our services,” the report added, echoing its own earlier reports about the spammy crew.
Spamouflage spams a lot
Meta is not the first to rate Spamouflage a menace. A January report from Google’s Threat Analysis Group (TAG) dubbed the group the most prolific information-operations group it tracked at that time.
In the past, the Spamouflage crew has reportedly attempted spreading misinformation ahead of the 2022 US midterm elections and trolled rare-earth mining companies. Of late the group has produced video segments featuring AI-generated news anchors spouting pro-China messaging.
TAG concurs with Meta’s message that the group’s prolific output did not translate into a substantial following.
According to Meta, Spamouflage spent just $3,500 on ads related to its Facebook campaign – paid for mostly in Chinese yuan, Hong Kong dollars and US dollars.
From Russia, with love
The report finds similarities between Spamouflage and an actor linked to Russia known as Secondary Infektion.
Secondary Infektion is perhaps best known for spreading misinformation about Ukrainian president Volodymyr Zelenskyy before and after Russia’s illegal invasion. This included fake news claiming that Zelenskyy, who is Jewish, is a Nazi. The group has also alleged that the Ukrainian president died by suicide in a Kyiv military bunker – an assertion hard to reconcile with his numerous in-person appearances alongside world leaders, in Ukraine and elsewhere.
“As we reviewed our findings on tactics, techniques and procedures (TTPs) used by Spamouflage over the years, we noted some distinct similarities with the Russian network we first exposed in 2019,” Meta’s report states. It suggests that operators of coordinated inauthentic behaviour networks “learn from one another, including as a result of public reporting about covert influence operations by our industry and security researchers.”
But in Meta’s opinion, Beijing’s Spamouflage has a lot to learn from Moscow’s Secondary Infektion.
“Secondary Infektion was much more careful in its operational security (OpSec) and avoided re-using the same accounts,” the report authors wrote.
Real news? Or Doppelganger?
Meanwhile, Meta also blocked “thousands” of malicious website domains, fake accounts, and pages on its various sites connected to a Russian operation dubbed Doppelganger.
“We assess this campaign to be the largest and most aggressively persistent covert influence operation from Russia that we’ve seen since 2017,” according to the report.
The social media giant says it first disrupted this campaign a year ago. Over that time it has expanded the targets of its pro-Russia fake news blitz to the US and Israel, after initially preferring France, Germany and Ukraine.
Doppelganger spoofs real news organizations, with some of its efforts “particularly elaborate,” according to the report. Meta highlighted a fake Washington Post article based on a phony Russian-language video purporting to show Zelenskyy admitting he was a puppet of the CIA.
The fake news story used the same byline and timestamp as a real Post interview published the same day. Doppelganger then tried to share the spoofed article on social media as “evidence” of American interference in Ukraine.
“It received no engagement on our platform,” Meta states. But Doppelganger, like Spamouflage and Secondary Infektion, did manage to register many accounts on Meta services and operate undetected for a considerable amountof time. ®
READ MORE HERE