Information Operations Spotlighted at Black Hat as Election Worries Rise
From Russia’s “best-in-class” efforts at widening social divides in Western democracies to China’s blunt attacks on dissidents, information operations are becoming a greater threat, says a Stanford researcher.
While the Russian government spends a fraction of the People’s Republic of China on overt state-sponsored media properties, the covert activities targeting Western democracies and other rivals is “best-in-class,” Renée DiResta, a research manager at the Stanford Internet Observatory, told attendees during an Aug. 6 keynote on information operations at virtual Black Hat USA.
Information operations — including campaigns of disinformation and propaganda — have become increasingly sophisticated, especially from the former Soviet Union, although China is learning quickly, she said. The campaigns often have a number of goals, including distracting attention from a nation’s actions, persuading people to a certain point of view, dividing citizens into opposing groups, and entrenching them into hardline opinions.
“Russia is, at the moment, kind of best-in-class for information operations,” she told attendees. “They have demonstrated not only a commitment to full-spectrum propaganda, but to far more sophisticated activities [from acting as] agents of influence to media manipulation and then ultimately to network infiltration as well.”
As the 2020 US presidential elections approach, a variety of groups are worried about the impact the information operations are currently having on democracies across the globe, from Russian disinformation campaigns leading up to the 2016 election to China’s crackdown on information on the protests in Hong Kong to Saudi Arabia’s attempts to control the narrative following the assassination of author and writer Jamal Khashoggi.
By far, Russia has the most sophisticated infrastructure for using information as a weapon, DiResta said. Russia has often blended information operations with more traditional hacking, and then used the information it obtains as fodder for future operations, she said. China has traditionally focused its information operations inward, but as part of the economic giant’s global push, the government has increasingly used outward-facing information operations, especially overt state-funded publications.
“China has decades of experience in both overt and covert “narrative management” — the government has prioritized it for decades, particularly inward-looking [operations],” she said. “What we are seeing now is how they are taking those inward-facing capabilities and beginning to expand outward, beginning to deploy those same tactics in areas and regions where they want to have influence outside of their borders.”
Information operations surged as social media grew in popularity, until today, spreading information is cheap, editorial gatekeepers have largely been eliminated, and companies rely on algorithms to determine what information is interesting enough to spread.
“We wind up at this point with a glut of material,” she said. “Algorithms surface and recommend content, rank it, and then they help amplify and disseminate it.”
For information operations, however, the most important part is that the algorithms can be gamed to spread memes and disinformation. The largest social media companies — from Facebook to Twitter to YouTube — struggle to define adequate rules and acceptable content for their platforms.
On Aug. 5, for example, Google announced that YouTube had shuttered almost 2,600 YouTube channels in the second quarter as part of its efforts into coordinated information operations by China. In late July, Democrats in the United States called on the Federal Bureau of Investigations to brief lawmakers on disinformation targeting members of Congress.
Information operations tend to follow a particular pattern, DiResta said. State actors create personas, post content, initially seed that content to social networks, and then coordinate with other personas to amplify the content. Success comes when real people start to share the memes and disinformation, especially if the story gets picked up in the mainstream media, she says.
What makes Russian operations so effective is that they are profiling people based on what they are interested in and what content they share with others. The Internet Research Agency, a group funded by the Russian government to carry out disinformation campaigns, ultimately aims to turn people into part of their disinformation infrastructure, she said.
“When you follow an Internet Research Agency page, you are giving them an indication that you are sympathetic to that cause,” she said. “What we saw over and over and over again were these attempts to recruit. How can they turn their audience into active participants — unwitting participants — in the operation?”
DiResta expects to see Russia-backed efforts to hack for sensitive political information that can be released, continued attempts to get information on voting machines and voting security efforts, and the infiltration of special-interest groups. Ultimately, the goal will be to undermine confidence in the election.
The United States should prioritize the fight against disinformation because the impact will last long after the election, she said, pointing to a real person’s page, almost half of which were memes sent by the Internet Research Agency.
“False stories are internalized by real people, they continue to spread long after active operations have ceased,” she said, adding “we can see how people are reacting to this stuff … but it is extremely difficult to understand the impact of these efforts.”
Related Content:
Veteran technology journalist of more than 20 years. Former research engineer. Written for more than two dozen publications, including CNET News.com, Dark Reading, MIT’s Technology Review, Popular Science, and Wired News. Five awards for journalism, including Best Deadline … View Full Bio
Recommended Reading:
More Insights
Read More HERE