Vulnerability Management, Threat Management

Report details social media takedown of pro-Western influence campaign, a first

The Twitter logo, a blue bird in a white square, is displayed on a device screen.
The Twitter logo is displayed on a mobile device. A new report from the Stanford Internet Observatory shines a light on what is believed to be the first major social media platform takedown of a U.S.-centric influence campaign. (Photo by Bethany Clarke/Getty Images)

A new report from the Stanford Internet Observatory and Graphika shines a light on what is believed to be the first major social media platform takedown of a U.S.-centric influence campaign.

The campaign, which included coordinated accounts pushing pro-American and pro-Western sentiments across eight different social media platforms, was initially discovered by researchers at Facebook and Twitter. According to the Stanford report, the efforts targeted countries in the Middle East. While the topics and content were varied, they “consistently advanced narratives promoting the interests of the United States and its allies while opposing countries including Russia, China, and Iran.”

Datasets provided by Twitter and Facebook reveal a partial scope of the operation. It includes more than 400,000 tweets by 170 accounts between 2012 and 2022, while Facebook’s team reported activity by 39 different profiles, 16 pages, two groups and 26 Instagram accounts that were active between 2017 and 2022. Both Twitter and Meta, Facebook’s parent company, pegged the United States as the likely origin of the operation, with Twitter also including Great Britain. Neither company attributed the campaign to a specific government or entity.

Some of the posts castigated Russia for its invasion of Ukraine and included claims of war-time atrocities. A large proportion of the posts on Twitter came from clusters of political accounts that expressed both pro- and anti-Iranian government sentiments.

“We believe this activity represents the most extensive case of covert pro-Western [information operations] on social media to be reviewed and analyzed by open-source researchers to date,” the Stanford report states. “With few exceptions, the study of modern IO has overwhelmingly focused on activity linked to authoritarian regimes in countries such as Russia, China, and Iran, with recent growth in research on the integral role played by private entities. This report illustrates the wider range of actors engaged in active operations to influence online audiences.”

While the pro-U.S. nature of the influence campaign makes this takedown unique, it does not appear to have been successful at generating meaningful engagement.

Less than one in every five of the accounts had more than 1,000 followers, and Stanford researchers noted that “the vast majority of posts and tweets we reviewed received no more than a handful of likes or retweets.” Further research and mapping by Stanford revealed that at least 60,000 active Twitter accounts engaged with at least one account that was later banned as part of a covert campaign.

The data for other platforms is limited, and Alex Stamos, director of the Stanford Internet Observatory, indicated that other companies were less willing to share their own datasets. Even Twitter and Meta’s datasets did not include technical details of the investigation, something the Stanford researchers said limited their analysis.

The campaign also relied on many tactics that are now well-known in the information operations space, including GAN-generated photos of fake humans to establish accounts and operatives, masquerading as members of the media, starting hashtag campaigns and launching online petitions. One of those petitions pushed for countries like Kyrgyzstan to crack down on China’s influence over their politics, while two others called for Kazakhstan to ban Russian TV channels.

Like other influence campaigns, many accounts were focused on general entertainment, lifestyle or cultural issues in an attempt to build an audience that could be receptive to more targeted political messages.

Others took a more direct path, deliberately engaging with high profile pro-Ukraine or pro-Russia accounts in order to gain attention and followers.

Derek B. Johnson

Derek is a senior editor and reporter at SC Media, where he has spent the past three years providing award-winning coverage of cybersecurity news across the public and private sectors. Prior to that, he was a senior reporter covering cybersecurity policy at Federal Computer Week. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.