Application security, Threat Management, Threat Management, Threat Intelligence

Facebook takes down pages suspected in election influence campaign

The 2018 midterm elections are bearing an unfortunate resemblance to the 2016 presidential race – a number of suspicious accounts, some meant to exert influence on the outcomes, acting in “coordinated inauthentic behavior” have been found on Facebook. Only this time, the social media company has banned the accounts before the election.

“We're still in the very early stages of our investigation and don't have all the facts — including who may be behind this,” Facebook said of the 32 banned accounts and pages. “But we are sharing what we know today given the connection between these bad actors and protests that are planned in Washington next week.”

Two weeks ago Facebook initially found eight Facebook pages and 17 profiles and seven Instagram accounts, created between March 2017 and May 2018, that violated the company's policies regarding coordinated unauthentic behavior, Facebook's head of cybersecurity policy, Nathaniel Gleicher, wrote. After completing an investigation, Facebook removed those pages and accounts and reported its findings – which showed that more than 290,000 accounts followed one or more of the pages - to law enforcement in the U.S., Congress, tech companies and the Atlantic Council's Digital Forensic Research Lab.

The company offered examples of the pages, including “Aztlan Warriors,” “Black Elevation,” “Mindful Being,” and “Resisters.”

Gleicher believes that the company's efforts to curb influence campaigns and boost transparency on the platform have compelled the bad actors to be “more careful to cover their tracks” than the Russian Internet Research Agency (IRA) was during the 2016 campaign, using VPNs and internet phone services to obscure their activities and paying third parties to run ads – which Facebook said amounted to about $11,000.

While some activity surrounding the accounts and pages “is consistent with what [was seen] from the IRA before and after the 2016 elections” and Facebook has “found evidence of some connections between these accounts and IRA accounts” disabled last year, Gleicher said the company still doesn't “have firm evidence to say with certainty who's behind this effort.”

Facebook continues to attempt to determine attribution, something that CSO Alex Stamos said remains a difficult task. The relationship between malicious actors and real-world sponsors can be difficult to determine in practice, especially for activity sponsored by nation-states,” Stamos wrote, noting that the company tries to “link suspicious activity to the individual or group with primary operational responsibility for the malicious action” and tie the actor to a real-world sponsor.

The company applied a framework based on a spectrum developed by Jason Healey, non-resident senior fellow, cyber statecraft initiative at the Atlantic Council, “to measure the degree of state responsibility for cyberattacks” and assessed four methods of attribution, including political motivations, coordination, technical forensics and tools, techniques and procedures (TTPs), Stamos said.

While Facebook didn't offer an assessment of the political motivations of the miscreants behind the pages and accounts, Stamos said, in addition to the company finding evidence of links between the accounts and IRA accounts, it also discovered that some of the TTPs are consistent with the IRA's 2016 and 2017 activities.

“Our technical forensics are insufficient to provide high confidence attribution at this time,” he said.

"Given how influential Facebook is, it is highly likely that bad actors will attempt to propagate fake news once again to influence voters towards their agenda, though the lasting feeling that Russia is the only nation with an interest should be dispelled because no-one knows that for a fact,” said Lee Munson, security researcher at Comparitech.com. “Whether Facebook can genuinely stamp out any shenanigans this time remains to be seen – artificial intelligence is only as good as its programming and I suspect the social media giant does not have anywhere near enough human reviewers on its payroll.”

While she applauded Facebook for “taking the threat of foreign-led disinformation campaigns seriously,” Fortalice Solutions CEO Theresa Payne, formerly White House CIO under President George W. Bush, said, “social media giants like Facebook and Twitter taking action is only a small piece of the puzzle.”

Payne urged the “the Trump administration to confirm election meddling head on.”

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.