Content

Facebook moves to protect elections by flagging content from state-run media

Facebook might have decided to leave it up to the public whether political ads are truthful - or not - but the social media giant said it’s diligently trying to identify and flag content from media outlets run by nation-states to avoid a repeat of the 2016 election when Russia and other countries leveraged its platform to influence voters. 

“Next month, we’ll begin labeling media outlets that are wholly or partially under the editorial control of their government as state-controlled media,” Facebook said of its efforts to protect the election integrity by boosting transparency in a blog post penned by Guy Rosen, vice president of integrity; Katie Harbath, public policy director, global elections; Nathaniel Gleicher, head of cybersecurity policy and Rob Leathern, director of product management. “We will hold these Pages to a higher standard of transparency because they combine the opinion-making influence of a media organization with the strategic backing of a state.” 

The social media firm has boosted its initiatives to prevent the spread of viral information, noting that it already reduced distribution of misinformation and intends to “more prominently” label content on Facebook and Instagram that a third-party fact checker has deemed false or partially false. “The labels below will be shown on top of false and partly false photos and videos, including on top of Stories content on Instagram, and will link out to the assessment from the fact-checker,” Facebook said.

“In addition to clearer labels, we’re also working to take faster action to prevent misinformation from going viral, especially given that quality reporting and fact-checking takes time,” the executives wrote.

As part of the push to protect U.S. elections, Facebook said it is taking aim at preventing foreign interference. On Monday, the firm removed 50 Instagram accounts and a single Facebook account, many of which it said "showed some links" to the St. Petersburg, Russia-based Internet Research Agency, whose operatives, a Senate Intelligence Committee report recently confirmed, “used social media to conduct an information warfare campaign designed to spread disinformation and societal division” in the U.S.

“We took down these networks based on their behavior, not the content they posted. In each case, the people behind this activity coordinated with one another and used fake accounts to misrepresent themselves, and that was the basis for our action,” Facebook explained.

To safeguard candidates, elected officials and staff, Facebook has rolled out Facebook Protect, a program in which company admins can enroll employees.

“Participants will be required to turn on two-factor authentication, and their accounts will be monitored for hacking, such as login attempts from unusual locations or unverified devices,” said Facebook. “And, if we discover an attack against one account, we can review and protect other accounts affiliated with that same organization that are enrolled in our program.”

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.