Russian disinformation was circulating on Facebook before the presidential election. It was seen by an estimated 126 million Americans, according to a statement prepared by Facebook’s General Counsel Colin Stretch.
The reach of these messages is far greater than what Facebook originally reported. The social media giant previously said Russian disinformation mills bought $100,000 in ads that reached as many as 10 million people.
Earlier this month, Facebook turned over more than 3,000 such ads to the Senate and House Intelligence Committee.
But the new estimate is more than 10 times that 10 million person estimate, equal to about a third of the U.S. population.
In addition to the 126 million people reached via Facebook, Russian disinformation creators uploaded more than 1,000 videos to Google’s YouTube platform, and published 131,000 messages on Twitter targeted at U.S. voters.
‘An Insidious Attempt to Drive People Apart’
Facebook, via Stretch, downplayed the significance of these numbers. “This equals about four-thousandths of one percent (0.004%) of content in News Feed,” Stretch said, “or approximately 1 out of 23,000 pieces of content.”
Nevertheless, Stretch called the targeted Russian disinformation “an insidious attempt to drive people apart,” and said that Facebook is “determined to prevent it from happening again.”
Facebook, along with Google and Twitter, are facing two days of rigorous hearings before United States lawmakers.
With their corporate images at stake, the companies will help Washington lawmakers determine the future of content and advertising laws for tech companies.
What Was in the Russian Disinformation?
The insidious content came from the Internet Research Group, a Kremlin-linked Russian disinformation group based in St. Petersberg.
It focused on inflammatory content about wedge topics, taking perspectives from multiple sides of the political spectrum. Internet Research Group created pro 2nd amendment groups and LGBT unity groups. Using dummy accounts, it posted content about race issues, police, and religion. All of it was Russian disinformation intended to spread divisiveness among American voters.
The Russian government denies trying to influence the election.
Reevaluating Laws, But Also User Guidelines
The content doesn’t break any of the social media giant’s user guidelines. It looked like the kind of content a zealous partisan voter would upload and spread. If a private citizen had made it, there would be no problem.
But as it was posted by Russian disinformation actors, with the intention of inflaming political discord and influencing a democratic election, that creates a major problem. Facebook, Google and Twitter will likely have to take more serious steps to curb this problem.
They’ll now have to further define the rules and laws for their information sharing platforms. But categorically stifling content from foreign agencies would create its own wave of problems. The dilemma brings up issues surrounding censorship and the free exchange of ideas in a nominally open democracy.
Facebook CEO Mark Zuckerberg said last year that the notion that fake news and Russian propaganda on Facebook had influenced the election was a “pretty crazy idea.” He later withdrew that remark, calling it “dismissive” and expressing regret.
The next two days look difficult for Facebook, Twitter and Google as they testify at the hearings.