After months of questions about what role Facebook played in Russia’s interference in the 2016 U.S. presidential election, the social media company said today it sold about $100,000 worth of ads connected to inauthentic accounts and pages that “likely operated out of Russia.”
While only a small portion of the ad purchases, which were made between June 2015 and May 2017, addressed the U.S. presidential election, many concerned issues divisive during the election and afterward, such as gun rights and racial and LGBT issues, Facebook said.
Separately, the Washington Post cited unnamed sources as saying that Facebook told congressional investigators about the ads, believed to be from a shadowy Russian company seeking to target U.S. voters. While the blog post Facebook published about the ad buys appeared to downplay their significance, they could draw the company into investigations by Congress and special counsel Robert Mueller.
The $100,000 in political-ad buys that Facebook disclosed today concerned about 3,000 ads connected to 470 accounts and pages that Facebook has since shut down because the company deemed them to be inauthentic and in violations of its terms of service. Facebook didn’t specify in its blog post whether it kept the revenue it made from the ads, and the company didn’t reply to a request for comment.
Facebook also conducted a “broad search” of accounts using U.S. IP addresses but with the language setting set to Russian. That search yielded another $50,000 spent on 2,200 ads with “potentially politically related” ads. Facebook said that those ads didn’t violate any policy or law.
Facebook CEO Mark Zuckerberg sparked a controversy in the weeks after the election by scoffing at the idea that its platform spread disinformation. In February, Zuckerberg walked back that assessment, acknowledging a problem and musing in a 6,000-word essay about how Facebook was evolving in response.
In April, Facebook released a white paper outlining the problem of false news on its social network and plans to address it, including building new products to curb its spread, undermining financial incentives of accounts that create of amplify misinformation, and sharing information with others. Later, however, the company refused to open up some of its political-ad data to researchers who sought to study how political ads target voters.