White House Dispute Exposes Facebook Blind Spot on Misinformation

https://www.nytimes.com/2021/07/19/technology/facebook-misinformation-blind-spot.html

Version 0 of 1.

SAN FRANCISCO — At the start of the pandemic, a group of data scientists at Facebook held a meeting with executives to ask for resources to help measure the prevalence of misinformation about Covid-19 on the social network.

The data scientists said figuring out how many Facebook users saw false or misleading information would be complex, perhaps taking a year a more, according to two people who participated in the meeting. But they added that by putting some new hires on the project and reassigning some existing employees to it, the company could better understand how incorrect facts about the virus spread on the platform.

The executives never approved the resources, and the team was never told why, according to the people, who requested anonymity because they were not authorized to speak to reporters.

Now, more than a year later, Facebook has been caught in a firestorm about the very type of information that the data scientists were hoping to track.

The White House and other federal agencies have pressed the company to hand over data about how anti-vaccine narratives spread online, and have accused Facebook of withholding key information. President Biden on Friday accused the company of “killing people” by allowing false information to circulate widely. On Monday, he walked that back slightly, instead directing blame at people who originate falsehoods.

“Anyone listening to it is getting hurt by it,” Mr. Biden said. He said he hoped that instead of “taking it personally,” Facebook would “do something about the misinformation.”

The company has responded with statistics on how many posts containing misinformation it has removed, as well as how many Americans it has directed to factual information about the government’s pandemic response. In a blog post on Saturday, Facebook asked the Biden administration to stop “finger-pointing,” and casting blame on Facebook after missing its goal of vaccinating 70 percent of American adults by July 4.

“Facebook is not the reason this goal was missed,” Guy Rosen, Facebook’s vice president of integrity, said in the post.

But the pointed back-and-forth struck an uncomfortable chord for the company: It doesn’t actually know many specifics about how misinformation about the coronavirus and the vaccines to combat it have spread. That blind spot has reinforced concerns among misinformation researchers over Facebook’s selective release of data, and how aggressively — or not — the company has studied misinformation on its platform.

“The suggestion we haven’t put resources toward combating Covid misinformation and supporting the vaccine rollout is just not supported by the facts,” said Dani Lever, a Facebook spokeswoman. “With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes — measuring whether people who use Facebook are accepting of Covid-19 vaccines.”

Executives at Facebook, including its chief executive, Mark Zuckerberg, have said the company committed to removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of Covid-19 misinformation since the start of the pandemic.

Experts who study disinformation said the number of pieces that Facebook removed was not as informative as how many were uploaded to the site, or in which groups and pages people were seeing the spread of misinformation.

“They need to open up the black box that is their content ranking and content amplification architecture. Take that black box and open it up for audit by independent researchers and government,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, a nonprofit that aims to combat disinformation. “We don’t know how many Americans have been infected with misinformation.”

Mr. Ahmed’s group, using publicly available data from CrowdTangle, a Facebook-owned program, found that 12 people were responsible for 65 percent of the Covid-19 misinformation on Facebook. The White House, including Mr. Biden, has repeated that figure in the past week. Facebook says it disagrees with the characterization of the “disinformation dozen,” adding that some of their pages and accounts were removed, while others no longer post content that violate Facebook’s rules.

Renée DiResta, a disinformation researcher at Stanford’s Internet Observatory, called on Facebook to release more granular data, which would allow experts to understand how false claims about the vaccine were affecting specific communities within the country. The information, which is known as “prevalence data,” essentially looks at how widespread a narrative is, such as what percentage of people in a community on the service see it.

“The reason more granular prevalence data is needed is that false claims don’t spread among all audiences equally,” Ms. DiResta said. “In order to effectively counter specific false claims that communities are seeing, civil society organization and researchers need a better sense of what is happening within those groups.”

Many employees within Facebook have made the same argument. Brian Boland, a former Facebook vice president in charge of partnerships strategy, told CNN on Sunday that he had argued while at the company that it should publicly share as much information as possible. When asked about the dispute with the White House over Covid misinformation, he said, “Facebook has that data.”

“They look at it,” Mr. Boland. But he added: “Do they look at it the right way? Are they investing in the teams as fully as they should?”

Mr. Boland’s comments were widely repeated as evidence that Facebook has the requested data but is not sharing it. He did not respond to a request for comment from The New York Times, but one of the data scientists who pushed inside Facebook for deeper study of coronavirus misinformation said the problem was more about whether and how the company studied the data.

Technically, the person said, the company has data on all content that moves through its platforms. But measuring and tracking Covid misinformation first requires defining and labeling what qualifies as misinformation, something the person said the company had not dedicated resources toward.

Some at Facebook have suggested the government, or health officials, should be the ones who define misinformation. Only once that key baseline is set can data scientists begin to build out systems known as qualifiers, which measure the spread of certain information.

Given the billions of individual pieces of content posted to Facebook daily, the undertaking of measuring, tracking and ultimately calculating the prevalence of misinformation would be a huge task, the person said.

The meeting held at the start of the pandemic was not the only time Facebook had internal discussions about how to track misinformation.

Members of Facebook’s communications team raised the question of prevalence as well, telling executives last summer and fall that it would be useful for disputing articles by journalists who used CrowdTangle to write articles about the spread of anti-vaccine misinformation, according to a Facebook employee involved in those discussions.

After the 2016 presidential election, Mr. Zuckerberg sought a similar statistic on how much “fake news” Americans had seen leading up to it, a member of Facebook’s communications team said. One week after the vote, Mr. Zuckerberg published a blog post saying the false news had amounted to “less than 1 percent,” but the company did not clarify that estimate or give more details despite being pressed by reporters.

Months later, Adam Mosseri, a Facebook executive who was then the head of NewsFeed, said part of the problem was that “fake news means different things to different people.”

Davey Alba and Zolan Kanno-Youngs contributed reporting.