misinformation researchers who relied on information from Facebook may have lost months or even years of work. This is because the social network has provided them with incorrect and incomplete information about how users interact with posts and links on the website, according to the New York Times.
Facebook has given researchers access to your data in recent years to track the spread of false information on its platform. It promised researchers transparency and access to all user interactions, but the information the company provided them only includes interactions with only about half of its users in the United States. In addition, most of the users whose interactions have been reported are those who are politically involved enough to clearly express their point of view.
In an email sent to researchers by The Times, Facebook apologized for the “possible damage”. The company also told them it would fix the problem, but it could take weeks as it has a lot of information to process. However, Facebook told researchers that the information it received from users outside the United States was not inaccurate.
Facebook spokesman Mavis Jones blames the inaccuracy of the data on a “technical problem” that the company is “seemingly trying to fix quickly”. As the Times notes, Fabio Giglietto, an assistant professor at the University of Urbino, first noticed the inaccuracy. Giglietto compared the information provided to researchers in a “highly viewed content report” posted to social media in August and found the results inconsistent.
Other investigators expressed concerns after the report was released. Alice Marwick, a researcher at the University of North Carolina, told Engadget they couldn’t confirm these results because they didn’t have access to the data used by Facebook. The company reportedly called investigators on Friday to apologize. Megan Squire, one of those researchers, told The Times: “From a human standpoint there were 47 people today, and every one of these projects is in jeopardy, and some of them will be completely destroyed.
Some researchers used their own tools to collect data for their research, but in at least one case, Facebook cut them off. In August, Facebook deactivated accounts linked to the NYU Ad Observatory project. The team used a browser extension to collect information about political ads, but the social network said it was an “unauthorized hack”. At the time, the project’s lead researcher, Laura Edelson, told Engadget that Facebook would silence the team because their “work often draws attention to platform issues.” Edelson added, “If this episode shows anything, Facebook shouldn’t have a veto over who can investigate them.”