Facebook has allegedly provided misinformation researchers with false and incomplete information about how users interact with posts and links on the website.
The report also revealed that, while Facebook promised transparency to researchers, the data it provided to researchers reportedly only covers interactions of about half of its users in the United States, and that most users whose data was included in the reports are often politically engaged enough to make their inclinations clear.
According to the Times, Fabio Giglietto, an associate professor at Urbino University, first discovered the inaccuracy after finding that the data given to the researchers did not match the “Widely Viewed Content Report” that Facebook published in August.
While Facebook spokesperson Mavis Jones attributed the inaccuracy of the data to a “technical error,” the company apologized via email to researchers for the “inconvenience it may have caused.” It also said that the company is already working to fix the problem, although it could take weeks because of the huge amount of data it has to process.
For more information, read the original story in Engadget.