Meta’s Election Research Opens More Questions Than It Answers
Meta denies its platforms cause political polarization
In the lead up to the 2020 presidential election, Meta set out to conduct a series of ambitious studies on the effects its platforms—Facebook and Instagram—have on the political beliefs of U.S.-based users. Independent researchers from several universities were given unprecedented access to Meta's data, and the power to change the feeds of tens of thousands of people in order to observe their behavior.
The researchers weren't paid by Meta, but the company seemed pleased with the results, which were released today in four papers in Nature and Science. Nick Clegg, Meta's president of global affairs, said in a statement "the experimental findings add to a growing body of research showing there is little evidence that key features of Meta's platforms alone cause harmful 'affective' polarization' or have 'meaningful effects on' political views and behavior."
It's a sweeping conclusion. But the studies are actually much narrower. Even though researchers were given more insight into Meta's platforms than ever before—for many years, Meta considered such data too sensitive to make public—the studies released today leave open as many questions as they answer.
Please select this link to read the complete article from WIRED.