Complete Story
 

03/14/2019

The Disinformation Problem Starts at Home

We must do more to safeguard against these threats

In the face of growing political pressure and measles outbreaks in the U.S. and abroad, YouTube recently pulled advertising from videos spreading anti-vaccine propaganda. Facebook, meanwhile, has announced that groups and pages that push misinformation about vaccines will get lower rankings and won’t be recommended to users. These overdue moves illustrate the companies’ ability to identify and police false content, and they undercut a notion widely embraced in the social media industry that Facebook, Twitter and YouTube shouldn’t be “arbiters of the truth.”

In fact, the major social media companies already play the arbiter role, just not in a systematic way. A new report from the New York University Stern Center for Business and Human Rights urges these companies to take a more active stance in preventing disinformation from spreading online.

We’ve known that Russian operatives are using social media platforms to interfere in US elections and exacerbate political polarization. But the problem actually starts here at home. The NYU report focuses on domestically generated disinformation, noting that far more false and divisive online content is produced domestically than comes from abroad. We bring this problem on ourselves—and the platforms need to do more to address it.

Please select this link to read the complete article from WIRED. 

Printer-Friendly Version