Fake news has been a problem on Facebook and other social media sites for a long time, but the proliferation of bogus news stories has really ramped up in the past year. While their CEO has been dismissive of the epidemic, it seems like they have received enough pressure from outside forces to take some first steps to curb the problem. A majority of US adults get their news from social media, so getting this right is a huge deal.
What's worrisome is that this first step could be seen as doing more harm than good. Crowdsourcing the curation of news feeds to third party fact-checkers has some obvious drawbacks. How long is it until trolls start flagging legitimate news sources as a way to further muddying what is quality news content and what is bogus?
Stories that are deemed to suspect will be flagged and given a disputed tag, but users will still be able to share the stories. Facebook has good reason not to rush into censoring content on the site, the entire enterprise hinges on the free flow of information between its users.
These actions taken by the social media giant feel like a half step, but it is something. Seeing stories tagged as disputed will hopefully highlight the problem with fake news, and perhaps motivate people to seek out better platforms for news coverage.