Facebook has been criticized as being one of the main distribution points for so-called fake news, which many think influenced the 2016 U.S. presidential election.
The company will become the suggester of perspective to avoid being the “arbiter of truth”. It’s rolling out “Related Articles” that appear below news links to stories lots of people are posting about on Facebook, or that are suspected to be false news and have been externally fact checked by Facebook’s partners.
Appearing before someone reads, Related Articles will surface links to additional reporting on the same topic to provide different view points, and to truthiness reports from the fact checkers.
If users see drastically different angles when they compare a story to its Related Articles, they might deem it suspicious and skip it, be less likely to believe or share it, or could click through the Related Articles and make up their own mind. That could reduce the spread and impact of false news without Facebook itself having to be the honesty police.
Related Articles could also balance out some of the radical invective that can subtly polarize the populace.
The company said in a statement on its website it will start using updated machine learning to detect possible hoaxes and send them to fact checkers, potentially showing fact-checking results under the original article.
Meanwhile, Facebook’s machine learning algorithm has improved its accuracy and speed, so the social network will now have it send more potential hoaxes to fact checkers. Lyons explains that the speed is important because “The sooner we can get potential false new stories to fact checkers, the sooner that they can review them, and the more we reduce the number of people who are actually exposed to them.”