December 22, 2016 | Engagement | by Alexian Chiavegato

Facebook’s Fight Against Fake News Rages On

As digital publishing continues to evolve, publishers are faced with more choices on where to distribute the content they work tirelessly to develop. In recent years, social media, along with native mobile and desktop sites, have become a go-to way for publications of all sizes to attract readers from across the web. In fact, one survey showed that 27 percent of marketers viewed social media as indispensable, compared to 28 percent for email and 26 percent for websites.

The giant in that industry is Facebook, of course, which verges on being a media company in its own right. Readers go to the social media platform to look for news, and generally trust that the content they access there is factual. Facebook also provides a new revenue stream for publishers aiming to monetize their blogs and digital publications. Now that trust is starting to come into question, and it’s affecting the publishers who depend on clicks coming from Facebook.

In 2016, fake news stories on Facebook have become more common, and more users are aware of the phenomenon. This presents a problem for Facebook, which has to balance its dual roles as a nonpartisan space for sharing content from innumerable outlets, from everyday users to multinational media companies, and its growing ability to shape news stories, not just share them. Almost 50 percent of adult Americans say they get their news from Facebook, with approximately 25 percent saying they have knowingly or unknowingly shared a fake news story before. This is starting to have an effect on users like Lisel Laslie, who told USA Today, “It’s like I am an investigative reporter, and I have to check eight sources before sharing anything.”

To its credit, Facebook has been introducing measures to reduce the spread of fake news since last summer, when the platform announced that stories straight from publishers would rank lower in the news feed than posts from friends. While this might seem like a blow to publishers banking on the ad dollars brought in from Facebook, it doesn’t apply to stories shared by friends, which means that engaging, important content could still be spread across the site. How this affects publishers in the long run is yet to be seen.

Now Facebook is introducing some new, more publisher-friendly ways to fight the scourge of fake news on the news feed.

A New Flagging Process: In the past Facebook users have been able to flag posts they find offensive in their news feeds. Now Facebook is making it easier to call out stories that may be hoaxes as well. Users can report fake news by simply clicking a drop-down menu at the top of a post, and then either message or block the users who spread it.

Third-Party Fact Checkers: If a post receives enough reports from users saying it is fake, a team of third-party fact checkers will examine it. These fact checkers come from organizations that have agreed to follow Poynter’s International Fact Checking Code of Principles, including such major organizations as the Associated Press and ABC News. If they deem the story to indeed be false, they will mark it as disputed, link it to a story explaining why, and possibly even move it lower on your news feed. The publishers are doing this for free as well, with James Goldston, president of ABC News, saying that they “regard this as a big part of our core mission.”

Increasing The Importance Of Sharing: According to Adam Mosseri, the vice president in charge of Facebook’s News Feed, Facebook has “found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way.” Facebook will begin using engagement as a signal that a story may be misleading, and move it down user’s news feeds, where people are less likely to see it.

While these might be considered imperfect solutions, they are signs that Facebook values its users, and also the publishers who put out factual, engaging content on the site. “Facebook was inevitably going to have to curate the platform much more carefully, and this seems like a reasonably transparent method of intervention,” says Emily Bell, director at the Tow Center for Digital Journalism at Columbia University.

Facebook, along with other content sharing platforms, now sees this as a real problem, and one worth solving.