Facebook, Inc. said Thursday it was stepping up efforts against fake news and hoaxes by testing several ways to make it easier to report a hoax if users see one on the social media network.
Adam Mosseri, vice president of Facebook in charge of news feed, said the online social media and social networking service based in Menlo Park, Northern California, would "learn from these tests, and iterate and extend them over time."
"We've started a program to work with third-party fact checking organizations that are signatories of Poynter's International Fact Checking Code of Principles," Mosseri wrote on the company's blog. "We'll use the reports from our community, along with other signals, to send stories to these organizations."
Posting a number of screen shots on a mobile device to illustrate how the new efforts would work, he noted that "if the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed."
However, it will still be possible to share these flagged stories, but a "warning that the story has been disputed" will pop up on the screen and a flagged story "can't be made into an ad and promoted."
Facebook has been under pressure for a while that fake news stories have been abundant on its network during this U.S. election year, misleading voters in a way that somehow has had impacted the result of the presidential election.
Deflecting the political implication of criticism, Mosseri wrote that "we've found that a lot of fake news is financially motivated. Spammers make money by masquerading as well-known news organizations, and posting hoaxes that get people to visit their sites, which are often mostly ads. So we're doing several things to reduce the financial incentives."
Meanwhile, "on the buying side we've eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications. On the publisher side, we are analyzing publisher sites to detect where policy enforcement actions might be necessary," he said.
"It's important to us that the stories you see on Facebook are authentic and meaningful," said the senior executive, adding that "we know there's more to be done. We're going to keep working on this problem for as long as it takes to get it right."
Last week, Facebook said it was working with Microsoft, Twitter and YouTube to help curb the spread of terrorist contents on line.