Facebook has long insisted that it is a technology company and not a publisher, and rejects the idea that it should be held responsible for the content that its users circulate on the platform. Just after the election, Zuckerberg said the notion that fake or misleading news on Facebook had helped swing the election to Donald Trump was a “crazy idea.”
Zuckerberg then said last Saturday that more than 99 percent of what people see on Facebook is authentic, calling “only a very small amount” fake news and hoaxes.
But in his Friday posting Zuckerberg struck a decidedly different tone. He said Facebook has been working on the issue of misinformation for a long time, calling the problem complex both technically and philosophically.
“While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap,” Zuckerberg said.
He outlined a series of steps that were already underway, including greater use of automation to “detect what people will flag as false before they do it themselves.”
He also said Facebook would make it easier to report false content, work with third-party verification organizations and journalists on fact-checking efforts, and explore posting warning labels on content that has been flagged as false. The company will also try to prevent fake-news providers from making money through its advertising system, as it had previously announced.
Zuckerberg said Facebook must be careful not to discourage sharing of opinions or mistakenly restricting accurate content. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” he said.
Facebook historically has relied on users to report links as false and share links to myth-busting sites, including Snopes, to determine if it can confidently classify stories as misinformation, Zuckerberg said. The service has extensive “community standards” on what kinds of content are acceptable.
Facebook faced international outcry earlier this year after it removed an iconic Vietnam War photo due to nudity, a decision that was later reversed. The thorniest content issues are decided by a group of top executives at Facebook, and there have been extensive internal conversations at the company in recent months over content controversies, people familiar with the discussions say.
Among the fake news reports that circulated ahead of the U.S. election were reports erroneously alleging Pope Francis had endorsed Trump and that a federal agent who had been investigating Democratic candidate Hillary Clinton was found dead.
(Reporting by David Bailey in Minneapolis; Editing by Jonathan Weber and Mary Milliken)