LONDON — Meta head Mark Zuckerberg ended its US fact-checking programmes meant to label viral, misleading content.
Meta will review posts only in response to user reports, Zuckerberg said recently.
Automated systems will flag “high-severity violations” around terrorism, child exploitation, scams and drugs. The changes only apply to its US operations, not to other regions.
Fact-checking organisations said the move could encourage hate speech online and fuel violence offline.
“Mr Zuckerberg doesn’t want to be in the business of arbitrating truth, but he is,” said Sarah Shugars, assistant professor of communication at Rutgers University, in New Jersey.
“The removal of fact-checkers and loosening of policies will only serve to discourage free speech and exacerbate bias on the platform. Claiming otherwise would be laughable if the repercussions were not so serious,” Shugars added according to Reuters.
Here’s what you need to know about the new rules and their impact.
What are Meta’s new rules?
Instead of using trusted media organisations to fact-check, Meta will use “community notes” to check content similar to those on X, formerly Twitter.
On X, users apply to contribute community notes. When enough users from “different perspectives” rate a note as “helpful”, it is publicly shown on a post.
X works out a user’s perspective based on the topics they choose to fact-check.
Meta will not write the community notes itself. A published note will require the backing of users “with a range of perspectives to help prevent biased ratings.”
With the changes, Meta’s algorithm will no longer make content that has been rated poorly by fact-checkers (now users, rather than professionals) less visible, while the company will make the labels on any fact-checked content “less obtrusive.”
Who are Meta’s fact-checking partners?
Meta’s decision will have ramifications for global media and fact-checking websites.
In the United States, Meta partners on fact-checking with various news organisations including Agence France Presse, USA Today and Thomson Reuters’s Reuters Fact Check unit, among others. The Thomson Reuters Foundation, the charitable arm of Thomson Reuters, runs the Context media platform.
Around the world, Meta works with 90 fact-checking organisations, covering more than 60 languages.
Fact-checking companies heavily rely on Meta’s funding for their revenue, according to a survey by the International Fact-Checking Network (IFCN).
Discussion about this post