Facebook and Instagram owner Meta said on January 7 it was scrapping its third-party fact-checking program and replacing it with Community Notes written by users similar to the dubious model used by Elon Musk’s social media platform X.
Starting in the U.S., Meta will end its fact-checking program with independent third parties. The company claimed that it decided to end the program because expert fact-checkers had “their own biases and too much content ended up being fact-checked.”
Instead, it will pivot to a Community Notes model that uses crowdsourced fact-checking contributions from users, a platform that has been under fire for being easily corruptible.
“We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context,” Meta’s Chief Global Affairs Officer Joel Kaplan said in a blog post.
Kaplan claimed the new system will be phased in over the next couple of months, and the company will work on improving it over the year. As part of the transition, Meta will use labels to replace warnings overlaid on posts that it forces users to click through.
The Associated Press had participated in Meta’s fact-checking program previously but ended its participation a year ago.
The social media company also said it plans to allow “more speech” by lifting some restrictions on some topics that are part of mainstream discussion such as immigration and gender in order to focus on illegal and “high severity violations” like terrorism, child sexual exploitation and drugs.
Meta said that its approach of building complex systems to manage content on its platforms has “gone too far” and has made “too many mistakes” by censoring too much content.
CEO Mark Zuckerberg acknowledged that the changes are in part sparked by political events including Donald Trump’s presidential election victory.
“The recent elections also feel like a cultural tipping point towards once again prioritizing speech,” Zuckerberg said in an online video.
Meta’s quasi-independent Oversight Board, which was set up to act as a referee on controversial content decisions, said it welcomed the changes and looked forward to working with the company “to understand the changes in greater detail, ensuring its new approach can be as effective and speech-friendly as possible.”
Critics see the latest effort by Meta as an attempt to cover up abandoning its pledge to moderate the violent and extremist right-wing rhetoric that has poisoned so many social media platforms since Trump and the MAGA movement came to dominate American politics.
Brendan Nyhan, a political scientist at Dartmouth College, called the Meta changes part of “a pattern of powerful people and institutions kowtowing to the president in a way that suggests they’re fearful of being targeted.”
Nyhan said that was a grave risk to the country.
“We have in many ways an economy that’s the envy of the world and people come here to start businesses because they don’t have to be aligned with the governing regime like they do in the rest of the world,” Nyhan said. “That’s being called into question.”
Except for YouTube, Meta’s Facebook is by far the most used social media platform in the U.S. According to the Pew Research Center, about 68% of American adults use Facebook, a number that has largely held steady since 2016. Teenagers, however, have fled Facebook over the past decade, with just 32% reporting they used it in a 2024 survey.
Meta began fact checks in December 2016, after Trump was elected to his first term, in response to criticism that “fake news” was spreading on its platforms. For years, the tech giant boasted it was working with more than 100 organizations in over 60 languages to combat misinformation.
Media experts and those who study social media were aghast at Meta’s policy shift.
“Mark Zuckerberg’s decision to end Meta’s fact-checking program not only removes a valuable resource for users, but it also provides an air of legitimacy to a popular disinformation narrative: That fact-checking is politically biased. Fact-checkers provide a valuable service by adding important context to the viral claims that mislead and misinform millions of users on Meta,” said Dan Evon, lead writer for RumorGuard, the News Literacy Project’s digital tool that curates fact checks and teaches people to spot viral misinformation.
The announcement was released on the same day that Meta appointed three new members to its board of directors, including Dana White who is the president and CEO of Ultimate Fighting Championship.
White is a familiar figure in the orbit of Trump. White has had speaking roles at the 2016, 2020, and 2024 Republican conventions and appeared on stage at Trump’s election victory party in November, even speaking briefly to the crowd.
Tapping White to join the board is seen as Zuckerberg’s latest maneuver to improve ties with Trump, who was once banned from Facebook for inciting the insurrection on January 6, after he instructed supporters to attack the Capitol to prevent Congress from voting to certify President Joe Biden’s 2020 election victory.
After Trump won re-election in November, Zuckerberg dined at the president-elect’s Mar-a-Lago club in Florida and Meta donated $1 million to Trump’s inauguration fund.