Meta has made a notable choice in the United States. The platform is discontinuing independent fact-checkers who verify information and issue warnings for incorrect posts. Fact-checkers are professionals or organizations that investigate whether news and other information are accurate, providing warnings for false or misleading content. Instead, Meta is introducing the Community Notes system, allowing users to add context to posts themselves. This idea comes from X (formerly Twitter). But is this a smart move or a risk?
Meta opts for community notes
Meta is discontinuing independent fact-checkers on Facebook and Instagram in the US. The new Community Notes system allows users to collaborate in adding additional explanations and context to posts. According to CEO Mark Zuckerberg, this aligns with the core values of free speech. He described the old fact-checking program as "too politically biased" and stated that this change was necessary to regain users' trust.
Political pressure plays a role
The timing is striking: this change comes just before Donald Trump is sworn in as president again. Trump and his supporters previously referred to the old fact-checking policy as censorship of right-wing voices. Trump welcomed the change and called it a step in the right direction. Critics, however, see this as a politically motivated decision to accommodate the new administration.
Concerns about hate and disinformation
Organizations like Global Witness are concerned about this decision. They fear that the absence of professional fact-checkers will lead to more hate and misinformation. Fact-checking organizations, such as Full Fact, are also critical. They believe that independent oversight remains essential and describe this step by Meta as a risk in the fight against misinformation worldwide.
What does this mean for moderation?
Meta states that Community Notes is a fairer system, where users collaboratively provide messages with more context. This can lead to greater nuance, but it also raises questions. Some sensitive topics, such as posts about self-harm or eating disorders, remain under strict supervision. The new system is currently only being used in the U.S.; in Europe, the old fact-checking program remains active.
This change indicates that Meta is taking a new approach to content moderation. Whether this will result in a better balance between free speech and the fight against misinformation remains to be seen.

