In a move that reflects Silicon Valley’s eagerness to mend ties with President-elect Donald Trump, Meta CEO Mark Zuckerberg announced on Tuesday that Facebook will be shutting down its “fact-checking” department later this month. This decision marks the end of a practice that has been controversial, especially among conservatives who felt silenced during the pandemic. Zuckerberg, 40, shared this news in a company-released video, while Joel Kaplan, Meta’s chief global affairs officer, appeared on “Fox & Friends” to provide further insights. In the video obtained by Fox News, Zuckerberg stated, “We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms.” He added that fact-checkers will be replaced with Community Notes similar to X’s system in the U.S.
Kaplan emphasized on “Fox & Friends” that this change is aimed at bringing Facebook back to its core mission of promoting free speech. He echoed Zuckerberg’s sentiment by saying it’s an opportunity for them to reset the balance in favor of free expression. The inception of Facebook’s fact-checking program was largely due to concerns over Russian interference in the 2016 election. An FBI investigation revealed Russian actors used fake accounts or “bots” on Facebook to spread disinformation and incite discord before Election Day. Following these revelations, Zuckerberg had apologized before Congress for inadequate control over disinformation and data breaches like Cambridge Analytica’s incident.
This one sentence from Mark Zuckerberg proves he's serious about ending the censorship on his platforms.
"We're going to move our trust and safety and content moderation teams out of California. And our US-based content review is going to be based in Texas."
This means Silicon… pic.twitter.com/xXsej9E8qD
— George (@BehizyTweets) January 7, 2025
Back then, Facebook faced significant political pressure to overhaul its oversight processes. As Kaplan explained on Fox News, they turned to independent third-party fact-checkers but soon realized there was too much political bias in their choices of what content to fact-check. Consequently, they are ending this process entirely and introducing a community-driven notes section similar to X’s approach.
Instead of relying on so-called experts who may have their own biases, the new system will depend on platform users providing commentary on content they’ve read. If a note garners broad user support, it can be attached for others to see. Kaplan believes this method is superior as it moves away from expert reliance towards community involvement.
Content moderation policies will also undergo changes since current practices are deemed “too restrictive,” limiting discussions around sensitive topics like immigration and gender issues. Kaplan told Fox News Digital that they aim for discourse without fear of censorship and are revising rules not just in theory but also in enforcement methods.
Automated systems previously used by Facebook often made errors by removing content that didn’t violate standards; however, certain topics such as terrorism or illegal activities will still be moderated carefully.
I wonder if this means all my so called violations will be taken off too….