A major social media outlet is pumping the brakes on censorship and doing away with “fact-checking” ahead of President-elect Donald Trump’s return to office. Mark Zuckerberg, founder of Facebook and CEO of Meta, announced on Tuesday that his company, which runs Facebook, Instagram, and WhatsApp, among others, will “get back to our roots around free expression” by “replacing fact checkers with Community Notes, simplifying our policies and focusing on reducing mistakes.”
Zuckerberg explained, “Governments and legacy media have pushed to censor more and more. A lot of this is clearly political.” He continued to say that there is “legitimately bad” content on the internet, including child exploitation, requiring the construction of “complex systems” to monitor and moderate content on social media. However, he added, those “complex systems” can “sometimes make mistakes.” He said, “Even if they accidentally censor just 1% of posts, that’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship.” Zuckerberg noted, “The recent elections also feel like a cultural tipping point towards once again prioritizing speech.”
In a separate press release that links to Zuckerberg’s video, Meta chief global affairs officer Joel Kaplan said that free speech “can be messy. On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression.” He admitted, “In recent years we’ve developed increasingly complex systems to manage content across our platforms, partly in response to societal and political pressure to moderate content. This approach has gone too far.”
One change Meta will make is ending its third-party “fact-checking” program and introducing a “Community Notes” system, similar to the one used on X (formerly Twitter). The Community Notes system allows users to flag incorrect, deceptive, or misleading content and relies on a cadre of users to add notes correcting, clarifying, or providing context to the flagged posts. Beginning in 2016, Facebook began working with third-party “fact-checkers” to review content flagged as potentially incorrect, deceptive, or misleading. Kaplan wrote, “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact check and how. Over time we ended up with too much content being fact checked that people would understand to be legitimate political speech and debate.” He added, “Our system then attached real consequences in the form of intrusive labels and reduced distribution. A program intended to inform too often became a tool to censor.”
Over the next several months, Meta will be phasing out its “fact-checking” program and phasing in the “Community Notes” system, Kaplan shared. He explained that Meta will not write Community Notes and will not determine which posts are tagged with notes. “Community Notes will require agreement between people with a range of perspectives to help prevent biased ratings,” he emphasized. He added, “We intend to be transparent about how different viewpoints inform the Notes displayed in our apps, and are working on the right way to share this information.”
In addition to replacing “fact-checking” with “Community Notes,” Kaplan noted that posts which get “fact checked” by “Community Notes” will no longer be demoted, suppressed, or censored and will no longer be covered up by full-screen “misinformation” warnings. Instead, Meta “will use a much less obtrusive label indicating that there is additional information for those who want to see it.”
Another change Meta will make is removing restrictions it currently has in place on certain topics — namely, immigration and gender identity. “Over time, we have developed complex systems to manage content on our platforms, which are increasingly complicated for us to enforce. As a result, we have been over-enforcing our rules, limiting legitimate political debate and censoring too much trivial content and subjecting too many people to frustrating enforcement actions,” Kaplan explained. He estimated that anywhere from 10% to 20% of the “millions” of posts that Meta censored in December alone were “mistakes” and should not have been censored. “We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement,” Kaplan said.
To that end, Meta will not only lift restrictions currently in place on “sensitive” topics like immigration and gender, but is revamping the system it uses to check for problematic content. Kaplan admitted that Meta currently uses automated systems to scan for content the company deems problematic, but that automated system will now be restricted to looking for illegal content, such as child exploitation material, terrorist indoctrination programs, and the like. “For less severe policy violations, we’re going to rely on someone reporting an issue before we take any action,” Kaplan explained. He continued, “We also demote too much content that our systems predict might violate our standards. We are in the process of getting rid of most of these demotions and requiring greater confidence that the content violates for the rest.” A key part of this program change, he said, will be moving Meta’s trust and safety teams out of California and relocating them in Texas.
Kaplan explained that Meta will also be adjusting its approach to political content. Up until now, he said, Meta has allowed users to mute or request to see less political content and has even suppressed or demoted such content without request. Going forward, the company will “personalize” political content and “start treating civic content from people and Pages you follow on Facebook more like any other content in your feed…” Meta will use indicators such as liking particular posts, opening posts or videos, or even just looking at particular posts for a longer period of time to curate, recommend, and promote political content for users.
In addition to changes Meta is making to its policies and programs, the company has also added a Trump ally to its board of directors. Dana White, CEO of the Ultimate Fighting Championship (UFC) and a friend to the president-elect, announced Monday that he is joining Meta’s board. “I’ve never been interested in joining a board of directors until I got the offer to join Meta’s board,” White said. He added, “I am very excited to join this incredible team and to learn more about this business from the inside.” White was among those who spoke at Trump’s electoral victory party in November and is credited with convincing podcaster Joe Rogan to invite Trump on his show in the days leading up to the election.
Facebook has garnered a reputation for censoring conservative political posts and speech, especially during the COVID-19 pandemic and the 2020 presidential election. Last year, Zuckerberg wrote a letter to Congress, explaining that Facebook had been pressured by the FBI, members of the Democratic Party, and President Joe Biden’s administration to censor and suppress conservative positions and posts, including an exclusive report from the New York Post detailing the contents of Hunter Biden’s laptop. While Zuckerberg admitted that “it was our decision whether or not to take content down, and we own our decisions,” he did note, “I believe the government pressure was wrong, and I regret that we were not more outspoken about it. I also think we made some choices that, with the benefit of hindsight and new information, we wouldn’t make today.”
S.A. McCarthy serves as a news writer at The Washington Stand.