Why Zuckerberg’s Pivot on Meta Content Moderation Is a Win for Free Thinkers

Earlier this week, Meta CEO Mark Zuckerberg announced sweeping changes to the social media conglomerate’s content moderation policies. Admitting that the censorship on platforms like Facebook and Instagram has “gone too far,” Zuckerberg promised to “get rid of fact-checkers and replace them with community notes similar to X starting in the US.” 

That’s good news for free speech. If the past several years have taught us anything, it’s that bottom-up regulation works far better than top-down command-and-control policies when it comes to the marketplace of ideas.

Why is top-down regulation a bad idea? For one thing, it can lead to politically motivated censorship. In October 2020, Stanford epidemiologist Dr. Jay Bhattacharya, together with Drs. Martin Kulldorf (of Harvard) and Sunetra Gupta (of Oxford) published the Great Barrington Declaration. As Bhattacharya described it, the Declaration “called for an end to economic lockdowns, school shutdowns, and similar restrictive policies on the grounds that they disproportionately harm the young and economically disadvantaged while conferring limited benefits to society as a whole.” Instead of engaging the luminaries behind the Declaration in a debate about COVID policy, the Biden regime censored them. The CDC and Biden White House pressured social media companies to shadow ban the authors. Search engines and social media companies (including Facebook) censored even mention of the Declaration. YouTube took down public policy roundtable videos that featured these scientists.

A key lesson from the COVID era is that censors are rarely noble truth-seekers. More often, they’re as partisan as the rest of us. Faced with a choice between debating their opponents in the open or censoring them, many public officials took the latter (and easier) route. The consequences for our nation’s COVID policy, which would have benefited immensely from an open debate between proponents and opponents of lockdowns, were disastrous.

It’s also true that in today’s climate, most of the fact-checkers that Facebook partnered with to decide which information ought to be allowed on the platform are partisan. Allsides, an organization that rates the political bias of different news organizations, notes that FactCheck.org, PolitiFact, Snopes, and TruthOrFiction.com all lean left. Maybe that’s why, as early as 2019, Pew reported that 70 percent of Republicans said that fact-checking efforts tended to favor one side, compared to just 29 percent of Democrats.

Two specific areas in which Meta is loosening the restrictions on what can and cannot be said are immigration and gender. This represents an essential change. Many leftists assume that the more important a debate is to peoples’ lives, the more we need censors to crack down on misinformation. In reality, the opposite is true. The higher the stakes of any given debate, the more essential it is that all sides be allowed to speak so that we as a society can better approach the truth. On topics where the impact is measured in lives, a bottom-up approach like Community Notes, in which the counter to false speech is recognized to be more speech, is essential.

The insight that a bottom-up approach works better in the marketplace of ideas than top-down censorship owes its origins to Austrian economics. Austrian economist Friedrich Hayek won the Nobel Prize in Economics in part for his deep exploration of how markets function. In his seminal paper “The Use of Knowledge In Society,” Hayek explained why markets work so well. Knowledge about the world, he said, isn’t consolidated in a few public policy experts. Instead, it is distributed. Each of us has a tiny piece of the puzzle; due to everything from our upbringing to our brains to our current work and location in the world, each of us has a small insight that no other human has. Because knowledge is distributed, we get the most efficient outcomes when we can develop systems that encourage everyone to share their piece of the puzzle with everyone else, and all work together to assemble the gigantic jigsaw puzzle that is knowledge of the world. Markets represent just such a system. When it comes to the marketplace of ideas, X’s Community Notes feature does the same thing.

Of course, the system isn’t perfect. At Tangle, journalist Isaac Saul says that Community Notes “often needs 24-48 hours before a post gets a note under it warning users that it is overtly and obviously false.” “By then,” he warns, “it usually has millions of views, and the truth never gets the chance to catch up.” That’s a real problem, and one that both Facebook and X should work to address. But 24-48 hours to correct errors is still a lot faster than the several years it took the social media companies who censored Bhattacharya to see their own mistake and allow him back on their platforms.

There’s another benefit to bottom-up regulation such as that offered by Community Notes: it empowers ordinary citizens. Saul explains that in ultimate frisbee, which he spent many years playing and coaching at a high level, some games have a bottom-up system of rules enforcement in which “Players call their own fouls and violations, following a rulebook with instructions for what to do in situations where they disagree on what happened.” Other games have formal referees. The former system, he says, “demands a level of accountability, honesty, and honor among participants that referees does [sic] not.” In refereed games, by contrast, players have “an incentive to see how much you can get away with without getting caught.”

The spread of misinformation and disinformation is a real problem in our society, but perhaps the best way to address that problem is to train ordinary citizens to spot faulty logic, to call out each others’ false statements, and to research claims before sharing them. Community Notes seems likely to help do that. Top-down content moderation, by contrast, runs the risk of outsourcing our civic responsibility to third-party censors and so atrophying the muscles that each of us need to develop in order to be good citizens of a democratic republic.

That said, we should be careful not to over-praise Zuckerberg. Commentators on left and right agree that the timing of Meta’s pivot feels tailored to take advantage of a shift in the political winds. As Saul points out, Zuckerberg hired friend-of-the-new-administration Joel Kaplan as Meta’s new Chief Global Affairs Officer, pledged to work with the Trump administration, and had Kaplan announce Meta’s shift in policy on Fox News. These actions do not paint the picture of a Meta CEO who is following his internal compass so much as one who is trying to curry favor with the new regime.

But what we sorely need is more men and women of courage. In a 2019 speech, Zuckerberg said that free expression was a paramount virtue in a democracy. If he truly believes that, then he needs to have the courage of his convictions. That courage would look like him being more open about why his companies abandoned free expression as a virtue so thoroughly just a few years ago. It would also involve him making a commitment: that in the future, his companies will continue to live out this foundational virtue no matter which direction the political winds may blow.