Facebook is standing down in its efforts to use fact-checking to suppress “misinformation,” dropping its partnerships with third-party fact-checking organizations and turning to a user-driven “community notes” model similar to the one on X. This was inevitable — a top-down infrastructure to stop false ideas from spreading proved ineffective on several dimensions. Content moderation is a human project, and the fact-checkers (on whom the content moderators have relied to decide what’s true) invariably bring their preferences and biases to the fact-check process, and those biases have overwhelmingly gone leftward. Instead of helping a lot of people see the light (or whatever), this has led much of the population to view moderation efforts with appropriate hostility. Of course, it didn’t help that Facebook was also suppressing a wide variety of ideological views and unpleasant opinions, a practice it will also wind down.
As Reed Albergotti writes for Semafor, Facebook’s approach to moderation was a “failed experiment,” and now it’s over.