Meta’s move away from fact-checking in content moderation practices could potentially allow more hate speech or mis- or disinformation, a Northeastern University social media expert says. Meta is adopting a model similar to the one used by X, called Community Notes. John Wihbey, an associate professor of media innovation and technology at Northeastern University, sees the move as the company repositioning itself ahead of President-elect Donald Trump’s inauguration. But third party fact-checking, while difficult to scale on a platform with billions of users, “is an important symbol of commitment to trust and safety and information integrity,” Wihbey says. It is “dangerous,” he says, to break from those norms at a moment when “the winds of authoritarian populism are blowing across the globe.”
In a video message, Meta founder and CEO Mark Zuckerberg described the shift as part of an effort to “get back to our roots around free expression,” noting, among other things, that the company’s fact-checking system has resulted in “too many mistakes and too much censorship.”