One of Meta’s top executives has blamed “so-called experts” at the company’s former “fact-checkers” for having “destroyed a lot of trust and credibility” in Facebook through “partisan political bias” and “censorship.”
The comments were made by Meta’s Global Affairs Officer Joel Kaplan while announcing major new changes to content moderation on its platforms Facebook, Instagram, Threads, and WhatsApp.
During an interview with Fox News, Kaplan detailed the changes that come into effect next week.
Meta is launching its new Community Notes program next week to replace its biased third-party “fact-checking” program.
The company has credited the idea as being successfully pioneered by Elon Musk’s X.
Rather than “fact-checking” posts, the system allows other users to add “notes,” either supporting ot countering certain statements.
Under Facebook’s old system, left-wing “fact checkers” could limit and block posts from being used by other users, even leading to people being banned for their opinions.
Kaplan revealed that content with community notes applied will not be limited in distribution to users nor have penalties imposed.
Meta announced in January that it had ended its “fact-checking” program and lifted restrictions on speech on the platform to “restore free expression” across Facebook, Instagram, and Meta platforms.
The company said its content moderation practices had “gone too far.”
Kaplan revealed that President Donald Trump’s re-election has given Meta an “opportunity” to “go back to its roots of free expression.”
“We had a third-party, fact-checking program, which was well-intentioned at the beginning but proved to be really prone to partisan political bias and destroyed a lot of trust and credibility in the system,” Kaplan said.
“We decided to replace that system, starting in the United States with a crowdsourced, community-based approach, which we announced in January.”
Next week, Meta is opening the new Community Notes program for users to write and rate notes on content across Facebook, Instagram, and Threads.
“We’ve developed a waitlist that actually has a couple of hundred thousand people on it, a broad cross-section of Americans who use Facebook and Instagram who want to be able to add context to the content that they are seeing when they think it is misleading,” Kaplan said.
“And the great thing about community notes is that, first of all, instead of a handful of so-called experts like the third-party fact-checkers, it’s our community, which is broad based, ideologically diverse people from across the political spectrum.”
Meta insists that it will not decide what gets rated or what gets written.
Instead, it will be determined by the contributors from the Facebook, Instagram, and Threads communities.
Kaplan noted that Meta is borrowing the algorithm used by X, which Musk’s company has open-sourced from its system.
“The algorithm only applies a community note when people who normally disagree agree that something is misleading,” Kaplan said.
“And that’s the way that you ensure that the bias that crept into the third-party fact-checking system isn’t a part of this system.”
“Another thing that it won’t do that the third-party fact-checking program did is it doesn’t apply any penalty,” Kaplan explained.
“The third-party fact-checking program, in addition to the bias, had penalties attached to it, where if something was rated false, we would dramatically reduce its distribution,” Kaplan continued.
“And that turned a program that was intended to be about providing additional information into one that was essentially a censorship tool.”
Meta’s third-party fact-checking program was put in place after the 2016 election.
It had been used to “manage content” and “misinformation” on its platforms.
However, executives admitted the system had “gone too far,” largely due to “political pressure.”
“The community notes program is just about providing additional information and context so people can make their own decisions, but it doesn’t apply any distribution penalties or limit the flow of information through the algorithm,” Kaplan said.
Kaplan said Meta believes users “should see both the posts and then also the additional information to give them context about the post.”
“We want to make sure that the full range of information is provided,” he said.
The Community Notes will be limited to 500 characters and will be written by contributors in Meta’s program.
“Individual members of the community will write and submit notes, and then other members of the community will get to say, ‘Yeah, that looks right to me,’ within the system,” he said.
“And once the algorithm determines that it received a critical mass of support from people who usually disagree, that is the check on the bias.”
“All the changes we made in January were in the service of returning to our roots of free expression, and the third-party fact-checking program has become an impediment to that,” Kaplan said.
“A community-based system that empowers our users to just provide additional information that people find helpful, I think, is a really big improvement on voice and expression on the platform.”
WATCH:
As for who can contribute community notes, Meta told Fox News Digital that contributors must be over 18 years old and have an account that is more than six months old and in good standing.
The user must also have either a verified phone number or be enrolled in two-factor authentication.
The community notes feature will be available in six languages commonly used in the United States to start, including English, Spanish, Chinese, Vietnamese, French, and Portuguese.
Meta will expand to other languages down the line.