Mark Zuckerberg, Meta’s chief executive, blamed the company’s fact-checking partners for some of Facebook’s moderation issues, saying in a video that “fact-checkers have been too politically biased” and have “destroyed more trust than they created.”

Fact-checking groups that worked with Meta have taken issue with that characterization, saying they had no role in deciding what the company did with the content that was fact-checked.

“I don’t believe we were doing anything, in any form, with bias,” said Neil Brown, the president of the Poynter Institute, a global nonprofit that runs PolitiFact, one of Meta’s fact-checking partners. “There’s a mountain of what could be checked, and we were grabbing what we could.”

Mr. Brown said the group used Meta’s tools to submit fact-checks and followed Meta’s rules that prevented the group from fact-checking politicians. Meta ultimately decided how to respond to the fact-checks, adding warning labels, limiting the reach of some content or even removing the posts.

“We did not, and could not, remove content,” wrote Lori Robertson, the managing editor of FactCheck.org, which has partnered with Meta since 2016, in a blog post. “Any decisions to do that were Meta’s.”

Meta is shifting instead to a program it’s calling Community Notes, which will see it rely on its own users to write fact-checks instead of third-party organizations. Researchers have found the program can be effective when paired with other moderation strategies.

Share.

Leave A Reply

Exit mobile version