Facebook has a secret internal system that exempts high-profile celebrities, athletes, politicians, and journalists from its content moderation standards, according to a report from the Wall Street Journal.
The program is called “cross check” or “XCheck,” and it was initially developed as “a quality-control measure for actions taken against high-profile accounts,” the Journal reported. It is intended to prevent “PR fires,” or the bad press that comes from an erroneous enforcement action taken against people considered VIPs.
The effect of this program is that there are “invisible elite tiers” within Facebook that determine who must follow the rules and who gets a special exemption that allows them to break the rules without fear of consequences. Approximately 5.8 million high-profile Facebook users were protected from enforcement action as of 2020, and only 10% of the posts protected by XCheck are actually reviewed, according to documents obtained by the Journal.
“At times, the documents show, XCheck has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users,” the Journal reported. “In 2019, it allowed international soccer star Neymar to show nude photos of a woman, who had accused him of rape, to tens of millions of his fans before the content was removed by Facebook. Whitelisted accounts shared inflammatory claims that Facebook’s fact checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up “pedophile rings,” and that then-President Donald Trump had called all refugees seeking asylum ‘animals.'”
“We are not actually doing what we say we do publicly,” an internal company review obtained by the paper said. The review said the program is “a breach of trust” and reinforced the point that “unlike the rest of our community, these people can violate our standards without any consequences.”
A spokesman for Facebook told the Journal that criticism of XCheck is fair, but that the system “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”
He said that Facebook is reviewing its policies to address criticisms that the company lacks the will or ability to enforce its content moderation standards fairly.
“A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them,” the spokesman said.
But according to the Journal, when it comes to the XCheck program, Facebook has misled the reviewers it put in place for oversight and accountability.
After the independent Oversight Board Facebook established to review its content moderation decisions upheld the suspension of former President Donald Trump, the board requested that Facebook “report on the relative error rates and thematic consistency of determinations made through the cross check process compared with ordinary enforcement procedures.”
Facebook declined to do so, telling the board “it’s not feasible to track this information” and referring the board to a 2018 blog post that stated, “We remove content from Facebook, no matter who posts it, when it breaks our standards.”
But the XCheck documents reported by the Journal indicate that claim is not true.
The Oversight Board told the Journal in a statement that it “has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts.”