Meta’s ‘Cross-check’ Program Prioritizes Business Over Human Rights, Says Oversight Board

Meta issued its complete policy advice on cross-checking regulations over a year ago after asking the Oversight Board for feedback on the matter. 

In a report by Engadget, the Board decided that the program’s specific content filtering procedure for high-profile users puts the company’s revenue above user rights.

Board’s Conclusion

“In our review, we found several shortcomings in Meta’s cross-check program,” the Board states. 

While Meta has assured the Board that cross-check is meant to further the company’s human rights obligations, they found that the program looks more directly geared to fulfill business concerns. 

The criticism is similar to whistleblower Frances Haugen’s claim last year when she exposed shocking information concerning cross-checking and how Meta “chooses profits over safety.”

Cross-check Program

Cross-check, or “xcheck,” is a Facebook and Instagram internal tool that protects prominent users from the platforms’ automatic content moderation systems. 

Meta has described it as a “second layer of review” that helps prevent postings from being accidentally deleted. 

Haugen’s revelations demonstrated the scheme covers millions of accounts and has allowed billions of views on postings that would have been deleted. 

Meta has been accused of not being “fully forthcoming” about the system, which has been at the center of the Oversight Board’s handling of the suspension of former President Donald Trump.

Oversight Board’s policy advisory opinion (PAO) on the program provides the most in-depth analysis of Meta’s developing cross-check regulations to date. 

The Board extensively discusses its two distinct cross-check processes: Early Response Secondary Review (ERSR) for select high-profile users as determined by Meta, and General Secondary Review (GSR), a modern system that uses a classifier to instantly flag certain types of posts throughout its platform for additional review. 

GSR, which can be applied to information from any Facebook or Instagram user, was launched in 2021 “in reaction to criticism” due to Haugen’s statements in the Facebook Papers.

Also Read: Meta: Facebook and Instagram to Bring Changes to Teenager’s Privacy Online, Protection Against Harassment

Review of Cases

Both cross-check methods have problems, says the Oversight Board. Both have a “consistent backlog of cases,” which prolongs the time rule-breaking content is online. 

Meta informed the Board it might take more than five days to decide on cross-checking information. Hence, Meta-infringing material is left on Facebook and Instagram when it is most popular and might cause damage.

The Oversight Board criticizes various “business” elements in Meta’s cross-check regulations. 

For example, it argues Meta imbalance toward under-enforcement of cross-checked material owing to the appearance of censorship and the impact it may have on the firm. 

The Board understands this to suggest that mitigating the “perception of censorship” may precede other human rights duties for content management.

Recommendations for Meta

The Board has several cross-check suggestions for Meta. The Board says Meta should utilize specialist teams apart from political or commercial influence to assess which accounts obtain cross-check safeguards. 

It also offers a “transparent strike mechanism” to cancel cross-checking status from abusive accounts.

The Board further advises that Meta notify all cross-check accounts and publicly indicate the pages and accounts of companies obtaining list-based protection. 

It wants Meta to monitor cross-checking accuracy numbers and decrease case backlogs.

Also Read: Meta to Remove News off Facebook if US Congress Approves Media Bill

ⓒ 2022 All rights reserved. Do not reproduce without permission.

Leave a Reply

Your email address will not be published. Required fields are marked *