Facebook and Instagram put business over human rights by giving special treatment to rule-breaking posts by politicians, celebrities and high profile users, Meta’s independent policymaking council said on Tuesday.

A year-long investigation by the Oversight Board called for the overhaul of a system known as “cross-check” that effectively protects high profile users from Facebook’s content rules.

“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the programme appears more directly structured to satisfy business concerns,” the panel said in a report.

“By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm”.

The Oversight Board, created in 2018 by Meta CEO Mark Zuckerberg, said it learned about cross-check in 2021 while looking into Facebook’s decision to suspend former US president Donald Trump.

It is disappointing in terms of the system that Meta has been depicting that it compliments its commitments to human rights.

Nighat Dad

The Oversight Board

It comes after The Wall Street Journal revealed that Facebook operated a two-tiered content moderation system, which made regular users subject to Facebook’s rules but not VIP users, who were secretly put into the cross-check programme.

Trump was on the special list, as well as Zuckerberg himself and even Brazilian football star Neymar.

Following the Wall Street Journal report, Meta asked the Oversight Board in September 2021 to review the system and suggest ways to fix it.

“It is disappointing in terms of the system that Meta has been depicting that it compliments its commitments to human rights or, treating their users equally. It was not the case. It is privilege-based system,” Nighat Dad, a Pakistani lawyer and Internet activist who sits on the Oversight Board, told Euronews Next.

“And the people who need more protections are rather, rather left behind, for instance, journalists, human rights defenders, people who are working in the conflict zone,” she added.

The Board has made 32 recommendations on how to fix the issue and said that content that violates Meta’s rules with “high severity” in a first assessment “should be removed or hidden while further review is taking place”.

Dad said she is hopeful that Meta will implement most of the recommendations that the Board has given.

“I think the company really should, should set the precedent because at least we now know that there are quality systems in place,” she said.

“It’s not only the job of the company to implement those recommendations, but also set a precedent for other platforms as well, because at least people now know about the cross-check system of methodologies which has problems,” she added.

“But there are many companies and platforms which have these faulty systems, so people don’t even know about that”.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *