Meta’s rules for checking VIP posts have caused ‘real harm’ and need ‘overhaul’, review says
On Facebook and Instagram there are rules about what can and can’t be posted.
They can change from time to time, as can the way they are enforced by human and robot moderators. But in theory the rules are the same for every one of the sites’ nearly five billion users.
Unless, that is, you happen to be a politician, celebrity or a business partner of Facebook and Instagram’s parent company Meta.
Their posts, and those of around 5.8 million other influential users, are passed through a special VIP channel known as cross-check, which gives them extra leeway to break Meta’s rules.
The exemptions can be significant. If a normal user’s post is flagged by the automated moderation system, it will be taken down immediately.
If a VIP’s post is flagged, it will stay up while human moderators take a second (or even a third, fourth or fifth) look at it.
In September 2011, for instance, the Brazilian footballer Neymar posted intimate imagery of someone else on his Facebook and Instagram accounts, without, it was reported, the permission of the person involved.
The video was a clear breach of Meta’s content policies, which forbid many relatively mild forms of nudity. Yet, according to The Guardian, it was left online for over a day and received 56 million views before it was taken down.
The reason for the delay? Neymar, who later announced a business deal with Meta to promote Facebook Gaming, was on the list for cross-check, which was struggling to deal with a backlog at the time.
This kind of delay, which on average lasts five days, rising to 12 in the United States and 17 in Syria, is one of several aspects of cross-check sharply criticised by Meta’s Oversight Board, the semi-independent internal “court” set up by Mark Zuckerberg to advise on difficult issues around moderation.
The board has been reviewing the programme since last year, when whistleblower Frances Haugen revealed the scale of the system by leaking internal company documents to the Wall Street Journal.
In a report published on Tuesday, the board calls on Meta to overhaul the programme, arguing that it “prioritises users of commercial value” over its “human rights responsibilities and company values”.
The system has caused “real harm”, Oversight Board director Thomas Hughes told Sky News. Yet he fell short of calling for the system to be disbanded, saying “you do need to have some kind of secondary review process”.
The board called on Meta to overhaul cross-check by making the process quicker and more transparent and by refocusing it on human rights related issues, such as accidental removal of journalistic material.
Read more:
Mark Zuckerberg faces a long and painful road to metaverse success
Facebook defends itself after handing over chat messages to US police investigating abortion
It says Meta should set out clear criteria for involvement in cross-check and publicly mark accounts which are included in the system, particularly state actors or business partners. At present, even those people who are subject to cross-check don’t know they are listed.
The report says that Meta prefers under-enforcing its rules, to avoid creating a “perception of censorship” or stirring up “public controversy” for commercial partners, especially ones who can create trouble for senior Meta executives.
However, in order to avoid damaging delays in moderation, the board suggests that content flagged as “high severity” on first review should be taken down while it is reassessed.
Meta does not have to follow the board’s suggestions and has declined to do so on several notable occasions, although Mr Hughes said the company tended to implement most recommendations. In this case, there are 32.
“They won’t implement them all, but given the implementation rate to date, I think they will implement the majority,” said Mr Hughes. “The board thinks these recommendations are achievable.”
Yet despite calling for Meta to “radically increase transparency around cross-check”, the board struggled to generate full transparency itself, and many crucial details are missing from its report.
The board did not find out who exactly is on the cross-check list, despite “repeatedly asking”. It was not able to confirm the exact number of people on the list, nor obtain detailed examples of posts that had been cross-checked.
“This limited disclosure impairs the board’s ability to carry out its mandated oversight responsibilities,” the board complained in its report.
The board previously said that Meta had “not been fully forthcoming” about cross-check, failing to mention the programme in relation to President Trump, and then saying it was small when in fact it involved millions of users.
Yet although whistleblower Ms Haugen accused Meta of “repeatedly lying” about the scheme, Mr Hughes disagreed, saying he believed the information the board had been given was “accurate” and “fulsome”, and that the board had “flexed its muscles” to investigate the programme.
Critics argued that Meta’s underlying problems were too big for the Oversight Board to fix, because implementing their most substantial suggestions would require the company to employ tens of thousands more human moderators, especially in countries outside the US and Canada.
The board found that these two countries account for 42% of cross-checked content despite only having 9% of monthly active users.
“The Haugen documents show a picture of systemic inequality in which the US, for all its moderation problems, gets the lion’s share of the moderation resource and nearly everywhere else gets basically nothing,” says Cori Crider, director of Foxglove, which is suing Meta on behalf of former Facebook content moderator Daniel Motaung.
“Until that imbalance is redressed, I can’t see how the Oversight Board’s opinions make much difference.”