The person said that Musk’s messages are making meta look smart
A Twitter Observation of the Oversight Board’s Redux: When Meta and Musk Learned to Shut Down the Cross-Check Program
The topic at hand was Meta’s controversial Cross Check program, which gave special treatment to posts from certain powerful users—celebrities, journalists, government officials, and the like. For years this program operated in secret, and Meta even misled the board on its scope. The leaked details of the program showed that millions of people received special treatment, meaning that their posts were less likely to be taken down if reported for breaking rules against hate speech. The idea was to avoid mistakes in cases where errors would have more impact—or embarrass Meta—because of the prominence of the speaker. Meta researchers had some reservations about the project, according to internal documents. Only after that exposure did Meta ask the board to take a look at the program and recommend what the company should do with it.
The meeting I witnessed was part of that reckoning. And the tone of the discussion led me to wonder if the board would suggest that Meta shut down the program altogether, in the name of fairness. The policies should be for everyone. A board member cried out.
That didn’t happen. This week the social media world paused from talk about the content-moderation train wreck that Musk is conducting at twitter due to the delayed delivery of the Cross Check report by the Oversight Board. (It never did provide the board with a list identifying who got special permission to stave off a takedown, at least until someone took a closer look at the post.) The conclusions were scathing. Meta claimed that the program’s purpose was to improve the quality of its decisions, but the board determined that it was more to protect the company’s business interests. Meta never set up processes to monitor the program and assess whether it was fulfilling its mission. The lack of transparency to the outside world was appalling. The posts were not taken down fast because Meta failed to deliver the quick personalized action. There were too many cases to handle by Meta’s team. They frequently remained up for days before being given secondary consideration.
The cross-check program came under scrutiny after it was reported that some users, including politicians, journalists, and advertisers, were protected from normal content moderation processes by the cross-check program.
The Oversight Board said on Friday that a change to the cross-check program could render Meta’s approach to mistake prevention more fair, credible and legitimate.
Facebook-parent Meta on Friday announced a revamp of its “cross-check” moderation system after facing criticism for giving VIPs special treatment by applying different review processes for VIP posts versus those from regular users.
Meta stopped short of adopting all the recommended changes that were put forward by its own Oversight Board, including publicizing which high-profile accounts qualify for the program.
Meta said Friday that it would implement in part or in full many of the recommendations the Oversight Board made for improving the program.
Meta will try to distinguish between the accounts included in the enhanced program for business and human rights reasons in the company’s transparency center. Meta will also refine its process for temporarily removing or hiding potentially harmful content while it’s pending additional review. And the company also said it would work to ensure that cross-check content reviewers have the appropriate language and regional expertise “whenever possible.”
The company declined to publicly mark the pages of state actors and politicians who were included in the cross-check program. The company said that such public identifiers could make those accounts “potential targets for bad actors.”
“We are committed to maintaining transparency with the board and the public as we continue to execute on the commitments we are making,” regarding the cross-check program, Meta said in a policy statement.