Elon Musk’s social media posts are making their Meta look smart
The First Day of Meta’s Oversight Board Report: When Facebook Failed to Report on its Operatic Content Moderation Train Wreck
It was the first day of April 2022, and I was sitting in a law firm’s midtown Manhattan conference room at a meeting of Meta’s Oversight Board, the independent body the scrutinizes its content decisions. It seemed that the despair had set in.
The program was called out by the Oversight Board in a December policy recommendation, for being set up to appease business concerns and being potentially harmful to users. The board — an entity financed by Meta but which says it operates independently — urged the company to “radically increase transparency” about the cross-check system and how it works.
That didn’t happen. This week the social media world took a pause from lookie-looing the operatic content-moderation train wreck that Elon Musk is conducting at Twitter, as the Oversight Board finally delivered its Cross Check report, delayed because of foot-dragging by Meta in providing information. It never gave the board a list of who got special permission to avoid a takedown, even though it was at least seen by someone. The conclusions were scathing. Meta claimed that the program was to improve the quality of its content decisions, but the board believed that it was more about protecting the company’s interests. Meta never set up processes to monitor the program and assess whether it was fulfilling its mission. The lack of transparency to the outside world was appalling. The posts were spared quick takedowns as a result of Meta failing to deliver quick personalized action. There were simply too many of those cases for Meta’s team to handle. They stayed up for days before being given the attention they deserved.
The prime example, featured in the original WSJ report, was a post from Brazilian soccer star Neymar, who posted a sexual image without its subject’s consent in September 2019. Because of the special treatment he got from being in the Cross Check elite, the image—a flagrant policy violation—garnered over 56 million views before it was finally removed. The program meant to reduce the impact of content decision mistakes wound up boosting the impact of horrible content.
As of 2020, the program had ballooned to include 5.8 million users, the Journal reported. The Oversight Board said in the following report that Facebook failed to give it crucial details about the system. Meta said that the criticism of the system was fair but that cross check was created in order to improve the accuracy of moderation on content that would need more understanding.
Meta, the parent company of Facebook, said on Friday there would be a remake of its moderation system after facing criticism for giving preferential treatment to certain users.
But Meta did not follow through with all the recommended changes, including a suggestion to publicly identify which high-profile accounts are eligible for the program.
On Friday, Meta said it would implement in part or in full many of the more than two dozen recommendations the Oversight Board made for improving the program.
Among the changes it has committed to make, Meta says it will aim to distinguish between accounts included in the enhanced review program for business versus human rights reasons, and detail those distinctions to the board and in the company’s transparency center. Meta will also refine its process for temporarily removing or hiding potentially harmful content while it’s pending additional review. The company will work to ensure that cross-checkcontent reviewers have the appropriate language and regional expertise whenever possible.
The company decided not to publicly mark the pages of state actors, political candidates and other figures that were included in the cross-check program. The company said that such public identifiers could make those accounts “potential targets for bad actors.”
Meta said in its policy statement that they were committed to maintaining transparency with the board and the public.