The Oversight Board says that it was unfair to remove the videos of Israel- Hamas war

The Oversight Board Appeals for Meta’s Automated Moderation of Videos about Hostiles and Isolated Civilians in the Israel-Hamas War

An oversight board has criticized the company’s automated moderation tools for being too aggressive after a couple of videos about hostages and injured civilians in the Israel-Hamas war were removed from social media. The reviews panel found that the posts should have remained live and that the removal of their content has a high cost to freedom of expression and access to information. There is a warning for our readers that the descriptions of the content may be disturbing.

One of the videos was removed from Facebook and depicts a woman pleading to her abductors not to kill her. The other video on the internet shows something that appears to be the aftermath of an Israeli strike on or near al-Shida Hospital. There is a post with footage of killed or injured Palestinians.

The board says that, in the case of the latter video, both the removal and a rejection of the user’s appeal to restore the footage were conducted by Meta’s automated moderation tools, without any human review. After the case was taken up by the board, the videos were restored with a warning screen and have been for a few days.

Meta demoted the two reviewed posts because they were intended to raise awareness but the company acknowledges that the posts were not intended to be seen as recommended by other Facebook and Instagram users. Meta has since responded to the board’s decision to overturn the removals, saying that because no recommendations were provided by the panel, there will be no further updates to the case.

“We as the board have recommended certain steps, including creating a crisis protocol center, in past decisions,” Michael McConnell, a cochair of the Oversight Board, told WIRED. It is going to remain. But my hope would be to provide human intervention strategically at the points where mistakes are most often made by the automated systems, and [that] are of particular importance due to the heightened public interest and information surrounding the conflicts.”

Meta wrote in a companyblog that they were happy with the decision of the Oversight Board. expression and safety are important to us and people who use our services.

Previous post Einstein and artificial intelligence have shaped science in 100 years
Next post The DOJ should investigate Apple after the Beeper shutdowns