The Oversight Board of Meta said on Thursday that it has overturned two judgments made by Facebook to delete content on its platform. The findings of the independent committee reveal serious weaknesses in Facebook’s content moderation procedures in two key areas: the platform’s use of automatic technologies to remove content and the removal of newsworthy content by human moderators.
In September 2020, a Facebook user in Colombia shared a cartoon image showing police violence from the National Police of Colombia. This is the first case from the Oversight Board. When 16 months, Facebook deleted the user’s post after its automated systems found a match between the cartoon image and one kept in a Media Matching Service bank.
The image in question did not contravene Facebook’s policies and shouldn’t have been added to the Media Matching Service pool, according to the Oversight Board, which found that Facebook erred in removing the user’s post.
The Oversight Board also reported that other users were impacted. A post that contained this image was removed, but 215 users appealed the decision. 98 percent of those appeals to Meta were successful. The cartoon image, however, continued to be present in the bank and result in automated detections and subsequent post deletions. When the Oversight Board decided to take up this specific instance, Meta was the first to remove the image from the Media Matching Service pool.
The Oversight Board found that Meta incorrectly deleted a news article regarding the Taliban in the second instance. An Indian newspaper published a link to an article regarding the Taliban’s declaration to reopen schools for women and girls in January 2022 on their website. Because Meta interpreted the message as “praise” for the Taliban, it found that it violated its policy on Dangerous Individuals and Organizations.
As a result, Meta took down the post and restricted access to some Facebook capabilities, like Facebook livestreaming, for the Indian newspaper. Due to a dearth of Urdu-speaking reviewers at the corporation, the newspaper’s attempt to challenge the ruling was unsuccessful.
Once more, Meta changed its mind, restored the content, and removed the Facebook Page restrictions once the Oversight Board chose to take this case. The Oversight Board found that just reporting on noteworthy events is not against Facebook’s policies.
The Oversight Board was established in 2018 to act as a sort of Supreme Court for Meta’s content moderation judgments. In January 2021, the organisation announced the results of its initial cases. One of those early decisions, which demanded the reinstatement of a removed post that Muslim activist groups had declared to be hate speech, received harsh criticism. The Oversight Board’s decision to uphold Meta’s ban of Donald Trump on Facebook, however, has perhaps been its most famous judgement to far. After the bloody rioting on January 6 at the Capitol building, the former President was removed from the podium.
However, Meta was compelled by the Oversight Board’s ruling to establish a deadline for Trump’s suspension. Shortly after the Oversight Board’s 2021 decision, Meta declared it would think about letting Trump return to its platforms in January 2023. Even though it may have seemed far away in the future in June 2021, that time is now only a few months away. Don’t be shocked to see Trump’s name on one, two, twenty, or more Oversight Board cases if and when he decides to return to Facebook next year.