ABUJA, Nigeria — On Tuesday, Meta’s oversight board raised significant concerns regarding the company’s inaction in removing a disturbing video that circulated widely, showing two men who appeared to have been violently assaulted for allegedly being gay.
Posted in Nigeria, the video reflects the harsh laws against homosexuality that are prevalent in over 30 countries in Africa, where public sentiment often supports these prohibitive laws, despite some constitutional guarantees of freedom. Such legislation is frequently wielded to persecute and unjustly detain individuals suspected of being part of the LGBTQIA+ community, leading to rampant human rights abuses that frequently go unaddressed.
According to the report, the negative impact of the video, which accumulated over 3.6 million views between December 2022 and February 2023, was described as “immediate and impossible to undo.”
The oversight board criticized the video for “sharing and mocking violence and discrimination,” pointing out that despite being reported multiple times and subsequently evaluated by three human moderators, it remained accessible on Facebook for nearly five months after violating four separate community standards.
The panel noted that the prolonged visibility of the video markedly increased the chances of identifying the victims and set a dangerous precedent that could encourage further violence against the LGBTQIA+ community in Nigeria. “Even after the video’s removal, research by the Board indicates that there were still fragments of the same content lingering on Facebook,” they stated.
In the graphic footage, a crowd can be seen questioning two bleeding men about their identities and sexual orientation.
While Meta was not immediately available for comment, the company acknowledged two critical errors related to the video. The oversight panel revealed that Meta’s automated systems misidentified the language as English when it was actually spoken in Igbo, a language from southeastern Nigeria that is not supported by Meta for broad content moderation. Additionally, human reviewers mistakenly identified the language as Swahili.
“This situation raises important questions about how content in languages that lack significant support is managed, the languages that the company has chosen to support for extensive reviews, and whether translation accuracy is sufficient for reviewers who deal with multiple languages,” the panel stated.
In their recommendations, the board urged Meta to revise its Coordinating Harm and Promoting Crime Community Standard to include explicit examples regarding the risks of outing marginalized groups. They also called for an evaluation of enforcement accuracy pertaining to the protections against exposing the identities or locations of those believed to be part of such groups, improvement of language detection systems for unsupported languages, and provision of precise translations during the review process.
Copyright @2024 | USLive | Terms of Service | Privacy Policy | CA Notice of Collection | [privacy-do-not-sell-link]