Facebook fails to flag violent hate speeches in Adstext_fields
Facebook once again failed to detect violent hate speech in advertisements submitted to the platform by the nonprofit groups Global Witness and Foxglove. A similar test was run by Global Witness in March which the social media giant failed.
Whistleblower Frances Haugen in her 2021 congressional testimony had said Facebook has an ineffective method for moderation and it is "literally fanning ethnic violence".
Global Witness created 12 text-based ads using dehumanising hate speech calling for genocidal actions against three main ethnic groups in Ethiopia - the Amhara, the Oromo, and the Tigrayans. Facebook approved the ads for publication and wider circulation. The non-profit did not actually post the ads on Facebook after it was approved.
The non-profit organisation said campaigners picked out the worst cases of hate speech they could think of. They are the ones Facebook should be able to detect easily. "The content in the ad wasn't coded or dog whistles". The text explicitly said a certain type of person or people is not a human and should be starved to death.
A similar test in March was based on the situation in Myanmar.
Global Witness informed Meta about the undetected violations. The parent company said the hateful content should not have been approved and pointed out the work the tech firm has done to detect the same. A week later, the non-profit repeated the exercise with two ads. Once again, Facebook approved them.
Director of Foxglove, Rosa Curling, said the only possible explanation for the approval of hateful content is that there is no one moderating them, reported AP. The blatant violent hate speech in the ads was written in Amharic, the most widely used language in Ethiopia.
In both cases, Global Witness received identical emailed statements from Meta. The Zuckerberg-led firm said it is invested heavily in safety measures in Ethiopia, adding more staff with local expertise and building our capacity to catch hateful and inflammatory content in the most widely spoken languages, including Amharic.
Meta has repeatedly refused to reveal how many content moderators in non-English speaking nations including Ethiopia and Myanmar.