Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
DEEP READ
Schools breeding hatred
access_time 14 Sep 2023 10:37 AM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Ramadan: Its essence and lessons
access_time 13 March 2024 9:24 AM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightFacebook apologises...

Facebook apologises for mistakes in removing hate speech

text_fields
bookmark_border
Facebook apologises for mistakes in removing hate speech
cancel

San Francisco: Facebook has apologised after an investigation exposed inconsistencies by moderators in removing offensive posts reported by the social network's users.

The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers, approved a picture of a corpse with the statement "the only good muslim is a f...... dead one" was while another post stating "death to the Muslims!!!" was removed.

In an analysis of 900 posts, the US-based non-profit investigative newsroom found that content reviewers at Facebook often make different calls on items with similar content, and do not always abide by the company's guidelines.

The posts were submitted to ProPublica as part of a crowd-sourced investigation into how facebook implements its hate-speech rules.

ProPublica asked Facebook to explain its decisions on a sample of 49 items.

People who submitted these items maintained that Facebook censors had erred, mostly by failing to remove hate speech, and in some cases by deleting legitimate expression.

Facebook admitted that its reviewers had made a mistake in 22 cases, but the social network defended its rulings in 19 instances.

In six cases, Facebook said that the users had not flagged the conten correctly, or the author had deleted it. In the remaining two cases, Facebook said it did not have enough information to respond.

"We're sorry for the mistakes we have made... They do not reflect the community we want to help build," Facebook Vice President Justin Osofsky was quoted as saying by ProPublica.

"We must do better," he added.

Facebook, according to Osofsky, will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better, the report said on Thursday.

Show Full Article
Next Story