Top
Begin typing your search above and press return to search.
exit_to_app
reservation and equality
access_time 2021-05-08T13:33:17+05:30
Outlook for BJP in Kerala
access_time 2021-05-06T11:21:07+05:30
Not Covid deaths, but mass murder
access_time 2021-05-05T10:52:36+05:30
DEEP READ
Iran and the revival of JCPOA
access_time 2021-04-23T13:21:09+05:30
A model mosque in Gujarat
access_time 2021-04-12T17:13:34+05:30
Towards a digital emergency?
access_time 2021-02-27T14:50:41+05:30
The slaughter of democracy in Puducherry
access_time 2021-02-24T11:27:21+05:30
Populist Fascism
access_time 2021-01-31T17:19:29+05:30
Media Freedom
access_time 2021-01-31T15:47:07+05:30
exit_to_app
Homechevron_rightTechnologychevron_rightFacebook apologises...

Facebook apologises for mistakes in removing hate speech

text_fields
bookmark_border
Facebook apologises for mistakes in removing hate speech
cancel

San Francisco: Facebook has apologised after an investigation exposed inconsistencies by moderators in removing offensive posts reported by the social network's users.

The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers, approved a picture of a corpse with the statement "the only good muslim is a f...... dead one" was while another post stating "death to the Muslims!!!" was removed.

In an analysis of 900 posts, the US-based non-profit investigative newsroom found that content reviewers at Facebook often make different calls on items with similar content, and do not always abide by the company's guidelines.

The posts were submitted to ProPublica as part of a crowd-sourced investigation into how facebook implements its hate-speech rules.

ProPublica asked Facebook to explain its decisions on a sample of 49 items.

People who submitted these items maintained that Facebook censors had erred, mostly by failing to remove hate speech, and in some cases by deleting legitimate expression.

Facebook admitted that its reviewers had made a mistake in 22 cases, but the social network defended its rulings in 19 instances.

In six cases, Facebook said that the users had not flagged the conten correctly, or the author had deleted it. In the remaining two cases, Facebook said it did not have enough information to respond.

"We're sorry for the mistakes we have made... They do not reflect the community we want to help build," Facebook Vice President Justin Osofsky was quoted as saying by ProPublica.

"We must do better," he added.

Facebook, according to Osofsky, will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better, the report said on Thursday.

Show Full Article
TAGS:
Next Story