Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
The smouldering of anger in Ladakh
access_time 29 March 2024 4:20 AM GMT
Democracy that banks on the electorate
access_time 28 March 2024 5:34 AM GMT
Lessons to learn from Moscow terror attack
access_time 27 March 2024 6:10 AM GMT
Gaza
access_time 26 March 2024 4:34 AM GMT
The poison is not in words, but inside
access_time 25 March 2024 5:42 AM GMT
DEEP READ
Schools breeding hatred
access_time 14 Sep 2023 10:37 AM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Ramadan: Its essence and lessons
access_time 13 March 2024 9:24 AM GMT
When ‘Jai Sree Ram’ becomes a death call
access_time 15 Feb 2024 9:54 AM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightIndependent body to...

Independent body to moderate content at Facebook

text_fields
bookmark_border
Independent body to moderate content at Facebook
cancel

San Francisco: Facebook will establish an independent body next year to oversee user appeals of content, thereby admitting that it should not take a call on free expression and safety on its own, CEO Mark Zuckerberg has said.

In a 5,500-word article late on Thursday, Zuckerberg said the purpose of this body would be to uphold the principle of giving people a voice while also recognising the reality of keeping people safe.

"In the next year, we're planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding," he said.

The independent body will "prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight.

"Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons," he noted.

Over time, said the social media platform CEO, the body would play an important role in its overall governance.

"Just as our board of directors is accountable to our shareholders, this body would be focused only on our community," he added.

The post came in the wake of a report in The New York Times earlier this week that suggested Facebook was aware of a Russian campaign designed to influence the 2016 US presidential election as early as spring of 2016.

Facebook later denied the claims made in the report.

According to the CEO, the team involved in its policy process and enforcement of those policies was made up of around 30,000 people, including content reviewers.

"In total, they review more than two million pieces of content every day," he said, adding that the company currently has a team of more than 200 people working on counter-terrorism specifically.

"In my note about our efforts towards 'Preparing for Elections', I discussed our work fighting misinformation. This includes proactively identifying fake accounts, which are the source of much of the spam, misinformation, and coordinated information campaigns," he informed.

"In the last two quarters, we have removed more than 1.5 billion fake accounts," he said.

Show Full Article
Next Story