Facebook takes tough action against Groups that spread harmful content and hate speechtext_fields
San Francisco: Facebook takes tough action against Groups that spread harmful content, hate speech and misinformation. On Thursday it said that admins and moderators of Groups taken down for policy violations will not be able to create any new Group for a certain period of time.
"For members who have any Community Standards violations in a Group, their posts in that Group will now require approval for the next 30 days'" said Facebook in a blog post.
"This stops their post from being seen by others until an admin or moderator approves it," said Tom Alison, VP of Engineering at Facebook.
Sometimes admins may step down or leave their Groups.
"Our proactive detection continues to operate in these Groups, but we know that active admins can help maintain the community and promote more productive conversations.
Facebook will suggest admin roles to members who may be interested. A number of factors go into these suggestions, including whether people have a history of Community Standards violations.
The social network will begin archiving Groups that have been without an admin for some time in coming weeks,
"Moving forward, when a single remaining admin chooses to step down, they can invite members to become admins. If no invited members accept, we will suggest admin roles to members who may be interested. If no one accepts, we'll archive the group," Facebook said.
Facebook has removed about 1.5 million pieces of content in Groups for violating its policies on organised hate.
It also removed about 12 million pieces of content in groups for violating its policies on hate speech.
"When it comes to groups themselves, we will take an entire group down if it repeatedly breaks our rules or if it was set up with the intent to violate our standards. Over the last year, we took down more than 1 million groups for violating these policies," Alison informed.
Currently, Facebook removes a Group, if admins or moderators repeatedly approve posts that violate Community Standards,
Groups that repeatedly share content rated false by fact-checkers are not recommended to other people on Facebook.
"We rank all content from these groups lower in News Feed and limit notifications so fewer members see their posts," Alison noted.
In another bid to help people get health information from authoritative sources, Facebook said it is starting to no longer show health groups in recommendations.
(With inputs from IANS)