New Delhi: The Union Ministry of Electronics and Information Technology is preparing a report on findings regarding India in the internal documents of Facebook leaked by whistleblower Frances Haugen, The Indian Express (TIE) reported.
The findings include alleged discrepancies in algorithmic recommendations that lead fresh users of the platform in the country to "misinformation and hate speech", TIE says.
Sources told TIE that if needed, the ministry would call Facebook's executives to explain the working of algorithms and the actions the company had taken so far to counter misinformation and hate speech. But, at the moment, the ministry needs to study the whistleblower's revelations.
The report might be prepared and finalised by this week and would contain details such as how Facebook failed to check the spread of misinformation and hate speech in India, mainly because it doesn't have the right tools to filter or flag content in Hindi and Bengali.
Sources added that the findings of a Kerala based researcher who created a user account and encountered several instances of hate speech and misinformation based on algorithmic recommendations by the platform, would also likely be included in the ministry report.
Frances Haugen, in her complaint to the United States Securities and Exchange Commission (SEC), said that despite being aware that RSS users, Groups and Pages promote fear-mongering, anti-Muslim narratives, Facebook couldn't take action or filter it as it lacks devices to moderate content in Hindi and Bengali.
Citing one of the leaked documents, titled "Adversarial Harmful Networks-India Case study", the complaint noted that "we have to put forth a nomination for designation of this group (RSS) given political sensitivities," TIE reported.
The New York Times (NYT) had reported that the company's own employees were having a hard time with the effects the platform had on users in India, especially in the run-up to the 2019 Lok Sabha elections.
TIE says that the queries it sent following the NYT report were answered by Facebook that based on the algorithmic recommendations the platform gave to the test account it had created, Facebook had undertaken "deeper, more rigorous analysis" of its recommendation systems in India such as removing marginal content and civic political Groups from their recommendation systems.
NYT had reported that the Facebook researcher's report was one of the studies and memos were written by Facebook employees, which provides solid evidence to the criticisms that the platform moved into a country without fully understanding its potential effects on local culture and politics. Also, failing to deploy resources to act on issues once they occur.