A Delhi Assembly panel on Thursday asked Facebook India to submit records of users’ complaints on the content posted on the social media platform a month before and two months after violence broke out in northeast Delhi in February 2020, reported PTI.

Fifty-three people, mostly Muslims, had died after clashes broke about between supporters of the Citizenship Amendment Act and those opposing it. The Act had triggered protests across the country as it introduced religious criteria for Indian citizenship for the first time.

Advertisement

After the clashes, allegations had emerged that Facebook’s laxity in implementing hate speech rules and policies on the social media platform had contributed to the violence.

On Thursday, Chairperson of the Delhi Assembly’s Peace and Harmony Committee Raghav Chadha sought the records after Facebook’s Public Policy Director Shivnath Thukral submitted a representation on the matter.

Chadha also questioned Thukral about the organisational structure, the mechanism of addressing complaints, the platform’s community standards and Facebook’s definition of hate speech.

Advertisement

During the proceedings, the Facebook official told the panel that while the platform was not a law enforcement agency, but it had a system to co-operate with such agencies when required.

2020 Delhi riots and hate speech

Chadha asked the Facebook official to note the steps taken to remove incendiary content from the platform, especially those posted during the violence in Delhi last year. But, Thukral refrained from commenting on the posts shared during the Delhi riots.

However, the Facebook official went on to clarify that the platform tackles the larger issue of problematic content shared on the platform through machine learning tools and algorithms.

Advertisement

“We feel our enforcement or action is actually leading to the drop in the prevalence of problematic content on our platform,” Thurkral said. “The latest data shows that problematic or hate speech content is down to 0.03%...In 10,000 pieces of content that you will come across in your feed, only three will be problematic.”

When asked if Facebook India had defined hate speech in the Indian context, Thukral said that it falls into one of the “categories of our community standards” and that there are “definitions around it”.

Also read:

Advertisement

Facebook aware of anti-Muslim content in India, but took little action, alleges whistleblower

“A platform like ours has to work on complex issues...we have to balance both free speech and privacy, safety at the same time so it is very difficult to say that the definition is always going to be perfect...it is ever-evolving,” the Facebook public policy director said.

Facebook has a structured mechanism to work with law enforcement agencies of a country to tackle any issue that pertains to real-world violence, he said.

Advertisement

Facebook’s functioning

Thukral said 40,000 Facebook employees work on content management on the platform, and 15,000 of them deal with content moderation. He added that content that are found to be violating the community standards of the platform is removed immediately.

“I think most reasonable people would acknowledge that social media is being held responsible for many issues that run much deeper in society from polarised harmful content to organised crime,” Thukral said in his address. “The fact is we [Facebook] tackle the issues of hate speech and misinformation head on.”

The Facebook official said that the platform ensures transparency through the enforcement reports and also self-regulates through the oversight board.