India’s temporary ban on the Chinese app TikTok last month has prompted other video-sharing apps to tread with caution.
Given that the issues that led to the ban – inciting hate and abuse, and distributing sexually explicit content – exist all over the internet, Singapore’s BIGO Technology, for instance, is proactively working to plug such vulnerabilities in its platforms. The company owns live streaming app BIGO LIVE and video sharing app LIKE, which are quite popular in India.
LIKE has “established comprehensive 24/7 artificial intelligence and manual monitoring to filter content,” Aaron Wei, Vice President at BIGO Technology, told Quartz. The company also has a team to “regularly monitor the quality of AI and manual monitoring results to consistently improve [its] accuracy,” he added. In addition, over three-quarters of LIKE’s 200 Indian employees are dedicated to comment and content moderation.
Meanwhile, Chinese video-based social network, Helo, purged 160,000 accounts from its app shortly after TikTok was banned even though it had not faced direct criticism about its content. Helo and TikTok are both owned by ByteDance.
But are these steps enough to plug the problem?
Big reach, big problem
As low-cost smartphones flood India and data prices in the country skim the bottom of the barrel, quirky video apps continue to find many takers.
“The youth enjoy showcasing their inclination towards music, dance, fashion, and entertainment, the LIKE App supports such youngsters by giving them a platform,” BIGO’s Wei said. Even Bollywood celebrities such as Ranveer Singh, Sonakshi Sinha, and Disha Patani have taken to LIKE.
Within a short time, these apps have penetrated India’s internet user base in a big way. Helo, which was launched in India in June 2018 claims to have 40 million users in the country. LIKE, which made its India debut in August 2017, reportedly garnered 100 million users by November 2018.
And TikTok has a whopping 300 million users in India. Even as the ban dented the app’s popularity temporarily, it is far from slowing down.
During the week it was taken down from the Apple and Google app stores in India, TikTok lost out on more than 15 million first-time users, market intelligence firm Sensor Tower estimated based on past performance and projections. But it made a solid comeback and is now the top among free apps listed on both Google and Apple app stores in India.
Given this popularity, experts say, these companies need to be far more accountable than they are now.
It’s on you
Most measures that BIGO or LIKE have taken so far have not been entirely effective in the past.
After all, TikTok, had also set up a team of content moderators in India, which has grown 400% over the last year. These moderators speak a number of regional Indian languages but inappropriate content still slips through the gaps.
And LIKE’s concern has not come out of thin air either. Although the company does not yet have any issues related to illicit content in India – and Wei claims “TikTok and LIKE are two very different apps,” – BIGO has come under fire for its BIGO LIVE app. On the streaming app, women in India reportedly dance suggestively or even offer to strip in exchange for monetary rewards.
“It’s not proper to blame the people who are creating the content,” Apar Gupta, director at Internet Freedom Foundation, told the Quint. “If a company is itself incentivising its content production and is heavily involved in the curation and selection of the creators, I think the principal blame does rest on the company and the product rather than the people who are the creators.”
In addition, there is the problem of companies “acting like publishers” when trying to sort content, Mike Laughton, policy analyst at Access Partnership, a global public policy consultancy for the tech sector, told Quartz.
“…the ‘cure’ can be worse than the ‘disease.’ Platforms that try to filter or moderate pornographic content often find themselves filtering religious iconography, perpetuating gender biases, or undermining public sexual health campaigns by blocking access to educational resources,” he said.
One bet is for platforms to ensure safety practices are in place, while abdicating control of this filtering process.
“Online toxicity won’t take care of itself and regulators are right to demand safeguards, particularly for minors. That’s why it’s fair to expect apps to maintain safety,” said Zohar Levkovitz, co-founder and CEO of AntiToxin Technologies, which works with apps to detect harmful digital interactions like bullying, abusive behaviour, self harm, and explicit content.
This article first appeared on Quartz.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!