Absence of strong Indian laws and sheer volume of videos mean sexually explicit content is easily available for young users.
TikTok has plenty of semi-porn. Seriously.
It is hard for the more than 300 million Indian users of the popular short video format app to miss its sexually suggestive content. Worryingly, several of these videos are made by adolescents and teenagers and could be potentially feeding a paedophile’s fantasy.
In several countries, TikTok is acting on these concerns after facing heat from lawmakers. In the US for example, TikTok has landed in a soup for its failure to take down videos made by children under the age of 13. The Dutch Data Protection Authority in May joined other European agencies in enquiring about how TikTok handle’s children’s data.
The company has said it is introducing new measures to protect young people on the app from content that is crass, explicit, disturbing and voyeuristic. But in India, its largest market, they are yet to take root. Sexually explicit content is easily accessible by simply browsing the app.
Though there are no studies centred on India, digital safety app maker Qustodio said in a recent report that children aged four to 15 in the US, UK and Spain, now spend almost as much time on TikTok as they do on Youtube at an average of 80 minutes and 85 minutes a day. TikTok also drove growth in kids’ social app use by 100 percent in 2019 and 200 percent in 2020, the report found.
TikTok did not comment for this article.
Tiktok is hardly the only platform to have such disturbing content, but it is particularly popular among children and teenagers, who produce and consume such content habitually.
Nitish Chandan, who works with Cyber Peace Foundation, a cybersecurity think tank, said during his interaction with school children, it was obvious that there are two platforms that were popular — Tiktok and PUBG. These kids could be as young as 10.
“It is a trend that happens in every generation. So during my time, it was Orkut, then Facebook. For these kids, it is Tiktok and PUBG,” he said.
This should not surprise because of the easy-to-use features of TikTok, a Chinese app owned by internet giant ByteDance.
Users can easily create short videos set to music and can deliver dialogues, lip-sync, dance or act using in-built visual effects. A live-streaming feature allows users to send virtual gifts to the video makers they follow, which can be purchased with real money.
Users need not log in to watch these videos and the videos are mostly curated depending on the region and are updated according to the user’s watch history, likes and viewing preferences.
Janice Verghese, an advocate who also works with Cyber Peace Foundation, said the Tiktok is easier to use than YouTube or Facebook. “Also, once you sign up there is no discrimination between kids and adults in terms of content you see.”
Booming Growth In India
Little wonder that Tiktok grew like the launch of a rocket in a short span. The app has been downloaded 323 million times — about 40 percent of the global aggregate — and it has nearly 120 million daily active users.
Verghese said when the audience increases on a platform, the tendency to cross boundaries increase as well. “These kids in rural parts of India have never had their chance to be popular and Tiktok has given them that opportunity. Soon they are exposed to more threats and prone to be influenced and inclined to engage in creating content that may be inappropriate,” she said.
Adolescents require parental consent to upload the videos they produce. But going by the easy availability of such videos, that looks unlikely.
What do India’s laws say?
“At the end of the day, it all comes down to how soon the particular content gets taken down,” said Verghese.
That is easier said than done because of the nature of Indian laws.
It s not easy to remove content even if it is unsettling if it does not fall under textbook prohibited content. The Supreme Court struck down Section 66A of IT Act, which placed restrictions on online speech.
This means that users can flag inappropriate content and it is up to Tiktok to remove them based on its community guidelines. But in the event when videos are not removed, Tiktok is not liable for punitive action. The onus falls on the government agencies or courts (district/high courts) to decide if a particular content violates law and should be removed from public view and pass an order to do so.
India also does not have a separate law to protect children against online exploitation like the Children's Online Privacy Protection Rule (COPPA) in the US. Unlike the European Union, India also lacks a privacy law that can protect children’s data.
Probir Roy Chowdhury, Partner, J Sagar and Associates, said, “This would mean that there is no mechanism other than to file a report directly with the social media platform, flagging the content or to file a police report, in which case such platforms would will have to extend cooperation to catch the perpetrator.”
With no data protection and privacy law in place, the app can collect children’s data and use it for targeted advertising without any restriction, according to him.
Currently, there are two provisions available under Protection of Children from Sexual Offences Act (POCSO) and Section 67B of the IT Act, where offenders could be imprisoned for up to five years and fined up to Rs 10 lakh.
“Unfortunately, the conviction rate under this section is abysmal,” said Pavan Duggal, senior advocate specialising in cyber law.
In case of intermediaries such as Tiktok, they are held liable only when the CSAM content was not taken out after being flagged. In addition, Section 79 (1) of the IT Act grants them conditional immunity where they are only required to act (for deletion of content) upon receiving the court order or a notification from the appropriate government or agency as per the Supreme Court verdict in the Shreya Singhal case.
This makes intermediaries mute spectators. So there is a need to revise legal stance on intermediaries. For these (laws) have outlived their utility and intermediaries should have to be made to comply with the law,” said Duggal.
Tiktok cannot be held liable for hosting such content because Section 79 (1) of the IT Act lacks bite.
Siddharth Pillai, who works with the Aarambh, a non-governmental organisation that works on children’s privacy and protection online, said, “There are lot of grey areas.” While nudity and textbook pornography content and images are removed faster, others fall in the grey area and more often than not hardly any action is taken.
Tiktok has separate community guidelines on nudity and sexual exploitation involving minors. Under this, content that shows private parts and those depicting activities around sexual exploitation and activity will not be allowed. The rules also say content that depicts erotic dances or those that contain sexual or erotic language involving minors will not be allowed.
However, privacy experts pointed out that plenty of content slips past the controls because lakhs of videos are generated and uploaded on TikTok every day. The artificial intelligence and machine learning-based filters, an expert pointed out, look at searches, maybe to some extent skin exposure in images. But other forms of content, particularly pornographic, could be easily accessed.
Tiktok recently launched family pairing, a feature that allows parents to control the content a teenager consumes. But the problem is the app is not popular, rather not familiar among parents, especially in rural areas.
Verghese of Cyber Peace Foundation explained that given that majority of the population is from rural India, there is lack of awareness about the consequences these content could have. This also makes exercising parental controls a challenge in India.
A 2013 judgement by the Delhi High Court restricted children below 13 years to open an account in social media sites such as Facebook. Tiktok too does not allow kids below 13 years to sign up.
But again, it is impossible to physically verify the age of users, which explains the abundant presence of adolescents in the platform. India has no dedicated law that restricts internet freedom of children below 13 years.