Tik Tok poses new risks for Net users, regulation needed

The Asian Age.  | Pradeep S Mehta and Udai Mehta

Opinion, Oped

Despite citizen concerns over digital freedom, governments worldwide are moving towards greater regulatory control over content platforms.

Best known for its 15 seconds short video sharing platform, Tik Tok has become a popular social network amongst netizens, who are eager to broadcast their personal lives and share their opinions on various issues.

The ministry of electronics and information technology (Meity) recently confirmed that it would issue the Notification on the Intermediary Guidelines (Amendment) Rules 2018, by January 15, 2020. The rules aim to bring in accountability in digital technology companies, with respect to regulating harmful content being published and transmitted through their platforms. Though all social media companies (among other stakeholders) would be anxious over it, the clock’s tick tock during the wait would probably be more resounding for Tik Tok. Despite gaining a strong foothold of 200 million users for its app in India, of which 120 million are active monthly users, the Chinese company (Byte-Dance) has grabbed eyeballs for all the wrong reasons in the country.

Best known for its 15 seconds short video sharing platform, Tik Tok has become a popular social network amongst netizens, who are eager to broadcast their personal lives and share their opinions on various issues. Notably, being a video-only app, i.e. without any text posts, it has especially become popular amongst the youth in peri-urban areas and among those having poor English skills (and those from the lower strata of the society). In the ideal world, such uptake and usage may be seen as promoting digital inclusion. However, in the real word, this has had several adverse impacts, such as data protection concerns, moral bankruptcy, etc, for Indian users.

First, comes the pertinent issue of data processing, with respect to “who owns the videos uploaded by users?” Tik Tok has claimed exclusivity in certain instances, despite conveniently declaring itself to be a mere intermediary. Its effects on the ongoing efforts of promoting data portability by other social media players may need to be analysed in greater detail. Furthermore, although Tik Tok’s practices of personal data processing may not be very different from other social media platforms, legitimate privacy concerns and geo-political ramifications may arise due to the Chinese origin of the app. Adding fuel to such fears, was the recent news in which Tik Tok was accused of advancing Chinese foreign policy goals beyond its borders, through its app.

Second, the app has been accused of “encouraging inappropriate or sexually explicit content” on its platform. This concern gets further aggravated in light of its popularity amongst the younger population of the country. Coupled with the lower education levels of its users, exposure to such videos may present the risk of resulting in moral bankruptcy among the country’s youth, along with risking children’s exposure to sexual predators. The Madras high court in its April 3 order banning Tik Tok’s downloads had also asked if the government would enact a statute, like the Children’s Online Privacy Protection Act enacted by the United States, to prevent children from becoming cyber/online victims. The proposed data protection law drafted by the Justice B.N. Srikrishna committee has prohibited technology companies from profiling, tracking, behavioural monitoring or advertising directed at children. The draft also proposed parental consent to sign up for apps.

Third, and the gravest harm caused by Tik Tok, are the “accidental deaths” of its users. With thrust on gaining fame and popularity on the app, many users are being constantly and subconsciously pushed to create thrilling videos, even at the cost of risking their lives. With India already topping the charts in the number of selfie deaths, it comes as no surprise that the desire of becoming an influencer on Tik Tok has claimed the lives of many users.

These concerns led to banning of the app in India for a brief period. Since then, Tik Tok has taken few steps such as adding comments filters, framing anti-bullying guidelines, and enabling content vigilance to curb the menaces resulting from its app. However, the sufficiency of these self-regulations remains questionable, in light of the extreme risks they aim to counter.

Keeping the above in mind, it may be safe to say that although digital tools and technologies are known to have brought various social and economic benefits for consumers at large, there are always some outliers, in which the benefits reaped from them may get outweighed by the corresponding costs imposed by them.

With the intermediary guidelines soon to be finalised, the Meity must use this opportunity to make stringent regulations for thwarting the ill effects of apps like Tik Tok. However, it is imperative for the Meity to not adopt a one size fits all approach. Owing to the harm being promulgated by select few platforms, only such apps may require stricter control and accountability, rather than the rest.

Despite citizen concerns over digital freedom, governments worldwide are moving towards greater regulatory control over content platforms. On April 3, Australia passed a law that will make social media companies liable for fines up to 10 per cent of profits or the arrest of its executives and jail terms of up to three years if companies fail to remove “abhorrent violent material” from their platforms. Singapore has drafted a law that could make Internet and social media companies pay fines of up to $1 million and jail of up to six years for officials who do not comply in removing fake news on their platforms. It also mandates those who spread fake news will have to file a correction online.

With this, here is wishing the Meity the very best in devising an optimal and light touch regulatory framework for regulating online content, by keeping in mind the interests of all stakeholders while finalising the intermediary guidelines, along with keeping sufficient scope for actionable recourse against outliers.

Read more...