Vulgar Videos Made on Chinese Social Media Apps Make Their Way to WhatsApp

These may be exciting times for Chinese short video-sharing apps that have attacked smartphone users. Be that as it may, the precarious ascent in the popularity of apps like TikTok, Likee, Vigo Video and others has left the government just as natives astounded for one straightforward reason: An unabated rise in express, uncouth and unseemly recordings.

Social Media Apps Make Their Way to WhatsApp

Regrettably, the titillating videos made on these apps have now discovered a greater mobile-based messaging medium to corrupt young minds: Facebook-claimed WhatsApp.

WhatsApp with more than 300 million users in a nation has turned into the one-stop shop for the flow of videos showing scantily-clad girls dancing to vulgar tunes, adult jokes and explicit “entertaining” messages presented by unattractive girls being made in the limited shabby by-paths of communities on such Chinese applications.

Despite the fact that tech firms claim to have smart algorithms and Artificial Intelligence (AI)-based frameworks alongside human teams in place to check objectionable content, it is fast spreading.

Both WhatsApp and TikTok went quiet over questions sent to them. TikTok guided us to an old statement that “we are focused on persistently upgrading our safety features as a demonstration of our progressing promise to our clients”.

According to Pavan Duggal, country’s top cyber law expert and a senior Supreme Court advocate, the best way to stop enormous flow of vulgar videos on mobile applications is to address the issue of intermediary liability.

“The Section 67 of the Information Technology Act, 2000 makes the transmission or distribution or causing to be distributed or transmitted in the electronic form – any information, which is lascivious or which offers to the scurrilous interests or the impact of which is tend to deprave or corrupt the minds of those who are probably going to see, read or hear the issue contained or embodied in it – as an offence,” informed Duggal.

Be that as it may, it is just a bailable offence and does not have any deterrent effect.

“The absence of any successful prosecution under Section 67 has let the people a chance to accept that they can circulate vulgar videos without any potential repercussions. Hence, the obligation should be put on the specialist organizations that the moment they are notified about any such offensive or vulgar videos on their stages, they are compelled by a sense of honor to expel the equivalent,” Duggal told IANS.

In Shreya Singhal case in 2015, Supreme Court struck down section 66A of the Information Technology Act, 2000 which gave provisions for the arrest of those who posted supposedly hostile substance on the Internet, upholding freedom of expression.

As per Duggal who is also Chairman, International Commission on Cyber Security Law, the confinements forced by the Supreme Court need to be re-looked as the service providers are misinterpreting the provisions of the said judgment. They also ban on TikTok, saying it spoils the future of youths and minds of children.

On its part, TikTok says it has quit enabling users underneath 13 years to login and create an account on the platform. “With the help of machine learning algorithms, videos can be screened as they are posted, with shocking substance removed even before a user reports it, in some instances.

“As a testament of our zero-tolerance policy on shocking substance, to date, we have expelled more than 6 million videos that have violated our Terms of Use and Community Guidelines,” it includes. Nonetheless,  the rate at such improper videos are being produced, the efforts are not enough.

“The government must correct the Information Technology Act, 2000 to explicitly making tech companies liable for corrupting the minds of young peoples who get influenced by such explicit videos and may perpetrate crimes,” Duggal emphasised.

Inability to agree to standards should attract severe punishment of 5 to 7 years and fine of Rs. 20-30 lakh for the tech companies so as to bring in appropriate deterrent impact, noted the Supreme Court advocate.