Thursday August 16, 2018
Home India Government Pr...

Government Pressure: WhatsApp to Limit Message Forwarding In INDIA

WhatsApp reacted: "In India, where people forward more messages, photos, and videos than any other country in the world, we'll also test a lower limit of 5 chats at once.

0
//
17
The app is adding more features to group chats. Wikimedia commons
Republish
Reprint

 Hauled up second time by the government on its failure to check the spread of fake and provocative content on its platform amid growing lynching episodes, Whatsapp on Friday said it is launching a test to limit message forwarding to five chats that will apply to its users in India.

In its second notice on Thursday, the Ministry of Electronics and IT (MeitY) took a tough stand, asking WhatsApp to come out with more effective solutions that can bring in accountability and facilitate enforcement of law in addition to their efforts towards labelling forwards and identifying fake news.

“It has been conveyed to them in unmistakable terms that it is a very serious issue which deserves a more sensitive response,” MeitY said in the notice.

WhatsApp reacted: “In India, where people forward more messages, photos, and videos than any other country in the world, we’ll also test a lower limit of 5 chats at once.

“We will also remove the quick forward button next to media messages,” WhatsApp said in a statement.

The test, once comes to practice, will curtail WhatsApp’s services for over 200 million users in India. Globally, the company allows users to forward messages for up to 20 chats (either individuals or groups).

WhatsApp said that with new changes, which it will continue to evaluate, “will help keep WhatsApp the way it was designed to be: a private messaging app”.

“We are deeply committed to your safety and privacy which is why WhatsApp is end-to-end encrypted, and we’ll continue to improve our app with features like this one,” it added.

Whatsapp
“We will also remove the quick forward button next to media messages,” WhatsApp said in a statement. Pixabay

In its first reply to the IT Ministry, WhatsApp said the company is “horrified” by terrible acts of violence.

The IT Ministry had asked WhatsApp to ensure that the platform is not used for malafide activities over the growing instances of lynching of innocent people owing to large number of irresponsible messages filled with rumours being circulated on its platform.

The mobile messiging service listed several measures — including labelling Forwarding messages — in its first reply to control the spread of misinformation and abuse on its platform but failed to meet the requirements from the IT Ministry.

Several people have lost their lives in the past one year by lynch mobs after rumours of child lifting triggered via messages on WhatsApp.

Union Home Minister Rajnath Singh for the first time admitted in the Lok Sabha on Thursday that fake news on social media has resulted in many mob lynching incidents in the country, saying the government has asked service providers to put a check on rumour mongering on social media.

Also Read-WhatsApp Begins Rolling Out ‘Restrict Group’ Feature For Admins

Expressing concern over the misuse of social media, Rajya Sabha Chairman M. Venkaiah Naidu also asked the government to evolve a national policy after discussing with all stakeholders including political parties to combat the menace.

The Supreme Court also issued 22 guidelines this week for the central and state governments to put an end to “horrendous acts” of vigilantism, lynching and mobocracy and directed them to work in tandem to take “preventive, remedial and punitive measures”. (IANS)

Click here for reuse options!
Copyright 2018 NewsGram

Next Story

Facebook Won’t Remove Content for Being False

"Even if it is a horrible assertion of falsity, whether it's about the Holocaust or any other world even, we don't remove content simply for being false," Bickert, Facebook's Head of Global Policy Management added.

0
Facebook
Facebook refutes report 'Zuckerberg doesn't care about publishers'. Pixabay

Even as the world painfully takes notes of dangers of fake news spread on social media platforms, Facebook has said that it does not remove content simply for being false.

While the social network platform has in place rules against hate speech and takes personal attacks seriously, false content does not face censorship on its platform, Monika Bickert, Facebook’s Head of Global Policy Management said on Thursday while participating in “Hard Questions”, a series that explores the most challenging issues Facebook confronts.

“We don’t allow hate speech on Facebook because it creates an environment where people feel personally attacked, where they won’t feel comfortable coming and sharing themselves,” Bickert said.

“The one thing that we don’t remove is where someone simply asserts something false,” she said, adding that Facebook tries to counter the virality of such content or tries to promote or make visible other views.

We don't remove content for being false: Facebook
Rumours on social media have been linked to real world violence in several countries. Flickr Common

“Even if it is a horrible assertion of falsity, whether it’s about the Holocaust or any other world even, we don’t remove content simply for being false,” Bickert added.

The statement bears significance at a time when rumours on social media platforms have been linked to real world violence in several countries, including India.

Also Read: Facebook will not Remove Fake News- but will ‘Demote’ it

Facebook, Bickert said, also considers local regulations while blocking content on the platform.

“And we also block the speech where countries have told us, ‘this is illegal in our country’, then we will remove that speech in that country alone,” she said. (IANS)