Tuesday February 18, 2020
Home India Indian Artist...

Indian Artists Demand Internet Giants to Change Stance on Nudity

Its policy team, along with artists, art educators, museum curators, activists as well as Facebook employees, has decided to examine how to better serve artists, including considering a new approach to nudity guidelines

0
//
facebook, instant games
FILE - Attendees walk past a Facebook logo during Facebook Inc's F8 developers conference in San Jose, California, United States. VOA

By Radhika Parashar

As artists across the world take to streets against the social networking platforms’ unfair policies towards art-based nudity, the Indian community of artists has come out in their support, detailing their own experiences and demanding the digital platforms to have a clear differentiation between vulgarity and art.

In a protest this month, nearly 100 people stripped naked holding pictures of nipples in their hands in front of Facebook’s New York headquarters, demanding allowance to showcase artistic nudity on the popular apps. The campaign was outlandishly titled #WeTheNipple.

Another protest in June saw international porn artists gather outside Instagram’s Silicon Valley headquarters, describing the nudity-censorship rules of Facebook and its family of apps “vague, inconsistent and threatening to their livelihood”.

The move has been hailed by international photographers, painters, models and screen artists from around the world. including in India.

“I obviously understand that Facebook and Instagram want to avoid ‘vulgar’ content on their platforms, but scrapping off art-based nudity is affects artists very seriously,” fashion photographer Soumya Iyer told IANS.

“Just like everyone else, we also want to showcase our work and build ourselves our own brand on Facebook and Instagram because of their global reach and popularity but it’s sad that fine-art is neither accepted nor respected,” said Rohan Tulpule, a fine art photographer who has faced consequences of ‘unfair’ censorship rules against his work multiple times.

Recollecting her own experiences on suffering damages, Iyer said: “My series called ‘gender of beauty’ was taken down because of the display of nipples. Instagram is such a huge platform and artists can really make use of its power of engagement! I don’t understand why the display of a woman’s nipple has become a matter of shame”.

Known for his bold photo-series like “Life Through Holes” and “The Plus Size Of Life,” Tulpule added: “Hashtags help us increase our reach but if we use hashtags like #fineart and #nudephotograph, our post comes into notice and gets deleted under ‘policy violation’. We get restricted from all activities on the platforms. The platforms have to understand that all nudity is not vulgarity”.

whatsapp
FILE – The WhatsApp app logo is seen on a smartphone in this picture illustration. VOA

According to conceptual performance artist Inder Salim, “nothing is more scary when imaginary uniform sets of rules are imposed on all to suppress all those atavistic tendencies in us.”

As part of its community guidelines, Facebook-owned Instagram says “for a variety of reasons, we don’t allow nudity” on their platform — including “photos, videos, and some digitally-created content that show sexual intercourse, genitals and close-ups of fully-nude buttocks. It also includes some photos of female nipples.”

However, the platform fails to elaborate the exact extent of “some photos” of female nipples it discourages on its app.

Last week, Facebook was slammed for banning Grammy-nominated British rock band Led Zeppelin’s 1973 album “Houses of the Holy” cover that features nude children. Later, admitting that the image was “culturally significant”, Facebook restored the image.

Also Read: How to Keep Your Hair Healthy When it’s Long

According to actor-model Milind Soman, who stirred major controversies after he stripped naked for a photo-shoot way back in 1995, accepted that tough social media policies that lack distinction between artistic and vulgar nudity is not favourable for artists in this digital era.

“It’s their platform, their policies and their call. What is artistic and what is vulgar on their platform is up to them to decide because after all, it is their business to be profitable (first),” Soman told IANS.

The global demonstrations have convinced Facebook to re-think its stance on artistic nudity.

Its policy team, along with artists, art educators, museum curators, activists as well as Facebook employees, has decided to examine how to better serve artists, including considering a new approach to nudity guidelines. (IANS)

Next Story

Facebook Shares Data on Child Nudity, Terrorism, Drug Sales on Instagram

On spread of hate speech on its platforms, Facebook said it can detect such harmful content before people report it and, sometimes, before anyone sees it

0
Social Media, Facebook, Authenticity, Posts
The social media application, Facebook is displayed on Apple's App Store, July 30, 2019. VOA

Facebook has shared for the first time data on how it takes action against child nudity and child sexual exploitation, terrorist propaganda, illicit firearm and drug sales and suicide and self-injury on its photo-sharing app Instagram.

In Q2 2019, Facebook removed about 512,000 pieces of content related to child nudity and child sexual exploitation on Instagram.

“In Q3 (July-September period), we saw greater progress and removed 754,000 pieces of content, of which 94.6 per cent we detected proactively,” Guy Rosen, VP Integrity, said in a statement on Wednesday.

It is ironic that Instagram has also become a platform, like Facebook, for such acts.

“For child nudity and sexual exploitation of children, we made improvements to our processes for adding violations to our internal database in order to detect and remove additional instances of the same content shared on both Facebook and Instagram,” Rosen explained.

In its “Community Standards Enforcement Report, November 2019,” the social networking platform said it has been detecting and removing content associated with Al Qaeda, ISIS and their affiliates on Facebook above 99 per cent.

“The rate at which we proactively detect content affiliated with any terrorist organisation on Facebook is 98.5 per cent and on Instagram is 92.2 per cent,” informed the company.

facebook privacy
FILE – The Instagram icon is displayed on a mobile screen in Los Angeles. VOA

In the area of suicide and self-injury, Facebook took action on about 2 million pieces of content in Q2 2019.

“We saw further progress in Q3 when we removed 2.5 million pieces of content, of which 97.3 per cent we detected proactively.

“On Instagram, we saw similar progress and removed about 835,000 pieces of content in Q2 2019, of which 77.8 per cent we detected proactively, and we removed about 845,000 pieces of content in Q3 2019, of which 79.1 per cent we detected proactively,” said Rosen.

In Q3 2019, Gacebook removed about 4.4 million pieces of drug sale content. It removed about 2.3 million pieces of firearm sales content in the same period.

Also Read: Tech Giant Apple Launches its All-new 16-inch MacBook Pro

On Instagram, the company removed about 1.5 million pieces of drug sale content and 58,600 pieces of firearm sales content.

On spread of hate speech on its platforms, Facebook said it can detect such harmful content before people report it and, sometimes, before anyone sees it.

“With these evolutions in our detection systems, our proactive rate has climbed to 80 per cent, from 68 per cent in our last report, and we’ve increased the volume of content we find and remove for violating our hate speech policy,” said Rosen. (IANS)