Sunday January 26, 2020
Home Lead Story YouTube Disab...

YouTube Disabling Comments on Videos Featuring Minors

Grooming children for sex trafficking consists of convincing someone to send a risky picture and then using it to extort them

0
//
YouTube
YouTube. Pixabay

In the face of allegations that YouTube is being used by paedophiles to target minors, the Google-owned video streaming platform said it is now disabling comments on videos featuring minors that could be at risk of attracting predatory behaviour.

Due to the on-going paedophilia content controversy on YouTube, several international companies including Disney, Nestle and Fortnite maker Epic Games pulled out their advertisements from the platform this week.

Many users also alleged that YouTube was being careless with monitoring content on its app.

YouTube, however, claimed that it has been terminating accounts and channels over the past few weeks that have violated the platform’s policies.

“No form of content that endangers minors is acceptable on YouTube, which is why we have terminated certain channels that attempt to endanger children in any way,” YouTube said in a blog post on Thursday.

“Over the past week, we disabled comments from tens of millions of videos that could be subject to predatory behaviour,” YouTube said, adding that over the next few months, it will be broadening this action to suspend comments on videos featuring young minors.

But the platform is not introducing a blanket ban on comments for all videos featuring minors.

“A small number of creators will be able to keep comments enabled on these types of videos. These channels will be required to actively moderate their comments, beyond just using our moderation tools and demonstrate a low risk of predatory behaviour. We will work with them directly,” the company wrote.

YouTube, Google, google services
The YouTube Music app is displayed on a mobile phone in Los Angeles. VOA

YouTube said it has a new comments classifier in place that is more sweeping in scope, and will detect and remove two times more individual comments.

A study by the Human Trafficking and Social Justice Institute of the University of Toledo in Ohio, US recently showed that human traffickers are exploiting social media platforms such as Facebook, Instagram or SnapChat, as well as dating apps such as Tinder, Blendr and Yellow to hunt for potential underage victims.

Traffickers educate themselves by studying what the victim posts on these sites to build trust.

The study, which was requested by the Ohio Attorney General’s Human Trafficking Commission, revealed how traffickers quickly target and connect with vulnerable children on the Internet through social media.

Also Read- Tech Giant Google Using AI to Predict Wind Energy Output

Grooming children for sex trafficking consists of convincing someone to send a risky picture and then using it to extort them.

The traffickers use fear of repercussions as a way to compel the youth to move from a monitored page to a less monitored page by saying, “You don’t want your parents to find out what we’re talking about”, the study suggested.

With a global user-base of over 1.3 billion people, every eight out of ten 18-49 years old people watch YouTube. With the penetration of Internet and smartphones, users below 18 years of age today are also heavily exposed to the platform. (IANS)

Next Story

Content Moderators on Facebook and YouTube Asked to Sign PTSD Forms

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign PTSD forms

0
YouTube Facebook
Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder. Pixabay

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder (PTSD).

According to The Financial Times and The Verge, global professional services firm Accenture which provides content moderators for big tech firms have asked them to sign a form, explicitly acknowledging that their job could cause post-traumatic stress disorder.

Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. A similar document was also provided by Accenture to workers at a YouTube content moderation facility in Austin, Texas. Accenture said the wellbeing of workers was a “top priority”.

facebook
Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. Pixabay

“We regularly update the information we give our people to ensure that they have a clear understanding of the work they do,” the company said in a statement.

“According to an employee who signed one of these acknowledgment forms, every moderator at the facility was emailed a link and asked to sign immediately,” the report said.

The Accenture form says workers might review “disturbing” videos and that moderating “such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). Both Facebook and Google said they did not review Accenture’s new form.

The Verge’s probe last month into Accenture’s Austin site described hundreds of low-paid immigrants toiling in, removing videos flagged for extreme violence and terrorist content.

Also Read- Tech Giant Apple Becomes One of The Fastest-Growing Brands in India

“The moment they quit Accenture or get fired, they lose access to all mental health services. One former moderator for Google said she was still experiencing symptoms of PTSD two years after leaving,” the report claimed.

Last year, The Verge published a report of Facebook moderators and one of them said he “sleeps with a gun by his side” after doing the job. (IANS)