Sunday April 21, 2019
Home Lead Story Facebook Says...

Facebook Says Fixing Mistakes, After Report Exposes Content Moderation Flaws

Facebook said it does have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure it has correctly applied its policies

0
//
Facebook
Facebook testing 'LOL' app to woo kids, experts wary. Pixabay

Facing ire over reports that it is protecting far-right activists and under-age accounts, Facebook on Wednesday said it takes the mistakes incredibly seriously and is working on to prevent these issues from happening again.

Channel 4 Dispatches — a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor, showed that moderators at Facebook are preventing Pages from far-right activists from being deleted even after they violate the rules.

In a blog post, Monika Bickert, Vice President of Global Policy Management at Facebook, said the TV report on Channel 4 in the UK has raised important questions about our policies and processes, including guidance given during training sessions in Dublin.

“It’s clear that some of what is in the programme does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again,” Bickert wrote.

The documentary also showed that Facebook moderators have turned blind eye to under-age accounts.

“Moderators are told they can only take action to close down the account of a child who clearly looks 10-year-old if the child actually admits in posts they are under-aged,” reports said, citing the documentary.

Facebook said it has immediately required all trainers in Dublin to do a re-training session — and is preparing to do the same globally.

Facebook mobile app
The documentary also showed that Facebook moderators have turned blind eye to under-age accounts. Pixabay

“We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found,” the Facebook executive said.

In a separate letter written to Nicole Kleeman, Executive Producer at Glasgow-based Firecrest Films who raised the issues with Facebook, Bickert said a review is going on regarding training practices across Facebook contractor teams, including the Dublin-based CPL Resources, the largest moderation centre for UK content.

“In addition, in relation to the content where mistakes were clearly made, we’ve gone back an taken the correct action,” she said.

Facebook had earlier promised to double the number of people working on its safety and security teams this year to 20,000. This includes over 7,500 content reviewers.

The company said it does not allow people under 13 to have a Facebook account.

If a Facebook user is reported to us as being under 13, a reviewer will look at the content on their profile (text and photos) to try to ascertain their age.

Also Read: Facebook Joins Skill India Mission to Train Empower youth

“If they believe the person is under 13, the account will be put on a hold. This means they cannot use Facebook until they provide proof of their age. We are investigating why any reviewers or trainers at CPL would have suggested otherwise,” Bickert said.

Facebook said it does have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure it has correctly applied its policies.

“While this process was previously referred to as ‘shield’, or shielded review, we changed the name to ‘Cross Check’ in May to more accurately reflect the process,” the company said. (IANS)

Next Story

Facebook Still Hosting NZ Shooting Footage: Report

Facing flak, the social media giant is now exploring restrictions on who can use its “Facebook Live” feature

0
Facebook, data,photos
A television photographer shoots the sign outside of Facebook headquarters in Menlo Park, Calif. VOA

Despite Facebook’s claim that the livestreaming video of the March 15 Christchurch shooting that killed 50 people was removed from its platforms, sections of the raw footage are still available for users to watch, the media reported.

According to a report in Motherboard on Friday, certain videos on Facebook and Instagram show sections of the raw attack footage.

“The world’s biggest and most well-resourced social media network is still hosting copies of the violent attack video on its own platform as well as Instagram,” the report claimed.

Some of the videos are slices of the original 17-minute clip — trimmed down to one minute or so — and are open to be viewed by anyone.

In one instance, instead of removing the video, which shows the terrorist shooting and murdering innocent civilians from a first-person perspective, Facebook has simply marked the clip as potentially containing “violent or graphic content”.

One of the clips shows the terrorist walking up to the first mosque he targeted, and opening fire. The video does not show the full attack, and stops at the 01:15 mark.

Facebook
Facebook App on a smartphone device. (VOA)

A Facebook spokesperson, however, said “the video did violate our policies and has been removed”.

The Facebook livestreaming of the New Zealand terror attack sparked global outrage. The video was viewed over 4,000 times before it was removed.

The video was later shared in millions on other social media platforms, including Twitter and YouTube.

Also Read- Jack Dorsey Admits Twitter Makes it Easy to Abuse Others

Facing flak, the social media giant is now exploring restrictions on who can use its “Facebook Live” feature.

Earlier this month, New Zealand’s privacy commissioner John Edwards labelled Facebook as “morally bankrupt pathological liars” after the social media platform’s CEO Mark Zuckerberg tried to play down the Facebook livestreaming of Christchurch shooting. (IANS)