Wednesday October 24, 2018
Home Lead Story Facebook Says...

Facebook Says Fixing Mistakes, After Report Exposes Content Moderation Flaws

Facebook said it does have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure it has correctly applied its policies

0
//
11
Facebook
Facebook likely to launch camera-equipped hardware for TVs. Pixabay
Republish
Reprint

Facing ire over reports that it is protecting far-right activists and under-age accounts, Facebook on Wednesday said it takes the mistakes incredibly seriously and is working on to prevent these issues from happening again.

Channel 4 Dispatches — a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor, showed that moderators at Facebook are preventing Pages from far-right activists from being deleted even after they violate the rules.

In a blog post, Monika Bickert, Vice President of Global Policy Management at Facebook, said the TV report on Channel 4 in the UK has raised important questions about our policies and processes, including guidance given during training sessions in Dublin.

“It’s clear that some of what is in the programme does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again,” Bickert wrote.

The documentary also showed that Facebook moderators have turned blind eye to under-age accounts.

“Moderators are told they can only take action to close down the account of a child who clearly looks 10-year-old if the child actually admits in posts they are under-aged,” reports said, citing the documentary.

Facebook said it has immediately required all trainers in Dublin to do a re-training session — and is preparing to do the same globally.

Facebook mobile app
The documentary also showed that Facebook moderators have turned blind eye to under-age accounts. Pixabay

“We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found,” the Facebook executive said.

In a separate letter written to Nicole Kleeman, Executive Producer at Glasgow-based Firecrest Films who raised the issues with Facebook, Bickert said a review is going on regarding training practices across Facebook contractor teams, including the Dublin-based CPL Resources, the largest moderation centre for UK content.

“In addition, in relation to the content where mistakes were clearly made, we’ve gone back an taken the correct action,” she said.

Facebook had earlier promised to double the number of people working on its safety and security teams this year to 20,000. This includes over 7,500 content reviewers.

The company said it does not allow people under 13 to have a Facebook account.

If a Facebook user is reported to us as being under 13, a reviewer will look at the content on their profile (text and photos) to try to ascertain their age.

Also Read: Facebook Joins Skill India Mission to Train Empower youth

“If they believe the person is under 13, the account will be put on a hold. This means they cannot use Facebook until they provide proof of their age. We are investigating why any reviewers or trainers at CPL would have suggested otherwise,” Bickert said.

Facebook said it does have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure it has correctly applied its policies.

“While this process was previously referred to as ‘shield’, or shielded review, we changed the name to ‘Cross Check’ in May to more accurately reflect the process,” the company said. (IANS)

Click here for reuse options!
Copyright 2018 NewsGram

Next Story

Facebook Set up a War Room to Fight Election Interference

With the new ad architecture in place, people would be able to see who paid for a particular political ad

0
Facebook
Facebook now has a War Room to fight election interference. Pixabay

In line with its efforts to prevent misuse of its platform during elections, Facebook has set up a War Room to reduce the spread of potentially harmful content.

Facebook faced flak for not doing enough to prevent spread of misinformation by Russia-linked accounts during the 2016 US presidential election. The social networking giant has rolled out several initiatives to fight fake news and bring more transparency and accountability in its advertising since then.

The launch of the first War Room at its headquarters in Menlo Park, California, is part of the social network’s new initiatives to fight election interference on its platform.

Although Facebook opened the doors of the War Room ahead of the general elections in Brazil and mid-term elections in the US, it revealed the details only this week.

The goal behind setting up the War Room was to get the right subject-matter experts from across the company in one place so they can address potential problems identified by its technology in real time and respond quickly.

Facebook
Facebook, social media. Pixabay

“The War Room has over two dozen experts from across the company – including from our threat intelligence, data science, software engineering, research, community operations and legal teams,” Samidh Chakrabarti, Facebook’s Director of Product Management, Civic Engagement, said in a statement on Thursday.

“These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook,” Chakrabarti added.

Facebook said its dashboards offer real-time monitoring on key elections issues, such as efforts to prevent people from voting, increases in spam, potential foreign interference, or reports of content that violates our policies.

The War Room team also monitors news coverage and election-related activity across other social networks and traditional media in order to identify what type of content may go viral.

These preparations helped a lot during the first round of Brazil’s presidential elections, Facebook claimed.

The social networking giant said its technology detected a false post claiming that Brazil’s Election Day had been moved from October 7 to October 8 due to national protests.

While untrue, that message began to go viral. But the team quickly detected the problem, determined that the post violated Facebook’s policies, and removed it in under an hour.

“And within two hours, we’d removed other versions of the same fake news post,” Chakrabarti said.

Facebook
Facebook App on a smartphone device. (VOA)

The team in the War Room, Facebook said, also helped quickly remove hate speech posts that were designed to whip up violence against people from northeast Brazil after the first round of election results were called.

“The work we are doing in the War Room builds on almost two years of hard work and significant investments, in both people and technology, to improve security on Facebook, including during elections,” Chakrabarti said.

Earlier this month Facebook said that it was planning to set up a task force comprising “hundreds of people” ahead of the 2019 general elections in India.

You May Also Like to Read About- McAfee Introduces New Device-to-Cloud Security Solutions

“With the 2019 elections coming, we are pulling together a group of specialists to work together with political parties,” Richard Allan, Facebook’s Vice President for Global Policy Solutions, told the media in New Delhi.

Facebook has also set a goal of bringing a transparency feature for political ads — now available in the US and Brazil — to India by March next year, Allan informed.

With the new ad architecture in place, people would be able to see who paid for a particular political ad. (IANS)