Monday November 19, 2018
Home Lead Story ‘White ...

‘White Genocide Conspiracy Theory’ a Mistake: Facebook

The Intercept report revealed that Facebook still has work to do to prevent extremists groups from spreading their hate-filled messages.

0
//
Fake News, Facebook, dating
This photo shows the logo for Facebook on screens at the Nasdaq MarketSite, in New York's Times Square. VOA
Republish
Reprint

Social networking giant Facebook has apologised after letting an ad campaign target its users interested in “White genocide conspiracy theory”.

News site The Intercept had no trouble in launching the campaign just a few days after conspiracy theory about external forces trying to exterminate the White race purportedly inspired the man who killed 11 Jewish worshippers at a Pittsburgh synagogue last week.

Earlier this week, The Intercept was able to select “white genocide conspiracy theory” as a pre-defined “detailed targeting” criterion on the social network to promote two articles to an interest group.

The interest group, according to Facebook, comprised 168,000 users “who have expressed an interest or like pages related to White genocide conspiracy theory”.

facebook, U.S.
Facebook CEO Mark Zuckerberg testifies before a House Energy and Commerce hearing on Capitol Hill in Washington about the use of Facebook data to target American voters in the 2016 election and data privacy. VOA

The ad which was labelled provocatively as “White Supremacy – Test” was approved manually by a member of Facebook’s advertising wing, the report said.

After the news site contacted Facebook for comment, company spokesperson Joe Osborne told The Intercept that the “White genocide conspiracy theory” category had been “generated through a mix of automated and human reviews, but any newly added interests are ultimately approved by people”.

“This targeting option has been removed, and we’ve taken down these ads. It’s against our advertising principles and never should have been in our system to begin with. We deeply apologize for this error,” the Facebook spokesperson said.

This is not the first time Facebook came under the scanner for its role in promoting hate speech through ad campaigns.

Facebook, child nudity
A man is silhouetted against a video screen with an Facebook logo in this photo illustration. VOA

Last year, the investigative news outlet ProPublica reported that “the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of ‘Jew hater’, ‘How to burn jews’, or, ‘History of why jews ruin the world'”.

Social networking giant Facebook has apologised after letting an ad campaign target its users interested in “White genocide conspiracy theory”.

Also Read:Democrats Gain Fundraising Advantage In The US Misterm Elections

At that time Facebook promised that it would explore ways to fix the problem and assured the public that it was building new guardrails in its product and review processes to filter out such ad campaigns.

The Intercept report revealed that Facebook still has work to do to prevent extremists groups from spreading their hate-filled messages. (IANS)

Click here for reuse options!
Copyright 2018 NewsGram

Next Story

How To Deal With Online Hate Speech: A Detailed Guide By Facebook

Critics of the company, however, said Zuckerberg hasn't gone far enough to address the inherent problems of Facebook, which has 2 billion users.

0
Facebook, India, Fake News, Hate Speech
A television photographer shoots the sign outside of Facebook headquarters in Menlo Park, Calif. VOA

Facebook says it is getting better at proactively removing hate speech and changing the incentives that result in the most sensational and provocative content becoming the most popular on the site.

The company has done so, it says, by ramping up its operations so that computers can review and make quick decisions on large amounts of content with thousands of reviewers making more nuanced decisions.

In the future, if a person disagrees with Facebook’s decision, he or she will be able to appeal to an independent review board.

Facebook “shouldn’t be making so many important decisions about free expression and safety on our own,” Facebook CEO Mark Zuckerberg said in a call with reporters Thursday.

Facebook, India, Fake News, Hate Speech
Facebook CEO Mark Zuckerberg delivers the keynote address at a Facebook developers conference in San Jose, California. VOA

But as Zuckerberg detailed what the company has accomplished in recent months to crack down on spam, hate speech and violent content, he also acknowledged that Facebook has far to go.

“There are issues you never fix,” he said. “There’s going to be ongoing content issues.”

Company’s actions

In the call, Zuckerberg addressed a recent story in The New York Times that detailed how the company fought back during some of its biggest controversies over the past two years, such as the revelation of how the network was used by Russian operatives in the 2016 U.S. presidential election.

The Times story suggested that company executives first dismissed early concerns about foreign operatives, then tried to deflect public attention away from Facebook once the news came out.

Facebook, India, Fake News, Hate Speech
A Facebook panel is seen during the Cannes Lions International Festival of Creativity, in Cannes, France. VOA

Zuckerberg said the firm made mistakes and was slow to understand the enormity of the issues it faced. “But to suggest that we didn’t want to know is simply untrue,” he said.

Zuckerberg also said he didn’t know the firm had hired Definers Public Affairs, a Washington, D.C., consulting firm that spread negative information about Facebook competitors as the social networking firm was in the midst of one scandal after another. Facebook severed its relationship with the firm.

“It may be normal in Washington, but it’s not the kind of thing I want Facebook associated with, which is why we won’t be doing it,” Zuckerberg said.

The firm posted a rebuttal to the Times story.

Content removed

Facebook said it is getting better at proactively finding and removing contentsuch as spam, violent posts and hate speech. The company said it removed or took other action on 15.4 million pieces of violent content between June and September of this year, about double what it removed in the prior three months.

Facebook, India, Fake News, Hate Speech
This photo shows a Facebook app icon on a smartphone in New York. VOA

But Zuckerberg and other executives said Facebook still has more work to do in places such as Myanmar. In the third quarter, the firm said it proactively identified 63 percent of the hate speech it removed, up from 13 percent in the last quarter of 2017. At least 100 Burmese language experts are reviewing content, the firm said.

One issue that continues to dog Facebook is that some of the most popular content is also the most sensational and provocative. Facebook said it now penalizes what it calls “borderline content” so it gets less distribution and engagement.

“By fixing this incentive problem in our services, we believe it’ll create a virtuous cycle: by reducing sensationalism of all forms, we’ll create a healthier, less-polarized discourse where more people feel safe participating,” Zuckerberg wrote in a post.

Also Read: Facebook to Establish an Independent Body to Moderate Content

Critics of the company, however, said Zuckerberg hasn’t gone far enough to address the inherent problems of Facebook, which has 2 billion users.

“We have a man-made, for-profit, simultaneous communication space, marketplace and battle space and that it is, as a result, designed not to reward veracity or morality but virality,” said Peter W. Singer, strategist and senior fellow at New America, a nonpartisan think tank, at an event Thursday in Washington, D.C. (VOA)