Saturday January 25, 2020
Home Lead Story Social Media ...

Social Media Giant Facebook Bans Canadian White Nationalism Accounts

Facebook users searching for terms associated with white supremacy are being directed to Life After Hate, an organisation set up by former violent extremists, which provides crisis intervention, education, support groups and outreach

0
//
Facebook, data,photos
A television photographer shoots the sign outside of Facebook headquarters in Menlo Park, Calif. VOA

In its first crackdown against white nationalism and separatism on its platform, Facebook has banned far-right political commentator Faith Goldy and Canadian white nationalist groups including from Instagram.

Facebook last month abandoned a long-standing policy of allowing white supremacy to thrive on its platforms as nations criticised it for promoting hate propaganda in the wake of New Zealand terror attacks.

BuzzFeed first reported on Goldy’s ban from the social network.

“Facebook will ban Faith Goldy, Soldiers of Odin, the Canadian Nationalist Front, and other hate groups from across its platforms,” the report said.

“Individuals and organizations who spread hate, attack, or call for the exclusion of others on the basis of who they are have no place on Facebook,” a Facebook spokesperson was quoted as saying.

“BANNED FROM @FACEBOOK & @INSTAGRAM. Somehow state media had enough advance warning to get a piece out before even I found out! Our enemies are weak & terrified. They forget most revolutions were waged before social media!” tweeted Goldy who ran for mayor of Toronto last year.

The social networking giant came under pressure after a white man livestreamed a terror attack on two mosques in New Zealand on Facebook Live.

facebook, iphone, new york
FILE – The Facebook app icon is shown on an iPhone in New York. VOA

The Facebook Live video of the terror attack in which 50 people were killed was viewed over 4,000 times before it was removed.

Besides streaming the 17-minute attack on the first mosque on Facebook, the attacker, Australian national Brenton Tarrant, had also posted a 70-page manifesto detailing his extreme right-wing ideology and hatred for Muslims.

Facebook said last week that it allowed the expressions of white nationalism and white separatism on its platforms because “we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are important part of people’s identity”.

Also Read- Revitalizing North Korea Talks With U.S. After Failed Hanoi Summit

But over the past three months, said Facebook, its conversations with members of civil society and academics have confirmed that white nationalism and white separatism cannot be meaningfully separated from white supremacy and organised hate groups.

“Going forward, while people will still be able to demonstrate pride in their ethnic heritage, we will not tolerate praise or support for white nationalism and white separatism,” Facebook said.

Facebook users searching for terms associated with white supremacy are being directed to Life After Hate, an organisation set up by former violent extremists, which provides crisis intervention, education, support groups and outreach. (IANS)

Next Story

Content Moderators on Facebook and YouTube Asked to Sign PTSD Forms

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign PTSD forms

0
YouTube Facebook
Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder. Pixabay

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder (PTSD).

According to The Financial Times and The Verge, global professional services firm Accenture which provides content moderators for big tech firms have asked them to sign a form, explicitly acknowledging that their job could cause post-traumatic stress disorder.

Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. A similar document was also provided by Accenture to workers at a YouTube content moderation facility in Austin, Texas. Accenture said the wellbeing of workers was a “top priority”.

facebook
Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. Pixabay

“We regularly update the information we give our people to ensure that they have a clear understanding of the work they do,” the company said in a statement.

“According to an employee who signed one of these acknowledgment forms, every moderator at the facility was emailed a link and asked to sign immediately,” the report said.

The Accenture form says workers might review “disturbing” videos and that moderating “such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). Both Facebook and Google said they did not review Accenture’s new form.

The Verge’s probe last month into Accenture’s Austin site described hundreds of low-paid immigrants toiling in, removing videos flagged for extreme violence and terrorist content.

Also Read- Tech Giant Apple Becomes One of The Fastest-Growing Brands in India

“The moment they quit Accenture or get fired, they lose access to all mental health services. One former moderator for Google said she was still experiencing symptoms of PTSD two years after leaving,” the report claimed.

Last year, The Verge published a report of Facebook moderators and one of them said he “sleeps with a gun by his side” after doing the job. (IANS)