Thursday April 25, 2019
Home Lead Story Should Promot...

Should Promote Human Rights More In Myanmar: Facebook

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

0
//
Facebook, myanmar
A cellphone user looks at a Facebook page at a shop in Latha street, Yangon, Myanmar. VOA

Facebook on Monday said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence.

The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Alex Warofka, a Facebook product policy manager, said in a blog post.

facebook, U.S., myanmar
A protester wearing a mask with the face of Facebook founder Mark Zuckerberg, in between men wearing angry face emoji masks, is seen during a demonstration against Facebook outside Portcullis in London. VOA

BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released.

A Reuters special report in August found that Facebook failed to promptly heed numerous warnings from organizations in Myanmar about social media posts fueling attacks on minority groups such as the Rohingya.

In August 2017 the military led a crackdown in Myanmar’s Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighboring Bangladesh, according to U.N. agencies.

Rohingya, India, myanmar
A man from the Rohingya community fills out an identification form provided by local police inside his shop at a camp in New Delhi. VOA

 

The social media website in August removed several Myanmar military officials from the platform to prevent the spread of “hate and misinformation,” for the first time banning a country’s military or political leaders.

It also removed dozens of accounts for engaging in a campaign that “used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military.”

The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingya with “genocidal intent.”

Facebook said it has begun correcting shortcomings.

myanmar, facebook
A deforested section of the Chakmakul camp for Rohingya refugees clings to a hillside in southern Bangladesh, Feb. 13, 2018. VOA

Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanizing posts while they undergo review.

Also Read: Video: Orange Rallies in US Honor Victims of Gun Violence

In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

BSR said locating staff there, for example, could aid in Facebook’s understanding of how its services are used locally but said its workers could be targeted by the country’s military, which has been accused by the U.N. of ethnic cleansing of the Rohingya. (VOA)

Next Story

New Zealand, France Plan in Effort to Stop Promotion of Terrorism, Violent Extremism on Social Media

A lone gunman killed 50 people at two mosques in Christchurch on March 15, while livestreaming the massacre on Facebook

0
facebook, christchurch attack, new zealand
FILE - The Facebook logo is seen on a shop window in Malaga, Spain, June 4, 2018. (VOA)

In the wake of the Christchurch attack, New Zealand said on Wednesday that it would work with France in an effort to stop social media from being used to promote terrorism and violent extremism.

Prime Minister Jacinda Ardern said in a statement that she will co-chair a meeting with French President Emmanuel Macron in Paris on May 15 that will seek to have world leaders and CEOs of tech companies agree to a pledge, called the Christchurch Call, to eliminate terrorist and violent extremist content online.

A lone gunman killed 50 people at two mosques in Christchurch on March 15, while livestreaming the massacre on Facebook.

Brenton Tarrant, 28, a suspected white supremacist, has been charged with 50 counts of murder for the mass shooting.

christchurch attack, new zealand, facebook
Students light candles as they gather for a vigil to commemorate victims of Friday’s shooting, outside the Al Noor mosque in Christchurch, New Zealand, March 18, 2019. (VOA)

“It’s critical that technology platforms like Facebook are not perverted as a tool for terrorism, and instead become part of a global solution to countering extremism,” Ardern said in the statement.

“This meeting presents an opportunity for an act of unity between governments and the tech companies,” she added.

The meeting will be held alongside the Tech for Humanity meeting of G7 digital ministers, of which France is the chair, and France’s separate Tech for Good summit, both on 15 May, the statement said.

Ardern said at a press conference later on Wednesday that she has spoken with executives from a number of tech firms including Facebook, Twitter, Microsoft, Google and few other companies.

“The response I’ve received has been positive. No tech company, just like no government, would like to see violent extremism and terrorism online,” Ardern said at the media briefing, adding that she had also spoken with Facebook’s Mark Zuckerberg directly on the topic.

christchurch attack, facebook, new zealand
Facebook, the world’s largest social network with 2.7 billion users, has faced criticism since the Christchurch attack that it failed to tackle extremism. VOA

A Facebook spokesman said the company looks forward to collaborating with government, industry and safety experts on a clear framework of rules.

“We’re evaluating how we can best support this effort and who among top Facebook executives will attend,” the spokesman said in a statement sent by email. Facebook, the world’s largest social network with 2.7 billion users, has faced criticism since the Christchurch attack that it failed to tackle extremism.

ALSO READ: Social Media Giant Facebook Announces First Browser API for Google Chrome

One of the main groups representing Muslims in France has said it was suing Facebook and YouTube, a unit of Alphabet’s Google, accusing them of inciting violence by allowing the streaming of the Christchurch massacre on their platforms.

Facebook Chief Operating Officer Sheryl Sandberg said last month that the company was looking to place restrictions on who can go live on its platform based on certain criteria. (VOA)