Monday March 18, 2019
Home Lead Story Facebook Says...

Facebook Says That it Has Changed Over Years

"We're establishing an independent body which people can use to appeal Facebook decisions involving potentially offensive content," said Facebook

0
//
Facebook, data,photos
A television photographer shoots the sign outside of Facebook headquarters in Menlo Park, Calif. VOA

Stressing that it is determined to do more to keep people  across its services, Facebook said it has identified some key areas where it has to do more to keep its platforms sanitised.

“We still face legitimate scrutiny, but we’re not the same company we were even a year ago,” Facebook said in a blog post on Monday.

When it comes to political interference on its platform, the social media giant said it is committed to bringing greater transparency to the ads people see on Facebook.

“This is particularly true with ads related to politics. All political ads on Facebook and Instagram in the US must now be labelled – including a ‘paid for by’ disclosure from the advertiser.

Facebook, data, vietnam
This photo shows a Facebook app icon on a smartphone in New York. VOA

“We also launched a searchable archive for political content that houses these ads for up to seven years. We’ve since expanded this feature to Brazil and the UK, and will soon in India,” said the company.

Beyond political and issue ads, people can now see every ad a Page is running — even if the person wasn’t targeted. People can also filter ads by country and can report an ad to Facebook.

“We have introduced new policies requiring advertisers to specify the origin of their audience’s information when they bring a customer list to us,” Facebook informed.

“When something is rated ‘false’ by a fact-checker, we’re able to reduce future impressions of that content by an average of 80 per cent.”

Facebook
Facebook, social media. Pixabay

The company said it is now detecting 99 per cent of terrorist-related content before it’s reported, 97 per cent of violence and graphic content, and 96 per cent of nudity.

On users’ privacy, Facebook said: “We know we didn’t do a good enough job securing our platform in the past.

“We now have over 30,000 people working on safety and security — about half of whom are content reviewers working out of 20 offices around the world.”

Also Read- Tesla Acquires Energy Storage Company For $218 mn

On regulation, the company said it agrees with the demand from various governments to regulate the Internet.

“We’re working with governments to improve the safety of our platform, including a recent initiative with French regulators to reduce hate speech.

“We’re establishing an independent body which people can use to appeal Facebook decisions involving potentially offensive content,” said Facebook. (IANS)

Next Story

Mass Shooting in New Zealand: Facebook Still Working to Remove All Videos

The attack came during Friday prayers when the Al Noor Mosque and the nearby Linwood Mosque were filled with hundreds of worshippers. The victims of Friday's shooting included immigrants from Jordan, Saudi Arabia, Turkey, Indonesia and Malaysia.

0
Facebook
The logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square, In this March 29, 2018. VOA

Facebook is continuing to work to remove all video of the mass shooting in New Zealand which the perpetrator livestreamed Friday, the company said Sunday.

“We will continue working directly with New Zealand Police as their response and investigation continues,” Mia Garlick of Facebook New Zealand said in a statement Sunday.

Garlick said that the company is currently working to remove even edited versions of the original video which do not contain graphic content, “Out of respect for the people affected by this tragedy and the concerns of local authorities.”

facebook
Facebook’s most recent comments follow criticism of the platform after the shooter not only livestreamed the 17 graphic minutes of his rampage, using a camera mounted on his helmet, but also had posted a 74-page white supremacist manifesto on Facebook. Pixabay

In the 24 hours following the mass shooting, which left 50 people dead, Facebook removed 1.5 million videos of the attack, of which 1.2 million were blocked at upload, the company said.

Facebook’s most recent comments follow criticism of the platform after the shooter not only livestreamed the 17 graphic minutes of his rampage, using a camera mounted on his helmet, but also had posted a 74-page white supremacist manifesto on Facebook.

Earlier Sunday, New Zealand’s Prime Minister Jacinda Ardern told a news conference that there were “further questions to be answered” by Facebook and other social media platforms.

FILE - New Zealand's Prime Minister Jacinda Ardern speaks on live television following fatal shootings at two mosques in central Christchurch, New Zealand, March 15, 2019.
New Zealand’s Prime Minister Jacinda Ardern speaks on live television following fatal shootings at two mosques in central Christchurch, New Zealand, March 15, 2019. VOA

“We did as much as we could to remove or seek to have removed some of the footage that was being circulated in the aftermath of this terrorist attack. Ultimately, though, it has been up to those platforms to facilitate their removal and support their removal,” she said.

The attack came during Friday prayers when the Al Noor Mosque and the nearby Linwood Mosque were filled with hundreds of worshippers. The victims of Friday’s shooting included immigrants from Jordan, Saudi Arabia, Turkey, Indonesia and Malaysia. (VOA)