Tuesday April 24, 2018
Home Lead Story Didn’t ...

Didn’t do enough to prevent Facebook from being used to harm others: Zuckerberg

He said that he was committed to getting this right. He added the company was getting to the bottom of exactly what Cambridge Analytica did and telling everyone affected

0
//
18
facebook
The social media app is in news for all the wrong reasons lately. VOA
Republish
Reprint

As Facebook CEO Mark Zuckerberg’s testifies before Congress fro the first time, he accepted that the company didn’t do enough to prevent the platform from being used to harm others.

In his opening remarks, Zuckerberg, said, “Facebook is an idealistic and optimistic company. For most of our existence, we focused on all of the good that connecting people can do. But it’s clear now that we didn’t do enough to prevent these tools for being used as harm as well.”

“That goes for fake news, for interference in elections and we didn’t take a broad enough view of our responsibility and that was a big mistake and it was my mistake and I’m sorry,” the 33-year-old executive said.

Facebook one of the most popular apps in US. Pixabay
Facebook needs to fix itself: Jack Ma earlier this week. Pixabay

His apology came after Facebook is embroiled in a widening scandal that a British data firm called Cambridge Analytica had improperly gathered detailed Facebook information on 87 million users, up from a previous estimate of more than 50 million.

“Now we have to go through our — all of our relationship with people and make sure we’re taking a broad enough view of our responsibility. It’s not enough to just connect people. We have to make sure those connections are positive. It’s not enough to give people a voice. We have to make sure people aren’t using it to harm people or spread disinformation,” he added.

He said that he was committed to getting this right. He added the company was getting to the bottom of exactly what Cambridge Analytica did and telling everyone affected. “What we know now is that Cambridge Analytica improperly accessed information by buying it. When we first contacted Cambridge Analytica, they told us they had deleted the data,” Zuckerberg cleared.

Zuckerberg needed to be present at the court for the hearing of data breach matter of Facebook.
Zuckerberg takes full responsibility.

He said that the company made big changes in the platform in 2014 that have prevented this specific situation with Cambridge Analytica from occurring again today.

“But there’s more to do. My top priority has always been our social mission of connecting people, building community and bringing the world closer together,” he said. IANS

Click here for reuse options!
Copyright 2018 NewsGram

Next Story

Facebook Takes Action on The Terror-Related Content

Facebook took action on 1.9mn terror-related content

0
//
6
Facebook page.
Facebook. Pixabay

Facebook took action on 1.9 million pieces of content related to the Islamic State (IS) and Al Qaeda in the first quarter of 2018, twice as much as the last quarter of 2017.

The key part is that Facebook found the vast majority of this content on its own.

“In Q1 2018, 99 per cent of the IS and Al Qaeda content we took action on was not user reported,” Monika Bickert, Vice President of Global Policy Management at Facebook, said in a blog post late on Monday.

“Taking action” means that Facebook removed the vast majority of this content and added a warning to a small portion that was shared for informational or counter speech purposes.

The Facebook's image.
Facebook. Pixabay

“This number likely understates the total volume, because when we remove a profile, Page or Group for violating our policies, all of the corresponding content becomes inaccessible.

But we don’t go back through to classify and label every individual piece of content that supported terrorism,” explained Brian Fishman, Global Head of Counterterrorism Policy at Facebook.

Facebook now has a counter-terrorism team of 200 people, up from 150 in June 2017.

Also Read: British Campaigner Sues Facebook Over Fake Ads

“We have built specialised techniques to surface and remove older content. Of the terrorism-related content we removed in Q1 2018, more than 600,000 pieces were identified through these mechanisms,” the blog post said.

“We’re under no illusion that the job is done or that the progress we have made is enough,” said Facebook.

“Terrorist groups are always trying to circumvent our systems, so we must constantly improve,” the company added.  IANS

Next Story