Monday May 20, 2019
Home Lead Story Know How U.S....

Know How U.S. is Keeping A Check on Social Media’s Spread of Fake News

“What have you done to ensure that all your folks out there globally know the dog whistles, know the keywords, the phrasing, the things that people respond to, so we can be more responsive and be proactive in blocking some of this language?”

0
//
facebook, us-mexico border
"People cannot use our fundraising tools for activities involving weapons," said a Facebook spokesperson in a statement. VOA

As Notre Dame Cathedral burned, a posting on Facebook circulated – a grainy video of what appeared to be a man in traditional Muslim garb up in the cathedral.

Fact-checkers worldwide jumped into action and pointed out the video and postings were fake and the posts never went viral.

But this week, the Sri Lanka government temporarily shut down Facebook and other sites to stop the spread of misinformation in the wake of the Easter Sunday bombings in the country that killed more than 250 people. Last year, misinformation on Facebook was blamed for contributing to riots in the country.

Facebook, Twitter, YouTube and others are increasingly being held responsible for the content on their sites as the world tries to grapple in real time with events as they unfold. From lawmakers to the public, there has been a rising cry for the sites to do more to combat misinformation particularly if it targets certain groups.

Shift in sense of responsibility

For years, some critics of social media companies, such as Twitter, YouTube and Facebook, have accused them of having done the minimum to monitor and stamp out misinformation on their platforms. After all, the internet platforms are generally not legally responsible for the content there, thanks to a 1996 U.S. federal law that says they are not publishers. This law has been held up as a key protection for free expression online.

And, that legal protection has been key to the internet firms’ explosive growth. But there is a growing consensus that companies are ethically responsible for misleading content, particularly if the content has an audience and is being used to target certain groups.

An Indian man browses through the twitter account of Alt News, a fact-checking website, in New Delhi, India, April 2, 2019.
An Indian man browses through the twitter account of Alt News, a fact-checking website, in New Delhi, India, April 2, 2019. VOA

Tuning into dog whistles

At a recent House Judiciary Committee hearing on white supremacy and hate crimes, Congresswoman Sylvia Garcia, a Texas Democrat, questioned representatives from Facebook and Google about their policies.

“What have you done to ensure that all your folks out there globally know the dog whistles, know the keywords, the phrasing, the things that people respond to, so we can be more responsive and be proactive in blocking some of this language?” Garcia asked.

Each company takes a different approach.

Facebook, which perhaps has had the most public reckoning over fake news, won’t say it’s a media company. But it has taken partial responsibility about the content on its site, said Daniel Funke, a reporter at the International Fact-Checking Network at the Poynter Institute.

The social networking giant uses a combination of technology and humans to address false posts and messages that appear to target groups. It is collaborating with outside fact-checkers to weed out objectionable content, and has hired thousands to grapple with content issues on its site.

Swamp of misinformation

Twitter has targeted bots, automatic accounts that spread falsehoods. But fake news often is born on Twitter and jumps to Facebook.

“They’ve done literally nothing to fight misinformation,” Funke said.

YouTube, owned by Google, has altered its algorithms to make it harder to find problematic videos, or embed code to make sure relevant factual content comes up higher in the search. YouTube is “such a swamp of misinformation just because there is so much there, and it lives on beyond the moment,” Funke said.

Other platforms of concern are Instagram and WhatsApp, both owned by Facebook.

Some say what the internet companies have done so far is not enough.

“To use a metaphor that’s often used in boxing, truth is against the ropes. It is getting pummeled,” said Sam Wineburg, an education professor at Stanford University.

What’s needed, he said, is for the companies to take full responsibility: “This is a mess we’ve created and we are going to devote resources that will lower the profits to shareholders, because it will require a deeper investment in our own company.”

FILE - An activist wearing a Facebook CEO Mark Zuckerberg mask stands outside Portcullis House in Westminster as an international committee of parliamentarians met for a hearing on the impact of disinformation on democracy in London.
An activist wearing a Facebook CEO Mark Zuckerberg mask stands outside Portcullis House in Westminster as an international committee of parliamentarians met for a hearing on the impact of disinformation on democracy in London. VOA

Fact-checking and artificial intelligence

One of the fact-checking organizations that Facebook works with is FactCheck.org. It receives misinformation posts from Facebook and others. Its reporters check out the stories then report on their own site whether the information is true or false. That information goes back to Facebook as well.

Facebook is “then able to create a database now of bad actors, and they can start taking action against them,” said Eugene Kiely, director of FactCheck.org. Facebook has said it will make it harder to find posts by people or groups that continually post misinformation.

The groups will see less financial incentives, Kiely points out. “They’ll get less clicks and less advertising.”

Funke predicts companies will use technology to semi-automate fact-checking, making it better, faster and able to match the scale of misinformation.

Also Read: Now Google Assistant Will Help You Making Your Baby Sleep

That will cost money of course.

It also could slow the internet companies’ growth.

Does being more responsible mean making less money? Social media companies are likely to find out. (VOA)

Next Story

Social Media Giant’s CEO Mark Zuckerberg Rejects The Claim ‘Time To Break Up Facebook’

Hughes maintains that lawmakers merely marvel at Facebook's explosive growth and have overlooked their own responsibility to protect the public through more competition.

0
In an opinion piece in The New York Times on Thursday, Hughes said the government must hold Mark (Zuckerberg) accountable. VOA

Facebook CEO Mark Zuckerberg has rejected the call for breaking up his company, saying the size of Facebook was actually a benefit to its users and for the security of the democratic process.

In an interview with French broadcaster France 2, Zuckerberg dismissed the claim made by his long-time friend and Facebook co-founder Chris Hughes that it is time to break up Facebook as Zuckerberg has yielded “unchecked power and influence” far beyond that of anyone else in the private sector or in the government.

“When I read what he wrote, my main reaction was that what he’s proposing that we do isn’t going to do anything to help solve those issues.

facebook
The Facebook case is being looked at as a measure of the Donald Trump administration’s willingness to regulate US tech companies. VOA

“So I think that if what you care about is democracy and elections, then you want a company like us to be able to invest billions of dollars per year like we are in building up really advanced tools to fight election interference,” Zuckerberg told France 2 while in Paris to meet with French President Emmanuel Macron.

In an opinion piece in The New York Times on Thursday, Hughes said the government must hold Mark (Zuckerberg) accountable.

“Mark’s personal reputation and the reputation of Facebook have taken a nose-dive,” wrote Hughes, who during his freshman year at Harvard University in 2002 was recruited by Zuckerberg for Facebook.

Zuckerberg said that Facebook’s budget for safety this year is bigger than the whole revenue of the company when it went public earlier this decade.

“A lot of that is because we’ve been able to build a successful business that can now support that. You know, we invest more in safety than anyone in social media,” reported TechCrunch, quoting Zuckerberg.

Hughes wrote that Zuckerberg has surrounded himself with a team that reinforces his beliefs instead of challenging them.

“Mark is a good, kind person. But I’m angry that his focus on growth led him to sacrifice security and civility for clicks,” he wrote.

In a separate opinion piece in the NYT on Sunday, Nick Clegg, who is the Vice President for global affairs and communications in Facebook, said that success should not be penalised.

“Facebook shouldn’t be broken up but it does need to be held to account,” Clegg wrote.

“Hughes maintains that lawmakers merely marvel at Facebook’s explosive growth and have overlooked their own responsibility to protect the public through more competition.

facebook
Embroiled in users’ data scandals, Facebook is set to create new privacy positions within the company that would include a committee, and external evaluator and a Chief Compliance Officer. Pixabay

“This argument holds dangerous implications for the American technology sector, the strongest pillar of the economy. And it reveals misunderstandings of Facebook and the central purpose of antitrust law,” Clegg argued.

Embroiled in users’ data scandals, Facebook is set to create new privacy positions within the company that would include a committee, and external evaluator and a Chief Compliance Officer.

Also Read: Countries Across Globe Unite For Establishing Legal Laws To Reduce Plastic Polluting Environment
Facebook has already kept aside $3 billion anticipating a record fine coming from the US Federal Trade Commission (FTC) related to the Cambridge Analytica data scandal which involved 87 million users.

The Facebook case is being looked at as a measure of the Donald Trump administration’s willingness to regulate US tech companies. (IANS)