Tuesday September 17, 2019
Home Lead Story Elderly And C...

Elderly And Conservatives Shared More Fake News On Facebook in 2016

The Massachusetts Institute of Technology's Deb Roy, a former Twitter chief media scientist, said the problem is that the American news diet is "full of balkanized narratives"

0
//
Facebook, Fake News
Facebook 'tricked' kids, parents to spend money on 'free' games: Report. VOA

People over 65 and ultraconservatives shared about seven times more fake information masquerading as news on Facebook than younger adults, moderates and super liberals during the 2016 election season, a new study found.

The first major study to look at who is sharing links from debunked sites found that not many people were doing it. On average, only 8.5 percent of those studied — about 1 person out of 12 — shared false information during the 2016 campaign, according to the study in Wednesday’s issue of the journal Science Advances. But those doing it tended to be older and more conservative.

“For something to be viral, you’ve got to know who shares it,” said study co-author Jonathan Nagler, a politics professor and co-director of the Social Media and Political Participation Lab at New York University. “Wow, old people are much more likely than young people to do this.”

Battling back

Facebook and other social media companies were caught off guard in 2016 when Russian agents exploited their platforms to meddle with the U.S. presidential election by spreading fake news, impersonating Americans and running targeted advertisements to try to sway votes. Since then, the companies have thrown millions of dollars and thousands of people into fighting false information.

Facebook, fake news
This Nov. 1, 2017, photo shows prints of some of the Facebook ads linked to a Russian effort to disrupt the American political process and stir up tensions around divisive social issues, released by members of the U.S. House Intelligence Committee, in Washington. According to a study published Jan. 9, 2019, in Science Advances, people over 65 and conservatives shared far more false information in 2016 on Facebook than others. VOA

Researchers at Princeton University and NYU in 2016 interviewed 2,711 people who used Facebook. Of those, nearly half agreed to share all their postings with the professors.

The researchers used three different lists of false information sites — one compiled by BuzzFeed and two others from academic research teams — and counted how often people shared from those sites. Then to double check, they looked at 897 specific articles that had been found false by fact checkers and saw how often those were spread.

All those lists showed similar trends.

When other demographic factors and overall posting tendencies are factored in, the average person older than 65 shared seven times more false information than those between 18 and 29. The seniors shared more than twice as many fake stories as people between 45 and 64 and more than three times that of people in the 30-to-44-year-old range, said lead study author Andrew Guess, a politics professor at Princeton.

The simplest theory for why older people share more false information is a lack of “digital literacy,” said study co-author Joshua Tucker, also co-director of the NYU social media political lab. Senior citizens may not tell truth from lies on social networks as easily as others, the researchers said.

Facebook, Fake News
A user gets ready to launch Facebook on an iPhone, in North Andover, Mass., June 19, 2017. Facebook has made changes to fight false information, including de-emphasizing proven false stories in people’s feeds so others are less likely to see them. VOA

Signaling identity

Harvard public policy and communication professor Matthew Baum, who was not part of the study but praised it, said he thought sharing false information was “less about beliefs in the facts of a story than about signaling one’s partisan identity.” That’s why efforts to correct fakery don’t really change attitudes and one reason why few people share false information, he said.

When other demographics and posting practices are factored in, people who called themselves very conservative shared the most false information, a bit more than those who identified themselves as conservative. The very conservatives shared misinformation 6.8 times more often than the very liberals and 6.7 times more than moderates. People who called themselves liberals essentially shared no fake stories, Guess said.

Nagler said he was not surprised that conservatives in 2016 shared more fake information, but he and his colleagues said that did not necessarily mean that conservatives are by nature more gullible when it comes to false stories. It could simply reflect that there was much more pro-Donald Trump and anti-Hillary Clinton false information in circulation in 2016 that it drove the numbers for sharing, they said.

However, Baum said in an email that conservatives post more false information because they tend to be more extreme, with less ideological variation than their liberal counterparts and they take their lead from Trump, who “advocates, supports, shares and produces fake news/misinformation on a regular basis.”

Facebook, social media
Silhouettes of laptop users are seen next to a screen projection of Facebook logo in this illustration. VOA

The researchers looked at differences in gender, race and income but could not find any statistically significant differences in sharing of false information.

Improvements

After much criticism, Facebook made changes to fight false information, including de-emphasizing proven false stories in people’s feeds so others were less likely to see them. It seems to be working, Guess said. Facebook officials declined to comment.

Also Read: EU Authorities Direct Tech Giants To Submit Reports Regarding ‘Fake News’

“I think if we were to run this study again, we might not get the same results,” Guess said.

The Massachusetts Institute of Technology’s Deb Roy, a former Twitter chief media scientist, said the problem is that the American news diet is “full of balkanized narratives” with people seeking information that they agree with and calling true news that they don’t agree with fake.

“What a mess,” Roy said. (VOA)

Next Story

Facebook Offers Help To India On Fake News Traceability On WhatsApp

With India pressing for traceability of WhatsApp messages to check the spread of fake news, Nick Clegg, Facebook Vice President, Global Affairs and Communications, has offered alternative ways to help the country

0
Fake, News, WhatsApp, Facebook, India
Over 300 million of the 550 million smartphone and broadband users in the country are low on literacy and digital literacy. Pixabay

With India pressing for traceability of WhatsApp messages to check the spread of fake news, Nick Clegg, Facebook Vice President, Global Affairs and Communications, has offered alternative ways to help the country, without any reference towards tracing the origin of the WhatsApp messages.

WhatsApp had categorically said in the past that the government’s demand to trace the origin of messages on its platform is not possible as it “undermines the privacy of the people”.

Clegg who was the UK’s former Deputy Prime Minister before joining Facebook, visited India last week and met several senior government officials, including IT Minister Ravi Shankar Prasad, and offered to assist law enforcement agencies in all possible ways like Artificial Intelligence-driven data analytics and access to “meta-data”.

“Facebook cares deeply about the safety of people in India and Nick’s meetings this week provided opportunities to discuss our commitment to supporting privacy and security in every app we provide and how we can continue to work productively with the government of India towards these shared goals,” a company spokesperson said in a statement.

Fake, News, WhatsApp, Facebook, India
When a message is sent from WhatsApp, the identity of the originator can also be revealed along with the message. Pixabay

Last December, the Ministry of Electronics and Information Technology (MeitY) proposed changes to Section 79 of the Information Technology (IT) Act, 2000.

The proposed regulations require a company to “enable tracing out of originators of information on its platform as required by legally authorised government agencies”.

The end-to-end encryption feature in WhatsApp makes it difficult for law enforcement authorities to find out the culprit behind a misinformation campaign.

The mobile messaging platform with over 400 million users has already called the proposed changes “overbroad”.

“Attributing messages on WhatsApp would undermine the end-to-end encryption, and its private nature, leading to possibilities of being misused,” a company spokesperson had earlier said.

WhatsApp’s parent company Facebook has over 300 million users in India.

WhatsApp in February stressed that some of the proposed government regulations for social media companies operating in India are threatening the very existence of the app in its current form.

“Of the proposed regulations, the one which concerns us the most is the emphasis on traceability of messages,” Carl Woog, WhatsApp’s Head of Communications, had told IANS.

Fake, News, WhatsApp, Facebook, India
The Facebook mobile app on an Android smartphone. Wikimedia Commons

Meanwhile, Facebook has filed a petition to transfer the case looking at enforcing traceability on WhatsApp to the Supreme Court. It is currently sub judice in the Madras High Court.

Tamil Nadu, however, is aiming to get Facebook’s transfer petition dismissed by the Supreme Court.

A professor at the Indian Institute of Technology (IIT)-Madras recently stressed that the issue can be easily resolved without diluting end-to-end encryption and affecting the privacy of users.

“If WhatsApp says it is not technically possible to show the originator of the message, I can show that it is possible,” said V. Kamakoti.

ALSO READ: Youtube Changes Counts Views To Reduce Inflating Growth Hacks

When a message is sent from WhatsApp, the identity of the originator can also be revealed along with the message.

So the message and the identity of the creator can be seen only by the recipient.

“When that recipient forwards the message, his/her identity can be revealed to the next recipient,” he said, adding that as per the court ruling, those who forward a harmful message can also be held responsible in certain cases. (IANS)