Thursday July 19, 2018
Home Science & Technology Facebook ‘exp...

Facebook ‘exploited’ Australian kids for advertisers

Facebook monitored the posts of Australian children and used algorithms to identify and exploit them by allowing advertisers to target them during their "most vulnerable moments", media reported, evoking criticism against the social media giant.

0
//
61
Facebook, Pixabay
Republish
Reprint

May 02, 2017: Facebook monitored the posts of Australian children and used algorithms to identify and exploit them by allowing advertisers to target them during their “most vulnerable moments”, media reported, evoking criticism against the social media giant. A confidential 23-page Facebook document prepared by company’s two top Australian executives outlines how the social network can target “moments when young people need a confidence boost” in pinpoint detail, The Australian reported on Sunday.

Facebook collected the information on a person’s moods including feeling “worthless”, “overwhelmed” and “nervous” and then, it divulged the same to advertisers who use it to target them. Facebook admitted it was wrong to target the children and apologised.  “We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate,” a Facebook spokeswoman told The Australian.

Also watch:

“While the data on which this research is based was aggregated and presented consistent with applicable privacy and legal protections, including the removal of any personally identifiable information, our internal process sets a standard higher than required by law,” she added. Facebook’s tactic violates the Australian Code for Advertising and Marketing Communications to Children guidelines.

The revelation also points towards the how Facebook can be used for covert surveillance which most of the social networking sites claim to be fighting against. There have been rumours about Facebook’s advertising sales methods but there was no proof until now that could corroborate that. “The document is an insight on how Facebook gathers psychological insights on 6.4 million ‘high schoolers’, ‘tertiary students’ and ‘young Australians, New Zealanders… in the workforce’ to sell targeted advertising,” the report noted. IANS

 

Click here for reuse options!
Copyright 2017 NewsGram

Next Story

Facebook Accused of Protecting Far-Right Activists Who Broke the Sites Rules

Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the rules set up by the social media giant, the media reported.

0
Facebook
Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the community rules. Pixabay

Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the rules set up by the social media giant, the media reported.

The process called “shielded review” was uncovered by Channel 4 Dispatches – a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor.

“In the documentary, a moderator tells the ‘Dispatches’ reporter that Britain First’s pages were left up, even though they repeatedly broke Facebook’s rules, because ‘they have a lot of followers so they’re generating a lot of revenue for Facebook’,” the Guardian reported on Tuesday.

Similarly, popular pages, including those of activists like Tommy Robinson, are protected from Facebook rules.

Robinson is currently in jail, serving a 13-month sentence for contempt of court.

Richard Allan, Facebook’s Head of Public Policy, was quoted as saying in the documentary that the company’s rules are based on revenue.

“If the content is indeed violating it will go,” Allan said.

Facebook, however, said it will remove Robinson’s page if he repeatedly violated the site’s community standards.ABritain First’s Facebook page was eventually banned in March 2018.

“It’s clear that some of what is shown in the programme does not reflect Facebook’s policies or values, and falls short of the high standards we expect.

Facebook
Facebook, social media.Pixabay

“We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention,” Allan said.

The documentary also showed that Facebook moderators have turned blind eye to under-age accounts.

“Moderators are told they can only take action to close down the account of a child who clearly looks 10-years-old if the child actually admits in posts they are under-aged,” The Telegraph reported, citing the documentary.

“We have to have an admission that the person is under-age. If not, we just pretend that we are blind and we don’t know what underage looks like,” a trainer told the undercover reporter.

Facebook is also facing the flak for launching Messenger Kids that encourages children under age 13 to join social media.

British Health Secretary Jeremy Hunt in December warned the social media giant to stay away from his children.

Also read-Facebook Joins Skill India Mission to Train Empower Youth

Early this year, more than 100 child health experts have urged Facebook to withdraw the app.

Despite call for withdrawal by experts, Facebook has decided to expand the reach of Messenger Kids by introducing the video calling and messaging app designed for children under 13 to families in Canada and Peru.

Facebook said it will also introduce Spanish and French versions of the app. (IANS)