Sunday July 21, 2019
Home Lead Story Making AI Unb...

Making AI Unbiased Has Become Essential For Human Freedom

What about the AI assistants that we have on our devices? If Google Assistant, Siri, or for that matter Alexa, talks to us in a female voice, it could make our kids believe that women - not men -- are supposed to be assistants.

0
//
"If AI systems are built only by one representative group such as all male, all Asian or all Caucasian; then they are more likely to create biased results," Mythreyee Ganapathy, Director, Program Management, Cloud and Enterprise, Microsoft, told IANS. Pixabay

From recommending what you need to buy on a shopping site to who you should be friends with on a social network, Artificial Intelligence (AI) has established itself as a guiding force in our lives.

The growing importance of this technology, however, has also made people aware of the biases that have become part of it, increasing the pressure on technology companies to make amends.

Google, for example has now established an external advisory council to help the tech giant develop AI technology in an ethical and responsible way.

E-commerce giant Amazon this month announced that it was working with the National Science Foundation (NSF) to commit up to $10 million each in research grants over the next three years focused on fairness in AI.

technology
The growing importance of this technology, however, has also made people aware of the biases that have become part of it, increasing the pressure on technology companies to make amends. Pixabay

“We believe we must work closely with academic researchers to develop innovative solutions that address issues of fairness, transparency, and accountability and to ensure that biases in data don’t get embedded in the systems we create,” Prem Natarajan, Vice President of Natural Understanding in the Alexa AI group at Amazon, wrote in a blog post.

Biased automation tools could push disadvantaged communities further to the margins. Imagine, for example, an AI recruitment tool that considers women to be less intelligent. If a job portal employs such a tool, it is more likely to recommend males to an organisation planning to hire new people.

What about the AI assistants that we have on our devices? If Google Assistant, Siri, or for that matter Alexa, talks to us in a female voice, it could make our kids believe that women – not men — are supposed to be assistants.

 

According to a report in the investigative journalism website ProPublica, one risk assessment tool commonly used in US court rooms was found recommending lighter punishment for white people than black people.

google
Google, for example has now established an external advisory council to help the tech giant develop AI technology in an ethical and responsible way. Pixabay

Making AI unbiased has therefore become essential for human freedom and for ensuring equal opportunities for all and fighting discrimination. But why do AI tools show bias and reflect the prejudices which are already existing in our society?

This is partly because the community that builds AI does not adequately reflect the diversity in the world. According to a 2018 World Economic Forum Report, only 22 per cent of AI professionals globally are female.

Also Read:“Countries Had “Set Up” Migrant Caravans That Make Their Way To The U.S.,” President Donald Trump Calls for Ending Aid

“If AI systems are built only by one representative group such as all male, all Asian or all Caucasian; then they are more likely to create biased results,” Mythreyee Ganapathy, Director, Program Management, Cloud and Enterprise, Microsoft, told IANS.

“Data sets that will be used to train AI models need to be assembled by a diverse group of data engineers. A simple example is data sets that are used to train speech AI models which focus primarily on adult speech samples unintentionally exclude children and hence the models are unable to recognise children’s voices,” she pointed out. (IANS)

Next Story

Tech Giant Google to Fix Loophole That Lets Sites to Track Porn-viewing Habits of People

According to Google, the change will affect sites that use the “FileSystem API” to intercept “Incognito” mode sessions and require people to log in or switch to normal browsing mode,” on the assumption that these individuals are attempting to circumvent metered paywalls

0
google, online tracking
A man walks past a Google sign outside with a span of the Bay Bridge at rear in San Francisco, May 1, 2019. VOA

After facing criticism over letting third-party organization get access to users’ viewing habits even while browsing in ‘Incognito’ mode, Google has said Chrome will fix a loophole that has allowed sites to detect people who are browsing the web privately.

This confirms that a loophole is indeed there in “Incognito” mode allowing site owners and publishers to detect when people are browsing privately, including porn.

“People choose to browse the web privately for many reasons. Some wish to protect their privacy on shared or borrowed devices, or to exclude certain activities from their browsing histories,” Barb Palser, a Partner Development Manager at Google said in a blog post.

Chrome will remedy a loophole that has allowed sites to detect people who are browsing in ‘Incognito’ Mode.

“This will affect some publishers who have used the loophole to deter metered paywall circumvention,” Palser added.

When third-party vendors use the loophole in Chrome’s “Incognitoa mode, Chrome’s FileSystem API is disabled to avoid leaving traces of activity on someone’s device.

amazon
FILE – The Google logo is seen at a start-up campus in Paris, France, Feb. 15, 2018. VOA

“With the release of Chrome 76 scheduled for July 30, the behaviour of the FileSystem API will be modified to remedy this method of Incognito Mode detection. Chrome will likewise work to remedy any other current or future means of Incognito Mode detection,” Google informed.

Google’s acknowledgment came after a new joint study from Microsoft, Carnegie Mellon University and University of Pennsylvania that investigated 22,484 sex websites using a tool called “webXray” revealed that 93 per cent of pages track and leak users’ data to third-party organisations even during the “Incognito” mode.

Of non-pornography-specific services, Google tracks 74 per cent of sites, Oracle 24 per cent and Facebook 10 per cent.

Also Read: YouTube Fined in Millions Over Kids’ Data Privacy Breach

According to Google, the change will affect sites that use the “FileSystem API” to intercept “Incognito” mode sessions and require people to log in or switch to normal browsing mode,” on the assumption that these individuals are attempting to circumvent metered paywalls”.

“We suggest publishers monitor the effect of the ‘FileSystem API’ change before taking reactive measures since any impact on user behaviour may be different than expected and any change in meter strategy will impact all users, not just those using ‘Incognito’ mode,” Google explained. (IANS)