Friday November 22, 2019
Home Lead Story Making AI Unb...

Making AI Unbiased Has Become Essential For Human Freedom

What about the AI assistants that we have on our devices? If Google Assistant, Siri, or for that matter Alexa, talks to us in a female voice, it could make our kids believe that women - not men -- are supposed to be assistants.

0
//
"If AI systems are built only by one representative group such as all male, all Asian or all Caucasian; then they are more likely to create biased results," Mythreyee Ganapathy, Director, Program Management, Cloud and Enterprise, Microsoft, told IANS. Pixabay

From recommending what you need to buy on a shopping site to who you should be friends with on a social network, Artificial Intelligence (AI) has established itself as a guiding force in our lives.

The growing importance of this technology, however, has also made people aware of the biases that have become part of it, increasing the pressure on technology companies to make amends.

Google, for example has now established an external advisory council to help the tech giant develop AI technology in an ethical and responsible way.

E-commerce giant Amazon this month announced that it was working with the National Science Foundation (NSF) to commit up to $10 million each in research grants over the next three years focused on fairness in AI.

technology
The growing importance of this technology, however, has also made people aware of the biases that have become part of it, increasing the pressure on technology companies to make amends. Pixabay

“We believe we must work closely with academic researchers to develop innovative solutions that address issues of fairness, transparency, and accountability and to ensure that biases in data don’t get embedded in the systems we create,” Prem Natarajan, Vice President of Natural Understanding in the Alexa AI group at Amazon, wrote in a blog post.

Biased automation tools could push disadvantaged communities further to the margins. Imagine, for example, an AI recruitment tool that considers women to be less intelligent. If a job portal employs such a tool, it is more likely to recommend males to an organisation planning to hire new people.

What about the AI assistants that we have on our devices? If Google Assistant, Siri, or for that matter Alexa, talks to us in a female voice, it could make our kids believe that women – not men — are supposed to be assistants.

 

According to a report in the investigative journalism website ProPublica, one risk assessment tool commonly used in US court rooms was found recommending lighter punishment for white people than black people.

google
Google, for example has now established an external advisory council to help the tech giant develop AI technology in an ethical and responsible way. Pixabay

Making AI unbiased has therefore become essential for human freedom and for ensuring equal opportunities for all and fighting discrimination. But why do AI tools show bias and reflect the prejudices which are already existing in our society?

This is partly because the community that builds AI does not adequately reflect the diversity in the world. According to a 2018 World Economic Forum Report, only 22 per cent of AI professionals globally are female.

Also Read:“Countries Had “Set Up” Migrant Caravans That Make Their Way To The U.S.,” President Donald Trump Calls for Ending Aid

“If AI systems are built only by one representative group such as all male, all Asian or all Caucasian; then they are more likely to create biased results,” Mythreyee Ganapathy, Director, Program Management, Cloud and Enterprise, Microsoft, told IANS.

“Data sets that will be used to train AI models need to be assembled by a diverse group of data engineers. A simple example is data sets that are used to train speech AI models which focus primarily on adult speech samples unintentionally exclude children and hence the models are unable to recognise children’s voices,” she pointed out. (IANS)

Next Story

Researchers Develop AI Bots that Can Beat Humans in Online Multiplayer Games

If you replace a human teammate with a bot, you can expect a higher win rate for your team. Bots are better partners

0
AI Bots
The AI Bots are trained by playing against itself as both resistance and spy. When playing an online game, it uses its game tree to estimate what each player is going to do. Pixabay

Researchers have developed an artificial intelligence-enabled machine, known as AI Bots that can beat human players in a tricky online multiplayer game where player roles and motives are kept secret, says a study.

It was presented at International Conference on Information Systems.

The machine called “DeepRole” is the first gaming bot that can win online multiplayer games in which the participants’ team allegiances are initially unclear, according the study from Massachusetts Institute of Technology (MIT), US.

The bot is designed with novel “deductive reasoning” added into an AI algorithm commonly used for playing poker.

This helps it reason about partially observable actions, to determine the probability that a given player is a teammate or opponent. In doing so, it quickly learns whom to ally with and which actions to take to ensure its team’s victory.

“If you replace a human teammate with a bot, you can expect a higher win rate for your team. Bots are better partners,” said study first author Jack Serrino from MIT.

The researchers pitted DeepRole against human players in more than 4,000 rounds of the online game “The Resistance: Avalon.” In this game, players try to deduce their peers’ secret roles as the game progresses, while simultaneously hiding their own roles.

AI Bots
Researchers have developed an artificial intelligence-enabled machine, known as AI Bots that can beat human players in a tricky online multiplayer game where player roles and motives are kept secret. Pixabay

As both a teammate and an opponent, DeepRole consistently outperformed human players.

“Humans learn from and cooperate with others, and that enables us to achieve together things that none of us can achieve alone,” said study co-author Max Kleiman-Weiner.

“Games like ‘Avalon’ better mimic the dynamic social settings humans experience in everyday life. You have to figure out who’s on your team and will work with you, whether it’s your first day of kindergarten or another day in your office,” Kleiman-Weiners said.

ALSO READ: Plan Your Retirement With Low Cost ULIPS Now

The bot is trained by playing against itself as both resistance and spy. When playing an online game, it uses its game tree to estimate what each player is going to do. (IANS)