Wednesday, September 23, 2020
Home Lead Story Google, IBM, Microsoft AI Datasets Show Gender Bias

Google, IBM, Microsoft AI Datasets Show Gender Bias

Google AI datasets identified most women wearing masks as if their mouths were covered by duct tapes

Revealing another dark side of trained Artificial Intelligence (AI) models, new research has claimed that Google AI datasets identified most women wearing masks as if their mouths were covered by duct tapes. Not just Google. When put to work, artificial intelligence-powered IBM Watson virtual assistant was not far behind on gender bias.

In 23 per cent of cases, Watson saw a woman wearing a gag while in another 23 per cent, it was sure the woman was “wearing a restraint or chains”.

Follow NewsGram on Twitter to stay updated about the World news.

To reach this conclusion, Ilinca Barsan, Director of Data Science, Wunderman Thompson Data used 265 images of men in masks and 265 images of women in masks, of varying picture quality, mask style and context — from outdoor pictures to office snapshots, from stock images to iPhone selfies, from DIY cotton masks to N95 respirators.

Google, IBM, Microsoft AI models fail to curb gender bias
The results showed that AI algorithms are, indeed, written by “men”. Pixabay

The results showed that AI algorithms are, indeed, written by “men”.

Out of the 265 images of men in masks, Google correctly identified 36 per cent as containing PPE. It also mistook 27 per cent of images as depicting facial hair.

“While inaccurate, this makes sense, as the model was likely trained on thousands and thousands of images of bearded men.

“Despite not explicitly receiving the label man, the AI seemed to make the association that something covering a man’s lower half of the face was likely to be facial hair,” said Barsan who deciphers data at Wunderman Thompson, a New York-based global marketing communications agency.

Beyond that, 15 per cent of images were misclassified as duct tape.

“This suggested that it may be an issue for both men and women. We needed to learn if the misidentification was more likely to happen to women,” she said in a statement.

Most interestingly (and worrisome), the tool mistakenly identified 28 per cent women as depicting duct tape.

At almost twice the number for men, it was the single most common “bad guess” for labeling masks.

When Microsoft’s Computer Vision looked at the image sets, it suggested that 40 per cent of the women were wearing a fashion accessory, while 14 per cent were wearing lipstick, instead of spotting the face masks.

“Even as a data scientist, who spends big chunks of her time scrubbing and prepping datasets, the idea of potentially harmful AI bias can feel a little abstract; like something that happens to other people’s models, and accidentally gets embedded into other people’s data products,” Barsan elaborated.

IBM Watson correctly identified 12 per cent of men to be wearing masks, while it is only right 5 per cent of the time for women.

Overall, for 40 per cent of images of women, Microsoft Azure Cognitive Services identified the mask as a fashion accessory compared to only 13 per cent of images of men.

Google, IBM, Microsoft AI models fail to curb gender bias
IBM Watson correctly identified 12% of men to be wearing masks, while it is only right 5% of the time for women. Pixabay

“Going one step further, the computer vision model suggested that 14 per cent of images of masked women featured lipstick, while 12 per cent of images of men mistook the mask for a beard,” Barsan informed.

Also Read: This Year Has Been a Wake-Up Call for Us: Bhumi Pednekar

These labels seem harmless in comparison, she added, but it’s still a sign of underlying bias and the model’s expectation of what type of things it will and won’t see when you feed it the image of a woman.

“I was baffled by the duct-tape label because I’m a woman and, therefore, more likely to receive a duct-tape label back from Google in the first place. But gender is not even close to the only dimension we must consider here,” she lamented.

The researchers wrote the machines were looking for inspiration in “a darker corner of the web where women are perceived as victims of violence or silenced.” (IANS)



Most Popular

From “Confusion” To “Clarity” Regarding Sanatan Dharma With Dr. Chandra Shekhar Mayanil

Dr. Chandra Shekhar Mayanil is a Neuroscientist who is currently living in Naperville, Illinois. He has constantly been engaging in topics related to Yog...

Top 5 Beach Activities in Asia

Blessed with some of the most beautiful beaches in the world, Asia has something for everyone. From idyllic white-sand beaches fringed by palm trees...

Here’s Why You Should Switch to Contact Lenses, and It’s Not Just About Looks

Traditionally, we use eyeglasses for correcting vision problems such as near-sightedness (myopia) and far-sightedness (hyperopia). Glasses are both affordable and convenient to wear for...

This Indian Couple Run Street-Side Classes For Poor And Needy Children

On a quiet road in India's capital, tucked away on a wide, red-bricked sidewalk, children set adrift by the country's COVID-19 lockdown are being...

Lockdown Taught Me Greatest Lessons of Life: Mouni Roy

Actress Mouni Roy, who will soon be seen in the digital film, London Confidential, opposite Purab Kohli, says that lockdown taught her some of...

“It is Important To Support Folk Artists and Daily Workers in Trying Times”, Says Popular Singer Shaan

Popular playback singer Shaan says that folk artists and daily workers are keeping the music heritage and industry intact, and it's crucial for the...

Here’s How The Art of “Writing” Became an Empowerment Tool For These Female Authors

By Siddhi Jain Being a female author can come with its own set of challenges, especially when one is trying to juggle a career in...

Self-Driving Cars To Navigate Rush Hour Traffic On This Planet: NASA

A laser-based technology designed to help spacecraft land on a proverbial dime for missions to the Moon and Mars is also helping self-driving cars...

Recent Comments