Never miss a story

Get subscribed to our newsletter


×
Artificial intelligence-powered IBM Watson virtual assistant and Google AI datasets are not far behind on gender bias. Pixabay

Revealing another dark side of trained Artificial Intelligence (AI) models, new research has claimed that Google AI datasets identified most women wearing masks as if their mouths were covered by duct tapes. Not just Google. When put to work, artificial intelligence-powered IBM Watson virtual assistant was not far behind on gender bias.

In 23 per cent of cases, Watson saw a woman wearing a gag while in another 23 per cent, it was sure the woman was “wearing a restraint or chains”.


Follow NewsGram on Twitter to stay updated about the World news.

To reach this conclusion, Ilinca Barsan, Director of Data Science, Wunderman Thompson Data used 265 images of men in masks and 265 images of women in masks, of varying picture quality, mask style and context — from outdoor pictures to office snapshots, from stock images to iPhone selfies, from DIY cotton masks to N95 respirators.


The results showed that AI algorithms are, indeed, written by “men”. Pixabay

The results showed that AI algorithms are, indeed, written by “men”.

Out of the 265 images of men in masks, Google correctly identified 36 per cent as containing PPE. It also mistook 27 per cent of images as depicting facial hair.

“While inaccurate, this makes sense, as the model was likely trained on thousands and thousands of images of bearded men.

“Despite not explicitly receiving the label man, the AI seemed to make the association that something covering a man’s lower half of the face was likely to be facial hair,” said Barsan who deciphers data at Wunderman Thompson, a New York-based global marketing communications agency.

Beyond that, 15 per cent of images were misclassified as duct tape.

“This suggested that it may be an issue for both men and women. We needed to learn if the misidentification was more likely to happen to women,” she said in a statement.

Most interestingly (and worrisome), the tool mistakenly identified 28 per cent women as depicting duct tape.

At almost twice the number for men, it was the single most common “bad guess” for labeling masks.

When Microsoft’s Computer Vision looked at the image sets, it suggested that 40 per cent of the women were wearing a fashion accessory, while 14 per cent were wearing lipstick, instead of spotting the face masks.

“Even as a data scientist, who spends big chunks of her time scrubbing and prepping datasets, the idea of potentially harmful AI bias can feel a little abstract; like something that happens to other people’s models, and accidentally gets embedded into other people’s data products,” Barsan elaborated.

IBM Watson correctly identified 12 per cent of men to be wearing masks, while it is only right 5 per cent of the time for women.

Overall, for 40 per cent of images of women, Microsoft Azure Cognitive Services identified the mask as a fashion accessory compared to only 13 per cent of images of men.


IBM Watson correctly identified 12% of men to be wearing masks, while it is only right 5% of the time for women. Pixabay

“Going one step further, the computer vision model suggested that 14 per cent of images of masked women featured lipstick, while 12 per cent of images of men mistook the mask for a beard,” Barsan informed.

Also Read: This Year Has Been a Wake-Up Call for Us: Bhumi Pednekar

These labels seem harmless in comparison, she added, but it’s still a sign of underlying bias and the model’s expectation of what type of things it will and won’t see when you feed it the image of a woman.

“I was baffled by the duct-tape label because I’m a woman and, therefore, more likely to receive a duct-tape label back from Google in the first place. But gender is not even close to the only dimension we must consider here,” she lamented.

The researchers wrote the machines were looking for inspiration in “a darker corner of the web where women are perceived as victims of violence or silenced.” (IANS)


Popular

CNN

Doris Lessing who won a Nobel Prize in Literature

London (CNN)- At five o'clock in the morning, the esteemed 86-year-old astrophysicist Jim Peebles was woken suddenly by the telephone ringing.

"In previous experience, the only phone calls at that time of night are bad news," he said. This one was great news. "The opening sentence from the caller was: 'The Nobel committee has voted to award you the Nobel Prize in Physics. Do you accept?'" Peebles recalled. The wording threw him. Who wouldn't accept a Nobel Prize? "You know the Bob Dylan fiasco?" he said during a phone interview with CNN. "That might have put the wind up them."The "fiasco" Peebles mentions refers to the 2016 Nobel Prize in Literature, which was controversially given to an utterly unimpressed Dylan.Aside from being ever-presents on college campuses in the 1960s, little connects Peebles, an expert in theoretical cosmology, with Dylan. But one of the starkest contrasts might lie in their reactions to winning a Nobel -- and the songwriter is far from the only laureate whose crowning turned out to be an awkward affair.

The five committees are notoriously secretive, fiercely shielding their choices from the outside world -- including the laureates themselves, who are told of their victories just minutes before they are announced to the public.

Keep Reading Show less
Wikimedia Commons

Sindoor implies the longevity of a woman's marriage to her husband in the Hindu tradition

Married Hindu women are recognised by a red streak of vermillion in the middle of their foreheads. This is traditionally called 'sindoor', which is derived from the Sanskrit word sindura, meaning 'red lead.'. Sindoor is traditionally powdered turmeric and lime, sometimes red saffron, or red sandalwood. It is also called vermilion, or Kumkum.

Vermilion powder mixed on a plate Sindoor is traditionally powdered turmeric and lime, sometimes red saffron, or red sandalwood. It is also called vermilion, or Kumkum. Image source: Photo by Gayathri Malhotra on Unsplash

Keep Reading Show less
Wikimedia Commons

Actress Urvashi Rautela has recently announced the name of her next film which is titled 'Dil Hai Gray'.

Actress Urvashi Rautela has recently announced the name of her next film which is titled 'Dil Hai Gray'. It's a Hindi remake of Tamil film 'Thiruttu Payale 2'. Urvashi Rautela will be seen alongside Vineet Kumar Singh and Akshay Oberoi.

Urvashi shares: "I am excited to announce the title of my next film 'Dil Hai Gray' on the auspicious day of Vijaya Dashami. The film is very close to my heart and it was lovely working with director Susi Ganeshan sir, producer M Ramesh Reddy sir, and my co-stars Vineet Kumar Singh and Akshay Oberoi. "

"The film has created a massive response in the south industry and I am very positive about the story that it will be also be loved by the audience here. I hope my fans would bless us with their love and support. Super excited to watch my film on the big screen after a long time," she concludes. (IANS/ MBI)


Keep reading... Show less