Thursday December 13, 2018
Home Opinion Language: Min...

Language: Mind your engendered step!

0
//
Republish
Reprint

By Akash Shukla

Language, identity, gender and politics are interwoven in a relationship that amazingly twists the very fabric of sustainable development in education. Even though we use language constantly, we don’t normally pay a great deal of attention to it.

Be it barren, bimbo, blonde, ball breaker or any other word, Indian media has always been in the line of fire for failing to escape the biasness that English and Hindi bring along with a grudge. There are 220 words in an average dictionary to describe women of ill repute while there are 20 for men. There are three male words for every female word. The word ‘Fast’ need not only amount to ‘speed’ semantically. It also refers to persons who get what they want quickly, especially in a sexual relationship. The terms ‘fast’ or ‘loose’ are often used for women who are friendly to men.

Six things women must learn from men

Man teaches: Logic Lessons

“Don’t hate me for pointing this out, but life will be far easier if women understand that everything in life has logic behind it. Men follow their innate logic as they take decisions, while women are absolutely unaware about the mere existence of logic. And this makes it really hard for men to deal with women,” says TV actor Mihir Mishra for TOI.

The above-stated extract is a lifestyle feature and it harps on the logical predominance of men over women. The bullet point ‘Man teaches: Logic lessons’ reinforces the headline view of male supremacy. Telly celebs voice their viewpoints and their opinion gain momentum via PRINT, it tends to shape ideologies. And, what follows is an endless hue and cry over lax and loose reporting by activists and feminists.

‘Think global and act local (GLOCAL)’ is the way to approach and tackle a report situation. Here follows a list of unprintable headlines in GLOCAL context:

Picture Credit: rediff.com
Picture Credit: rediff.com
  1. Virginity, a must for a happy marriage? (TOI  Life & Style)
  2. Sex on Demand. Get What You Want Everytime! (Men’s Health. Cover)
  3. Ultimate Orgasms. MAKE ‘EM STRONGER and LONGER (Women’s Health. Cover)
  4. BE A LUCKY BITCH! (Cosmopolitan. Cover)
  5. …SEX UP THE BEDROOM (Femina India. Cover)
  6. The mistress of KONKAN SPICES (Indian Express. COASTAL SOIREE)

The list is endless. All these and many more top the charts and rule the roost in mainstream media and lifestyle journalism periodically. Use of words like ‘virginity’, ‘orgasms’, ‘bitch’, and ‘mistress’ is derogatory for women. Irrespective of any refutation, the wordplay is foul and is tantamount to hollow sensationalism. The projection of women is sexual in all the headlines stated above. Women have been snubbed and sidelined as arrogant, shopaholics and sex objects only. Sensationalist headlines do arrest attention but engendered language mutilates ideology of the common man.

Mass communication students and aspiring journalists should be taught to read between the texts and bring out the covert meaning which is always different from its overt counterpart. The world reads story from the publication’s perspective. Since there are no absolute facts, the version of truth tabled by the reporter must not tarnish the image of a publication’s policy.

Lastly, media is the watchdog of society and the only leash appropriate for it is judicious self-restraint at all times for careful and responsible reportage. A teacher must follow suit and do what is best for the students in the ever-changing teaching scenario.

 

Click here for reuse options!
Copyright 2015 NewsGram

Next Story

Google’s New AI Tool Blocks Gendered Pronouns

Microsoft's LinkedIn said it avoids gendered pronouns in its year-old predictive messaging tool.

0
Google, smart compose
- The Google name is displayed outside the company's office in London, Britain. VOA

Alphabet Inc’s Google in May introduced a slick feature for Gmail that automatically completes sentences for users as they type. Tap out “I love” and Gmail might propose “you” or “it.” But users are out of luck if the object of their affection is “him” or “her.”

Google’s technology will not suggest gender-based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews.

Gmail product manager Paul Lambert said a company research scientist discovered the problem in January when he typed “I am meeting an investor next week,” and Smart Compose suggested a possible follow-up question: “Do you want to meet him?” instead of “her.”

Smart Compose, Google
AI experts have called on the companies to display a disclaimer and multiple possible translations.. Pixabay

Consumers have become accustomed to embarrassing gaffes from autocorrect on smartphones. But Google refused to take chances at a time when gender issues are reshaping politics and society, and critics are scrutinizing potential biases in artificial intelligence like never before.

“Not all ‘screw ups’ are equal,” Lambert said. Gender is a “a big, big thing” to get wrong.

Getting Smart Compose right could be good for business. Demonstrating that Google understands the nuances of AI better than competitors is part of the company’s strategy to build affinity for its brand and attract customers to its AI-powered cloud computing tools, advertising services and hardware.

Gmail has 1.5 billion users, and Lambert said Smart Compose assists on 11 percent of messages worldwide sent from Gmail.com, where the feature first launched.

Google, Main One, russia, smart compose
A Google logo is seen at the company’s headquarters in Mountain View, California, VOA

Smart Compose is an example of what AI developers call natural language generation (NLG), in which computers learn to write sentences by studying patterns and relationships between words in literature, emails and web pages.

A system shown billions of human sentences becomes adept at completing common phrases but is limited by generalities. Men have long dominated fields such as finance and science, for example, so the technology would conclude from the data that an investor or engineer is “he” or “him.” The issue trips up nearly every major tech company.

Lambert said the Smart Compose team of about 15 engineers and designers tried several workarounds, but none proved bias-free or worthwhile. They decided the best solution was the strictest one: Limit coverage. The gendered pronoun ban affects fewer than 1 percent of cases where Smart Compose would propose something, Lambert said.

“The only reliable technique we have is to be conservative,” said Prabhakar Raghavan, who oversaw engineering of Gmail and other services until a recent promotion.

Smart compose
Google is not the only tech company wrestling with the gender-based pronoun problem. (VOA)

New policy

Google’s decision to play it safe on gender follows some high-profile embarrassments for the company’s predictive technologies.

The company apologized in 2015 when the image recognition feature of its photo service labeled a black couple as gorillas. In 2016, Google altered its search engine’s autocomplete function after it suggested the anti-Semitic query “are jews evil” when users sought information about Jews.

Google has banned expletives and racial slurs from its predictive technologies, as well as mentions of its business rivals or tragic events.

The company’s new policy banning gendered pronouns also affected the list of possible responses in Google’s Smart Reply. That service allow users to respond instantly to text messages and emails with short phrases such as “sounds good.”

Google uses tests developed by its AI ethics team to uncover new biases. A spam and abuse team pokes at systems, trying to find “juicy” gaffes by thinking as hackers or journalists might, Lambert said.

Google, Main One, russia, smart compose
Google CEO Sundar Pichai speaks during a news conference in New Delhi. VOA

Workers outside the United States look for local cultural issues. Smart Compose will soon work in four other languages: Spanish, Portuguese, Italian and French.

“You need a lot of human oversight,” said engineering leader Raghavan, because “in each language, the net of inappropriateness has to cover something different.”

Wispread challenge

Google is not the only tech company wrestling with the gender-based pronoun problem. Agolo, a New York startup that has received investment from Thomson Reuters, uses AI to summarize business documents.

Its technology cannot reliably determine in some documents which pronoun goes with which name. So the summary pulls several sentences to give users more context, said Mohamed AlTantawy, Agolo’s chief technology officer.

He said longer copy is better than missing details. “The smallest mistakes will make people lose confidence,” AlTantawy said. “People want 100 percent correct.”

Google, smart compose
Gmail product manager Paul Lambert said a company research scientist discovered the problem in January VOA

Yet, imperfections remain. Predictive keyboard tools developed by Google and Apple Inc propose the gendered “policeman” to complete “police” and “salesman” for “sales.”

Type the neutral Turkish phrase “one is a soldier” into Google Translate and it spits out “he’s a soldier” in English. So do translation tools from Alibaba and Microsoft Corp. Amazon.com Inc opts for “she” for the same phrase on its translation service for cloud computing customers.

AI experts have called on the companies to display a disclaimer and multiple possible translations.

Microsoft’s LinkedIn said it avoids gendered pronouns in its year-old predictive messaging tool, Smart Replies, to ward off potential blunders.

Also Read: Google Slammed With A Civil Case From Russia

Alibaba and Amazon did not respond to requests to comment. Warnings and limitations like those in Smart Compose remain the most-used countermeasures in complex systems, said John Hegele, integration engineer at Durham, North Carolina-based Automated Insights Inc, which generates news articles from statistics.

“The end goal is a fully machine-generated system where it magically knows what to write,” Hegele said. “There’s been a ton of advances made but we’re not there yet.” (VOA)