Monday September 23, 2019
Home Lead Story Adding Human-...

Adding Human-Like Features to Apple Siri, Amazon Alexa may Actually Disappoint Users

During the study, Sundar found that chat bots that had human features -- such as a human avatar -- but lacked interactivity, disappointed people who used it

0
//
alexa, siri, amazon echo
"People are pleasantly surprised when a chat bot with fewer human cues has higher interactivity," said Sundar. Wikimedia

An Indian American researcher-led team has found that giving human touch to chat bots like Apple Siri or Amazon Alexa may actually disappoint users.

Just giving a chat bot human name or adding human-like features to its avatar might not be enough to win over a user if the device fails to maintain a conversational back-and-forth with that person, according to S. Shyam Sundar, Co-director of Media Effects Research Laboratory at Pennsylvania State University.

“People are pleasantly surprised when a chat bot with fewer human cues has higher interactivity,” said Sundar.

“But when there are high human cues, it may set up your expectations for high interactivity – and when the chat bot doesn’t deliver that – it may leave you disappointed,” he added. In fact, human-like features might create a backlash against less responsive human-like chat bots.

siri, amazon echo, alexa
Because there is an expectation that people may be leery of interacting with a machine, developers typically add human names to their chat bots — for example, Apple’s Siri — or programme a human-like avatar to appear when the chat bot responds to a user. Flickr

During the study, Sundar found that chat bots that had human features — such as a human avatar — but lacked interactivity, disappointed people who used it. However, people responded better to a less-interactive chat bot that did not have human-like cues.

High interactivity is marked by swift responses that match a user’s queries and feature a threaded exchange that can be followed easily.

According to Sundar, even small changes in the dialogue, like acknowledging what the user said before providing a response, can make the chat bot seem more interactive.

Because there is an expectation that people may be leery of interacting with a machine, developers typically add human names to their chat bots — for example, Apple’s Siri — or programme a human-like avatar to appear when the chat bot responds to a user.

amazon echo, alexa, siri
In fact, human-like features might create a backlash against less responsive human-like chat bots. Wikimedia

The researchers, who published their findings in the journal Computers in Human Behavior, also found that just mentioning whether a human or a machine is involved — or, providing an identity cue — guides how people perceive the interaction.

ALSO READ: Huawei Signs an Agreement with Infosys on New Cloud Solutions

For the study, the researchers recruited 141 participants through Amazon Mechanical Turk, a crowd-sourced site that allows people to get paid to participate in studies. Sundar said the findings could help developers improve acceptance of chat technology among users. “There’s a big push in the industry for chat bots,” said Sundar.

“They’re low-cost and easy-to-use, which makes the technology attractive to companies for use in customer service, online tutoring and even cognitive therapy — but we also know that chat bots have limitations,” he added. (IANS)

Next Story

Tech Giant Apple Tweaks Siri to Protect Users’ Privacy

“As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologize,” Apple added

0
Apple, smartphone
Customers walk past an Apple logo inside of an Apple store at Grand Central Station in New York, Aug. 1, 2018. VOA

Apple has said it will no longer retain audio recordings of its digital assistant Siri interactions and use computer-generated transcripts to help Siri improve.

“The users will be able to opt in to help Siri improve by learning from the audio samples of their requests,” Apple said in a statement late on Wednesday.

Those who choose to participate will be able to opt out at any time.

“When customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri,” said the company.

Earlier, taking a tough stand against contractors who listened to over 1,000 Siri recordings per shift including people having sex, Apple reportedly laid off 300 contractors in Cork, Ireland.

According to a report in Engadget, after suspending the Siri “grading” programme last month, the Cupertino-based iPhone maker has now terminated it.

Apple’s Siri. Image source: Wikimedia Commons

More contractors throughout Europe may have been let go, said the report.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading.

“We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies,” Apple responded.

Also Read: Centre Sanctions Rs 1,040 Crore for Greening Punjab

According to Apple, Siri uses as little data as possible to deliver an accurate result.

“Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests — less than 0.2 per cent — and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability,” said the company.

“As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologize,” Apple added. (IANS)