Even as the scale of interaction between humans and technology increases, new research has shown that people tend to take more risks when prodded by a robot. The research, published in the journal Cyberpsychology, Behavior, and Social Networking, showed that robots can encourage people to take greater risks in a simulated gambling scenario than they would if there was nothing to influence their behaviors.
The researcher now believes that further studies are needed to see whether similar results would emerge from human interaction with other artificial intelligence (AI) systems, such as digital assistants or on-screen avatars.
Follow NewsGram on Instagram to keep yourself updated.
“On the one hand, our results might raise alarms about the prospect of robots causing harm by increasing risky behavior,” said Yaniv Hanoch, Associate Professor in Risk Management at the University of Southampton in Britain.
“On the other hand, our data points to the possibility of using robots and AI in preventive programs, such as anti-smoking campaigns in schools, and with hard to reach populations, such as addicts.”
This research involved 180 undergraduate students taking the Balloon Analogue Risk Task (BART), a computer assessment that asks participants to press the spacebar on a keyboard to inflate a balloon displayed on the screen.
With each press of the spacebar, the balloon inflates slightly, and 1 penny is added to the player’s “temporary money bank”. The balloons can explode randomly, meaning the player loses any money they have won for that balloon and they have the option to “cash-in” before this happens and move on to the next balloon.
One-third of the participants took the test in a room on their own (the control group), one third took the test alongside a robot that only provided them with the instructions but was silent the rest of the time, and the final, the experimental group, took the test with the robot providing instruction as well as speaking encouraging statements such as “why did you stop pumping?”
The results showed that the group who were encouraged by the robot took more risks, blowing up their balloons significantly more frequently than those in the other groups did. They also earned more money overall. There was no significant difference in the behaviors of the students accompanied by the silent robot and those with no robot.
“We saw participants in the control condition scale back their risk-taking behavior following a balloon explosion, whereas those in the experimental condition continued to take as much risk as before,” Hanoch said. “So, receiving direct encouragement from a risk-promoting robot seemed to override participants’ direct experiences and instincts.” (IANS)