Thursday July 18, 2019

Social Robots for Hospitalized Children

Social robots are the best gift Artificial Intelligence could ever give us

robots, telescopes
It is made from computer parts and powered by two modified portable cell phone chargers. Flickr

Social robots are the best gift Artificial Intelligence could ever give us. These robots have been increasingly used in medical and educational contexts. Over the past 20 years, social robots have become an emerging field where there are many things still to work on. This field requires knowledge in mechanics, control, artificial intelligence, psychology, design, ethics, etc.

A study by MIT researchers investigated different scenarios in therapy or education where social robots could be a useful tool for children. The study demonstrated that social robots, when used pediatric units support sessions at hospitals, cause positive emotions in sick children. They can be used to help reduce a sick kid’s anxiety, pain, and other distress in a hospital setting.

To work on their study, the researchers deployed a robotic teddy bear, named Huggable, across different pediatric units at Boston Children’s Hospital. Huggable is a plush teddy bear which was developed in 2006 with a screen depicting animated eyes, whose facial expressions and body language can be controlled by a specialist. The specialists could also talk through a speaker with their voice automatically shifted to a higher pitch so that it sounds more childlike.

social robots
For the research, the team divided over 50 hospitalized children into three groups. Wikimedia Commons

For the research, the team divided over 50 hospitalized children into three groups. The first group had Huggable, the second had a tablet-based virtual Huggable, and the third had a traditional plush teddy bear.

The results found that the percentage of children who enjoyed playing with Huggable was much more than that of the virtual or traditional teddy bear. The children were seen to get out of bed and move around more, and also emotionally connect with the robot, asking it personal questions and inviting it to come back later to meet their families. Such improved emotional, physical, and verbal outcomes are all the positive factors which could contribute to better and faster recovery in hospitalized children.

Although it was small research, it was the very first to explore social robotics in a real-world inpatient pediatric setting with ill children. The other studies that have been conducted initially were in labs with very few children or in a public setting without any patient identification.

An important thing to keep in mind here is that Huggable is designed only to assist health care specialists and not replace them. “It’s a companion,” says co-author Cynthia Breazeal, an associate professor of media arts and sciences and founding director of the Personal Robots group. “Our group designs technologies with the mindset that they’re teammates. We don’t just look at the child-robot interaction. It’s about helping specialists and parents because we want technology to support everyone who has invested in the quality care of a child.”

social robots
Social robots are the best gift Artificial Intelligence could ever give us. Pixabay

The study also generated valuable insights to develop a fully autonomous Huggable robot, which is the researchers’ ultimate goal. The researchers were able to determine which physical gestures are used most often, and which features specialists may want for future iterations. For instance, Huggable could introduce doctors before they enter a child’s room or learn about their interests and share that information with specialists. The researchers may also equip the robot with computer vision, so that it can detect certain objects in a room and talk about those with children to make them feel secure.

ALSO READ: Deep CEE: AI Learning Model Helping Astronomers Identify Galaxy Clusters Quickly

In the future, an automated robot could be used to improve continuity of care. A child would take home the social robot after a hospital visit to further support engagement, adherence to care regimens, and monitoring well-being.

Next, the researchers are hoping to zero in on which specific patient populations may benefit the most from the Huggable interventions. Logan, a pediatric psychologist at Boston Children’s Hospital said that “We want to find the sweet spot for the children who need this type of extra support.”

Next Story

Artificial Intelligence to Play a Critical Role in Diagnosing Breast Cancer Quickly

"We had about 80 per cent accuracy rate. We will continue to refine the algorithm by using more real-world images as inputs,” Oberai said

Cancer, Patients, Invasive
The treatments kill healthy cells as well as cancerous ones, and the side effects are legendary. Pixabay

Breast ultrasound elastography is an emerging imaging technique that provides information about a potential breast lesion and researchers have identified the critical role AI can play in making this technique more efficient and accurate.

Using more precise information about the characteristics of a cancerous versus non-cancerous breast lesion, this methodology using Artificial Intelligence (AI) has demonstrated more accuracy compared to traditional modes of imaging.

In the study published in the journal Computer Methods in Applied Mechanics and Engineering, Indian-origin researchers Dhruv Patel and Assad Oberai from the University of Southern California showed that it is possible to train a machine to interpret real-world images using synthetic data and streamline the steps to diagnosis.

In the case of breast ultrasound elastography, once an image of the affected area is taken, it is analysed to determine displacements inside the tissue. Using this data and the physical laws of mechanics, the spatial distribution of mechanical properties, like its stiffness, is determined.

In the study, researchers sought to determine if they could skip the most complicated steps of this workflow.

Cancer Ribbon. Pixabay

For this, the researchers used about 12,000 synthetic images to train their Machine Learning algorithm. This process was similar to how photo identification software works, i.e learning through repeated inputs on how to recognize a particular person in an image, or how our brain learns to classify a cat versus a dog.

Through enough examples, the algorithm was able to glean different features inherent to a benign tumour versus a malignant tumour and make the correct determination.

Also Read- Over 16 Million Accounts of Indian Influencers on Instagram are Fake

The researchers achieved nearly 100 per cent classification accuracy on synthetic images. Once the algorithm was trained, they tested it on real-world images to determine how accurate it could be in providing a diagnosis, measuring these results against biopsy-confirmed diagnoses associated with these images.

“We had about 80 per cent accuracy rate. We will continue to refine the algorithm by using more real-world images as inputs,” Oberai said. (IANS)