Tuesday March 31, 2020

Here’s Why Artificial Intelligence May Help You in Treating Sleep Disorders

AI tools hold great promise for medicine in general, but there has also been a great deal of hype, exaggerated claims and misinformation

0
//
AI
Because of the vast amounts of data collected by sleep centres, AI and machine learning could advance sleep care, resulting in a more accurate diagnosis, prediction of disease and treatment prognosis. Pixabay

Not just overnight sleep tests, Artificial intelligence (AI) too has the potential to improve efficiencies and precision in sleep medicine, resulting in a more patient-centered care and better outcomes, researchers have found.

The electrophysiological data collected during polysomnography — the most comprehensive type of sleep study — is well-positioned for enhanced analysis through AI and machine-assisted learning, according to a new position statement from the American Academy of Sleep Medicine.

“When we typically think of AI in sleep medicine, the obvious use case is for the scoring of sleep and associated events,” said Cathy Goldstein, associate professor of sleep medicine and neurology at the University of Michigan.

“This would streamline the processes of sleep laboratories and free up sleep technologist time for direct patient care.”

Because of the vast amounts of data collected by sleep centres, AI and machine learning could advance sleep care, resulting in a more accurate diagnosis, prediction of disease and treatment prognosis.

“AI could allow us to derive more meaningful information from sleep studies, given that our current summary metrics, for example, the apnea-hypopnea index, aren’t predictive of the health and quality of life outcomes that are important to patients,” elaborated Goldstein.

“Additionally, AI might help us understand mechanisms underlying obstructive sleep apnea, so we can select the right treatment for the right patient at the right time, as opposed to one-size-fits-all or trial and error approaches,” she added in a paper published in the Journal of Clinical Sleep Medicine.

Sleep
Not just overnight sleep tests, Artificial intelligence (AI) too has the potential to improve efficiencies and precision in sleep medicine, resulting in a more patient-centered care and better outcomes, researchers have found. Pixabay

Important considerations for the integration of AI into the sleep medicine practice include transparency and disclosure, testing on novel data, and laboratory integration.

AI tools hold great promise for medicine in general, but there has also been a great deal of hype, exaggerated claims and misinformation.

ALSO READ: Indian Smartphone Market to be Affected by Coronavirus: Report

“We want to interface with industry in a way that will foster safe and efficacious use of AI software to benefit our patients. These tools can only benefit patients if used with careful oversight,” the authors wrote. (IANS)

Next Story

Know About Where Do Employees Actually Gaze At During Video Calls

For the study, published in the journal Attention, Perception & Psychophysics, the team compared fixation behaviour in 173 participants under two conditions

0
Video Chat
The phenomenon known as "gaze cueing," a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of "shared" or "joint" attention where a number of people attend to the same object or location. Pixabay

 As more and more people use video conferencing tools to stay connected in social distancing times, neuroscientists from Florida Atlantic University have found that a person’s gaze is altered during tele-communication if they think that the person on the other end of the conversation can see them.

The phenomenon known as “gaze cueing,” a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of “shared” or “joint” attention where a number of people attend to the same object or location.

“Because gaze direction conveys so much socially relevant information, one’s own gaze behaviour is likely to be affected by whether one’s eyes are visible to a speaker,” said Elan Barenholtz, associate professor of psychology. For example, people may intend to signal that they are paying more attention to a speaker by fixating their face or eyes during a conversation.

Please Follow NewsGram on Twiiter To Get Latest Updates From All Around The World!

“Conversely, extended eye contact also can be perceived as aggressive and therefore noticing one’s eyes could lead to reduced direct fixation of another’s face or eyes. Indeed, people engage in avoidant eye movements by periodically breaking and reforming eye contact during conversations,” explained Barenholtz.

People are very sensitive to the gaze direction of others and even two-day-old infants prefer faces where the eyes are looking directly back at them. Social distancing across the globe due to coronavirus (COVID-19) has created the need to conduct business “virtually” using Skype, web conferencing, FaceTime and any other means available.

For the study, published in the journal Attention, Perception & Psychophysics, the team compared fixation behaviour in 173 participants under two conditions: one in which the participants believed they were engaging in a real-time interaction and one in which they knew they were watching a pre-recorded

The researchers wanted to know if face fixation would increase in the real-time condition based on the social expectation of facing one’s speaker in order to get attention or if it would lead to greater face avoidance, based on social norms as well as the cognitive demands of encoding the conversation.

Online, Webinar, Teacher, Conferencing, Tutor, Video
As more and more people use video conferencing tools to stay connected in social distancing times, neuroscientists from Florida Atlantic University have found that a person’s gaze is altered during tele-communication if they think that the person on the other end of the conversation can see them. Pixabay

Results showed that participants fixated on the whole face in the real-time condition and significantly less in the pre-recorded condition. In the pre-recorded condition, time spent fixating on the mouth was significantly greater compared to the real-time condition. There were no significant differences in time spent fixating on the eyes between the real-time and the pre-recorded conditions. To simulate a live interaction, the researchers convinced participants that they were engaging in a real-time, two-way video interaction (it was actually pre-recorded).

ALSO READ: “Coronavirus Lockdown Will Teach People Many important Lessons About Life”, Says Actor Aparshakti Khurana

When the face was fixated, attention was directed toward the mouth for the greater percentage of time in the pre-recorded condition versus the real-time condition. “Given that encoding and memory have been found to be optimized by fixating the mouth, which was reduced overall in the real-time condition, this suggests that people do not fully optimize for speech encoding in a live interaction,” the authors wrote. (IANS)