FB Developed Wrist Based Wearable Which Translates Neuro Signals Into Action

FB Developed Wrist Based Wearable Which Translates Neuro Signals Into Action

Facebook has developed a prototype of a wrist-based wearable for augmented reality (AR) that won't force users to choose between interacting with devices and the world around them. The Facebook Reality Lab (FRL) is developing natural, intuitive ways to interact with always-available AR glasses because it believes this will transform the way we connect with people near and far.

"We are taking a closer look at a version that may be possible much sooner: wrist-based input combined with usable but limited contextualized AI, which dynamically adapts to you and your environment," Facebook said in a statement late on Thursday.

Follow NewsGram on Instagram to keep yourself updated.

The company aid that it will address some groundbreaking work in soft robotics to build comfortable, all-day wearable devices and give an update on its haptic glove research later this year. A wrist-based wearable has the additional benefit of easily serving as a platform for computing, battery, and antennas while supporting a broad array of sensors.

"The missing piece was finding a clear path to rich input, and a potentially ideal solution as EMG (electromyography) that uses sensors to translate electrical motor nerve signals that travel through the wrist to the hand into digital commands that you can use to control the functions of a device," the FRL team explained.

EMG will eventually progress to richer controls. Unsplash

These signals let you communicate crisp one-bit commands to your device, a degree of control that's highly personalizable and adaptable to many situations. The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter. That means input can be effortless. Ultimately, it may even be possible to sense just the intention to move a finger.

"What we're trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles," said Thomas Reardon, FRL Director of Neuromotor Interfaces.

EMG will eventually progress to richer controls. In AR, you'll be able to actually touch and move virtual user interfaces (UIs) and objects, as you can see in this demo video. "You'll also be able to control virtual objects at a distance. It's sort of like having a superpower like the Force," the company said. (IANS/SP)

Related Stories

No stories found.
logo
NewsGram
www.newsgram.com