Tuesday March 31, 2020
Home Lead Story Intel to Unve...

Intel to Unveil its Latest and Most Powerful Neuromorphic Research System With Capacity of 100mn Neurons

Intel's neuromorphic systems, such as Pohoiki Springs, are still in the research phase and are not intended to replace conventional computing systems

0
//
Intel
Intel researchers believe the extreme parallelism and asynchronous signalling of neuromorphic systems may provide significant performance gains at dramatically reduced power levels compared with the most advanced conventional computers available today. Wikimedia Commons

Chipmaker Intel has announced the readiness of Pohoiki Springs, its latest and most powerful neuromorphic research system providing the computational capacity of 100 million neurons.

The cloud-based system will be made available to members of the Intel Neuromorphic Research Community (INRC), extending their neuromorphic work to solve larger, more complex problems, the company said on Wednesday.

Intel researchers believe the extreme parallelism and asynchronous signalling of neuromorphic systems may provide significant performance gains at dramatically reduced power levels compared with the most advanced conventional computers available today.

“Pohoiki Springs scales up our Loihi neuromorphic research chip by more than 750 times, while operating at a power level of under 500 watts,” Mike Davies, Director of Intel’s Neuromorphic Computing Lab, said in a statement. “The system enables our research partners to explore ways to accelerate workloads that run slowly today on conventional architectures, including high-performance computing (HPC) systems,” Davies added.

Pohoiki Springs is a data centre rack-mounted system and is Intel’s largest neuromorphic computing system developed to date. It integrates 768 Loihi neuromorphic research chips inside a chassis the size of five standard servers. Loihi processors take inspiration from the human brain. Like the brain, Loihi can process certain demanding workloads up to 1,000 times faster and 10,000 times more efficiently than conventional processors.

Pohoiki Springs is the next step in scaling this architecture to assess its potential to solve not just artificial intelligence (AI) problems, but a wide range of computationally difficult problems. In the natural world even some of the smallest living organisms can solve remarkably hard computational problems. Many insects, for example, can visually track objects and navigate and avoid obstacles in real time, despite having brains with well under one million neurons.

Intel
Chipmaker Intel has announced the readiness of Pohoiki Springs, its latest and most powerful neuromorphic research system providing the computational capacity of 100 million neurons. Wikimedia Commons

With 100 million neurons, Pohoiki Springs increases Loihi’s neural capacity to the size of a small mammal brain, a major step on the path to supporting much larger and more sophisticated neuromorphic workloads. The system lays the foundation for an autonomous, connected future, which will require new approaches to real-time, dynamic data processing.

Intel’s neuromorphic systems, such as Pohoiki Springs, are still in the research phase and are not intended to replace conventional computing systems.

ALSO READ: All You Need to Know About Anxiety During Coronavirus Crisis

Instead, they provide a tool for researchers to develop and characterize new neuro-inspired algorithms for real-time processing, problem solving, adaptation and learning. (IANS)

 

Next Story

Know About Where Do Employees Actually Gaze At During Video Calls

For the study, published in the journal Attention, Perception & Psychophysics, the team compared fixation behaviour in 173 participants under two conditions

0
Video Chat
The phenomenon known as "gaze cueing," a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of "shared" or "joint" attention where a number of people attend to the same object or location. Pixabay

 As more and more people use video conferencing tools to stay connected in social distancing times, neuroscientists from Florida Atlantic University have found that a person’s gaze is altered during tele-communication if they think that the person on the other end of the conversation can see them.

The phenomenon known as “gaze cueing,” a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of “shared” or “joint” attention where a number of people attend to the same object or location.

“Because gaze direction conveys so much socially relevant information, one’s own gaze behaviour is likely to be affected by whether one’s eyes are visible to a speaker,” said Elan Barenholtz, associate professor of psychology. For example, people may intend to signal that they are paying more attention to a speaker by fixating their face or eyes during a conversation.

Please Follow NewsGram on Twiiter To Get Latest Updates From All Around The World!

“Conversely, extended eye contact also can be perceived as aggressive and therefore noticing one’s eyes could lead to reduced direct fixation of another’s face or eyes. Indeed, people engage in avoidant eye movements by periodically breaking and reforming eye contact during conversations,” explained Barenholtz.

People are very sensitive to the gaze direction of others and even two-day-old infants prefer faces where the eyes are looking directly back at them. Social distancing across the globe due to coronavirus (COVID-19) has created the need to conduct business “virtually” using Skype, web conferencing, FaceTime and any other means available.

For the study, published in the journal Attention, Perception & Psychophysics, the team compared fixation behaviour in 173 participants under two conditions: one in which the participants believed they were engaging in a real-time interaction and one in which they knew they were watching a pre-recorded

The researchers wanted to know if face fixation would increase in the real-time condition based on the social expectation of facing one’s speaker in order to get attention or if it would lead to greater face avoidance, based on social norms as well as the cognitive demands of encoding the conversation.

Online, Webinar, Teacher, Conferencing, Tutor, Video
As more and more people use video conferencing tools to stay connected in social distancing times, neuroscientists from Florida Atlantic University have found that a person’s gaze is altered during tele-communication if they think that the person on the other end of the conversation can see them. Pixabay

Results showed that participants fixated on the whole face in the real-time condition and significantly less in the pre-recorded condition. In the pre-recorded condition, time spent fixating on the mouth was significantly greater compared to the real-time condition. There were no significant differences in time spent fixating on the eyes between the real-time and the pre-recorded conditions. To simulate a live interaction, the researchers convinced participants that they were engaging in a real-time, two-way video interaction (it was actually pre-recorded).

ALSO READ: “Coronavirus Lockdown Will Teach People Many important Lessons About Life”, Says Actor Aparshakti Khurana

When the face was fixated, attention was directed toward the mouth for the greater percentage of time in the pre-recorded condition versus the real-time condition. “Given that encoding and memory have been found to be optimized by fixating the mouth, which was reduced overall in the real-time condition, this suggests that people do not fully optimize for speech encoding in a live interaction,” the authors wrote. (IANS)