Tuesday June 18, 2019
Home Lead Story AI Helps Find...

AI Helps Find Source Of Radio Bursts 3 Billion Light Years Away From Earth

The researchers developed the new, powerful machine-learning algorithm and reanalysed the 2017 data, finding an additional 72 bursts not detected originally.

0
//
Space, radio
'AI helps track down mysterious cosmic signals', Pixabay

Scientists say they have used artificial intelligence (AI) to discover 72 new fast radio bursts from a mysterious source about three billion light years away from Earth.

The initiative may advance the search to find signs of intelligent life in the universe, said researchers from the University of California, Berkeley in the US.

Fast radio bursts are bright pulses of radio emission mere milliseconds in duration, thought to originate from distant galaxies.

However, the source of these emissions is still unclear, according to the research published in The Astrophysical Journal.

Theories range from highly magnetised neutron stars blasted by gas streams from a nearby supermassive black hole, to suggestions that the burst properties are consistent with signatures of technology developed by an advanced civilization.

 

earth, radio
While most fast radio bursts are one-offs, the source here, FRB 121102, is unique in emitting repeated bursts. Wikimedia Commons

 

“This work is exciting not just because it helps us understand the dynamic behaviour of fast radio bursts in more detail, but also because of the promise it shows for using machine learning to detect signals missed by classical algorithms,” said Andrew Siemion from the University of California – Berkele.

 

Researchers are also applying the successful machine-learning algorithm to find new kinds of signals that could be coming from extraterrestrial civilisations.

While most fast radio bursts are one-offs, the source here, FRB 121102, is unique in emitting repeated bursts.

This behaviour has drawn the attention of many astronomers hoping to pin down the cause and the extreme physics involved in fast radio bursts.

The AI algorithms dredged up the radio signals from data were recorded over a five-hour period in 2017, by the Green Bank Telescope in West Virginia in the US.

Radio
The researchers developed the new, powerful machine-learning algorithm and reanalysed the 2017 data, finding an additional 72 bursts not detected originally. (IANS)

An earlier analysis of the 400 terabytes of data employed standard computer algorithms to identify 21 bursts during that period.

“All were seen within one hour, suggesting that the source alternates between periods of quiescence and frenzied activity,” said Berkeley postdoctoral researcher Vishal Gajjar.

Also Read: HCL Launches AI Based ‘HCL Turbo’

The researchers developed the new, powerful machine-learning algorithm and reanalysed the 2017 data, finding an additional 72 bursts not detected originally.

This brings the total number of detected bursts from FRB 121102 to around 300 since it was discovered in 2012, researchers said. (IANS)

Next Story

Researchers Teaching Artificial Intelligence to Connect Senses Like Vision and Touch

The new AI-based system can create realistic tactile signals from visual inputs

0
Tool, Humans, Robots
Members of that same MIT team applied the new algorithm to the BMW factory floor experiments and found that instead of freezing in place, the robot simply rolled on . Pixabay

A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.

While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.

Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.

The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.

Teaching, Artificial Intelligence, Researchers
) A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI). Pixabay

In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.

“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

“By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings,” Li added.

The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.

Also Read- G20 Environment Ministers Agree to Tackle Marine Plastic Waste

Using a simple web camera, the team recorded nearly 200 objects, such as tools, household products, fabrics, and more, being touched more than 12,000 times.

Breaking those 12,000 video clips down into static frames, the team compiled “VisGel,” a dataset of more than three million visual/tactile-paired images.

“Bringing these two senses (vision and touch) together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects,” said Li.

The current dataset only has examples of interactions in a controlled environment.

Teaching, Artificial Intelligence, Researchers
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals. Pixabay

The team hopes to improve this by collecting data in more unstructured areas, or by using a new MIT-designed tactile glove, to better increase the size and diversity of the dataset.

“This is the first method that can convincingly translate between visual and touch signals”, said Andrew Owens, a post-doc at the University of California at Berkeley.

Also Read- Scholarship Scam: How Officials, Institutions, Banks Deprive Poor Students to Pursue Basic Education?

The team is set to present the findings next week at the “Conference on Computer Vision and Pattern Recognition” in Long Beach, California. (IANS)