Thursday July 18, 2019
Home Lead Story Deep CEE: AI ...

Deep CEE: AI Learning Model Helping Astronomers Identify Galaxy Clusters Quickly

"Data mining techniques such as deep learning will help us to analyse the enormous outputs of modern telescopes" said John Stott from Lancaster University

0
//
galaxy clusters, artificial intelligence
Galaxy clusters represent the most extreme environments that galaxies can live in and studying them can help us better understand dark matter and dark energy. Wikimedia Commons

Researchers have developed an Artificial Intelligence (AI)-powered tool that has been trained to “look” at colour images and identify galaxy clusters quickly.

The “Deep-CEE” – Deep Learning for Galaxy Cluster Extraction and Evaluation – model is based on neural networks, which are designed to mimic the way a human brain learns to recognise objects by activating specific neurons when visualising distinctive patterns and colours.

Matthew Chan, a PhD student at Lancaster University in Britain trained the AI by repeatedly showing it examples of known, labelled objects in images until the algorithm is able to learn to associate objects on its own. Then the researchers ran a pilot study to test the algorithm’s ability to identify and classify galaxy clusters in images that contain many other astronomical objects.

galaxy clusters, artificial intelligence
Chandra Six Galaxy Cluster (X-rays). Wikimedia Commons

“Data mining techniques such as deep learning will help us to analyse the enormous outputs of modern telescopes” said John Stott from Lancaster University. “We expect our method to find thousands of clusters never seen before by science,” Stott said.

Galaxy clusters represent the most extreme environments that galaxies can live in and studying them can help us better understand dark matter and dark energy. New state-of-the-art telescopes have enabled astronomers to observe wider and deeper than ever before, such as studying the large-scale structure of the universe and mapping its vast undiscovered content.

By automating the discovery process, scientists can quickly scan sets of images, and return precise predictions with minimal human interaction.

Artificial intelligence, galaxy clusters
Matthew Chan, a PhD student at Lancaster University in Britain trained the AI by repeatedly showing it examples of known, labelled objects in images until the algorithm is able to learn to associate objects on its own. Pixabay

This will be essential for analysing data in future. The upcoming Large Synoptic Survey telescope (LSST) sky survey (due to come online in 2021) will image the skies of the entire southern hemisphere, generating an estimated 15 TB of data every night. “We have successfully applied Deep-CEE to the Sloan Digital Sky Survey,” said Chan.

ALSO READ: Artificial Intelligence to Provide Weather Forecast, Soil Moisture Information to Farmers

“Ultimately, we will run our model on revolutionary surveys such as the LSST that will probe wider and deeper into regions of the Universe never before explored,” Chan added. The study was presented at the Royal Astronomical Society’s National Astronomy meeting at Lancaster University. (IANS)

Next Story

Artificial Intelligence to Play a Critical Role in Diagnosing Breast Cancer Quickly

"We had about 80 per cent accuracy rate. We will continue to refine the algorithm by using more real-world images as inputs,” Oberai said

0
Cancer, Patients, Invasive
The treatments kill healthy cells as well as cancerous ones, and the side effects are legendary. Pixabay

Breast ultrasound elastography is an emerging imaging technique that provides information about a potential breast lesion and researchers have identified the critical role AI can play in making this technique more efficient and accurate.

Using more precise information about the characteristics of a cancerous versus non-cancerous breast lesion, this methodology using Artificial Intelligence (AI) has demonstrated more accuracy compared to traditional modes of imaging.

In the study published in the journal Computer Methods in Applied Mechanics and Engineering, Indian-origin researchers Dhruv Patel and Assad Oberai from the University of Southern California showed that it is possible to train a machine to interpret real-world images using synthetic data and streamline the steps to diagnosis.

In the case of breast ultrasound elastography, once an image of the affected area is taken, it is analysed to determine displacements inside the tissue. Using this data and the physical laws of mechanics, the spatial distribution of mechanical properties, like its stiffness, is determined.

In the study, researchers sought to determine if they could skip the most complicated steps of this workflow.

Cancer
Cancer Ribbon. Pixabay

For this, the researchers used about 12,000 synthetic images to train their Machine Learning algorithm. This process was similar to how photo identification software works, i.e learning through repeated inputs on how to recognize a particular person in an image, or how our brain learns to classify a cat versus a dog.

Through enough examples, the algorithm was able to glean different features inherent to a benign tumour versus a malignant tumour and make the correct determination.

Also Read- Over 16 Million Accounts of Indian Influencers on Instagram are Fake

The researchers achieved nearly 100 per cent classification accuracy on synthetic images. Once the algorithm was trained, they tested it on real-world images to determine how accurate it could be in providing a diagnosis, measuring these results against biopsy-confirmed diagnoses associated with these images.

“We had about 80 per cent accuracy rate. We will continue to refine the algorithm by using more real-world images as inputs,” Oberai said. (IANS)