If you are struggling with data to make charts and power points or you are at the airport trying to make use of a long layover or simply worked up about maintenance of your cars, then relax, good news is on its way. Artificial Intelligence (AI) driven apps will soon be taking care of all these allowing you to enjoy other things in life.
A “sneak” into the future of digital solutions at the Adobe Summit 2019 presented by popular and witty Mindy Kaling, where ‘data unbound’, an AI- based document handling feature won hands down in an open voting. It is one of the most awaited features.
Presented by Sana Malik of Adobe Research, ‘data unbound’ analyses the data, draws the summary and converts it into charts and graphs for web pages and power points using Adobe Sensei and data analytics.
Another futuristic innovation presented was airline mobile app with augmented reality (AR) feature carrying interactive terminal map enabling the customer to stores and pick up products as AR objects in 3D.
Adobe’s automobile app will track the statistics about your car and keep you informed about its health for taking corrective measures in time and avoiding breakdowns.
In a ray of hope for those who have to go for breast cancer screening and even for healthy women who get false alarms during digital mammography, an Artificial Intelligence (AI)-based Google model has left radiologists behind in spotting breast cancer by just scanning the X-ray results.
Reading mammograms is a difficult task, even for experts, and can often result in both false positives and false negatives.
In turn, these inaccuracies can lead to delays in detection and treatment, unnecessary stress for patients and a higher workload for radiologists who are already in short supply, Google said in a blog post on Wednesday.
Google’s AI model spotted breast cancer in de-identified screening mammograms (where identifiable information has been removed) with greater accuracy, fewer false positives and fewer false negatives than experts.
“This sets the stage for future applications where the model could potentially support radiologists performing breast cancer screenings,” said Shravya Shetty, Technical Lead, Google Health.
Digital mammography or X-ray imaging of the breast, is the most common method to screen for breast cancer, with over 42 million exams performed each year in the US and the UK combined.
“But despite the wide usage of digital mammography, spotting and diagnosing breast cancer early remains a challenge,” said Daniel Tse, Product Manager, Google Health.
Together with colleagues at DeepMind, Cancer Research UK Imperial Centre, Northwestern University and Royal Surrey County Hospital, Google set out to see if AI could support radiologists to spot the signs of breast cancer more accurately.
The findings, published in the journal Nature, showed that AI could improve the detection of breast cancer.
Google AI model was trained and tuned on a representative data set comprised of de-identified mammograms from more than 76,000 women in the UK and more than 15,000 women in the US, to see if it could learn to spot signs of breast cancer in the scans.
The model was then evaluated on a separate de-identified data set of more than 25,000 women in the UK and over 3,000 women in the US.
“In this evaluation, our system produced a 5.7 per cent reduction of false positives in the US, and a 1.2 per cent reduction in the UK. It produced a 9.4 per cent reduction in false negatives in the US, and a 2.7 per cent reduction in the UK,” informed Google.
The researchers then trained the AI model only on the data from the women in the UK and then evaluated it on the data set from women in the US.
In this separate experiment, there was a 3.5 per cent reduction in false positives and an 8.1 per cent reduction in false negatives, “showing the model’s potential to generalize to new clinical settings while still performing at a higher level than experts”.
Notably, when making its decisions, the model received less information than human experts did.
The human experts (in line with routine practice) had access to patient histories and prior mammograms, while the model only processed the most recent anonymized mammogram with no extra information.