Tuesday June 18, 2019
Home Lead Story Can Doctors B...

Can Doctors Become Better With The Help Of Artificial Intelligence

The research now is on breast cancer, but doctors predict artificial intelligence will eventually make a difference in all forms of cancer and beyond.

0
//
Liver Cancer, Cancer, Artificial Intelligece
A high-magnification image from a 2012 glioblastoma case is seen as an example in this College of American Pathologists image released from Northfield. VOA

Teacher Rishi Rawat has one student who is not human, but a machine.

Lessons take place at a lab inside the University of Southern California’s (USC) Clinical Science Center in Los Angeles, where Rawat teaches artificial intelligence, or AI.

To help the machine learn, Rawat feeds the computer samples of cancer cells.

“They’re like a computer brain, and you can put the data into them and they will learn the patterns and the pattern recognition that’s important to making decisions,” he explained.

AI may soon be a useful tool in health care and allow doctors to understand biology and diagnose disease in ways that were never humanly possible.

Cell Pattern, Artificial Intelligence
Artificial intelligence through machine learning can detect complex patterns in cell arrangement that would be difficult for humans to recognize. VOA

Doctors not going away

“Machines are not going to take the place of doctors. Computers will not treat patients, but they will help make certain decisions and look for things that the human brain can’t recognize these patterns by itself,” said David Agus, USC’s professor of medicine and biomedical engineering, director at the Lawrence J. Ellison Institute for Transformative Medicine, and director at the university’s Center for Applied Molecular Medicine.

Rawat is part of a team of interdisciplinary scientists at USC who are researching how Artificial Intelligence and machine learning can identify complex patterns in cells and more accurately identify specific types of breast cancer tumors.

Once a confirmed cancerous tumor is removed, doctors still have to treat the patient to reduce the risk of recurrence. The type of treatment depends on the type of cancer and whether the tumor is driven by estrogen. Currently, pathologists would take a thin piece of tissue, put it on a slide, and stain with color to better see the cells.

“What the pathologist has to do is to count what percentage of the cells are brown and what percentage are not,” said Dan Ruderman, a physicist who is also assistant professor of research medicine at USC.

health, artificial Intelligence
Health would also not predict wealth as effectively as it does overall adoption and future readiness. Pixabay

The process could take days or even longer. Scientists say artificial intelligence can do something better than just count cells. Through machine learning, it can recognize complicated patterns on how the cells are arranged, with the hope, in the near future of making a quick and more reliable diagnosis that is free of human error.

“Are they disordered? Are they in a regular spacing? What’s going on exactly with the arrangement of the cells in the tissue,” described Ruderman of the types of patterns a machine can detect.

“We could do this instantaneously for almost no cost in the developing world,” Agus said.

Computing power improves

Scientists say the time is ripe for the marriage between computer science and cancer research.

“All of a sudden, we have the computing power to really do it in real time. We have the ability of scanning a slide to high enough resolution so that the computer can see every little feature of the cancer. So it’s a convergence of technology. We couldn’t have done this, we didn’t have the computing power to do this several years ago,” Agus said.

Cell Pattern, artificial Intelligence
High resolution slide scanners plus stronger computer power allows for the possibility for AI to help doctors more accurately figure out the subtype of breast cancer a patient has. VOA

Data is key to having a machine effectively do its job in medicine.

“Once you start to pool together tens and hundreds of thousands of patients and that data, you can actually [have] remarkable new insight, and so AI and machine learning is allowing that. It’s enabling us to go to the next level in medicine and really take that art to new heights,” Agus said.

Also Read: Researchers Develop Nano Technology That Offers Hope For Better Cancer Testing

Back at the lab, Rawat is not only feeding the computer more cell samples, he also designs and writes code to ensure that the algorithm has the ability to learn features unique to cancer cells.

The research now is on breast cancer, but doctors predict artificial intelligence will eventually make a difference in all forms of cancer and beyond. (VOA)

Next Story

Researchers Teaching Artificial Intelligence to Connect Senses Like Vision and Touch

The new AI-based system can create realistic tactile signals from visual inputs

0
Tool, Humans, Robots
Members of that same MIT team applied the new algorithm to the BMW factory floor experiments and found that instead of freezing in place, the robot simply rolled on . Pixabay

A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.

While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.

Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.

The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.

Teaching, Artificial Intelligence, Researchers
) A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI). Pixabay

In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.

“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

“By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings,” Li added.

The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.

Also Read- G20 Environment Ministers Agree to Tackle Marine Plastic Waste

Using a simple web camera, the team recorded nearly 200 objects, such as tools, household products, fabrics, and more, being touched more than 12,000 times.

Breaking those 12,000 video clips down into static frames, the team compiled “VisGel,” a dataset of more than three million visual/tactile-paired images.

“Bringing these two senses (vision and touch) together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects,” said Li.

The current dataset only has examples of interactions in a controlled environment.

Teaching, Artificial Intelligence, Researchers
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals. Pixabay

The team hopes to improve this by collecting data in more unstructured areas, or by using a new MIT-designed tactile glove, to better increase the size and diversity of the dataset.

“This is the first method that can convincingly translate between visual and touch signals”, said Andrew Owens, a post-doc at the University of California at Berkeley.

Also Read- Scholarship Scam: How Officials, Institutions, Banks Deprive Poor Students to Pursue Basic Education?

The team is set to present the findings next week at the “Conference on Computer Vision and Pattern Recognition” in Long Beach, California. (IANS)