Monday June 24, 2019

Novel stroke treatment repairs damaged brain tissue

Researchers have developed a new stem-cell based treatment for stroke that reduces brain damage and accelerates the brain's natural healing tendencies

0
//
The new research can reduce the threat of permanent brain damage considerably.
The new research can reduce the threat of permanent brain damage considerably. Wikimedia Commons

Researchers have developed a new stem-cell based treatment for stroke that reduces brain damage and accelerates the brain’s natural healing tendencies.

The treatment called AB126 was developed using extracellular vesicles (EV) — fluid-filled structures known as exosomes — which are generated from human neural stem cells.

“This is truly exciting evidence because exosomes provide a stealth-like characteristic, invisible even to the body’s own defences.

Also Read: Heart Surgery In Infants May Cause Deafness

When packaged with therapeutics, these treatments can actually change cell progression and improve functional recovery,” said Steven Stice, a professor at the University of Georgia in the US who led the research team.

Fully able to cloak itself within the bloodstream, this type of regenerative EV therapy appears to be the most promising in overcoming the limitations of many cells therapies-with the ability for exosomes to carry and deliver multiple doses-as well as the ability to store and administer treatment, the researchers said.

Human clinical trials for the treatment could begin as early as next year, the researchers added.
Human clinical trials for the treatment could begin as early as next year, the researchers added. Wikimedia Commons

Small in size, the tiny tubular shape of an exosome allows EV therapy to cross barriers that cells cannot be said the study published in the journal Translational Stroke Research.

Following the administration of AB126, the researchers used MRI scans to measure brain atrophy rates in preclinical, age-matched stroke models, which showed an approximately 35 percent decrease in the size of injury and 50 percent reduction in brain tissue loss.

Also Read: Father’s Stress Linked To Kids’ Brain Development

“Until now, we had very little evidence specific to neural exosome treatment and the ability to improve motor function. Just days after stroke, we saw better mobility, improved balance and measurable behavioural benefits in treated animal models,” Stice said.

Human clinical trials for the treatment could begin as early as next year, the researchers added. (IANS)

Next Story

Researchers Teaching Artificial Intelligence to Connect Senses Like Vision and Touch

The new AI-based system can create realistic tactile signals from visual inputs

0
Artificial intelligence, road infrastructure
The fully-automated system is based on AI-powered object detection to identify street signs in the freely available images. Pixabay

A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.

While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.

Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.

The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.

Teaching, Artificial Intelligence, Researchers
) A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI). Pixabay

In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.

“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

“By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings,” Li added.

The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.

Also Read- G20 Environment Ministers Agree to Tackle Marine Plastic Waste

Using a simple web camera, the team recorded nearly 200 objects, such as tools, household products, fabrics, and more, being touched more than 12,000 times.

Breaking those 12,000 video clips down into static frames, the team compiled “VisGel,” a dataset of more than three million visual/tactile-paired images.

“Bringing these two senses (vision and touch) together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects,” said Li.

The current dataset only has examples of interactions in a controlled environment.

Teaching, Artificial Intelligence, Researchers
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals. Pixabay

The team hopes to improve this by collecting data in more unstructured areas, or by using a new MIT-designed tactile glove, to better increase the size and diversity of the dataset.

“This is the first method that can convincingly translate between visual and touch signals”, said Andrew Owens, a post-doc at the University of California at Berkeley.

Also Read- Scholarship Scam: How Officials, Institutions, Banks Deprive Poor Students to Pursue Basic Education?

The team is set to present the findings next week at the “Conference on Computer Vision and Pattern Recognition” in Long Beach, California. (IANS)