Researchers have found a molecule that is effective against tuberculosis, says a new study on mice.
In the study, the group describes how it worked when tested in vitro and in a mouse model.
Tuberculosis is a bacterial infection caused by Mycobacterium tuberculosis (Mtb). This airborne pathogen tends to infect the lungs and is passed from person to person.
Back in the 1950s, researchers developed drugs to treat the disease.However, since that time, the bacteria has become resistance today, and almost one-third of all new cases are caused by antimicrobial-resistant strains.
In this new effort, published in the journal Science, the researchers from University of Cape Town in South Africa, found that introducing the 8918 molecule to Mtb in a petri dish resulted in its death.
In addition, they also found that administering the molecule to Mtb mouse models killed some of the bacteria they carried without harming them.
However, before 8918 can be considered a candidate for clinical trials, researchers must overcome one obstacle — it has a short half-life, which results in rapid microsomal metabolism indicating that it does not hang around long enough to kill many Mtb before the body flushes it away, the study noted. (IANS)
A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.
Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.
The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.
In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.
“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).
“By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings,” Li added.
The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.