New Delhi:The Union Cabinet was informed of an agreement between India and Russia on Wednesday. The India-Russia agreement states to provide competitive research grants to researchers.
The agreement provides grants to researchers of both the nations for joint implementation of research projects in areas of basic and exploratory sciences.
The agreement, signed in May 2015, is valid for a period of six years and could be extended by mutual consent between Department of Science and Technology(DST) and Russian Science Federation (RSF), an official statement said.
“This competition would be conducted in the areas of mathematics, computer and system science, physics and space science, chemistry and material science, biology and life science, basic research for medicine, agricultural science, earth science and engineering science,” it said.
The Cabinet meeting was chaired by Prime Minister Narendra Modi.
The Decision to identify research projects for funding would be taken jointly by the DST and RSF, it added. Russia has been a key strategic partner of India since the Soviet times, with this, both countries take their bilateral relations in the area of science and technology further.
Russia is one the biggest arms exporter to India. This agreement also is a boost for PM’s Make in India campaign.(IANS)
A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that can learn to see by touching and to feel by seeing.
While our sense of touch gives us capabilities to feel the physical world, our eyes help us understand the full picture of these tactile signals.
Robots, however, that have been programmed to see or feel can’t use these signals quite as interchangeably.
The new AI-based system can create realistic tactile signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.
In the future, this could help with a more harmonious relationship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-robot integration in an assistive or manufacturing setting.
“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, said Yunzhu Li, PhD student and lead author from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).
“By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings,” Li added.
The team used a KUKA robot arm with a special tactile sensor called GelSight, designed by another group at MIT.