• Robots equipped with high-resolution sensors and controlled by a new algorithm can be guided like smartphones with simple tactile gestures.
• Another algorithm developed by the University of Hertfordshire enables robots to independently adapt to their environments and to make decisions without requiring specific training.
The drive to build robots that are better adapted to their environments and capable of developing reflexes that are intuitive in nature faces a number of challenges. “There is still a lot of work to be done on perception sensors to make it possible to cover an entire robot with an electronic skin that provides it with a sense of touch,” explains Maged Iskandar, a researcher at the German Aerospace Centre’s Institute for Robotics and Mechatronics. “Electronic skins are fragile, and vulnerable to impacts and high levels of mechanical force. They also require a network of wires throughout the robot.” However, there is a real need to overcome such challenges because improved robotic sensitivity will facilitate more natural and intuitive human-machine interaction “that will enable humans to instruct robots simply by touching them.”
We can guide the robot by touching it with a finger in much the same way that we interact with a smartphone or tablet
A more intuitive human-machine guidance
Maged Iskandar and his team have developed an algorithm which, when deployed in tandem with high-resolution sensors, enables a robotic arm to sense tactile pressure on its surface. “The sensors locate and estimate levels of force, which are then interpreted by the algorithm, making it easier to train the robot. We can thus guide the robot by touching it with a finger in much the same way that we can interact with a smartphone or tablet.” Building on this research, Iskandar aims to extend tactile sensitivity across the entire surface of robots in order to improve human-machine collaboration in factories: “If you want the robot to perform a specific task, you can simply tell it by touching it rather than by programming it, which is a lot more intuitive.”
Autonomous robots that can independently adapt to their environments
At the University of Hertfordshire in the United Kingdom, artificial intelligence professor Daniel Polani has developed a chaos theory inspired algorithm that enables robots to function in a more intuitive manner and make decisions based on the dynamics of their environments. “We talk about intuition, but the technical term is autonomisation,’ explains the researcher. It implies that robots are aware that their actions have an impact on their environments and of their ability to change with regard to it.” When they decide on their own, robots can, for example, right themselves if they fall over. “The idea is that they should be able to adapt to their environments and determine how they should be empowered without any training, because we don’t want build robots with hundreds of thousands of programmes to deal with every possible situation.” To develop an algorithm that enables robots to independently reactivate in the wake of incidents, the researchers studied robotic motivation models that mimic decision-making processes in humans and animals, which are not oriented by specific reward signals.
Researchers will need to overcome many challenges to enable robots to understand their environments
For Daniel Polani, “the algorithm can be further improved, given that it only works in smooth differentiable systems,” that is to say or in soft or smooth environments. “We are currently at the simulation stage. What we need to do next is to implement the algorithm in robots. Robots have an impact on environments in which they operate, and they need to take the effect of this impact into account. It is an approach to robotics that is still in its infancy.”
The researcher also believes that a highly evolved understanding of language will play a key role in the development of robots. “The issue of deep understanding, which exceeds the ability large language models (LLMs) that appear to understand what they are told, is a major challenge for the improvement of robotic systems. These machines need to understand the context of a real environments.”