Robotic

Machine learning for intuitive robots that are aware of their environment

• Researchers are working to improve the tactile sensitivity of robots to facilitate more intuitive human-machine interaction. It may soon be possible to instruct robots simply by touching them.
• Robots equipped with high-resolution sensors and controlled by a new algorithm can be guided like smartphones with simple tactile gestures.
• Another algorithm developed by the University of Hertfordshire enables robots to independently adapt to their environments and to make decisions without requiring specific training.
Read the article
Researchers have developed a kirigami-inspired mechanical computer with no electronic components.

IoT and soft robotics: is mechanical computing making a comeback?

Read the article
Soft Robotics Lab - ETH Zürich (lab head: Prof. Robert Katzschmann (not in the picture). From left to right: Jose Greminger (Master student), Pablo Paniagua (Master student), Jakob Schreiner (visiting PhD student), Aiste Balciunaite (PhD student), Miriam Filippi (Established researcher), and Asia Badolato (PhD student).

“Biohybrid robotics needs an ethical compass”

Read the article
Soft Robotics Lab – ETH Zürich (lab head: Prof. Robert Katzschmann (not in the picture). From left to right: Jose Greminger (Master student), Pablo Paniagua (Master student), Jakob Schreiner (visiting PhD student), Aiste Balciunaite (PhD student), Miriam Filippi (Established researcher), and Asia Badolato (PhD student).

When will we see living robots? The challenges facing biohybrid robotics

Read the article

Artificial pollination: robotic solutions that aim to supplement the work of bees

Read the article

Biomimetics: can robots outperform animals?

Read the article

Autonomous cars: the five levels of autonomy

Watch the video