H2020 Project CENTAURO
CENTAURO – Robust Mobility and Dexterous Manipulation in Disaster Response by Fullbody Telepresence in a Centaur-like Robot
Summary
Disaster scenarios, like the Fukushima nuclear accident, clearly showed that the capabilities of disaster response robots were not sufficient for providing the needed support to rescue workers.
The CENTAURO project aimed at development of a human-robot symbiotic system where a human operator is telepresent with its whole body in a Centaur-like robot, which is capable of robust locomotion and dexterous manipulation in the rough terrain and austere conditions characteristic of disasters. The Centauro robot consists of a four-legged basis and an anthropomorphic upper body and is driven by lightweight, compliant actuators. It is able to navigate in affected man-made environments, including the inside of buildings and stairs, which are cluttered with debris and partially collapsed.
The CENTAURO system is capable of using unmodified human tools for solving complex bimanual manipulation tasks, such as connecting a hose or opening a valve, in order to relieve the situation. A human operator controls the robot intuitively using a full-body telepresence suit that provides visual, auditory, and upper body haptic feedback. Rich sensors provide the necessary situation awareness. Robot percepts and suggested actions are displayed to the operator with augmented reality techniques.
For routine manipulation and navigation tasks, autonomous robot skills have been developed. This allows for taking the operator partially out of the control loop, which is necessary to cope with communication latencies and bandwidth limitations and to reduce the operator workload. A series of increasingly complex tests with corresponding evaluation criteria was devised from end-user requirements to systematically benchmark the capabilities of the developed disaster response system.
Centauro robot at the Evaluation Camp @ Kerntechnische Hilfsdienst GmbH, Nov. 2017
Acknowledgement
This project has received funding from the European Union's Horizon 2020 Programme under Grant Agreement 644839 (ICT-23-2014 Robotics).