A-DREAMS advanced robotics ground control station

ESA’s work on a ground control station for space robotics operations has recently been extended – telemanipulation and visual perception abilities have been added to the existing teleoperation capabilities.
Due to high cost of sending humans into space, the application of robotics systems is considered an attractive alternative to and/or a complement for future space missions. Over the last 15 years, ESA has devoted considerable effort to the area of space robotics. Concepts, methods and tools have been developed to cover all areas related to high-level robot programming and interaction with space robotics activities. The Distributed Robotics & Automation Environment for Advanced Missions Specification and Supervision (DREAMS) system integrates these results into a ground control station that provides a human operator with the facility to teleoperate robots at a remote site – that is, to request the on-board execution of operations while supervising their evolution.
While the high level robot programming and supervision concept is perfectly adequate to perform space robotic activities in a well structured environment, some situations requires a lower level of interaction with the robot when, for example:
  • an anomaly requires a quick intervention to avoid having to return to the pre-preparation phase
  • a repair activity is required on a complex system, not necessarily well structured or known in advance – as, for example, in the case of Eurobot working on the ISS or during robotic geostationary satellite servicing operations
Therefore, the DREAMS system has been extended to support a telemanipulation operations mode, resulting in the Advanced DREAMS (A-DREAMS) system.

In the telemanipulation mode, the operator extends his manipulation and sensing capability to a remote location using a master device that remotely controls a slave robot located at the operations site.
Telemanipulation subsystem
The telemanipulation subsystem implements bi-lateral telemanipulation control. It consists of:

  • a desktop haptic device, with the associated API, that implements the functionality of tracking the movements of the user and conveying tactile sensations to the user through force
  • the telemanipulation component that implements the following functions:
    • the master and slave robot control function that computes the set-points to be sent to the slave and the master robot controllers to implement bi-lateral telemanipulation control – user defined gesture aids and virtual constraints are also considered
    • the slave and the master robot interface functions that implement the interfaces of the master and slave robot control functions with the targeted master and slave robot controllers to exchange the necessary telecommands and telemetry
    • the gesture aids computation that is used as an additional input to the slave robot control function to guide the operator to apply specific movements to the slave robot
  • the configuration and display function that provides the operator the means to
    • set telemanipulation configuration parameters – some of them updated on-line
    • visualise, on-line, critical, telemanipulation-dedicated telemetry

The design of the telemanipulation subsystem is modular, to allow enhancement by the integration of other types of haptic devices and control algorithms. This is achieved by the separate development and deployment of the real-time executive and the telemanipulation task and action. In this way, the integration of a new master and/or slave robot or of a new algorithm is limited to the design of a new action.
Perception subsystem
When operating in a structured environment that is only partially known and, in addition, evolves independently or because of the robot actions, the ability to reconstruct the real-world environment in three dimensions is mandatory for the safe and successful completion of the robotic operations. The perception subsystem consists of an imaging head (a pair of cameras) and dedicated software for the construction of a 3D-elevation map based on a pair of images.
Last update: 3 February 2011

Related article

 •  The EUROBOT Testbed (http://www.esa.int/TEC/Robotics/SEMUC68LURE_0.html)