ESATelerobotics & Haptics LaboratorySpace EngineeringMechanical
   
About us
What we doHow we workContact
Research
Overview
Projects
ProjectsMETERONH-Haptics
 
 
 
Bookmark and Share
 
 
 
 
 
printer friendly page
Overview
 
Mechatronic devices for Human-Optimized Haptic Interaction
 
EXARM Exoskeleton
The EXARM exoskeleton is a master device with an ergonomic kinematics structure. It features 16 degrees of freedom. Each joint is measured with angular sensors and the system can follow motions of the right human arm when dressed on. The EXARM allows nearly unconstrained natural motion because its joints do not require alignment with the human arm joints when worn. The EXARM was the first ergonomic exoskeleton prototype and is still used for research on kinematics and mapping strategies for control of a variety of robotic systems. It is simple to use, since no motors for force-feedback are included.
SAM Exoskeleton
The SAM (Sensory Arm Master) Exoskeleton master device is our benchmark design prototype, featuring 7 joints that require alignment the human arm joints, when worn. The SAM is a fully actuated force feedback haptic device, featuring position and torque sensors on each joint. Actuators are brushed DC motors with planetary gears and capstan reducers. The devices is part of a bilateral control setup that consists of the SAM, it’s bilateral controller and a virtual world simulation with collision detection and contact dynamics simulation. The SAM has been co-developed with an industrial consortium consisting of the Univeristy of Brussels, SAS and Micromega.
X-Arm-2 Exoskeleton
The X-Arm-2 Exoskeleton master device is an ergonomic, fully-actuated haptic master device for force reflection to the right arm of its operators. The haptic exoskeleton features 14 joints that are equipped with joint position sensors. Out of the 14 joints, 8 are additionally equipped with custom built high resolution torque sensors, motor-gear units and input-/output position sensors. The actuators feature innovative combinations of brushed DC motors with planetary gears and capstan reducers and for some joints inlcude Bowden Cable systems. All actuator drive-trains, including those with Bowden Cables have been designed and customized for allowing highly transparent force-reflection. All joints of the exoskeleton feature a high dynamic range and allow for crisp force feedback through a 5 kHz joint-torque control rate and low sensor noise. The X-Arm-2 is used for our research on under-actuated haptic devices and is part of a bilateral control work cell consisting of the exoskeleton, it’s controller and a 7 degrees of freedom KUKA lightweight manipulator (LWR).
Cybergrasp™ Unleashed
The Cybergrasp Unleashed is a customized version of the commercially available Cyberglove-/CybergraspTM system provided by Cyberglove Systems Inc. (http://www.cyberglovesystems.com) Our self-made customized version features higher control bandwidth of up to 2 kHz, improved computing capabilities (including seamless integration with Matlab/Simulink) and a fast network interface. Moreover, the system is integrated into one small and compact case that includes a PC, a data glove interface, motor drivers and all interface electronics. As such, the Cybergrasp™ Unleashed can be used flexibly in a variety of bilateral control setups. Currently, we use the system to control a DLR HIT Robotic Hand with force reflection. But through its simple interface, it can be also operated easily with other grippers or virtual scenes with collision detection and contact dynamics simulation.
Generic Hand Grasp Master
Currently we are building a novel haptic interface for the human hand and fingers that is targeted at allowing simple and intuitive control of a variety of robotic hands with force feedback. Details of this device will be provided once a patent filing process is finalized.
 
 
Human-optimized human-robot-interaction
 
Underactuation in Haptics
We perform research on how haptic devices can benefit from under-actuation concepts to increase device power-density and specific power. Since exoskeletons can form parallel kinematics loops with the human arm, when worn, the loop properties can be exploited to reduce mass of the device by removing certain actuators. Underactuation in haptics research entails simulation of the dynamic interaction between wearable haptic devices and the human body segments.
Human Modelling & Human Centric Design
Human Limb & Segment Modelling is important to better understand the requirements for haptic device design. This is especially applicable to the design of human-centric haptic devices, that are aimed at fulfilling the needs of good workspace coverage, smooth interaction, and robustness to variability between operators. Moreover, modelling of the human motion apparatus allows the verification of novel ‘ergonomic’ haptic device concepts already in simulation, thereby lowering cost of development. The modelling allows to increase the in-depth understanding of the fundamentals of human movement, human biomechanics and human/device interaction. Modelling activities concentrate on the upper-limb and the human hand.
Teleoperation 3D Vision
In order to maximise the benefit of 3D remote vision, a geometrically correct image needs to be provided to the human operator from the remote work-site. Camera systems as well as the display system need to be optimized for low depth distortion and scaling. This way, intuitive depth perception can allow to good situational awareness and accurate interpretation of the situation at the remote work site. Research focuses on techniques for realistic display in a variety of environments (screens, displays, HMD’s) and on the acceptance and performance analysis of human operators while performing remote operations.
Configuration Space Control & Kinematics
We enable intuitive remote operations in 6D space with exoskeletons and redundant robot manipulators. This way, advantages of redundant manipulators can be paired with the high intuitiveness of the exoskeleton control interfaces. Exoskeletons enable robust real-time control of manipulator pose and configuration. We research on optimizing tracking algorithms and automatically optimizing worcell configurations for smooth real-time tracking of human operators, singularities and joint limit avoidance and continuous matching of robot configuration to the human arm configuration. This way, seemless trajectories can be tracked in real-time by a redundant manipulator, without interruptions. We furthermore research, which aspects of remote control are important for successful task executions in a variety of workcell setups.

 
 
Space project demonstrators
 
METERON - NASA R2A Telemanipulation
In order to work towards standardizing communications interfaces between a variety of robotic systems in Europe and the U.S., and in order to collaborate on the topic of telerobotics, we have started a collaboration with a NASA team consisting of researchers from NASA JSC, ARC and JPL. By means of a customized communications middleware (RAPID) we have implemented the remote control of Robonaut from the ESA exoskeleton. First experiments have been done between ESTEC (Noordwijk) and NASA JSC (Houston) to uni-lateral control. With the new comm’s middleware, various robotic systems of ESA and NASA can be integrated easily.
METERON - Low Bandwidth Teleoperation Vision If good situational awareness is required from a remote site through a narrow bandwidth link, dedicated technologies are required to transmit real-time video streams in good quality. In METERON, only 96 kbps are available for real-time video streaming from ground to the International Space Station. This requires the use of augmented video technology, in which local overlay images on the receiving side are displayed on top of highly compressed video. These overlay features allow the human remote operator to better perceive the geometry and situation at the remote site. Our tools and viewers are implemented to support such overlay images for known environments. Research is focussing on how to generate virtual model information optimally from unknown environments.
METERON – 3D Augmented Reality
In this demonstration setup, the X-Arm-2 exoskeleton controls a KUKA lightweight robot for interaction with a task board. 3D augmented reality is be used to augment a remote operation setup with critical system information. For instance contract force and torque information from a robot interacting with an environment can be displayed over the real work-cell to increase operator awareness of magnitutes of forces being excerted. This information feedback can improve robustness and safety of haptic teleoperation. 3D augmented reality is a tool to enhance operator performance in remote task execution.

 
 
Tools & mechatronic haptic devices
 
SPAN Toolchain
The SPAN (Space Portable Applications Network) Toolchain consists of a variety of tools to allow fast and efficient development of robotics applications. The SPAN consists of a viewer tool SPANviewer, a communications library SPANcomm, a set of mathematics functions SPANtoolbox and currently research is still ongoing to complement the SPAN with an extensive and fast set of dynamics libraries SPANdyn. The ESA SPAN Toolkit contains also several real-time targets designed to allow agile code development with a range of real-time operating systems, including Xenomai Linux, RT-PREEMPT Linux and RTEMS. The ESA SPAN Targets are all integrated into a high-level code generation framework to accelerate the development process for robotics applications. The targets include support of several communications interfaces, such as DDS, EtherCAT and DTN.
RoboWebUI
The RoboWebUI (Robotics Web User Interface) is a cross-platform web toolkit allowing to monitor data from and to send data and commands to a robotic device running on a real-time target. The RoboWebUI makes use of the latest mobile device application development frameworks (phonegap) to allow quick an efficient development of a variety of web-based user interfaces for robotics through html, css and javascript. On the real-time target side, the RoboWebUI relies on automatic code generation for generation of the interfaces to the web-server. The RoboWebUI web-host as well as real-time thread is flexible and modular and therefore allows to run on a variety of target platforms such as e.g. Linux or Windows.
1 DOF Setup
The 1 DOF (Degree Of Freedom) Setup is a mechatronic test setup to analyse various controllers for bilateral control applications. The setup consists of an embedded computing platform and two single joint units (master and slave) that are each equipped with a brushless DC motor, encoder, potentiometer and load-side torque sensor. The 1 DOF setup is used by students and researchers for research in control algorithms for bi-lateral control. The setup can be adjusted to host a range of motors and motor controllers as well as for performing sensor and other hardware identification experiments.

 
 
Last update: 12 October 2012
 


 
 
 
   Copyright 2000 - 2012 © European Space Agency. All rights reserved.