Neuronal systems process visual cues using analog electronics.
The study was performed by Tobias Seidl, Christos Ampatzis and Kevin de Groote (Advanced Concepts Team) in cooperation with Vasco Medici, Giacomo Indiveri and Stefan Ammann (University of Zürich and ETH Zürich, Switzerland), Garrick Orchard (Johns Hopkins University, US) and Steven N. Fry (Rhine-Waal University of Applied Sciences, Germany).
The goals of the study were to design and simulate a neuromorphic analog VLSI optic flow sensor and use its output signals to land a space vehicle by means of insect-inspired biomimetic controllers.
We designed a neuromorphic vision sensor and simulated the operation of its circuits at different levels, ranging from (low) transistor to (high) behavioural levels, in realistic planetary landing scenarios. At the lowest level we carried out SPICE simulations to characterize the response properties of the sensor's individual blocks; at the intermediate level, we used transistor equations to derive analytically the individual circuit block behaviour and carry out full-chip simulations, by providing realistic inputs from planetary landing scenarios and at third level we implemented a Simulink model of the sensor which reproduces its working principles.
The encouraging results we obtained pushed us toward testing the sensing and control strategies on a robotic platform. We are now designing a hybrid reality/simulation framework, in collaboration with the D'Andrea lab at the Institute for Dynamic Systems and Control, ETH Zürich. A real quadrotor (X3D, Ascending Technologies, Germany) will be flown and its position tracked. The visual environment and the sensory feedback will be simulated and used to steer the robot in real-time.
Publications: Orchard, G., Bartolozzi, C. and Indiveri, G., “Applying neuromorphic vision sensors to planetary landing tasks”, IEEE Biomedical Circuits and Systems Conference, BioCAS, 2009.
See the Final report online
The study was performed by Tobias Seidl, Christos Ampatzis and Kevin de Groote (Advanced Concepts Team) in cooperation with Florent Valette, Franck Ruffier, Stéphane Viollet (Institut des sciences du mouvement, Marseille, France).
Flying insects excellently master visual motion sensing techniques coping with highly parallel data at low energetic. Results from neurophysiolgical, behavioural, and biorobotic studies on insect flight control were used to attempt safe landing of a spacecraft on Moon in a realistic simulation environment.
Autonomous landing on unknown extraterrestrial bodies requires fast but noise-tolerant motion processing to elicit correct steering commands. Flying insects excellently master visual motion sensing techniques coping with highly parallel data at low energetic cost, by employing dedicated motion processing circuits. Biomimetic optic flow-based control strategies for automatic landing were simulated in a very realistic environment of the Moon. Visual information were provided by PANGU software and used to regulate the optic flow (OF) generated during the landing phase of a two degree of freedom spacecraft.
It was shown that automatic lunar landing is suitable using neuromorphic OF measurement without any need for measuring speeds and altitude. In addition, the suggested control system is:
We have shown that the OF regulator aims at holding the perceived OF close to a previously chosen set point by acting upon the mean thruster force of the lander. As a result, it was shown that the lander’s ground speed and vertical speed decrease automatically all along the landing phase thank to the optic flow regulation. Thus, the biomimetic autopilot presented here allows the lander to reach low gate at low forward and descent speeds.
Publications: Valette, F., Ruffier, F., Viollet, S., Seidl, T., Submitted
See the Final report online