Biomimetics
5 Jan 2020

Event-based vision in space

Moon-like landscape observed by a conventional and event camera
Moon-like landscape observed by a conventional and event camera

The human visual system is extremely successful at the task of light perception and image processing. It allows the brain to react with millisecond latency to visual simulation in a broad range of scene illumination intensities, while requiring a fraction of the power that a conventional camera would need. A close study of the eye's retina revealed that it plays a crucial role in achieving such efficiency, hence attempts have been made in constructing an artificial “silicon-retina”. That work has finally led to the development of the event-based cameras.

Event-based cameras challenge the long established paradigm in image sensors of obtaining visual information through recording frames at fixed intervals. Instead, images are recorded asynchronously when pixels independently report a change to their illumination. The number of events depends not on a fixed clock frequency, but on the changes happening in the viewed scene. This reduces the energy consumption, while performing real-time compression of image data and vastly improves the temporal resolution of the areas of the image that contain most information.

Project overview

There are many potential benefits to employing computer vision techniques for guidance during landing. Compared to many of the existing odometry instruments, silicon cameras are light weight and versatile. These factors are particularly important if one intends to operate smaller crafts, where mass constraints become particularly important. The problem with image processing is that it is often a power-hungry, computationally intensive process, which posses a challenge in resource-scarce contexts, such as space exploration. The properties of event cameras offer a way of mitigating all those constraints, if reliable processing pipelines are developed.

This project explored how event cameras could be integrated into a control system of a landing probe. Event data was synthetically generated from images rendered in PANGU (Planet and Asteroid Natural Scene Simulation Utility [2]), a specialized software for rendering objects in space, which were then converted to events using a conversion algorithm developed for this project. (The converter itself is available to the public as an open-source package [3].) We demonstrated how the stream of events produced by such devices can be used to reconstruct the divergence of the optic field using sparse feature tracking, which in turn allows the estimation of time-to-contact, a quantity shown before to be sufficient for guidance control during landing [4]. The project concluded with closed-loop simulations of vertical landings on lunar-like surfaces, which showed the feasibility of our approach.

This proof-of-concept work hopes to draw the attention of the space scientific community to the potential of event cameras and open research questions that would help this technology mature and be integrated into existing space systems in the future.

References

  1. Gallego, Guillermo, Tobi Delbrück, Garrick Orchard, Chiara Bartolozzi, Brian Taba, Andrea Censi, Stefan Leutenegger, et al. 2019. “Event-Based Vision: A Survey.” CoRR abs/1904.08405.

  2. PANGU: https://pangu.software/

  3. event-based-vision: https://gitlab.com/EuropeanSpaceAgency/event-based-vision

  4. Izzo, de Croon, 2019. "Landing with Time-to-Contact ad Ventral Optic Flow Estimates" Journal of Guidance, Control and Dynamics Vol. 35, No. 4.

Outcome

Artificial Intelligence Conference paper
Event-Based Spacecraft Landing Using Time-To-Contact
Sikorski, O. and Izzo, D. and Meoni G.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops: 1941-1950
(2021)
Download
BibTex
Artificial Intelligence Conference paper
Globally Optimal Event-Based Divergence Estimation for Ventral Landing
McLeod, S. and Meoni, G. and Izzo, D. and Mergy, A. and Liu, D. and Latif, Y. and Reid, I. and Chin, TJ
ECCV AI4Space 2022 (CVPR) Workshops
(2022)
Download
BibTex
Hamburger icon
Menu
Advanced Concepts Team