Event-based vision in space
The human visual system is extremely successful at the task of light perception and image processing. It allows the brain to react with millisecond latency to visual simulation in a broad range of scene illumination intensities while requiring a fraction of the power that a conventional camera would need. A close study of the mammalian retina revealed that it plays a crucial role in achieving such efficiency, hence attempts have been made in constructing an artificial “silicon-retina” which have recently lead to the development of the event-based cameras.
Event-based cameras challenge the long established paradigm in image sensors of obtaining visual information through recording frames at fixed intervals. Instead, images are recorded asynchronously when pixels independently report a change to their illumination. The number of events depends not on a fixed clock frequency, but on the changes happening in the viewed scene. This reduces the energy consumption, while performing real-time compression of image data and vastly improves the temporal resolution of the areas of the image that contain most information.
This project investigates the possible applications of event-based cameras and related algorithms in space context, such as spacecraft guidance and debris monitoring.
- Gallego, Guillermo, Tobi Delbrück, Garrick Orchard, Chiara Bartolozzi, Brian Taba, Andrea Censi, Stefan Leutenegger, et al. 2019. “Event-Based Vision: A Survey.” CoRR abs/1904.08405.