Artificial Intelligence
Earth System Sciences
28 Jan 2021

Spiking Neural Networks for onboard Artificial Intelligence

Figure a) shows an image from the UC Merced Land Use dataset [7] used to stimulate the convoutional Spiking Neural Networks described in [9].  Figure b) shows the neurons of the first convolutional layer that have been activated in the first 7 timesteps. Figure c) shows the neurons of the first convolutional layer that that have been activated in the first 11 timesteps.
Figure a) shows an image from the UC Merced Land Use dataset [7] used to stimulate the convoutional Spiking Neural Networks described in [9]. Figure b) shows the neurons of the first convolutional layer that have been activated in the first 7 timesteps. Figure c) shows the neurons of the first convolutional layer that that have been activated in the first 11 timesteps.

In recent years, the capabilities demonstrated by Deep Neural Networks (DNNs) in many tasks have prompted researchers to investigate their applicability on board satellites. Possible benefits due to onboard DNNs include the possibility to prefilter payload data to mitigate downlink bandwidth requirements, to early detect potential disasters, and to perform a content-based download of data [1].

However, the complexity and the number of DNNs models exploitable onboard are restricted by the reduced energy budgets of satellites [2]. For this reason, the research for energy-efficient algorithms is a must to guarantee acceptable trade-offs between model performances and computational intensity [1,3].

In this regard, a solution of potential interest is represented by Spiking Neural Networks (SNNs), which are the third generation of neural networks. Each neuron of an SNN layer accumulates its input current spikes over different timesteps, increasing or decreasing its membrane voltage. When the latter passes a fixed threshold, the membrane voltage is reset, and the neuron emits a spike. Since the various neurons can fire in different timesteps, SNN usually benefits of a sparse computation, which can be exploited to diminish the energy consumption by running these models on event-based neuromorphic processors. In this way, SNNs might offer a better energy efficiency compared to their artificial counterpart, depending on their input data, coding information, neuron models, and hardware implementation [4,5,6].


Project overview

Despite the potential benefits in terms of energy efficiency, because of the reduced maturity of SNN training algorithms and the lower scalability of SNN models, the research in the SNN field has been focusing mostly on the identification of the most promising training methods and model architectures, remaining limited in simple benchmark datasets, such as MNIST [4,5]. In view of that, the applicability of SNNs to space classification datasets, such as UC Merced Land Use [7] or EuroSAT [8], has to be demonstrated yet.

As a part of this project, we are investigating the use of SNNs for scene classification problems. In particular, we would like to assess the possible drop in accuracy and the potential advantages in terms of energy efficiency, processing latency compared to standard Artificial Neural Networks. In our investigation, we are considering various SNN algorithms, featuring different neuron models, training methods, and information coding to evaluate the different trade-offs.


References:

  1. Furano, Gianluca, et al. "Towards the Use of Artificial Intelligence on the Edge in Space Systems: Challenges and Opportunities." IEEE Aerospace and Electronic Systems Magazine 35.12 (2020): 44-56.

  2. Pitsis, George, et al. "Efficient convolutional neural network weight compression for space data classification on multi-fpga platforms." ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019.

  3. Giuffrida, Gianluca, et al. "CloudScout: A deep neural network for on-board cloud detection on hyperspectral images." Remote Sensing 12.14 (2020): 2205.

  4. Kheradpisheh, Saeed Reza, and Timothée Masquelier. "S4NN: temporal backpropagation for spiking neural networks with one spike per neuron." arXiv preprint arXiv:1910.09495 (2019).

  5. Hunsberger, Eric, and Chris Eliasmith. "Training spiking deep networks for neuromorphic hardware." arXiv preprint arXiv:1611.05141 (2016).

  6. Bouvier, Maxence, et al. "Spiking neural networks hardware implementations and challenges: A survey." ACM Journal on Emerging Technologies in Computing Systems (JETC) 15.2 (2019): 1-35.

  7. Yi Yang and Shawn Newsam, "Bag-Of-Visual-Words and Spatial Extensions for Land-Use Classification," ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems (ACM GIS), 2010.

  8. Helber, Patrick, et al. "Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 12.7 (2019): 2217-2226.

  9. Kheradpisheh, Saeed Reza, et al. "STDP-based spiking deep convolutional neural networks for object recognition." Neural Networks 99 (2018): 56-67.

Outcome

Artificial Intelligence Conference paper
Investigating Spiking Neural Networks for Energy-Efficient On-Board AI Applications. A Case Study in Land Cover and Land Use Classification
Kucik, Andrzej S. and Meoni, Gabriele
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops: 2020-2030
(2021)
Download
BibTex
Hamburger icon
Menu
Advanced Concepts Team