Biomimetics
Artificial Intelligence
1 Apr 2022

Temporal dynamics in computational reservoirs

Large neural networks with sparse and random connectivity have proven to work well for creating output that is trainable for a range of computational tasks. These so called reservoirs are applied in artificial neural networks such as Echo State Networks (ESNs) and Liquid State Machines (LSMs) [1-2].

Most often these reservoirs are seen as black boxes, and only the output to a certain input is considered and used for training tasks and analysis. This is also what makes these networks very efficient computationally, as the reservoirs are static and it is sufficient to train the linear readout. Thereby, fast and energy efficient retraining is a possible benefit of using these networks e.g., for payload processing in spacecraft before transmitting data to the ground.

Considering the recurrent interactions that can arise from randomly initialising these big reservoirs reveals a broad range of possible dynamical states that the reservoir can be in. Small size recurrent neural networks have previously been studied analytically using tools from the field of dynamical systems [3-5]. For these small networks, Hopf bifurcations are observed when inhibition and excitation in the system are balanced in a certain way, causing the systems to exhibit self-sustained oscillatory behaviour.

The importance of these recurrent interactions between simple agents in big networks have been studied in several theoretical and biological systems. For instance, the Kuramoto model of coupled oscillators has been shown to exhibit flocking, swarming and synchronisation effects when the agents in the network are coupled in a non-reciprocal way [6-7]. Furthermore, the importance of oscillatory dynamical behaviour has been studied for natural and artificial neural networks [8-9].

Project overview

Bifurcation analysis of oscillatory dynamics observed in 2-neuron network.
Bifurcation analysis of oscillatory dynamics observed in 2-neuron network.

This project aims at investigating how the dynamics inside a reservoir network affect its efficiency and accuracy on time series prediction tasks. To do this, we aim at utilizing current understanding of the dynamics observed in small recurrent neural networks to bigger reservoirs such as ESNs and LSMs.

Furthermore, we are investigating how to add simple biologically inspired constraints to the creation of the reservoirs to govern their performance in computational tasks. For this, sources of inspiration are the biological sensory systems such as the olfactory, visual and auditory system, as well as the brain itself. Despite the noisy nature and high complexity of these systems, they have proven to work in an extremely energy efficient as well as robust way. Using some of the same underlying principles in the design of computational reservoirs, we are investigating, if the same qualities can be achieved in these artificial neural networks.

References

[1] Lukoševičius, M. (2012). A Practical Guide to Applying Echo State Networks. In G. Montavon, G. B. Orr, & K.-R. Müller (Eds.), Neural Networks: Tricks of the Trade (Vol. 7700, pp. 659–686). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-35289-8_36

[2] Maass, W. (2011). Liquid State Machines: Motivation, Theory, and Applications. In S. B. Cooper & A. Sorbi, Computability in Context (pp. 275–296). IMPERIAL COLLEGE PRESS. https://doi.org/10.1142/9781848162778_0008

[3] Beer, R. D. (1995). On the Dynamics of Small Continuous-Time Recurrent Neural Networks. Adaptive Behavior, 3(4), 469–509. https://doi.org/10.1177/105971239500300405

[4] Beer, R. D. (2006). Parameter Space Structure of Continuous-Time Recurrent Neural Networks. Neural Computation, 18(12), 3009–3051. https://doi.org/10.1162/neco.2006.18.12.3009

[5] Fasoli, D., Cattani, A., & Panzeri, S. (2016). The Complexity of Dynamics in Small Neural Circuits. PLOS Computational Biology, 12(8), e1004992. https://doi.org/10.1371/journal.pcbi.1004992

[6] O’Keeffe, K. P., Hong, H., & Strogatz, S. H. (2017). Oscillators that sync and swarm. Nature Communications, 8(1), 1504. https://doi.org/10.1038/s41467-017-01190-3

[7] Fruchart, M., Hanai, R., Littlewood, P. B., & Vitelli, V. (2021a). Non-reciprocal phase transitions. Nature, 592(7854), 363–369. https://doi.org/10.1038/s41586-021-03375-9

[8] Hahn, G., Bujan, A. F., Frégnac, Y., Aertsen, A., & Kumar, A. (2014). Communication through Resonance in Spiking Neuronal Networks. PLoS Computational Biology, 10(8), e1003811. https://doi.org/10.1371/journal.pcbi.1003811

[9] Yan, H., Zhao, L., Hu, L., Wang, X., Wang, E., & Wang, J. (2013). Nonequilibrium landscape theory of neural networks. Proceedings of the National Academy of Sciences, 110(45), E4185–E4194. https://doi.org/10.1073/pnas.1310692110

Hamburger icon
Menu
Advanced Concepts Team