I just published a (slightly modified) chapter of my thesis on arxiv, in which we show how to efficiently approximate the shapes of the steady states of the firing rate “bumps” in continuous attractor networks (spiking and rate-based). Turns out, you can use the theory to also tune network parameters, and do sweet parameter sweeps!
Continuous “bump” attractors are an established model of cortical working memory for continuous variables and can be implemented using various neuron and network models. Here, we develop a generalizable approach for the approximation of bump states of continuous attractor networks implemented in networks of both rate-based and spiking neurons. The method relies on a low-dimensional parametrization of the spatial shape of firing rates, allowing to apply efficient numerical optimization methods. Using our theory, we can establish a mapping between network structure and attractor properties that allows the prediction of the effects of network parameters on the steady state firing rate profile and the existence of bumps, and vice-versa, to fine-tune a network to produce bumps of a given shape.
Seeholzer A, Deger M, Gerstner W (2017). Efficient low-dimensional approximation of continuous attractor networks. arXiv:1711.08032 [q-bio.NC]