Continuous parameter working memory in a balanced chaotic neural network


  Nimrod Shaham [1]  ,  Yoram Burak [1,2]  
[1] Racah institute of physics, the Hebrew university of Jerusalem
[2] Edmond and Lily Safra center for brain sciences, the Hebrew university of Jerusalem

Neurons, the basic computational units in the brain, are characterized by intrinsic persistence times that range between ~1 and ~100 ms. Yet, the brain can maintain information in short-term memory over several seconds. It is widely believed that short-term memory does not involve any structural modifications in the brain. Thus, the memory is maintained by the joint dynamics of many coupled neurons. An important model for short-term memory of continuous parameters (such as the orientation of a bar, the color of an object, or the frequency of a tone) is that such parameters are stored in persistent activity of neural networks that exhibit continuous attractor dynamics: different values of a stimulus are represented by different locations along a continuum of semi-stable steady states. It has been unclear whether this theoretical idea is compatible with another important model for the architecture and dynamics of cortical neural networks - the balanced network (C. van Vreeswijk and H. Sompolinsky, Science, 1996). In this model, many excitatory inputs to each neuron are balanced on average by many inhibitory inputs, and each neuron is activated due to fluctuations of the total input around the mean. The fluctuations generate an overall chaotic behavior of the system and irregular, asynchronous behavior of the single units, in agreement with the empirical dynamics of cortical neurons. In this work we study a network with random connectivity which generates a chaotic balanced state. Using mutual inhibition between two balanced populations, we find an architecture for which the network can sustain slow dynamics in a certain direction in the mean activity space. Due to the slow dynamics, this balanced network is an appropriate candidate for the storage of working memory.In the limit 1<<K<<N (where K is the average number of inputs per neuron and N is the population size) we show analytically, using mean field techniques, the existence of a continuum of balanced steady states. For finite K and infinite population size the network has only one fixed point. However, by analyzing the mean field equations we show that the network still exhibits slow dynamics along a line of approximate steady states.In the realistic case of finite N we perform numerical simulations of up to ~10^5 binary neurons. We find that the chaotic dynamics drive diffusive motion along the approximate attractor, similar to the dynamics of networks in which noise arises from intrinsic neural or synaptic mechanisms (Burak and Fiete, PNAS 2012). Moreover, we can relate the diffusion to the statistics of single neural activity in a simple balanced network (without mutual inhibition). In addition to the diffusive motion, the network can exhibit systematic motion due to mistuning of the network parameters, and the overall dynamics along the attractor follow similar statistics as an Ornstein-Uhlenbeck process.Finally, using analytical and numerical techniques, we show that the diffusion coefficient along the attractor is inversely proportional to the network size. Thus, the persistence of the network can be improved by increasing the number of neurons. In practice, ~10^5 neurons are sufficient in our model (with proper tuning of the synaptic weights) to achieve persistence times of several seconds, larger by two to three orders of magnitude than the single neuron time scale.