Balanced input allows optimal encoding in a stochastic binary neural network model: An analytical study

Gustavo Deco, Etienne Hugues

Research output: Contribution to journalArticleResearchpeer-review

4 Citations (Scopus)

Abstract

Recent neurophysiological experiments have demonstrated a remarkable effect of attention on the underlying neural activity that suggests for the first time that information encoding is indeed actively influenced by attention. Single cell recordings show that attention reduces both the neural variability and correlations in the attended condition with respect to the non-attended one. This reduction of variability and redundancy enhances the information associated with the detection and further processing of the attended stimulus. Beyond the attentional paradigm, the local activity in a neural circuit can be modulated in a number of ways, leading to the general question of understanding how the activity of such circuits is sensitive to these relatively small modulations. Here, using an analytically tractable neural network model, we demonstrate how this enhancement of information emerges when excitatory and inhibitory synaptic currents are balanced. In particular, we show that the network encoding sensitivity -as measured by the Fisher information- is maximized at the exact balance. Furthermore, we find a similar result for a more realistic spiking neural network model. As the regime of balanced inputs has been experimentally observed, these results suggest that this regime is functionally important from an information encoding standpoint.

Original languageEnglish
Article numbere30723
JournalPLoS ONE
Volume7
Issue number2
DOIs
Publication statusPublished - 16 Feb 2012
Externally publishedYes

Cite this