A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks

PLoS Computational Biology
Alireza AlemiR Zecchina

Abstract

Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, ...Continue Reading

References

Jan 1, 1992·Proceedings of the National Academy of Sciences of the United States of America·S BröcherW Singer
Feb 1, 1989·Journal of Neurophysiology·S FunahashiP S Goldman-Rakic
Aug 13, 1971·Science·J M Fuster, G E Alexander
Jun 1, 1969·The Journal of Physiology·D Marr
Apr 1, 1982·Proceedings of the National Academy of Sciences of the United States of America·J J Hopfield
Dec 23, 1998·Network : Computation in Neural Systems·N BrunelS Fusi
Sep 1, 1986·Physical Review A: General Physics·H Sompolinsky
Mar 1, 1989·Physical Review A: General Physics·J BuhmannK Schulten
Sep 30, 1985·Physical Review Letters·D J AmitH Sompolinsky
Oct 9, 1999·Journal of Neurophysiology·H Wang, J J Wagner
Jun 2, 2001·Neural Computation·Y Amit, M Mascaro
Oct 9, 2002·Nature Neuroscience·Ranulfo RomoCarlos D Brody
Jan 5, 2005·Proceedings of the National Academy of Sciences of the United States of America·Nir KalismanHenry Markram
Feb 22, 2005·Neuron·Stefano FusiL F Abbott
Mar 2, 2005·PLoS Biology·Sen SongDmitri B Chklovskii
Mar 21, 2006·Nature Neuroscience·Yun WangPatricia S Goldman-Rakic
Sep 22, 2007·Neural Computation·Joseph M BraderStefano Fusi
Oct 31, 2007·Biological cybernetics·Nicolas Brunel, Mark C W van Rossum
Nov 7, 2007·Trends in Neurosciences·Boris BarbourJean-Pierre Nadal
Mar 15, 2008·Science·Gianluigi MongilloMisha Tsodyks
Apr 9, 2010·The European Journal of Neuroscience·Masato Inoue, Akichika Mikami
May 10, 2012·PLoS Computational Biology·Claudia ClopathNicolas Brunel
Dec 6, 2012·Proceedings of the National Academy of Sciences of the United States of America·Julio ChapetonArmen Stepanyants
Feb 26, 2013·PLoS Computational Biology·Claudia Clopath, Nicolas Brunel
Apr 9, 2014·Current Opinion in Neurobiology·Omri Barak, Misha Tsodyks

❮ Previous
Next ❯

Citations

Apr 12, 2016·Nature Neuroscience·Nicolas Brunel
Nov 1, 2016·Nature Communications·Thomas MiconiGerald M Edelman
Nov 18, 2018·Interface Focus·Luca SagliettiRiccardo Zecchina
Jan 18, 2018·Journal of Mathematical Neuroscience·Christopher J Hillar, Ngoc M Tran
Mar 12, 2019·International Journal of Neural Systems·Ruihan HuSheng Chang
Feb 18, 2017·Physical Review. E·Sheng-Jun Wang, Zhou Yang
May 12, 2020·Frontiers in Computational Neuroscience·Toviah Moldwin, Idan Segev
Aug 28, 2021·Entropy·Evaldo M F CuradoFernando D Nobre

❮ Previous
Next ❯

Software Mentioned

3TLR

Related Concepts

Related Feeds

Addison Disease

Addison's disease, also known as primary adrenal insufficiency and hypocortisolism, is a long-term endocrine disorder in which the adrenal glands do not produce enough steroid hormones. Discover the latest research on Addison's disease here.