56
Views
55
CrossRef citations to date
0
Altmetric
Original Article

Information storage in sparsely coded memory nets

&
Pages 61-74 | Received 07 Jun 1989, Published online: 04 Aug 2009
 

Abstract

We study simple, feedforward, neural networks for pattern storage and retrieval, with information theory criteria. Two Hebbian learning rules are considered, with emphasis on sparsely coded patterns. We address the question: under which conditions is the optimal information storage reached in the error-full regime?

For the model introduced some time ago by Willshaw, Buneman and Longuet-Higgins, the information stored goes through a maximum, which may be found within the error-less or the error-full regimes according to the value of the coding rate. However, it eventually vanishes as learning goes on and more patterns are stored.

For the original Hebb learning rule, where reinforcement occurs whenever both input and output neurons are active, the information stored reaches a stationary value, 1/(π ln 2), when the net is overloaded beyond its threshold for errors. If the coding rate f′ of the output pattern is small enough, the information storage goes through a maximum, which saturates the Gardner bound, 1/(2 ln 2). An interpolation between dense and sparse coding limits is also discussed.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.