62
Views
34
CrossRef citations to date
0
Altmetric
Original Article

Information capacity in recurrent McCulloch–Pitts networks with sparsely coded memory states

&
Pages 177-186 | Received 28 Mar 1991, Published online: 09 Jul 2009
 

Abstract

A new access to the asymptotic analysis of autoassociation properties in recurrent McCulloch-Pitts networks in the range of low activity is proposed. Using information theory, this method examines the static structure of stable states imprinted by a Hebbian storing process. In addition to the definition of critical pattern capacity usually considered in the analysis of the Hopfield model, the authors introduce the definition of information capacity which guarantees content addressability and is a stricter upper bound of the information really accessible in an autoassociation process. They calculate these two types of capacities for two types of local learning rules which are very effective for sparsely coded patterns: the Hebb rule and the clipped Hebb rule. It turns out that for both rules the information capacity is exactly half the pattern capacity.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.