Abstract
The informational properties of a neural network model of an autoassociative memory based on binary Hebbian synapses are investigated. The model is a modification of the Willshaw network with a floating threshold which keeps approximately constant the number of active neurons (winners) at each timestep. In the asymptotic case of large number of neurons, informational characteristics have been calculated analytically for single-step correction. Comparison with simulations shows that the maximal correction efficiency attains its asymptotic values for networks with surprisingly small number of neurons. Simulation results for multistep correction show considerable improvement over the single-step case.