Abstract
In this paper it is shown how techniques available from the field of statistical mechanics may be used to suggest the structure of connection weights which are capable of storing N patterns in a Hopfield network of N spins. Guided by this analysis simulation results are presented to confirm that N random patterns (both biased and unbiased) may indeed be stored in a Hopfield network of N spins using a set of weights that are proportional to the inverse of the pattern correlation matrix. Furthermore an unsupervised learning rule is introduced which is capable of enhancing the basin of attraction size up to some maximum for a selected subset of these stored patterns. The merits of this method for achieving associative memory in a Hopfield net are discussed and a comparison with the traditional Gardner algorithm is made.