Abstract
An unsupervised learning algorithm for a stochastic recurrent neural network based on the Boltzmann Machine architecture is formulated in this paper. The maximization of the mutual information between the stochastic output neurons and the clamped inputs is used as an unsupervised criterion for training the network. The resulting learning rule contains two terms corresponding to Hebbian and anti-Hebbian learning. It is interesting that these two terms are weighted by the amount of information transmitted in the learning synapse, giving an information-theoretic interpretation of the proportionality constant of Hebb's biological rule. The anti-Hebbian term, which can be interpreted as a forgetting function, supports the optimal coding. In this way, optimal nonlinear and recurrent implementations of data compression of Boolean patterns are obtained. As an example, the encoder problem is simulated and trained in an unsupervised way in a one layer network. Compression of non-uniform distributed binary data is included. Unsupervised classification, even for continuous inputs, is shown for the cases of four overlapping Gaussian spots and for a real-world example of thyroid diagnosis. In comparison with other techniques, the present model requires an exponentially smaller number of weights for the classification problem.