Abstract
We describe new algorithms for self-organizing feature maps based on a generalized information theory. In our model the maximizing of the local information transfer leads to a topologically ordered map whereas the increase of global information fails. The adaptation of the synaptic weights depends solely on internal variables which constitute the representation of the signal space. Applications with respect to vector quantization are discussed.