References
- Amari S, Cichocki A, Yang H H. A new learning algorithm for blind signal separation. Advances in Neural Information Processing Systems 8, D S Touretzky, et al. MIT Press, Cambridge, MA 1996; 757–63
- Atick J J, Li Z, Redlich A N. Understanding retinal color coding from first principles. Neural Comput. 1992; 4: 559–72
- Barlow H B, Kaushal T P, Mitchison G J. Finding minimum entropy codes. Neural Comput. 1989; 1: 412–23
- Barrow H G. Learning receptive fields. Proc. 1st IEEE Ann. Conf. on Neural Networks vol IV. IEEE, Piscataway, NJ 1987; 115–21
- Bell A J, Sejnowski T J. An information-maximization approach to blind separation and blind deconvolution. Neural Comput. 1995; 7: 1129–59
- Cardoso J-F, Souloumiac A. Blind beamforming for non-Gaussian signals. IEE Proc. F 1993; 140: 362–70
- Comon P. Independent component analysis-a new concept?. Signal Process. 1994; 36: 287–314
- Deco G, Parra L. Non-linear feature extraction by redundancy reduction in an unsupervised stochastic neural network. Neural Networks 1997; 10: 683–91
- Field D J. What is the goal of sensory coding?. Neural Comput. 1994; 6: 559–601
- Földiák P. Forming sparse representations by local anti-Hebbian learning. Biol. Cybern. 1990; 64: 165–70
- Hochreiter S, Schmidhuber J. Feature extraction through LOCOCODE. Neural Comput. 1999; 11: 679–714
- Jutten C, Herault J. Blind separation of sources, part I: an adaptive algorithm based on neuromimetic architecture. Signal Process. 1991; 24: 1–10
- Lindstädt S. Comparison of two unsupervised neural network models for redundancy reduction. Proc.1993, Connectionist Models Summer School, M C Mozer, et al. Erlbaum, Hillsdale, NJ 1993; 308–15
- Linsker R. Self-organization in a perceptual network. IEEE Comput. 1988; 21: 105–17
- Miller K D. A model for the development of simple cell receptive fields and the ordered arrangement of orientation columns through activity-dependent competition between on- and off-centre inputs. J. Neurosci. 1994; 14: 409–41
- Molgedey L, Schuster H G. Separation of independent signals using time-delayed correlations. Phys. Rev. Lett. 1994; 72: 3634–7
- Nadal J-P, Parga N. Redundancy reduction and independent component analysis: conditions on cumulants and adaptive approaches. Neural Comput. 1997; 9: 1421–56
- Parra L, Deco G, Miesbach S. Statistical independence and novelty detection with information preserving nonlinear maps. Neural Comput. 1996; 8: 260–9
- Rubner J, Schulten K. Development of feature detectors by self-organization: a network model. Biol. Cybern. 1990; 62: 193–9
- Rubner J, Tavan P. A self-organization network for principal-component analysis. Europhys. Lett. 1989; 10: 693–8
- Schmidhuber J H. Learning factorial codes by predictability minimization. Neural Comput. 1992; 4: 863–79
- Schmidhuber J H. Netzwerkarchitekturen, Zielfunktionen und Kettenregel. Habilitationsschrift. Institut für Informatik, Technische Universität München. 1993
- Schmidhuber J H. Neural predictors for detecting and removing redundant information. Adaptive Behavior and Learning, H Cruse, J Dean, H Ritter. Kluwer, Dordrecht 1999, in press
- Schmidhuber J H, Prelinger D. Discovering predictable classifications. Neural Comput. 1993; 5: 625–35
- Schmidhuber J H, Eldracher M, Foltin B. Semilinear predictability minimization produces well-known feature detectors. Neural Comput. 1996; 8: 773–86
- Schraudolph N N. Optimization of entropy with neural networks. PhD Thesis. University of California, San Diego 1995
- Schraudolph N N. Local gain adaptation in stochastic gradient descent. Proc. 9th Int. Conf. on Artificial Neural Networks. IEE, London 1999, in press
- Schraudolph N N, Sejnowski T J. Unsupervised discrimination of clustered data via optimization of binary information ga. Advances in Neural Information Processing Systems 5, S J Hanson, et al. Morgan Kaufmann, San Mateo, CA 1993; 499–506