5
Views
3
CrossRef citations to date
0
Altmetric
Original Article

Entropy optimization by the PFANN network: application to blind source separation

Pages 171-186 | Received 11 Jan 1999, Published online: 09 Jul 2009

References

  • Amari S-I, Chen T-P, Chicocki A. Stability analysis of learning algorithms for blind source separation. Neural Networks 1997; 10: 1345–51
  • Bell A J, Sejnowski T J. An information maximization approach to blind separation and blind deconvolution. Neural Comput. 1996; 7: 1129–59
  • Bellini S. Bussgang techniques for blind equalization. IEEE Global Telecommun. Conf. Rec. December, 1986; 1634–40
  • Comon P. Independent component analysis, a new concept?. Signal Process. 1994; 36: 287–314
  • Dehaene J, Twum Danso N. Local adaptive algorithms for information maximization in neural networks. ICASSP 97: Proc. Int. Conf. on Acoustics, Speech and Signal Processing (1997). 1997; 59–62
  • Fiori S, Piazza F. A study on functional-link neural units with maximum entropy response. ICANN 98: Proc 8th Int. Conf. on Artificial Neural Networks (1998). Springer, Berlin 1998; II: 493–8
  • Fiori S, Bucciarelli P, Piazza F. Blind signal flatting using warping neural modules. IJCNN 98: Proc. Int. Joint Conf. on Neural Networks (1998). IEEE, Piscataway, NJ 1998; 2: 2312–7
  • Fiori S. Blind source separation by new M-WARP algorithm. Electron. Lett. 1999; 35: 269–70
  • Gustaffson M. Gaussian mixture and kernel based approach to blind source separation using neural networks. ICANN 98: Proc 8th Int. Conf. on Artificial Neural Networks (1998). Springer, Berlin 1998; II: 869–74
  • Hyvärinen A, Oja E. Independent component analysis by general non-linear Hebbian-like rules. Signal Process. 1998; 64: 301–13
  • Laughlin S. A simple coding procedure enhances a neuron's information capacity. Z. Naturf. 1981; 36: 910–2
  • Linsker R. An application of the principle of maximum information preservation to linear systems. NIPS 88: Advances in Neural Information Processing Systems 1 (1988). Morgan-Kaufmann, San Mateo, CA 1989; 186–94
  • Linsker R. Local synaptic rules suffice to maximize mutual information in a linear network. Neural Comput. 1992; 4: 691–702
  • Moreland P. Mixture of experts estimate a-posteriori probabilities. ICANN 97: Proc 7th Int. Conf. on Artificial Neural Networks (1997). Springer, Berlin 1997; 499–505
  • Nadal J P, Brunel N, Parga N. Nonlinear feedforward networks with stochastic inputs: infomax implies redundancy reduction. Network: Comput. Neural Syst. 1998; 9: 207–17
  • Obradovic D, Deco G. Unsupervised learning for blind source separation: an information-theoretic approach. ICASSP 97: Proc. Int. Conf. on Acoustics, Speech and Signal Processing (1997). 1997; 127–130
  • Plumbley M D. Efficient information transfer and anti-Hebbian neural networks. Neural Networks 1993; 6: 823–33
  • Plumbley M D. Approximating optimal information transmission using local Hebbian algorithm in a double feedback loop. ICANN 93: Proc. 3rd Int. Conf. on Artificial Neural Networks (1993). Springer, Berlin 1993; 435–40
  • Pao Y-H. Adaptive Pattern Recognition and Neural Networks. Addison-Wesley, Reading, MA 1989; 8, ch
  • Roth Z, Baram Y. Multidimensional density shaping by sigmoids. IEEE Trans. Neural Networks 1996; 7: 1291–8
  • Sudjianto A, Hassoun M H. Nonlinear Hebbian rule: a statistical interpretation. ICNN 94: Proc. Int. Conf. on Neural Networks (1994). 1994; 2: 1247–52
  • Taleb A, Jutten C. Entropy optimization-application to source separation. ICANN 97: Proc 7th Int. Conf. on Artificial Neural Networks (1997). Springer, Berlin 1997; 529–34
  • Torkkola K. Blind deconvolution, information maximization and recursive filters. ICASSP 97: Proc. Int. Conf. on Acoustics, Speech and Signal Processing (1997). 1997; 3301–4
  • Vapnik V. The support vector method. ICANN 97: Proc 7th Int. Conf. on Artificial Neural Networks (1997). Springer, Berlin 1997; 263–71
  • Xu L, Cheung C C, Yang H H, Amari S-I. Independent component analysis by the information-theoretic approach with mixture of densities. IJCNN 98: Proc. Int. Joint Conf. on Neural Networks (1998). 1998; 1821–6
  • Xu L, Cheung C C, Ruan J, Amari S-I. Nonlinearity and separation capability: further justifications for the ICA algorithm with a learned mixture of parametric densities. ESANN 97: Proc. Eur. Symp. on Artificial Neural Networks (1998). 1997; 291–6
  • Xu L, Cheung C C, Amari S-I. Nonlinearity, separation capability and learned parametric mixture ICA algorithms. Int. J. Neural Syst. special issue 1998, in press
  • Yang H H, Amari S-I. Adaptive online learning algorithms for blind separation: maximum entropy and minimal mutual information. Neural Comput. 1997; 9: 1457–82
  • Yang Y, Barron A R. An asymptotic property of model selection criteria. IEEE Trans. Information Theory 1998; 44: 95–116
  • Zurada S M. Introduction to Neural Artificial Systems 1992, West

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.