16
Views
3
CrossRef citations to date
0
Altmetric
Original Article

Redundancy reduction with information-preserving nonlinear maps

, &
Pages 61-72 | Published online: 09 Jul 2009
 

Abstract

The basic idea of linear principal component analysis (PCA) involves decorrelating coordinates by an orthogonal linear transformation. In this paper we generalize this idea to the nonlinear case. Simultaneously we shall drop the usual restriction to Gaussian distributions. The linearity and orthogonality condition of linear PCA is replaced by the condition of volume conservation in order to avoid spurious information generated by the nonlinear transformation. This leads us to another very general class of nonlinear transformations, called symplectic maps. Later, instead of minimizing the correlation, we minimize the redundancy measured at the output coordinates. This generalizes second-order statistics, being only valid for Gaussian output distributions, to higher-order statistics. The proposed paradigm implements Barlow's redundancy-reduction principle for unsupervised feature extraction. The resulting factorial representation of the joint probability distribution presumably facilitates density estimation and is applied in particular to novelty detection.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.