Affiliation: Image Processing Laboratory (IPL). Universitat de València, Spain
(Joint work with Jesús Malo and Valero Laparra.) We generalize a class of projection pursuit methods to transform arbitrary multidimensional data into multivariate Gaussian data, thus attaining statistical independence of its components. The factorization of the original probability density function (PDF) is very useful to tackle density estimation and unsupervised learning problems.
The proposed analysis enables a number of novel ways to solve practical problems in high-dimensional scenarios, such as those encountered in image processing, speech recognition, array processing, or bioinformatics.
When data come from a linear transformation of independent non-Gaussian sources, independent component analysis (ICA) methods can efficiently solve the factorization problem. However, when the transformation is non-linear, ICA methods are no longer useful.
The general framework consists of the sequential application of a two-step processing unit: univariate marginal Gaussianization transforms followed by an orthogonal transform.
This iterative scheme generalizes previous ICA and PCA based projection pursuit methods to include even random rotations.
Relation to other methods, such as deep neural networks is pointed out. The considered class of methods is shown to be invertible and differentiable for any rotation while its convergence properties do depend on the selected rotation.
The performance is successfully illustrated in a number of multidimensional data processing problems such as image synthesis, classification, and denoising.