• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Nonlinear PCA: A new hierarchical approach
 
  • Details
  • Full
Options
2002
Conference Paper
Title

Nonlinear PCA: A new hierarchical approach

Abstract
Traditionally, nonlinear principal component analysis (NLPCA) is seen as nonlinear generalization of the standard (linear) principal component analysis (PCA). So far, most of these generalizations rely on a symmetric type of learning. Here we propose an algorithm that extends PCA into NLPCA through a hierarchical type of learning. The hierarchical algorithm (h-NLPCA), like many versions of the symmetric one (s-NLPCA), is based on a multi-layer perceptron with an auto-associative topology, the learning rule of which has been upgraded to accommodate the desired discrimination between components. With h-NLPCA we seek not only the nonlinear subspace spanned by the optimal set of components, ideal for data compression, but we give particular interest to the order in which these components appear. Due to its hierarchical nature, our algorithm is shown to be very efficient in detecting meaningful nonlinear features from real world data, as well as in providing a nonlinear whitening. Furthermore, in a quantitative type of analysis, the h-NLPCA achieves better classification accuracies, with a smaller number of components than most traditional approaches.
Author(s)
Scholz, M.
Vigario, R.
Mainwork
10th European Symposium on Artificial Neural Networks, ESANN 2002. Proceedings  
Conference
European Symposium on Artificial Neural Networks (ESANN) 2002  
Language
English
FIRST
Keyword(s)
  • nonlinear principal component analysis

  • generalization

  • principal component analysis

  • learning

  • data compression

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024