• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Artikel
  4. Limitations of multi-layer perceptrons - steps towards genetic neural networks
 
  • Details
  • Full
Options
1990
Journal Article
Title

Limitations of multi-layer perceptrons - steps towards genetic neural networks

Abstract
In this paper we investigate multi-layer perceptron networks in the task domain of Boolean functions. We demystify the multi-layer perceptron network by showing that it just divides the input space into regions constrained by hyperplanes. We use this information to construct minimal training sets. Despite using minimal training sets, the learning time of multi-layer perceptron networks with backpropagation scales exponentially for complex Boolean functions. But modular neural networks which consist of independentky trained subnetworks scale very well. We conjecture that the next generation of neural networks will be genetic neural networks which evolve their structure. We confirm Minsky and Papert: "The future of neural networks is tied not to the search for some single, universal scheme to solve all problems at once, bu to the evolution of a many-faceted technology of network design."
Author(s)
Mühlenbein, H.
Journal
Parallel computing  
DOI
10.1016/0167-8191(90)90079-O
Language
English
GMD  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024