Publica
Hier finden Sie wissenschaftliche Publikationen aus den FraunhoferInstituten. A neural network based on first principles
 Institute of Electrical and Electronics Engineers IEEE; IEEE Computer Society; IEEE Signal Processing Society: IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2020. Proceedings : May 48, 2020, Barcelona, Spain Piscataway, NJ: IEEE, 2020 ISBN: 9781509066315 ISBN: 9781509066322 S.40024006 
 International Conference on Acoustics, Speech and Signal Processing (ICASSP) <45, 2020, Barcelona> 

 Englisch 
 Konferenzbeitrag 
 Fraunhofer FKIE () 
Abstract
In this paper, a Neural network is derived from first principles, assuming only that each layer begins with a linear dimensionreducing transformation. The approach appeals to the principle of Maximum Entropy (MaxEnt) to find the posterior distribution of the input data of each layer, conditioned on the layer output variables. This posterior has a welldefined mean, the conditional mean estimator, that is calculated using a type of neural network with theoreticallyderived activation functions similar to sigmoid, softplus, and relu. This implicitly provides a theoretical justification for their use. A theorem that finds the conditional distribution and conditional mean estimator under the MaxEnt prior is proposed, unifying results for special cases. Combining layers results in an autoencoder with conventional feedforward analysis network and a type of linear Bayesian belief network in the reconstruction path.