Fraunhofer-Gesellschaft

Publica

Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

Architecture-aware Bayesian Optimization for Neural Network Tuning

 
: Sjöberg, Anders; Önnheim, Magnus; Gustavsson, Emil; Jirstrand, Mats

:

Tetko, I.V.:
Artificial Neural Networks and Machine Learning - ICANN 2019. Deep Learning : 28th International Conference on Artificial Neural Networks, Munich, Germany, September 17-19, 2019, Proceedings, Part II
Cham: Springer International Publishing, 2019 (Lecture Notes in Computer Science 11728)
ISBN: 978-3-030-30483-6 (Print)
ISBN: 978-3-030-30484-3 (Online)
S.220-231
International Conference on Artificial Neural Networks (ICANN) <28, 2019, Munich>
Englisch
Konferenzbeitrag
Fraunhofer ITWM ()

Abstract
Hyperparameter optimization of a neural network is a non-trivial task. It is time-consuming to evaluate a hyperparameter setting, no analytical expression of the impact of the hyperparameters are available, and the evaluations are noisy in the sense that the result is dependent on the training process and weight initialization. Bayesian optimization is a powerful tool to handle these problems. However, hyperparameter optimization of neural networks poses additional challenges, since the hyperparameters can be integer-valued, categorical, and/or conditional, whereas Bayesian optimization often assumes variables to be real-valued. In this paper we present an architecture-aware transformation of neural networks applied in the kernel of a Gaussian process to boost the performance of hyperparameter optimization.
The empirical experiment in this paper demonstrates that by introducing an architecture-aware transformation of the kernel, the performance of the Bayesian optimizer shows a clear improvement over a naïve implementation and that the results are comparable to other state-of-the-art methods.

: http://publica.fraunhofer.de/dokumente/N-593467.html