Graß, AlexanderAlexanderGraßDöhmen, TillTillDöhmenBeecks, ChristianChristianBeecks2023-01-242023-01-242022https://publica.fraunhofer.de/handle/publica/43433610.1109/ICDEW55742.2022.000172-s2.0-85134877092Time series are prominent in a broad variety of application domains. Given a time series, how to automatically derive its inherent structure? While Gaussian process models can describe structure characteristics by their individual exploitation of covariance functions, their inference is still a computationally complex task. State-of-the-art methods therefore aim to efficiently infer an interpretable model by searching appropriate kernel compositions associated with a high-dimensional hyperparameter space. In this work, we propose a new alternative approach to learn structural components of a time series directly without inference. To this end we train a deep neural network based on kernel-induced samples, in order to obtain a generalized model for the estimation of kernel compositions. Our investigations show that our proposed approach is able to effectively classify kernel compositions of random time series data as well as estimate their hyperparameters efficiently and with high accuracy.endeep learningGaussian processkernelneural networkrepresentation learningsamplingstructure discoverySample-based Kernel Structure Learning with Deep Neural Networks for Automated Structure Discoveryconference paper