Boos, EugenEugenBoosZimmermann, JanJanZimmermannWiemer, HajoHajoWiemerIhlenfeldt, SteffenSteffenIhlenfeldt2026-03-142026-03-142024https://publica.fraunhofer.de/handle/publica/51050610.1016/j.procir.2024.01.0122-s2.0-85193507531Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.entrueAttentionComplexityDeep learningRemaining useful lifeTransformerInvestigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series dataconference paper