• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Scopus
  4. Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data
 
  • Details
  • Full
Options
2024
Conference Paper
Title

Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data

Abstract
Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.
Author(s)
Boos, Eugen
Technische Universität Dresden
Zimmermann, Jan
Technische Universität Dresden
Wiemer, Hajo
Technische Universität Dresden
Ihlenfeldt, Steffen  
Fraunhofer-Institut für Werkzeugmaschinen und Umformtechnik IWU  
Mainwork
Procedia CIRP
Funder
Deutsche Forschungsgemeinschaft  
Conference
31st CIRP Conference on Life Cycle Engineering, LCE 2024
Open Access
DOI
10.1016/j.procir.2024.01.012
Additional link
Full text
Language
English
Fraunhofer-Institut für Werkzeugmaschinen und Umformtechnik IWU  
Keyword(s)
  • Attention

  • Complexity

  • Deep learning

  • Remaining useful life

  • Transformer

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024