• English
  • Deutsch
  • Log In
    Password Login
    Have you forgotten your password?
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Selected Challenges in ML Safety for Railway
 
  • Details
  • Full
Options
September 2022
Presentation
Title

Selected Challenges in ML Safety for Railway

Title Supplement
Presentation held at IKS Online Seminar "The Role of AI in Railway", September 15, 2022, Online
Abstract
Neural networks (NN) have been introduced in safety-critical applications from autonomous driving to train inspection. I argue that to close the demo-to-product gap, we need scientifically-rooted engineering methods that can efficiently improve the quality of NN. In particular, I consider a structural approach (via GSN) to argue the quality of neural networks with NN-specific dependability metrics. A systematic analysis considering the quality of data collection, training, testing, and operation allows us to identify many unsolved research questions: (1) Solve the denominator/edge case problem with synthetic data, with quantifiable argumentation (2) Reach the performance target by combining classical methods and data-based methods in vision (3) Decide the threshold (for OoD or any kind) based on the risk appetite (societally accepted risk).
Author(s)
Cheng, Chih-Hong  
Fraunhofer-Institut für Kognitive Systeme IKS  
Project(s)
IKS-Ausbauprojekt
Funder
Bayerisches Staatsministerium für Wirtschaft, Landesentwicklung und Energie
Conference
Online Seminar "The Role of AI in Railway" 2022  
Request publication:
bibliothek@iks.fraunhofer.de
Language
English
Fraunhofer-Institut für Kognitive Systeme IKS  
Fraunhofer Group
Fraunhofer-Verbund IUK-Technologie  
Keyword(s)
  • safety

  • train

  • railway

  • artificial intelligence

  • AI

  • machine learning

  • ML

  • neural networks

  • NN

  • safety-critical

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024