• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Domain Shifts in Reinforcement Learning: Identifying Disturbances in Environments
 
  • Details
  • Full
Options
2021
Poster
Title

Domain Shifts in Reinforcement Learning: Identifying Disturbances in Environments

Title Supplement
Poster presented at AISafety 2021. Artificial Intelligence Safety 2021, co-located with the 30th International Joint Conference on Artificial Intelligence, IJCAI 2021, Virtual, August 2021
Abstract
End-to-End Deep Reinforcement Learning (RL) systems return an action no matter what situation they are confronted with, even for situations that differ entirely from those an agent has been trained for. In this work, we propose to formalize the changes in the environment in terms of the Markov Decision Process (MDP), resulting in a more formal framework when dealing with such problems.
Author(s)
Haider, Tom  
Fraunhofer-Institut für Kognitive Systeme IKS  
Schmoeller Roza, Felippe
Fraunhofer-Institut für Kognitive Systeme IKS  
Eilers, Dirk  
Fraunhofer-Institut für Kognitive Systeme IKS  
Roscher, Karsten  
Fraunhofer-Institut für Kognitive Systeme IKS  
Günnemann, Stephan
Technische Univ. München, München
Funder
Bayerisches Staatsministerium für Wirtschaft, Landesentwicklung und Energie StMWi  
Conference
Workshop on Artificial Intelligence Safety (SafeAI) 2021  
International Joint Conference on Artificial Intelligence (IJCAI) 2021  
File(s)
Download (224.72 KB)
Rights
Use according to copyright law
DOI
10.24406/publica-fhg-412093
Language
English
Fraunhofer-Institut für Kognitive Systeme IKS  
Keyword(s)
  • reinforcement learning

  • RL

  • safety critical

  • Markov Decision Process

  • MDP

  • safety

  • Safe Intelligence

  • robustness

  • domain shift

  • out of distribution

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024