• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. An Empirical Evaluation of the Rashomon Effect in Explainable Machine Learning
 
  • Details
  • Full
Options
2023
Conference Paper
Title

An Empirical Evaluation of the Rashomon Effect in Explainable Machine Learning

Abstract
The Rashomon Effect describes the following phenomenon: for a given dataset there may exist many models with equally good performance but with different solution strategies. The Rashomon Effect has implications for Explainable Machine Learning, especially for the comparability of explanations. We provide a unified view on three different comparison scenarios and conduct a quantitative evaluation across different datasets, models, attribution methods, and metrics. We find that hyperparameter-tuning plays a role and that metric selection matters. Our results provide empirical support for previously anecdotal evidence and exhibit challenges for both scientists and practitioners.
Author(s)
Müller, Sebastian
Universität Bonn
Toborek, Vanessa
Universität Bonn
Beckh, Katharina  orcid-logo
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Jakobs, Matthias
Technische Universität Dortmund
Bauckhage, Christian  
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Welke, Pascal
Technische Universität Wien
Mainwork
Machine Learning and Knowledge Discovery in Databases. Research Track. European Conference, ECML PKDD 2021. Proceedings. Pt.III  
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases 2023  
DOI
10.1007/978-3-031-43418-1_28
Language
English
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Keyword(s)
  • Attribution Methods

  • Disagreement Problem

  • Explainable ML

  • Interpretable ML

  • Rashomon Effect

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024