• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Artikel
  4. How to explain individual classification decisions
 
  • Details
  • Full
Options
2010
Journal Article
Title

How to explain individual classification decisions

Abstract
After building a classifier with modern tools of machine learning we typically have a black box at hand that is able to predict well for unseen data. Thus, we get an answer to the question what is the most likely label of a given unseen data point. However, most methods will provide no answer why the model predicted a particular label for a single instance and what features were most influential for that particular instance. The only method that is currently able to provide such explanations are decision trees. This paper proposes a procedure which (based on a set of assumptions) allows to explain the decisions of any classification method.
Author(s)
Baehrens, D.
Schroeter, T.
Harmeling, S.
Kawanabe, M.
Hansen, K.
Müller, K.-R.
Journal
Journal of Machine Learning Research  
Language
English
FIRST
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024