• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Treating dialogue quality evaluation as an anomaly detection problem
 
  • Details
  • Full
Options
2020
Conference Paper
Title

Treating dialogue quality evaluation as an anomaly detection problem

Abstract
Dialogue systems for interaction with humans have been enjoying increased popularity in the research and industry fields. To this day, the best way to estimate their success is through means of human evaluation and not automated approaches, despite the abundance of work done in the field. In this paper, we investigate the effectiveness of perceiving dialogue evaluation as an anomaly detection task. The paper looks into four dialogue modeling approaches and how their objective functions correlate with human annotation scores. A high-level perspective exhibits negative results. However, a more in-depth look shows limited potential for using anomaly detection for evaluating dialogues.
Author(s)
Nedelchev, Rostislav
Lehmann, Jens  
Usbeck, Ricardo  
Mainwork
12th Language Resources and Evaluation Conference, LREC 2020. Proceedings. Online resource  
Conference
Language Resources and Evaluation Conference (LREC) 2020  
Link
Link
Language
English
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024