• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Identifying and Classifying User Requirements in Online Feedback via Crowdsourcing
 
  • Details
  • Full
Options
2020
Conference Paper
Title

Identifying and Classifying User Requirements in Online Feedback via Crowdsourcing

Abstract
[Context and motivation] App stores and social media channels such as Twitter enable users to share feedback regarding software. Due to its high volume, it is hard to effectively and systematically process such feedback to obtain a good understanding of users' opinions about a software product. [Question/problem] Tools based on natural language processing and machine learning have been proposed as an inexpensive mechanism for classifying user feedback. Unfortunately, the accuracy of these tools is imperfect, which jeopardizes the reliability of the analysis results. We investigate whether assigning micro-tasks to crowd workers could be an alternative technique for identifying and classifying requirements in user feedback. [Principal ideas/results] We present a crowdsourcing method for filtering out irrelevant app store reviews and for identifying features and qualities. A validation study has shown positive results in terms of feasibility, accuracy, and cost. [Contribution] We provide evidence that crowd workers can be an inexpensive yet accurate resource for classifying user reviews. Our findings contribute to the debate on the roles of and synergies between humans and AI techniques.
Author(s)
Vliet, Martijn van
Groen, Eduard C.
Fraunhofer-Institut für Experimentelles Software Engineering IESE  
Dalpiaz, Fabiano
Brinkkemper, Sjaak
Mainwork
26th International Working Conference on Requirements Engineering: Foundation for Software Quality, REFSQ 2020. Proceedings  
Conference
International Conference on Requirements Engineering - Foundation for Software Quality (REFSQ) 2020  
Open Access
DOI
10.1007/978-3-030-44429-7_11
Additional link
Full text
Language
English
Fraunhofer-Institut für Experimentelles Software Engineering IESE  
Keyword(s)
  • Crowd-based requirements engineering

  • Crowdsourcing

  • Online user reviews

  • Quality requirements

  • User feedback analysis

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024