Fraunhofer-Gesellschaft

Publica

Hier finden Sie wissenschaftliche Publikationen aus den Fraunhofer-Instituten.

On the Road to Enriching the App Improvement Process with Emotions

 
: Scherr, Simon André; Mennig, Patrick; Kammler, Christian; Elberzhager, Frank

:

Institute of Electrical and Electronics Engineers -IEEE-:
IEEE 27th International Requirements Engineering Conference Workshops, REW 2019. Proceedings : 23-27 September 2019, Jeju Island, South Korea
Piscataway, NJ: IEEE, 2019
ISBN: 978-1-7281-5165-6
ISBN: 978-1-7281-5166-3
pp.84-91
International Requirements Engineering Conference (RE) <27, 2019, Jeju Island/South Korea>
Bundesministerium für Bildung und Forschung BMBF (Deutschland)
02K14A182; Opti4Apps
Bundesministerium fur Wirtschaft und Energie BMWi (Deutschland)
03SBE112D; EnStadt: Pfaff
Bundesministerium fur Wirtschaft und Energie BMWi (Deutschland)
03SBE112G; EnStadt: Pfaff
Bundesministerium für Bildung und Forschung BMBF (Deutschland)
03SBE112D; EnStadt: Pfaff
Bundesministerium für Bildung und Forschung BMBF (Deutschland)
03SBE112G; EnStadt: Pfaff
English
Conference Paper
Fraunhofer IESE ()
Emotions; Apps; TrueDepth Camera; Study; User Experience

Abstract
The success of an app depends on its acceptance by the users. This is closely related to a good user experience and fulfill-ment of the users' requirements. Recent research emphasizes the use of user feedback as part of the requirements engineering process. Analyzing emotions can play an important role in determining how users perceive a product. Therefore, we propose the idea of emotion tracking on the users' devices. We aim on a method for the verification and validation of software requirements. To get closer to this goal, we performed a study using the iPhone's TrueDepth Camera. The aim was to link muscular positions to emotions. In this work, we present a study with actors. We evaluated facial muscle recordings from 23 actors playing scenes from a popular TV show. Our analysis shows that deriving emotions with the help of the TrueDepth Camera appears to be a promising approach. We were able to generate an initial model for detecting basic emotions relative to a person's neutral face. This provides a first step towards automatically identifying the emotions of users while using an app. Such feedback could then be used as an additional source of information when it comes to subsequent app development activities.

: http://publica.fraunhofer.de/documents/N-575274.html