Now showing 1 - 2 of 2
  • Publication
    How Pedestrians Perceive Autonomous Buses: Evaluating Visual Signals
    ( 2021) ;
    Kozachek, Diana
    ;
    Konkol, Kathrin
    ;
    Woelfel, Christiane
    ;
    ;
    Stark, Rainer
    With the deployment of autonomous buses, sophisticated technological systems are entering our daily lives and their signals are becoming a crucial factor in human-machine interaction. The successful implementation of visual signals requires a well-researched human-centred design as a key component for the new transportation system. The autonomous vehicle we investigated in this study uses a variety of these: Icons, LED panels and text. We conducted a user study with 45 participants in a virtual reality environment in which four recurring communication scenarios between an autonomous driving bus and its potential passengers had to be correctly interpreted. For our four scenarios, efficiency and comprehension of each visual signal combination was measured to evaluate performance on different types of visual information. The results show that new visualization concepts such as LED panels lead to highly variable efficiency and comprehension, while text or icons were well ac cepted. In summary, the authors of this paper present the most efficient combinations of visual signals for four reality scenarios.
  • Publication
    Natural Virtual Reality User Interface to Define Assembly Sequences for Digital Human Models
    Digital human models (DHMs) are virtual representations of human beings. They are used to conduct, among other things, ergonomic assessments in factory layout planning. DHM software tools are challenging in their use and thus require a high amount of training for engineers. In this paper, we present a virtual reality (VR) application that enables engineers to work with DHMs easily. Since VR systems with head-mounted displays (HMDs) are less expensive than CAVE systems, HMDs can be integrated more extensively into the product development process. Our application provides a reality-based interface and allows users to conduct an assembly task in VR and thus to manipulate the virtual scene with their real hands. These manipulations are used as input for the DHM to simulate, on that basis, human ergonomics. Therefore, we introduce a software and hardware architecture, the VATS (virtual action tracking system). This paper furthermore presents the results of a user study in which the VATS was compared to the existing WIMP (Windows, Icons, Menus and Pointer) interface. The results show that the VATS system enables users to conduct tasks in a significantly faster way.