• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Artikel
  4. DaRA Dataset: Combining Wearable Sensors, Location Tracking, and Process Knowledge for Enhanced Human Activity and Human Context Recognition in Warehousing
 
  • Details
  • Full
Options
January 22, 2026
Journal Article
Title

DaRA Dataset: Combining Wearable Sensors, Location Tracking, and Process Knowledge for Enhanced Human Activity and Human Context Recognition in Warehousing

Abstract
Understanding human movement in industrial environments requires more than simple step counts - it demands contextual information to interpret activities and enhance workflows. Key factors such as location and process context are essential. However, research on context-sensitive human activity recognition is limited by the lack of publicly available datasets that include both human movement and contextual labels. Our work introduces the DaRA dataset to address this research gap. DaRA comprises over 109 h of video footage, including 32 h from wearable first-person cameras and 77 h from fixed third-person cameras. In a laboratory environment replicating a realistic warehouse, scenarios such as order picking, packaging, unpacking, and storage were captured. The movements of 18 subjects were captured using inertial measurement units, Bluetooth devices for indoor localization, wearable first-person cameras, and fixed third-person cameras. DaRA offers detailed annotations with 12 class categories and 207 class labels covering human movements and contextual information such as process steps and locations. A total of 15 annotators and 8 revisers contributed over 1572 h in annotation and 361 h in revision. High label quality is reflected in Light’s Kappa values ranging from 78.27% to 99.88%. Therefore, DaRA provides a robust, multimodal foundation for human activity and context recognition in industrial settings.
Author(s)
Niemann, Friedrich
Rueda, Fernando Moya
Al Kfari, Moh’d Khier
Nair, Nilah Ravi
Schauten, Dustin
Kretschmer, Veronika  
Fraunhofer-Institut für Materialfluss und Logistik IML  
Lüdtke, Stefan
Kirchheim, Alice  
Fraunhofer-Institut für Materialfluss und Logistik IML  
Journal
Sensors. Online journal  
Open Access
File(s)
Download (16.56 MB)
Rights
CC BY 4.0: Creative Commons Attribution
DOI
10.3390/s26020739
10.24406/publica-7633
Additional link
Full text
Language
English
Fraunhofer-Institut für Materialfluss und Logistik IML  
Keyword(s)
  • dataset

  • logistics

  • wearable

  • inertial measurement unit

  • Bluetooth

  • video

  • third -person view

  • first-person view

  • human activity recognition

  • human context recognition

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024