Now showing 1 - 2 of 2
  • Publication
    Data augmentation for inertial sensor based human action recognition using deep learning
    ( 2024) ;
    Filaretov, Hristo
    ;
    Human Activity Recognition (HAR) approaches are predominantly based on supervised deep learning and benefit from large amounts of labeled data - an expensive resource. Data augmentation enriches labeled datasets by adding synthetic data, which is substantially cheaper, and often results in improved model performance, but is very rarely used for sensor data. This work explores data augmentation for inertial-sensor-based HAR by transforming the data through physically interpretable operations. The main studies were conducted on the Opportunity and the Overhead Car Assembly (OCA) datasets. For these experiments, only 20% of the available training data were used, and the experiments were conducted in an 8-fold cross-validation procedure over different subsets of the training data. The results show that simple geometric augmentations can be beneficial in many cases. Timewarping proved to offer the most reliable single augmentation, improving the average F1 score of Opportunity from 0.570 to 0.597 and of OCA Mixed from 0.884 to 0.906. Combining augmentations improved the accuracy in almost all scenarios but to a degree comparable to timewarping. Applying augmentations on all the available training data improved the F1 score compared to the base case with no augmentations, although this effect is more pronounced for datasets with more similar training and test data: for the OCA Mixed variant, the average F1 score improved from 0.917 to 0.933, while for the OCA Leave-One-Out (LOT) variant, the average F1 score did not significantly change. For Opportunity, which similarly to OCA LOT uses a participant-based training-test split, the F1 score improved from 0.684 to 0.697.
  • Publication
    Inertial Measurement Unit based Human Action Recognition Dataset for Cyclic Overhead Car Assembly and Disassembly
    ( 2022) ;
    Filaretov, Hristo
    ;
    Motion datasets in industrial environments are essential for the research on human-robot interaction and new exoskeleton control. Currently, a lot of Activities of Daily Living (ADL) datasets are available for researchers, but only a few target an industrial context. This paper presents a dataset for a semi-industrial Overhead Car Assembly (OCA) task consisting of synchronized video and 9-Degrees of Freedom (DOF) Inertial Measurement Unit (IMU) data. The dataset was recorded with a soft-robotic exoskeleton equipped with 4 IMUs covering the upper body. It has a minimum sampling rate of 20 Hz, lasts approximately 360 minutes and comprises of 282 cycles of a realistic industrial assembly task. The annotations consist of 6 mid-level actions and an additional Null class. Five different test subjects performed the task without specific instructions on how to assemble the used car shielding. In this paper, we describe the dataset, set guidelines for using the data in supervised learning approaches, and analyze the labeling error caused by the labeler onto the dataset. We also compare different state-of-the-art neural networks to set the first benchmark and achieve a weighted F1 score of 0.717.