Options
2025
Journal Article
Title
Virtual reality interactions via a user-generic ultrasound human-machine interface for wrist and hand tracking
Abstract
As computers move from desktop screens into our glasses, traditional controllers such as keyboards and mice have proven impractical. A control interface for immersive experiences needs to seamlessly transport intention from the real to the virtual world while remaining portable, accurate, and robust. Here, we present an easy-to-wear, dry-contact and portable ultrasound armband that can decode morphological information and act as a virtual reality controller by predicting hand and wrist kinematics. Using our armband, we collected a large dataset of paired ultrasound and hand kinematics and used it to train supervised deep-learning models capable of predicting hand kinematics from ultrasound. We explored how diverse intra-session, cross-session, and cross-participant data shifts affect model performance. Further, we proposed methods for data conditioning, augmentation, and a referencing strategy to mitigate the influence of confounding factors and to achieve accurate prediction of hand kinematics on unseen users without fine-tuning. Finally, we demonstrated the feasibility of our interface in a real-time virtual reality control framework. Using the developed ultrasound interface, participants completed challenging interaction tasks with simulated contact physics. This work demonstrates the potential of ultrasound-based technologies as a virtual reality interface, showcasing strong performance, robustness, and generalization potential.
Author(s)
Open Access
File(s)
Rights
CC BY 4.0: Creative Commons Attribution
Additional link
Language
English