Options
2025
Conference Paper
Title
Comparison of Google Pixel Watch 1, 2, and 3 in the Context of Real-Time Activity Recognition in Nursing Care
Abstract
This work investigates the performance of Google Pixel Watch models 1, 2, and 3 in machine-learning-based activity recognition. The study focuses on analyzing sensor differences and assessing the applicability of sensor data for accurate detection of everyday activities such as eating, drinking, and writing. An experimental design was developed to collect data from accelerometer, gyroscope, gravity, attitude (orientation sensor), heart rate sensor and microphone with a sampling rate 20 Hz. The analysis includes three experiments: evaluating sensor data for each model in isolation, testing cross-device data compatibility, and comparing battery performance under continuous data collection. Results indicate significant performance differences, with the Google Pixel Watch 3 showing superior classification results, while the Pixel Watch 2 underperformed expectations. Long Short-Term Memory (LSTM) networks were found to outperform traditional machine-learning algorithms. The study highlights the importance of high-quality sensor data for activity recognition and its universal applicability. Furthermore, the findings emphasize that advancements in sensor technology can significantly improve classification accuracy. Future research should address cross-device data usage and battery performance optimization.
Author(s)