A dataset collected to evaluate gesture recognition algorithms for interacting with mobile applications using arm gestures.
The SmartWatch Gestures Dataset has been collected to evaluate several gesture recognition algorithms for interacting with mobile applications using arm gestures.
Eight different users performed twenty repetitions of twenty different gestures, for a total of 3200 sequences. Each sequence contains acceleration data from the 3-axis accelerometer of a first generation Sony SmartWatch™, as well as timestamps from the different clock sources available on an Android device. The smartwatch was worn on the user's right wrist. The gestures have been manually segmented by the users performing them by tapping the smartwatch screen at the beginning and at the end of every repetition.
Disclaimer: SmartWatch Gestures Dataset is provided for research or academic purposes only. Publications and works that use this dataset should please refer to one of the papers referenced below.
G. Costante, L. Porzi, O. Lanz, P. Valigi, E. Ricci, Personalizing a Smartwatch-based Gesture Interface With Transfer Learning, 22nd European Signal Processing Conference, EUSIPCO 2014
L. Porzi and S. Messelodi and C.M. Modena and E. Ricci: A Smart Watch-based Gesture Recognition System for Assisting People with Visual Impairments. ACM International Workshop on Interactive Multimedia on Mobile and Portable Devices - IMMPD, Barcelona, Spain, 2013