Towards sensor position-invariant hand gesture recognition using a mechanomyographic interface

https://doi.org/10.23919/SPA.2017.8166837

This paper presents a study on the feasibility of an interface based on a mechanomyographic signal (MMG). Existing state-of-the-art studies show attempts of utilisation of MMG signal for gesture recognition where the sensors’ location is strictly defined with respect to muscle position. A test setup consisting of 5 IMU sensors arranged in a band form was used. The classifier for 5 gestures (fist, pronation, supination, flexion, extension) and idle state was implemented by using a feed-forward neural network with softmax output. The feature vector consists of 18 features: 5 representing muscle activity (RMS) and 13 parameters corresponding to relative sensor orientation, being indicators of local skin surface deformation evoked by muscle shortening. It was shown that taking relative sensor orientation into consideration, was a crucial factor for improvement of classification performance and position invariance. The interface was tested on three subjects, in three distinct orientations. The results showed that average performance represented by F1 score was 94±6%.