Applying Hand Gesture Recognition with Time-of-Flight Camera for 3D Medical Data Analysis
Abstract: This paper describes a human-computer interface based on hand gesture recognition, intended for analysis of 3D medical data. The gestures are designed to minimize the required muscle tension when using the system. Gesture recognition is based on a 3D sensor. Depth maps are acquired by a time-of-flight camera, designed specifically for hand gestures recognition. The depth images are denoised and segmented to right and left hand. The contours of the hands are found and a modified Shape Context descriptor is utilized for each hand, providing a set of features, which are employed to train and test various classifiers. Naive Bayes, Random Forest and Support Vector Machine (SVM) classifiers are utilized, with search of optimal parameters using cross-validation. The best accuracy (95%) is achieved with the Support Vector Machine classifier. The gestures are mapped to various controls of a 3D medical visualization module. Two visualization methods are employed - isosurface and cut-planes. The left hand is assigned to switching between different control modes and the right hand gestures are corresponding to controlling various properties in each mode. The system is convenient to use and runs in real-time on a typical PC machine.
Keywords: human-computer interaction, hand gesture recognition, time-of-flight camera
Area: Electronics, Telecommunications and Informatics
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.