Abstract
Unlike sign language, which usually involves large-scale movements to form a gesture, finger language, suitable for handicapped aphasiacs, is represented by relatively small-scale hand gestures accessible by a mere change of the bending manner of a patient's fingers. Therefore, we need a system that can tackle the specificity of each handicapped aphasiac. We propose a system that fulfills this requirement by employing a programmable data glove to capture tiny movement-related finger gestures, an optical signal value-parameterized function to calculate the finger bending degrees, and an automatic regression module to extract most adequate finger features for a specific patient. The selected features are fed into a neural network, which learns to build a finger language recognition model for the specific patient. Then the system can be available for use by the specific user. At the time of this writing, the achieved average success rate was 100% from unbiased field experiments.
The authors gratefully acknowledge the financial support provided by the National Science Council, Taiwan, ROC under grants NSC 93-2622-E-163-001-CC3 and NSC 94-2213-E-163-003. We would also like to thank the generous assistance of Dr. Lih-Horng Shyu, Department of Electro-Optics Engineering, National Formosa University with the construction of the data glove.