Abstract
Vision-based Sign Language Recognition has been an open research problem since decades. Many existing methods for sign recognition works well under restricted laboratory conditions but failed to support real-time scenarios because extraction of manual and non-manual movements with constantly changing shapes of signs are considered as tedious problem in machine vision and machine learning. To overcome these shortcomings, an interactive real time class level gesture similarity based sign recognition using Artificial Neural Network is presented in this paper. The method uses the sign images and starts with enhancing the image quality. The quality enhancement is performed by equalizing the histograms of luminance and contrast. The features of hand as subunits from quality improved image have been extracted by template matching techniques. Extracted features are used to generate neural network and trained with different class of signs. The classification is performed by measuring the class level gesture similarity measure towards each class of signs and images. Based on the measure estimated, the method classifies the image and sign. The result produced to the user has been iterated based on the actions provided by the user. The method is capable of iterating the result and recognition till the user gets satisfied. The method produces higher accuracy in sign recognition and reduces the false ratio.
Get full access to this article
View all access options for this article.
