Abstract
People having physical limitations such as speech and hearing impairment are often unable to convey their message properly, which leads to them being left out in many aspects of life. To help those people express themselves in a better and easier way we have developed the sign language detection application. We have developed a translator that takes hand gestures and input and give the equivalent alphabet as output, which will help those people to communicate. Convolutional neural network was used for image recognition and classification in order to detect the hand from other objects in the screen and classify the sign represented by the hand gesture at any given time, thus enabling us to translate the signs into English alphabets. Thus, this application can be used by the specially abled people to communicate with others in a more efficient and hassle freeway. The tools used were anaconda, python, Opencv, tensorflow, matplotlib, numpy, convolutional neural networks.
Subject Classification: