User trainable sign language to speech glove using KNN classifier

V. Shwetha, Vijayalaxmi, Dhanin Anoop Asarpota, Himanshu Verma

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can help translate sign language to text and that can be trained by the user itself if required. To achieve this a glove was designed using five flex sensors, three contact sensors and an accelerometer. Flex sensors were chosen as they are resistive devices that change resistance when bent. Due to their compactness, they can also be easily put on to a glove along with contact sensors and an accelerometer. The data from these sensors is then fed to an Arduino where it is read and processed before being sent to MATLAB via Bluetooth. After getting the values, the smart gesture detection algorithm must be designed so as to improve accuracy. To do this the data from the Arduino is first used to train a KNN model for classification. The model created after training is then used for classification of the gestures. A GUI was designed with control signals that allows the user to make a word from these gestures and then the word is converted to speech. The glove accurately gives us data points that can be used to classify various gestures of the American Sign Language. The interactive GUI developed in MATLAB enables a user to easily use the glove to make and/or edit a word created by the user and then recite it out on a speaker.

Original languageEnglish
Pages (from-to)3053-3058
Number of pages6
Issue number2
Publication statusPublished - 01-01-2019

All Science Journal Classification (ASJC) codes

  • Computer Science(all)


Dive into the research topics of 'User trainable sign language to speech glove using KNN classifier'. Together they form a unique fingerprint.

Cite this