TY - GEN
T1 - Video-based Hand Gesture Recognition using Random Forest for Sign Language Interpretation
AU - Himasree, J.
AU - Jeevitha, P. L.
AU - Deekshitha, K.
AU - Kolisetty, Aashrita
AU - Naveen, Soumyalatha
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Effective communication is essential for all individuals, yet it can be particularly challenging for the deaf community, as sign language is their major means of communication. We are researching how to create a reliable and effective system that can understand sign language motions correctly without the help of a human, realizing the importance of this problem.In this research work we have chosen to construct our own dataset rather than depending on pre-existing ones that may be accessed online because we understand how important authenticity and relevance are. This method gives us more control over the quality of the data and guarantees that the dataset closely matches our study goals. In addition, our system can be readily expanded and recognizes a wide range of hand motions and signs, including the alphabet. We can support a broad variety of sign language expressions thanks to this versatility, which improves inclusivity and usability. Random Forest is used to encode temporal variations among motions in sign language. The system's overall accuracy and reliability are increased thanks to this methodology, which enables it to detect minute variations in hand gestures and movements. Using the Keras and OpenCV libraries in conjunction with the Python programming language, we build a strong foundation for sign language recognition. In addition to streamlining the development process, these technological tools open the system to a larger developer and research community. In alignment with the research, the proposed work achieved an accuracy of 97% with overall precision, recall and F1 score of 99%.
AB - Effective communication is essential for all individuals, yet it can be particularly challenging for the deaf community, as sign language is their major means of communication. We are researching how to create a reliable and effective system that can understand sign language motions correctly without the help of a human, realizing the importance of this problem.In this research work we have chosen to construct our own dataset rather than depending on pre-existing ones that may be accessed online because we understand how important authenticity and relevance are. This method gives us more control over the quality of the data and guarantees that the dataset closely matches our study goals. In addition, our system can be readily expanded and recognizes a wide range of hand motions and signs, including the alphabet. We can support a broad variety of sign language expressions thanks to this versatility, which improves inclusivity and usability. Random Forest is used to encode temporal variations among motions in sign language. The system's overall accuracy and reliability are increased thanks to this methodology, which enables it to detect minute variations in hand gestures and movements. Using the Keras and OpenCV libraries in conjunction with the Python programming language, we build a strong foundation for sign language recognition. In addition to streamlining the development process, these technological tools open the system to a larger developer and research community. In alignment with the research, the proposed work achieved an accuracy of 97% with overall precision, recall and F1 score of 99%.
UR - https://www.scopus.com/pages/publications/85205545327
UR - https://www.scopus.com/pages/publications/85205545327#tab=citedBy
U2 - 10.1109/APCIT62007.2024.10673591
DO - 10.1109/APCIT62007.2024.10673591
M3 - Conference contribution
AN - SCOPUS:85205545327
T3 - 2024 Asia Pacific Conference on Innovation in Technology, APCIT 2024
BT - 2024 Asia Pacific Conference on Innovation in Technology, APCIT 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 Asia Pacific Conference on Innovation in Technology, APCIT 2024
Y2 - 26 July 2024 through 27 July 2024
ER -