Empowering Communication: A Review of Sign Language Translation Systems Powered by Machine Learning

[{“box”:0,”content”:”[if 992 equals=”Open Access”]n

n

n

n

Open Access

nn

n

n[/if 992]n

n

Year : June 14, 2024 at 10:47 am | [if 1553 equals=””] Volume :11 [else] Volume :11[/if 1553] | [if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] : 01 | Page : 25-31

n

n

n

n

n

n

By

n

[foreach 286]n

n

n

Rohan Kubde, Ayusha Malwe, Sharvari Mate, Simran Pardeshi, Sushma Nikumbh

n

    n t

  • n

n

n[/foreach]

n

n[if 2099 not_equal=”Yes”]n

    [foreach 286] [if 1175 not_equal=””]n t

  1. Lecturer, Student, Student, Student, Student Department of Electronics and Telecommunication, Sinhgad Institute of Technology and Science (SITS), Narhe, Pune, Department of Electronics and Telecommunication, Sinhgad Institute of Technology and Science (SITS), Narhe, Pune, Department of Electronics and Telecommunication, Sinhgad Institute of Technology and Science (SITS), Narhe, Pune, Department of Electronics and Telecommunication, Sinhgad Institute of Technology and Science (SITS), Narhe, Pune, Department of Electronics and Telecommunication, Sinhgad Institute of Technology and Science (SITS), Narhe, Pune Maharashtra, Maharashtra, Maharashtra, Maharashtra, Maharashtra India, India, India, India, India
  2. n[/if 1175][/foreach]

n[/if 2099][if 2099 equals=”Yes”][/if 2099]n

n

Abstract

nThis research study offers a fresh solution to the communication gap between the hearing population and the deaf and hard-of-hearing community: the creation of a machine learning-based sign language translator. By utilizing cutting-edge K Nearest Neighbour (K-NN), the system effectively converts sign language motions into text and vice versa, facilitating smooth communication between sign language users and well-read people. The basis of the project is thorough data collection and careful preparation, which produces a strong and varied dataset that can efficiently handle different variances in sign language. The study’s conclusions show how accurately the system can translate sign language gestures, outperforming earlier standards. In addition, the model has the ability to translate text in real time, which increases accessibility and inclusivity for people with hearing loss. For millions of people worldwide, sign language is an essential means of communication, but there is still a divide between signers and non-signers. The use of machine learning (ML) in sign language translation systems has shown promise in closing this gap. An extensive overview of current developments in machine learning-powered sign language translation systems is provided in this article. We explore several machine learning approaches, obstacles, and potential paths forward in this field, emphasising how it might improve communication for the community of people who are deaf or hard of hearing.

n

n

n

Keywords: Communication gap, K Nearest Neighbour (K-NN), Data collection, Sign Languages and Gestures, Translation

n[if 424 equals=”Regular Issue”][This article belongs to Recent Trends in Electronics Communication Systems(rtecs)]

n

[/if 424][if 424 equals=”Special Issue”][This article belongs to Special Issue under section in Recent Trends in Electronics Communication Systems(rtecs)][/if 424][if 424 equals=”Conference”]This article belongs to Conference [/if 424]

n

n

n

How to cite this article: Rohan Kubde, Ayusha Malwe, Sharvari Mate, Simran Pardeshi, Sushma Nikumbh. Empowering Communication: A Review of Sign Language Translation Systems Powered by Machine Learning. Recent Trends in Electronics Communication Systems. June 5, 2024; 11(01):25-31.

n

How to cite this URL: Rohan Kubde, Ayusha Malwe, Sharvari Mate, Simran Pardeshi, Sushma Nikumbh. Empowering Communication: A Review of Sign Language Translation Systems Powered by Machine Learning. Recent Trends in Electronics Communication Systems. June 5, 2024; 11(01):25-31. Available from: https://journals.stmjournals.com/rtecs/article=June 5, 2024/view=0

nn[if 992 equals=”Open Access”] Full Text PDF Download[/if 992] n[if 992 not_equal=”Open Access”]

[/if 992]n[if 992 not_equal=”Open Access”]

n


nn[/if 992]nn[if 379 not_equal=””]n

Browse Figures

n

n

[foreach 379]n

n[/foreach]n

n

n

n[/if 379]n

n

References

n[if 1104 equals=””]n

  1. Simei G. Wysoski, Marcus V. Lamar, Susumu Kuroy anagi, Akira Iwata, (2002). “A Rotation Invariant Approach On Static-Gesture Recognition Using Boundary Histograms And Neural International Journal of Arti ficial Intelligence Applications (IJAIA), Vol.3, No.4, July 2012.
  2. Bauer,H. Hienz “Relevant features for video-based continuous sign language recognition”, IEEE International Conference on Automatic Face and Gesture Recognition, 2002.
  3. Stergiopoulou, N. Papamarkos. (2009). “Hand gesture recognition using a neural network shape fitting technique,” Elsevier Engineering Applications of Artificial Intelligence, vol. 22(8), pp. 1141– 1158, doi: 1016/j.engappai.2009.03.008.
  4. S. Kulkarni, S.D. Lokhande, (2010) “Appearance Based Recognition of American Sign Language Using Gesture Segmentation”, International Journal on Computer Science and Engineering (IJCSE), Vol. 2(3), pp. 560-565.
  5. Mokhtar M. Hasan, Pramoud K. Misra, (2011). “Brightness Factor Matching For Gesture Recognition System Using Scaled Normalization”, International Journal of Computer Science Information Technology (IJCSIT), Vol. 3(2).
  6. Pigou, L., Dieleman, S., Kindermans, P.-J., Schrauwen, B. (2014). Sign Language Recognition Using Convolutional Neural Networks.
  7. Kumud Tripathi, Neha Baranwal and G. C. Nandi, “Continuous Indian Sign Language Gesture Recog nition and Sentence Formation”, Eleventh International Multi Conference on Information Processing 2015 (IMCIP-2015), Procedia Computer Science 54 (2015) 523–531.
  8. Noor Tubaiz, Tamer Shanableh, and Khaled Assaleh, “Glove-Based Continuous Arabic Sign Language Recognition in User-Dependent Mode,” IEEE Transactions on Human-Machine Systems, Vol. 45, NO. 4, August 2015.
  9. Geethu G Nath and Arun C.S, “Real Time Sign Language Interpreter,” 2017 International Conference on Electrical, Instrumentation, and Communication Engineering (ICEICE2017).
  10. Jing-Hao Sun, Ting-Ting Ji, Shu-Bin Zhang, Jia-Kui Yang, Guang-Rong Ji “Research on the Hand Gesture Recognition Based on Deep Learning”,07 February 2019.

nn[/if 1104][if 1104 not_equal=””]n

    [foreach 1102]n t

  1. [if 1106 equals=””], [/if 1106][if 1106 not_equal=””],[/if 1106]
  2. n[/foreach]

n[/if 1104]

nn


nn[if 1114 equals=”Yes”]n

n[/if 1114]

n

n

[if 424 not_equal=””]Regular Issue[else]Published[/if 424] Subscription Review Article

n

n

n

n

n

Recent Trends in Electronics Communication Systems

n

[if 344 not_equal=””]ISSN: 2393-8757[/if 344]

n

n

n

n

n

[if 2146 equals=”Yes”][/if 2146][if 2146 not_equal=”Yes”][/if 2146]n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n[if 1748 not_equal=””]

[else]

[/if 1748]n

n

n

Volume 11
[if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] 01
Received May 3, 2024
Accepted May 29, 2024
Published June 5, 2024

n

n

n

n

n

n function myFunction2() {n var x = document.getElementById(“browsefigure”);n if (x.style.display === “block”) {n x.style.display = “none”;n }n else { x.style.display = “Block”; }n }n document.querySelector(“.prevBtn”).addEventListener(“click”, () => {n changeSlides(-1);n });n document.querySelector(“.nextBtn”).addEventListener(“click”, () => {n changeSlides(1);n });n var slideIndex = 1;n showSlides(slideIndex);n function changeSlides(n) {n showSlides((slideIndex += n));n }n function currentSlide(n) {n showSlides((slideIndex = n));n }n function showSlides(n) {n var i;n var slides = document.getElementsByClassName(“Slide”);n var dots = document.getElementsByClassName(“Navdot”);n if (n > slides.length) { slideIndex = 1; }n if (n (item.style.display = “none”));n Array.from(dots).forEach(n item => (item.className = item.className.replace(” selected”, “”))n );n slides[slideIndex – 1].style.display = “block”;n dots[slideIndex – 1].className += ” selected”;n }n”}]