Sign Language-enabled Offline IP-based Video Call Intercom System for the Deaf


Year : 2025 | Volume : 03 | Issue : 01 | Page : 53-66
    By

    Prakrati Bajpai,

  • Soni M.,

  • Nikita Singh,

  • Ravi Shankar Kumar,

  • Sanjay B.R.,

  1. Student, Department Electrical and Electronics Engineering, Dayananda Sagar College of Engineering, Bangalore, Karnataka, India
  2. Assistant Professor, Department Electrical and Electronics Engineering, Dayananda Sagar College of Engineering, Bangalore, Karnataka, India
  3. Student, Department Electrical and Electronics Engineering, Dayananda Sagar College of Engineering, Bangalore, Karnataka, India
  4. Student, Department Electrical and Electronics Engineering, Dayananda Sagar College of Engineering, Bangalore, Karnataka, India
  5. Student, Department Electrical and Electronics Engineering, Dayananda Sagar College of Engineering, Bangalore, Karnataka, India

Abstract

document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_abs_175426’);});Edit Abstract & Keyword

Individuals who are deaf or hard of hearing frequently encounter major communication challenges on digital platforms because sign language support is often insufficient. This study presents a real-time, internet-independent sign language-to-text translation system embedded in an offline IP-based video intercom, designed to facilitate seamless communication. Leveraging TensorFlow’s Convolutional Neural Network (CNN) for classification and MediaPipe for hand tracking, the system achieved a training accuracy of 94.24% and a validation accuracy of 94.01% within 20 epochs. In contrast to existing solutions that depend on internet connectivity or specialized hardware, this system operates offline, making it scalable, cost-effective, and adaptable for diverse environments. Furthermore, it integrates vibration sensors to provide tactile alerts, enhancing emergency notifications and usability in noisy or visually inaccessible settings. By combining real-time sign language translation with an offline IP-based video intercom, this system addresses the limitations of conventional video calling platforms. It offers an inclusive and practical solution for communication, making it particularly suitable for workplaces, public spaces, and other environments requiring accessible communication tools for the deaf and hard-of-hearing community.

Keywords: Sign language recognition, CNN, video call intercom, offline LAN IP system, sensor, real-time text conversion, WebRTC, tactile feedback, emergency alerts

[This article belongs to International Journal of Image Processing and Pattern Recognition (ijippr)]

How to cite this article:
Prakrati Bajpai, Soni M., Nikita Singh, Ravi Shankar Kumar, Sanjay B.R.. Sign Language-enabled Offline IP-based Video Call Intercom System for the Deaf. International Journal of Image Processing and Pattern Recognition. 2025; 03(01):53-66.
How to cite this URL:
Prakrati Bajpai, Soni M., Nikita Singh, Ravi Shankar Kumar, Sanjay B.R.. Sign Language-enabled Offline IP-based Video Call Intercom System for the Deaf. International Journal of Image Processing and Pattern Recognition. 2025; 03(01):53-66. Available from: https://journals.stmjournals.com/ijippr/article=2025/view=0


document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_ref_175426’);});Edit

References

  1. Meng Y, Jiang H, Duan N, Wen H. Real-Time Hand Gesture Monitoring Model Based on MediaPipe’s Registerable System. Sensors. 2024 Sep 27; 24(19): 6262.
  2. Moryossef A, Tsochantaridis I, Aharoni R, Ebling S, Narayanan S. Real-time sign language detection using human pose estimation. In Computer Vision ECCV 2020 Workshops: Glasgow, UK, August 23–28, 2020, Proceedings, Part II 16. Springer International Publishing; 2020; 237–248.
  3. 11Cao Z, Simon T, Wei SE, Sheikh Y. Realtime multi-person 2d pose estimation using part affinity fields. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2017; 7291–7299.
  4. Amutha S, Shanmukh N, Naidu AP, Kumar PV, Narayana GS. Real-Time Sign Language Recognition using a Multimodal Deep Learning Approach. In 2023 IEEE International Conference on Advances in Computing, Communication and Applied Informatics (ACCAI). 2023 May 25; 1–8.
  5. Borg M, Camilleri KP. Sign language detection “in the wild” with recurrent neural networks. In ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2019 May 12; 1637–1641.
  6. Huang J, Chouvatut V. Video-Based Sign Language Recognition via ResNet and LSTM Network. J Imaging. 2024 Jun; 10(6): 149.
  7. Camgoz NC, Hadfield S, Koller O, Ney H, Bowden R. Neural sign language translation. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2018; 7784–7793.
  8. Adaloglou N, Chatzis T, Papastratis I, Stergioulas A, Papadopoulos GT, Zacharopoulou V, Xydopoulos GJ, Atzakas K, Papazachariou D, Daras P. A comprehensive study on deep learning-based methods for sign language recognition. IEEE Trans Multimedia. 2021 Apr 1; 24: 1750–62.
  9. Miah AS, Hasan MA, Nishimura S, Shin J. Sign language recognition using graph and general deep neural network based on large scale dataset. IEEE Access. 2024 Mar 1; 12: 34553–34569.
  10. Aggarwal D, Ahirwar S, Srivastava S, Verma S, Goel Y. Sign Language Prediction using Machine Learning Techniques: A Review. In 2023 IEEE 2nd International Conference on Electronics and Renewable Systems (ICEARS). 2023 Mar 2; 1296–1300.
  11. Al Abdullah B, Amoudi G, Alghamdi H. Advancements in Sign Language Recognition: A Comprehensive Review and Future Prospects. IEEE Access. 2024 Sep 10; 12: 128871–128895.
  12. Simon T, Joo H, Matthews I, Sheikh Y. Hand keypoint detection in single images using multiview bootstrapping. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition. 2017; 1145–1153.
  13. Soni M, Bhat A, Aralikatti S, Pasha A, Niranjan L. An Efficient Digital Cluster Scheme to Improve the Lifetime Ratio of Wireless Sensor Networks. In 2023 IEEE International Conference on Smart Systems for applications in Electrical Sciences (ICSSES). 2023 Jul 7; 1–5.
  14. Smilkov D, Thorat N, Assogba Y, Nicholson C, Kreeger N, Yu P, Cai S, Nielsen E, Soegel D, Bileschi S, Terry M. Tensorflow.js: Machine learning for the web and beyond. Proceedings of Machine Learning and Systems. 2019 Apr 15; 1: 309–21.
  15. Fareed AI, Ramanathan M, Yeswanth R, Devi S. Translation Tool for Alternative Communicators using Natural Language Processing. In 2024 IEEE 5th International Conference on Electronics and Sustainable Communication Systems (ICESC). 2024 Aug 7; 842–848.
  16. Miah AS, Hasan MA, Shin J, Okuyama Y, Tomioka Y. Multistage spatial attention-based neural network for hand gesture recognition. Computers. 2023 Jan 5; 12(1): 13.
  17. Abdallah MS, Samaan GH, Wadie AR, Makhmudov F, Cho YI. Light-weight deep learning techniques with advanced processing for real-time hand gesture recognition. Sensors. 2022 Dec 20; 23(1): 2.
  18. Koller O, Zargaran S, Ney H, Bowden R. Deep Sign: Hybrid CNN-HMM for Continuous Sign Language Recognition. In Proceedings of the British Machine Vision Conference (BMVC). 2016 Sep 19; 136.1–136.12.
  19. Premsai I, Thiyagu TM. IoT based Wireless Alert System for Individuals with Impaired Hearing. In2024 3rd International Conference on Sentiment Analysis and Deep Learning (ICSADL). 2024 Mar 13; 662–666.
  20. Monteiro CD, Mathew CM, Gutierrez-Osuna R, Shipman F. Detecting and identifying sign languages through visual features. In 2016 IEEE International Symposium on Multimedia (ISM). 2016 Dec 11; 287–290.
  21. Joshi A, Agrawal S, Modi A. ISLTranslate: Dataset for Translating Indian Sign Language. arXiv preprint arXiv:2307.05440. 2023 Jul 11.

Regular Issue Subscription Original Research
Volume 03
Issue 01
Received 15/01/2025
Accepted 18/01/2025
Published 07/02/2025
Publication Time 23 Days

async function fetchCitationCount(doi) {
let apiUrl = `https://api.crossref.org/works/${doi}`;
try {
let response = await fetch(apiUrl);
let data = await response.json();
let citationCount = data.message[“is-referenced-by-count”];
document.getElementById(“citation-count”).innerText = `Citations: ${citationCount}`;
} catch (error) {
console.error(“Error fetching citation count:”, error);
document.getElementById(“citation-count”).innerText = “Citations: Data unavailable”;
}
}
fetchCitationCount(“10.37628/IJIPPR.v03i01.0”);

Loading citations…