Smart Glasses Using Ultrasonic Sensor and AI for Blind Person

Year : 2024 | Volume :11 | Issue : 02 | Page : 1-9
By
vector

Saurabh Tiwari,

vector

Yuvraj SinghBhadauria,

vector

Shivam Mishra,

vector

Shubham Singh,

vector

Nilufar Yasmin,

  1. Student, Department of Electronics & Communication Engineering, Ajay Kumar Garg Engineering College, Ghaziabad, Uttar Pradesh, India
  2. Student, Department of Electronics & Communication Engineering, Ajay Kumar Garg Engineering College, Ghaziabad, Uttar Pradesh, India
  3. Student, Department of Electronics & Communication Engineering, Ajay Kumar Garg Engineering College, Ghaziabad, Uttar Pradesh, India
  4. Student, Department of Electronics & Communication Engineering, Ajay Kumar Garg Engineering College, Ghaziabad, Uttar Pradesh, India
  5. Student, Department of Electronics & Communication Engineering, Ajay Kumar Garg Engineering College, Ghaziabad, Uttar Pradesh,

Abstract document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_abs_111811’);});Edit Abstract & Keyword

Smart glasses has received considerable attention recently from people around the world. This research paper introduces a pioneering project, ‘Smart Glasses Using AI and Ultrasonic Sensor,’ aimed at revolutionizing the assistive technology landscape for visually impaired individuals. The project seamlessly integrates advanced hardware, including Raspberry Pi and Node MCU, with an array of sensors and state-of-the-art machine learning techniques, notably the YOLOv5 model. This paper presents a new paradigm in assistive technology where the fusion of artificial intelligence and ultrasonic sensor technologies empowers users with enhanced spatial awareness and real-time assistance. The smart glasses introduce novel features such as object recognition, temperature sensing, ‘find my glass’ functionality, and distance measurement, leveraging a user-centric design, cost-effective solutions, and community-informed development, promising to significantly enhance accessibility, convenience, and independence for visually impaired individuals.

Keywords: Smart glass, ultrasonic sensor, node MCU, raspberry pi, Pi camera, temperature sensor, IR sensor

[This article belongs to Recent Trends in Sensor Research & Technology (rtsrt)]

How to cite this article:
Saurabh Tiwari, Yuvraj SinghBhadauria, Shivam Mishra, Shubham Singh, Nilufar Yasmin. Smart Glasses Using Ultrasonic Sensor and AI for Blind Person. Recent Trends in Sensor Research & Technology. 2024; 11(02):1-9.
How to cite this URL:
Saurabh Tiwari, Yuvraj SinghBhadauria, Shivam Mishra, Shubham Singh, Nilufar Yasmin. Smart Glasses Using Ultrasonic Sensor and AI for Blind Person. Recent Trends in Sensor Research & Technology. 2024; 11(02):1-9. Available from: https://journals.stmjournals.com/rtsrt/article=2024/view=0

Full Text PDF

Browse Figures

References
document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_ref_111811’);});Edit

  1. Tahoun N, Awad A, Bonny T. Smart assistant for blind and visually impaired people. InProceedings of the 3rd International Conference on Advances in Artificial Intelligence 2019 Oct 26 (pp. 227–231).
  2. Samuda P, Praveena NG, Nithiya C, Komathi BJ, Pavithra J, Kiruthika V. Arduino based Customized Smart Glasses for the Blind People. In2022 Second International Conference on Artificial Intelligence and Smart Energy (ICAIS) 2022 Feb 23 (pp. 1136–1141). IEEE.
  3. Babu, R., Basha, S. D., Chandrakanth, M., Teja, R. T., & Hemanth, P. (2024, February). Smart Blind Glasses Using OpenCV Python. In 2024 IEEE Wireless Antenna and Microwave Symposium (WAMS) (pp. 01–04). IEEE.
  4. González-Lorence A, Navarrete-Fernández ÁC, Ayala-Landeros R, Soto-Osornio JE, Ayala-Landeros JG. Intelligent Mobility System for Improving the Blind Pedestrian Independent Behavior in Unknown Outdoor Environments. Human Behavior and Emerging Technologies. 2022;2022(1):4943457.
  5. Nazim S, Firdous S, Pillai SR, Shukla VK. Smart glasses: A visual assistant for the blind. In2022 International Mobile and Embedded Technology Conference (MECON) 2022 Mar 10 (pp. 621–626). IEEE.
  6. Sharma SR, Rawat V, Rawat T, Ahmed W, Dev D. Smart Glasses for Blind People Using Obstacles Detection. In2023 International Conference on Sustainable Emerging Innovations in Engineering and Technology (ICSEIET) 2023 Sep 14 (pp. 891–894). IEEE.
  7. Ahmed, Mohammad & Reddy, (2023). Obstacle Avoidance Using Wi-Fi Enabled Smart Ultrasonic Glasses for Visually Blind. International Journal for Research in Applied Science and Engineering Technology. 11. 12–17. 10.22214/ijraset.2023.55910.
  8. Rajput, , Ahmed, F., Ahmed, H., Ahmed Shaikh, Z., & Shamshad, A. (2014). Smart Obstacle Detector for Blind Person. Journal of Biomedical Engineering and Medical Imaging, 1(3), 31–40. https://doi.org/10.14738/jbemi.13.245
  9. Bolimera, Ravi & Vittapu, Sravankumar & Sankuru, Ravichand & Reddy, Shamakuri & Sravya, Marella & Srinath, Singirala. (2023). Ultrasonic Sensor and Arduino-Based Smart Glasses for Visually Assisting the Blind. Journal of Electronic Design Engineering. 9. 28–35. 46610/JOEDE.2023.v09i03.005.
  10. Rani, T. P., Sakthy, S. S., Kalaichelvi, P., Vignesh, T., & Priyadharshan, M. (2023, December). Visual Information Translator Using Smart Glasses for Blind. In 2023 Intelligent Computing and Control for Engineering and Business Systems (ICCEBS) (pp. 1-6). IEEE.
  11. Lorenzini MC, Jarry J, Wittich W. The impact of using eSight eyewear on functional vision and oculo-motor control in low vision patients. Investigative Ophthalmology & Visual Science. 2017 Jun 23;58(8):3267-.
  12. Granquist C, Sun SY, Montezuma SR, Tran TM, Gage R, Legge GE. Evaluation and comparison of artificial intelligence vision aids: Orcam myeye 1 and seeing ai. Journal of Visual Impairment & Blindness. 2021 Jul;115(4):277–85.
  13. Bigham JP, Prince CM, Ladner RE. WebAnywhere: a screen reader on-the-go. InProceedings of the 2008 international cross-disciplinary conference on Web accessibility (W4A) 2008 Apr 21 (pp. 73–82).

   


Regular Issue Subscription Original Research
Volume 11
Issue 02
Received 06/07/2024
Accepted 15/07/2024
Published 25/07/2024

function myFunction2() {
var x = document.getElementById(“browsefigure”);
if (x.style.display === “block”) {
x.style.display = “none”;
}
else { x.style.display = “Block”; }
}
document.querySelector(“.prevBtn”).addEventListener(“click”, () => {
changeSlides(-1);
});
document.querySelector(“.nextBtn”).addEventListener(“click”, () => {
changeSlides(1);
});
var slideIndex = 1;
showSlides(slideIndex);
function changeSlides(n) {
showSlides((slideIndex += n));
}
function currentSlide(n) {
showSlides((slideIndex = n));
}
function showSlides(n) {
var i;
var slides = document.getElementsByClassName(“Slide”);
var dots = document.getElementsByClassName(“Navdot”);
if (n > slides.length) { slideIndex = 1; }
if (n (item.style.display = “none”));
Array.from(dots).forEach(
item => (item.className = item.className.replace(” selected”, “”))
);
slides[slideIndex – 1].style.display = “block”;
dots[slideIndex – 1].className += ” selected”;
}