GESTURE AND VOICE DRIVEN REMOTE CONTROL SYSTEM

[{“box”:0,”content”:”[if 992 equals=”Open Access”]n

n

n

n

Open Access

nn

n

n[/if 992]n

n

Year : June 14, 2024 at 12:34 pm | [if 1553 equals=””] Volume :14 [else] Volume :14[/if 1553] | [if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] : 01 | Page : 1-9

n

n

n

n

n

n

By

n

[foreach 286]n

n

n

Tanmay Lonare, Devyani Mahajan, Anish Kulkarni, S.V. Tathe

n

    n t

  • n

n

n[/foreach]

n

n[if 2099 not_equal=”Yes”]n

    [foreach 286] [if 1175 not_equal=””]n t

  1. Student,, Student,, Student,, Professor, Department of Electronics and Telecommunication, Sinhgad College of Engineering, Savitribai Phule Pune University, Pune,, Department of Electronics and Telecommunication, Sinhgad College of Engineering, Savitribai Phule Pune University, Pune,, Department of Electronics and Telecommunication, Sinhgad College of Engineering, Savitribai Phule Pune University, Pune,, Department of Electronics and Telecommunication, Sinhgad College of Engineering, Savitribai Phule Pune University, Pune, Maharashtra,, Maharashtra,, Maharashtra,, Maharashtra, India, India, India, India
  2. n[/if 1175][/foreach]

n[/if 2099][if 2099 equals=”Yes”][/if 2099]n

n

Abstract

nThe Gesture and Voice Driven Remote Control System represents an innovative technology that empowers users to interact with and control electronic devices through hand movements, voice commands, and sensor integration while also enabling remote access and operation. This system provides an alternative and more accessible means of interaction, ultimately redefining the way we engage with our devices. By interpreting gestures and voice commands through advanced recognition algorithms and speech recognition, it eliminates the need for physical remotes, significantly enhancing accessibility and convenience and user experience in device interaction. This integration of hand movement and vocal instructions transforms the landscape of human- computer interaction, offering a futuristic and user- friendly interface with a wide range of applications. This abstract provides a glimpse into the potential of this groundbreaking system, revolutionizing the way we control and access electronic devices. Gesture and Voice-Driven Remote-Control System is a groundbreaking technology that allows users to interact with and control electronic devices using hand movements, voice commands and sensor integration, all while enabling remote operation. This system provides an accessible and user-friendly alternative for device interaction. By combining hand gestures, voice commands, vocal instructions, it revolutionizes human-computer interaction, offering a futuristic interface for various applications.

n

n

n

Keywords: Automatic device, Raspberry pi, Node MCU, HC-05, Gesture control, Face Recognition.

n[if 424 equals=”Regular Issue”][This article belongs to Current Trends in Signal Processing(ctsp)]

n

[/if 424][if 424 equals=”Special Issue”][This article belongs to Special Issue under section in Current Trends in Signal Processing(ctsp)][/if 424][if 424 equals=”Conference”]This article belongs to Conference [/if 424]

n

n

n

How to cite this article: Tanmay Lonare, Devyani Mahajan, Anish Kulkarni, S.V. Tathe. GESTURE AND VOICE DRIVEN REMOTE CONTROL SYSTEM. Current Trends in Signal Processing. June 14, 2024; 14(01):1-9.

n

How to cite this URL: Tanmay Lonare, Devyani Mahajan, Anish Kulkarni, S.V. Tathe. GESTURE AND VOICE DRIVEN REMOTE CONTROL SYSTEM. Current Trends in Signal Processing. June 14, 2024; 14(01):1-9. Available from: https://journals.stmjournals.com/ctsp/article=June 14, 2024/view=0

nn[if 992 equals=”Open Access”] Full Text PDF Download[/if 992] n[if 992 not_equal=”Open Access”]

[/if 992]n[if 992 not_equal=”Open Access”]

n


nn[/if 992]nn[if 379 not_equal=””]n

Browse Figures

n

n

[foreach 379]n

n[/foreach]n

n

n

n[/if 379]n

n

References

n[if 1104 equals=””]n

  • Mahith, D. S. B. Kumar, K. C. Prajwal and M. Dakshayini, “Bluetooth Home Automation,” Second International Conference on Green Computing and Internet of Things (ICGCIoT), Bangalore, India, 2018, pp. 603-607.

 

  • -J. Gonzalo and A. Holgado-Terriza Juan, “Control of home devices based on hand gestures,” IEEE 5th International Conference on Consumer Electronics – Berlin (ICCE- Berlin), Berlin, Germany, 2015, pp. 510-514.

 

  • Asadullah and K. Ullah, “Smart home automation system using Bluetooth technology,” International Conference on Innovations in Electrical Engineering and Computational Technologies (ICIEECT), Karachi, Pakistan, 2017, pp. 1-6.

 

  • Ebrahim Abidi et al., “Development of Voice Control and Home Security for Smart Home Automation,” 7th International Conference on Computer and Communication Engineering (ICCCE), Kuala Lumpur, Malaysia, 2018, pp. 1-6.

 

  • Huang and G. Shi, “Design of the control system for hybrid driving two-arm robot based on voice recognition,” IEEE 10th International Conference on Industrial Informatics, Beijing, China, 2012, pp. 602- 605.

 

  • K. Barnwal and P. Gupta, “Evaluation of AI System’s Voice Recognition Performance in Social Conversation,” 2022 5th International Conference on Contemporary Computing and Informatics (IC3I), Uttar Pradesh, India, 2022, pp. 804-808.

 

  • Tatar and S. Bayar, “FPGA Based Bluetooth Controlled Land Vehicle,” 2018 International Symposium on Advanced Electrical and Communication Technologies (ISAECT), Rabat, Morocco, 2018, pp. 1-6.

 

  • N. Gore, H. Kour, M. Gandhi, D. Tandur and A. Varghese, “Bluetooth based Sensor Monitoring in Industrial IoT Plants,” 2019 International Conference on Data Science and Communication (IconDSC), Bangalore, India, 2019, pp. 1-6.

 

  • L. Raju, V. Chandrani, S. S. Begum and M. P. Devi, “Home Automation and Security System with Node MCU using Internet of Things,” 2019 International Conference on Vision Towards Emerging Trends in Communication and Networking (ViTECoN), Vellore, India, 2019, pp. 1-5.

 

nn[/if 1104][if 1104 not_equal=””]n

    [foreach 1102]n t

  1. [if 1106 equals=””], [/if 1106][if 1106 not_equal=””],[/if 1106]
  2. n[/foreach]

n[/if 1104]

nn


nn[if 1114 equals=”Yes”]n

n[/if 1114]

n

n

[if 424 not_equal=””]Regular Issue[else]Published[/if 424] Subscription Original Research

n

n

n

n

n

Current Trends in Signal Processing

n

[if 344 not_equal=””]ISSN: 2277–6176[/if 344]

n

n

n

n

n

[if 2146 equals=”Yes”][/if 2146][if 2146 not_equal=”Yes”][/if 2146]n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n[if 1748 not_equal=””]

[else]

[/if 1748]n

n

n

Volume 14
[if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] 01
Received April 23, 2024
Accepted May 31, 2024
Published June 14, 2024

n

n

n

n

n

n function myFunction2() {n var x = document.getElementById(“browsefigure”);n if (x.style.display === “block”) {n x.style.display = “none”;n }n else { x.style.display = “Block”; }n }n document.querySelector(“.prevBtn”).addEventListener(“click”, () => {n changeSlides(-1);n });n document.querySelector(“.nextBtn”).addEventListener(“click”, () => {n changeSlides(1);n });n var slideIndex = 1;n showSlides(slideIndex);n function changeSlides(n) {n showSlides((slideIndex += n));n }n function currentSlide(n) {n showSlides((slideIndex = n));n }n function showSlides(n) {n var i;n var slides = document.getElementsByClassName(“Slide”);n var dots = document.getElementsByClassName(“Navdot”);n if (n > slides.length) { slideIndex = 1; }n if (n (item.style.display = “none”));n Array.from(dots).forEach(n item => (item.className = item.className.replace(” selected”, “”))n );n slides[slideIndex – 1].style.display = “block”;n dots[slideIndex – 1].className += ” selected”;n }n”}]