[{“box”:0,”content”:”[if 992 equals=”Open Access”]n
n
Open Access
nn
n
n[/if 992]n
n
n
n
n

n
Siraj Shaikh, Bhakti Walimbe, Abhijeet Shinde, Sanket Sase,
n
- n t
n
n
n[/foreach]
n
n[if 2099 not_equal=”Yes”]n
- [foreach 286] [if 1175 not_equal=””]n t
- Student, Student, Student, Student Department of Electronics and Telecommunication Engineering, SKNCOE, SPPU, Pune, Department of Electronics and Telecommunication Engineering, SKNCOE, SPPU, Pune, Department of Electronics and Telecommunication Engineering, SKNCOE, SPPU, Pune, Department of Electronics and Telecommunication Engineering, SKNCOE, SPPU, Pune Maharashtra, Maharashtra, Maharashtra, Maharashtra India, India, India, India
n[/if 1175][/foreach]
n[/if 2099][if 2099 equals=”Yes”][/if 2099]n
Abstract
nIn a time when human-computer interaction (HCI) is changing quickly, generating user-friendly interfaces is essential. This paper presents a new technique to HCI using virtual mouse and keyboard systems based on gesture recognition. This system replaces classical input devices by permitting users to control their computers with hand gestures using computer vision techniques. We created an optical mouse and keyboard technique that can recognize hand gestures recorded by a regular webcam by utilizing computer vision algorithms. Different gestures cause corresponding mouse clicks, keyboard inputs, and mouse movements through image processing. The implementation only needs a webcam as hardware and is written in Python using the PyCharm platform. The device works well as a virtual mouse and keyboard, offering smooth control without the demand for cables or other external hardware. The efficiency and user experience of computer interaction can be improved by permitting users to input text using natural hand gestures, navigate interfaces, and execute left and right clicks. This work provides opportunities for more organic and intuitive paradigms in HCI.
n
Keywords: Gesture Recognition, Virtual Mouse, Virtual Keyboard, Computer Vision, Human-Computer Interaction
n[if 424 equals=”Regular Issue”][This article belongs to Journal of Microelectronics and Solid State Devices(jomsd)]
n
n
n
n
n
nn[if 992 equals=”Open Access”] Full Text PDF Download[/if 992] n
nn[if 379 not_equal=””]n
Browse Figures
n
n
n[/if 379]n
References
n[if 1104 equals=””]n
[1] Sun JH, Ji TT, Zhang SB, Yang JK, Ji GR. Research on the hand gesture recognition based on deep learning. In2018 12th International symposium on antennas, propagation and EM theory (ISAPE) 2018 Dec 3 (pp. 1-4). IEEE.
[2] Rekha KB, Sampreeth R, Sai YY, Navatha M, Nayak R. Gesture-Driven Computing: Pioneering Control with Eye Tracking and Computer Vision. In2024 3rd International Conference for Innovation in Technology (INOCON) 2024 Mar 1 (pp. 1-9). IEEE.
[3] Da Silva RA, Veiga AC. Algorithm for decoding visual gestures for an assistive virtual keyboard. IEEE Latin America Transactions. 2020 Nov;18(11):1909-16. [4] Kumar P, Rautaray SS, Agrawal A. Hand data glove: A new generation real-time mouse for human-computer interaction. In2012 1st international conference on Recent Advances in Information Technology (RAIT) 2012 Mar 15 (pp. 750-755). IEEE.
[5] S. Sadhana Rao,” Sixth Sense Technology”, Proceedings of the International Conference onCommunicationand Computational Intelligence– 2010, pp.336-339.
[6] Game P. M., Mahajan A.R,”A gestural user interface to Interact with computer system ”, International Journal on Science and Technology (IJSAT) Volume II, Issue I, (Jan.- Mar.) 2011, pp.018 – 027.
[7] S. S. Rautaray and A. Agrawal, A Novel Human Computer Interface Based on Hand Gesture Recognition using Computer Vision Techniques, In Proceedings of ACM IITM’10, pp. 292–296, (2010)
[8] G. M. Gandhi and Salvi, ”Artificial Intelligence Integrated Blockchain For Training Autonomous Cars,” 2019 Fifth International Conference on Science Technology Engineering and Mathematics (ICONSTEM), Chennai, India, 2019, pp. 157 – 161
[9] Jesudoss A, Subramaniam NP. EAM: architecting efficient authentication model for internet security using image-based one time password technique. Indian Journal of Science and Technology. 2016 Feb;9(7):1-6.
[10] Praveena A, Eriki MK, Enjam DT. Implementation of smart attendance monitoring using open-CV and python. Journal of Computational and Theoretical Nanoscience. 2019 Aug 1;16(8):3290-5.
nn[/if 1104][if 1104 not_equal=””]n
- [foreach 1102]n t
- [if 1106 equals=””], [/if 1106][if 1106 not_equal=””],[/if 1106]
n[/foreach]
n[/if 1104]
nn
nn[if 1114 equals=”Yes”]n
n[/if 1114]
n
n

n
Journal of Microelectronics and Solid State Devices
n
n
n
n
nnn
n
| Volume | ||
| [if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] | ||
| Received | July 1, 2024 | |
| Accepted | July 15, 2024 | |
| Published | July 28, 2024 |
n
n
n
n
n
n nfunction myFunction2() {nvar x = document.getElementById(“browsefigure”);nif (x.style.display === “block”) {nx.style.display = “none”;n}nelse { x.style.display = “Block”; }n}ndocument.querySelector(“.prevBtn”).addEventListener(“click”, () => {nchangeSlides(-1);n});ndocument.querySelector(“.nextBtn”).addEventListener(“click”, () => {nchangeSlides(1);n});nvar slideIndex = 1;nshowSlides(slideIndex);nfunction changeSlides(n) {nshowSlides((slideIndex += n));n}nfunction currentSlide(n) {nshowSlides((slideIndex = n));n}nfunction showSlides(n) {nvar i;nvar slides = document.getElementsByClassName(“Slide”);nvar dots = document.getElementsByClassName(“Navdot”);nif (n > slides.length) { slideIndex = 1; }nif (n (item.style.display = “none”));nArray.from(dots).forEach(nitem => (item.className = item.className.replace(” selected”, “”))n);nslides[slideIndex – 1].style.display = “block”;ndots[slideIndex – 1].className += ” selected”;n}n”}]
