3d Hand Interaction in Virtual Space


Notice

This is an unedited manuscript accepted for publication and provided as an Article in Press for early access at the author’s request. The article will undergo copyediting, typesetting, and galley proof review before final publication. Please be aware that errors may be identified during production that could affect the content. All legal disclaimers of the journal apply.

Year : 2025 | Volume : 03 | Issue : 01 | Page : –
    By

    Vedant Gawner,

  • Smita Badarkhe,

  • Sanket Fulpagare,

  • Aditya Ghode,

  1. Student, Department of Electronics & Telecommunication Engineering, Smt. Kashibai Navale College of Engineering, Vadgaon, Savitribai Phule Pune University, Pune, Maharashtra, India
  2. Student, Department of Electronics & Telecommunication Engineering, Smt. Kashibai Navale College of Engineering, Vadgaon, Savitribai Phule Pune University, Pune, Maharashtra, India
  3. Student, Department of Electronics & Telecommunication Engineering, Smt. Kashibai Navale College of Engineering, Vadgaon, Savitribai Phule Pune University, Pune, Maharashtra, India
  4. Student, Department of Electronics & Telecommunication Engineering, Smt. Kashibai Navale College of Engineering, Vadgaon, Savitribai Phule Pune University, Pune, Maharashtra, India

Abstract

document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_abs_182860’);});Edit Abstract & Keyword

In the realm of augmented reality (AR) and virtual reality (VR), it is necessary to facilitate users’ interplay with the digital world. In virtual environments, the system will be able to track the user’s hand movements in three dimensions, providing the impression that they are essentially there. The system will adopt a camera component for input, a Python library called Open-CV for real-time video streaming, and a media-pipe library to record the user’s hand movements. The Unity Gaming Engine will be adopted because it is required to create a virtual environment in instruction to interact with the 3D world. The system tracks hand positions more accurately and reduces noise by using specific filters in addition to sophisticated deep learning techniques to identify hand motions. Unity’s robust toolset and its support for popular hand tracking devices, such as the Leap Motion Controller and Oculus Quest, facilitate the development of these systems. Additionally, Unity’s flexible architecture and scripting capabilities allow for the customization of hand interactions, making it possible to tailor the user experience to specific applications, from gaming to virtual simulations and educational tools. Unity’s extensive asset store and integration with popular VR SDKs (Software Development Kits) facilitate rapid prototyping and deployment of hand tracking features, making it accessible to both indie developers and large studios. To manage complicated activities and maintain a quick and responsive system, we also investigate the use of cloud computing. The system’s functionality was evaluated in a variety of virtual environments. The outcomes demonstrated how much more sensitive and accurate our hand tracking is, which is crucial for interactive VR activities like gaming, remote collaboration, and virtual training.

Keywords: Python; C++; unity engine; machine learning

[This article belongs to International Journal of Optical Innovations & Research ]

How to cite this article:
Vedant Gawner, Smita Badarkhe, Sanket Fulpagare, Aditya Ghode. 3d Hand Interaction in Virtual Space. International Journal of Optical Innovations & Research. 2025; 03(01):-.
How to cite this URL:
Vedant Gawner, Smita Badarkhe, Sanket Fulpagare, Aditya Ghode. 3d Hand Interaction in Virtual Space. International Journal of Optical Innovations & Research. 2025; 03(01):-. Available from: https://journals.stmjournals.com/ijoir/article=2025/view=0


document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_ref_182860’);});Edit

References

  1. Guo Z, Zeng W, Yu T, Xu Y, Xiao Y, Cao X, Cao Z. Vision-based finger tapping test in patients with Parkinson’s disease via spatial-temporal 3D hand pose estimation. IEEE journal of biomedical and health informatics. 2022 Mar 29;26(8):3848-59.
  2. Xiong F, Zhang B, Xiao Y, Cao Z, Yu T, Zhou JT, Yuan J. A2j: Anchor-to-joint regression network for 3d articulated pose estimation from a single depth image. InProceedings of the IEEE/CVF international conference on computer vision 2019 (pp. 793-802).
  3. Takala TM, Heiskanen H. Auto-scaled full body avatars for virtual reality: Facilitating interactive virtual body modification. In2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 2018 Mar 18 (pp. 1-1). IEEE.
  4. Teleb H, Chang G. Data glove integration with 3d virtual environments. In2012 International Conference on Systems and Informatics (ICSAI2012) 2012 May 19 (pp. 107-112). IEEE.
  5. Emery KJ, Zannoli M, Xiao L, Warren J, Talathi SS. Estimating gaze from head and hand pose and scene images for open-ended exploration in VR Environments. In2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) 2021 Mar 27 (pp. 554-555). IEEE.
  6. Narasimhaswamy S, Nguyen T, Huang M, Hoai M. Whose hands are these? hand detection and hand-body association in the wild. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 2022 (pp. 4889-4899).
  7. Gallotti P, Raposo A, Soares L. v-Glove: A 3D virtual touch interface. In2011 XIII Symposium on Virtual Reality 2011 May 23 (pp. 242-251). IEEE.
  8. Zhang L, Zhou W, Zhang X, Lou X. An end-to-end computer vision system architecture. In2022 IEEE International Symposium on Circuits and Systems (ISCAS) 2022 May 27 (pp. 2338-2342). IEEE.
  9. Zhang R, Jing X, Wu S, Jiang C, Mu J, Yu FR. Device-free wireless sensing for human detection: The deep learning perspective. IEEE Internet of Things Journal. 2020 Sep 16;8(4):2517-39.
  10. Tian Y, Pan G, Alouini MS. Applying deep-learning-based computer vision to wireless communications: Methodologies, opportunities, and challenges. IEEE Open Journal of the Communications Society. 2020 Dec 23;2:132-43.

Regular Issue Subscription Original Research
Volume 03
Issue 01
Received 17/07/2024
Accepted 27/03/2025
Published 09/04/2025
Publication Time 266 Days

async function fetchCitationCount(doi) {
let apiUrl = `https://api.crossref.org/works/${doi}`;
try {
let response = await fetch(apiUrl);
let data = await response.json();
let citationCount = data.message[“is-referenced-by-count”];
document.getElementById(“citation-count”).innerText = `Citations: ${citationCount}`;
} catch (error) {
console.error(“Error fetching citation count:”, error);
document.getElementById(“citation-count”).innerText = “Citations: Data unavailable”;
}
}
fetchCitationCount(“10.37591/IJOIR.v03i01.0”);

Loading citations…

PlumX Metrics