This is an unedited manuscript accepted for publication and provided as an Article in Press for early access at the author’s request. The article will undergo copyediting, typesetting, and galley proof review before final publication. Please be aware that errors may be identified during production that could affect the content. All legal disclaimers of the journal apply.

Balkrishna Rasiklal Yadav,
- Researcher, Institute of Electrical and Electronics Engineers, New Jersey, United States
Abstract document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_abs_113416’);});Edit Abstract & Keyword
By combining sophisticated robotics and visual awareness, computer vision operated robotic arms have revolutionized technology. These devices are having a profound effect on several industries, from manufacturing processes to healthcare. Computer vision controlled robotic arms are altering the game with their ability to see, comprehend, and interact with their surroundings. In this study we have tried to develop and implement software and hardware’s to improve the freedom of movement in a robotic arm which will have a similar look to hand gestures. In the modern world, robotic arms are becoming essential, with uses in a wide range of sectors including military, defense, healthcare, and industrial automation. These amazing devices can replicate the movements and hand gestures of a human, which makes them incredibly useful in a variety of scenarios. The most innovative automation technology available now is robotic systems. Initially, robots were employed on manufacturing floors in the 1960s and early 1970s. Evaluating the precision of system was done by sort of methods such as sorting, kinematic modelling, and Centre recognition which elevate the movements of robotic arms. More innovation if performed to elevate the free movement of robotic arms can enhance its applications at defense and industrial level.
Keywords: robotics, defense, kinematic Modelling, hand gestures, point to point trajectory
[This article belongs to International Journal of Advanced Control and System Engineering (ijacse)]
Balkrishna Rasiklal Yadav. Eyes for Machines: A Computer Vision Approach to Enhance Robotic Arm Dexterity and Autonomy. International Journal of Advanced Control and System Engineering. 2024; 02(02):1-10.
Balkrishna Rasiklal Yadav. Eyes for Machines: A Computer Vision Approach to Enhance Robotic Arm Dexterity and Autonomy. International Journal of Advanced Control and System Engineering. 2024; 02(02):1-10. Available from: https://journals.stmjournals.com/ijacse/article=2024/view=0
References
document.addEventListener(‘DOMContentLoaded’,function(){frmFrontForm.scrollToID(‘frm_container_ref_113416’);});Edit
- Rai N., Rai B., Rai P. Computer vision approach for controlling educational robotic arm based on object properties; Proceedings of the 2014 2nd International Conference on Emerging Technology Trends in Electronics, Communication and Networking; Surat, 26–27 December 2014
- Hsu Y.H., Hsu H.-Y., Lin J.-S. Control design and implementation of intelligent vehicles with robot arm and computer vision; Proceedings of the 2015 International Conference on Advanced Robotics and Intelligent Systems (ARIS); Taipei, 29–31 May 2015
- Chen X., Huang X., Wang Y., Gao X. Combination of augmented reality-based brain-computer interface and computer vision for high-level control of a robotic IEEE Trans. Neural Syst. Rehabil.
Eng. 2020; 28:3140–3147. doi: 10.1109/TNSRE.2020.3038209
- Fisher Yu, Wenqi Xian, Yingying Chen, Fangchen Liu, Mike Liao, Vashisht Madhavan, and Trevor Bdd100k: A diverse driving video database with scalable annotation tooling. arXiv preprint arXiv:1805.04687, 2018. Published at IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2020.
- Lerrel Pinto and Abhinav Gupta. 2016. Supersizing self-supervision: Learning to grasp from 50K tries and 700 robot hours. In 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE Press, 3406-3413. https://doi.org/10.1109/ICRA.2016.7487517
- Wenhan Luo, Peng Sun, Fangwei Zhong, Wei Liu, Tong Zhang, and Yizhou Wang. End-to-end active object tracking via reinforcement learning. Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3286-3295, 2018.
- L. Raheja, R. Shyam, U. Kumar and P. B. Prasad, “Real-Time Robotic Hand Control Using Hand Gestures,” 2010 Second International Conference on Machine Learning and Computing, Bangalore, India, 2010, pp. 12-16, doi: 10.1109/ICMLC.2010.12.
- Andreas Geiger, Philip Lenz, Christoph Stiller, and Raquel Vision meets robotics: The kitti dataset. The International Journal of Robotics Research, 32(11):1231–1237, 2013. https://www.cvlibs.net/publications/Geiger2013IJRR.pdf
- Sergey Levine, Peter Pastor, Alex Krizhevsky, Julian Ibarz, and Deirdre Quillen. Learning hand- eye coordination for robotic grasping with deep learning and large-scale data The International Journal of Robotics Research, 37(4-5):421–436, 2018
10.] T. P. Cabré, M. T. Cairol, D. F. Calafell, M. T. Ribes and J. P. Roca, “Project-Based Learning Example: Controlling an Educational Robotic Arm With Computer Vision,” in IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, vol. 8, no. 3, pp. 135-142, Aug. 2013, doi: 10.1109/RITA.2013.2273114. 11] Dr. C.K. Gomathy, Mr. G. Niteesh, Mr K. Sai Krishna. The Gesture Controlled the Robot. International Research Journal of Engineering and Technology (IRJET). 2021; 8(4): 1721-1725. 12] S. S. Dheeban D., V. Harish A. Harivignesh M. Prasanna N. Senthil Kumar Gesture Controlled Robotic Arm. The International Journal of Science & Technoledge. March 2016; 4(3): 101-112 file:///C:/Users/PC54/Downloads/17.ST1603-078.pdf 13.] Chinmay Patil et al. Design and Implementation of Gesture Controlled Robot with a Robotic Arm.International Research Journal of Engineering and Technology (IRJET). Volume-6[9],2019 https://www.irjet.net/archives/V6/i9/IRJET-V6I9205
| Volume | 02 |
| Issue | 02 |
| Received | 21/10/2024 |
| Accepted | 24/10/2024 |
| Published | 29/10/2024 |
