This is an unedited manuscript accepted for publication and provided as an Article in Press for early access at the author’s request. The article will undergo copyediting, typesetting, and galley proof review before final publication. Please be aware that errors may be identified during production that could affect the content. All legal disclaimers of the journal apply.
Bhargav Rajyagor,
Prof. Paresh Vora,
- Assistant Professor, Department of Computer Application, Noble University, Bamangam, Gujarat, India
- Assistant Professor, Department of Computer Application, Noble University, Bamangam, Gujarat, India
Abstract
Weeds présent a major challenge to crop productivité by compétent with crops for vital resources, including water, sunlight, and nutrients, often resulting in significant yield reductions. On a global scale, weeds are responsible for approximately 13.2% of annual crop losses, a quantity sufficient to feed nearly one billion people. These invasive plants disrupt agricultural systems and adversely impact crop yields. Given their uneven distribution in fields, ground or aerial robots are commonly used for targeted herbicide application, utilizing computer vision algorithms to detect weeds before treatment. In cotton cultivation, prevalent weeds such as horse purslane (Trianthema portulacastrum L.) and purple nutsedge (Cyperus rotundus L.) pose significant threats. Despite increasing interest in deep learning-based weed detection, progress remains limited due to the scarcity of extensive datasets.
To address this challenge, we introduce CottonWeeds, a curated dataset featuring 7,000 images of horse purslane and purple nutsedge captured under varied conditions in Indian cotton fields. This dataset is designed to facilitate the development of real-time weed recognition models. Weed control strategies typically involve physical, mechanical, biological, and chemical methods. However, recent advancements in Unmanned Aerial Vehicles (UAVs) and deep learning technologies, particularly Convolutional Neural Networks (CNNs), have enabled precise weed detection, reducing costs and promoting sustainable agricultural practices. This study explores the integration of UAVs and CNNs to improve weed detection accuracy, address herbicide-resistant species, and advance sustainable farming solutions. Future research aims to enhance these approaches by incorporating advanced image processing and deep learning algorithms for automated feature extraction, thus tackling complex challenges in cotton weed management. Model performance will be evaluated using metrics such as precision, recall, and F1 score to ensure a comprehensive assessment and minimize false positives and false negatives.This version maintains the original intent and detail while rephrasing for uniqueness and improved readability.
Keywords: Weeds, Computer Vision, Deep learning, UAVs (Unmanned Aerial Vehicles)
[This article belongs to Journal of Aerospace Engineering & Technology (joaet)]
Bhargav Rajyagor, Prof. Paresh Vora. Sustainable Cotton Crop Productivity through Precision Weed Detection: A Deep Learning-Based Approach with UAV Integration. Journal of Aerospace Engineering & Technology. 2025; 15(01):-.
Bhargav Rajyagor, Prof. Paresh Vora. Sustainable Cotton Crop Productivity through Precision Weed Detection: A Deep Learning-Based Approach with UAV Integration. Journal of Aerospace Engineering & Technology. 2025; 15(01):-. Available from: https://journals.stmjournals.com/joaet/article=2025/view=0
References
- Coleman, G. R. Y., Kutugata, M., Walsh, M. J., & Bagavathiannan, M. V. (2024). Multi-growth stage plant recognition: A case study of Palmer amaranth (Amaranthus palmeri) in cotton (Gossypium hirsutum). Computers and Electronics in Agriculture, 217, 108622. https://doi.org/10.1016/j.compag.2024.108622
- Karim, Md. J., Nahiduzzaman, Md., Ahsan, M., & Haider, J. (2024). Development of an early detection and automatic targeting system for cotton weeds using an improved lightweight YOLOv8 architecture on an edge device. Knowledge-Based Systems, 300, 112204. https://doi.org/10.1016/j.knosys.2024.112204
- Luo, T., Zhao, J., Gu, Y., Zhang, S., Qiao, X., Tian, W., & Han, Y. (2023). Classification of weed seeds based on visual images and deep learning. Information Processing in Agriculture, 10(1), 40–51. https://doi.org/10.1016/j.inpa.2021.10.002
- Mensah, B., Rai, N., Betitame, K., & Sun, X. (2024). Advances in weed identification using hyperspectral imaging: A comprehensive review of platform sensors and deep learning techniques. Journal of Agriculture and Food Research, 18, 101388. https://doi.org/10.1016/j.jafr.2024.101388
- Rai, N., Mahecha, M. V., Christensen, A., Quanbeck, J., Zhang, Y., Howatt, K., Ostlie, M., & Sun, X. (2023). Multi-format open source weed image dataset for real-time weed identification in precision agriculture. Data in Brief, 51, 109691. https://doi.org/10.1016/j.dib.2023.109691
- Rajyagor, B., Rakholia, R., & S. S. Agrawal (2021). Tri-level handwritten text segmentation techniques for Gujarati language. Indian Journal of Science and Technology, 14(7), 618–627. https://doi.org/10.17485/IJST/v14i7.2146
- Wang, P., Tang, Y., Luo, F., Wang, L., Li, C., Niu, Q., & Li, H. (2022). Weed25: A deep learning dataset for weed identification. Frontiers in Plant Science, 13, 1053329. https://doi.org/10.3389/fpls.2022.1053329

Journal of Aerospace Engineering & Technology
| Volume | 15 |
| Issue | 01 |
| Received | 20/01/2025 |
| Accepted | 25/01/2025 |
| Published | 08/02/2025 |