Multi-functional UAV for Disaster Response and Management

[{“box”:0,”content”:”[if 992 equals=”Open Access”]n

n

n

n

Open Access

nn

n

n[/if 992]n

n

Year : July 22, 2024 at 2:21 pm | [if 1553 equals=””] Volume :14 [else] Volume :14[/if 1553] | [if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] : 01 | Page : 1-6

n

n

n

n

n

n

By

n

[foreach 286]n

n

n

Abhishek S., Anirudh K.K, Gautham G. Lal, Faris Muhammed,

n

    n t

  • n

n

n[/foreach]

n

n[if 2099 not_equal=”Yes”]n

    [foreach 286] [if 1175 not_equal=””]n t

  1. Student,, Student,, Student,, Student, Department of Electrical and Electronics Engineering, TKM College of Engineering,, Department of Electrical and Electronics Engineering, TKM College of Engineering,, Department of Electrical and Electronics Engineering, TKM College of Engineering,, Department of Electrical and Electronics Engineering, TKM College of Engineering, Kerala,, Kerala,, Kerala,, Kerala, India, India, India, India
  2. n[/if 1175][/foreach]

n[/if 2099][if 2099 equals=”Yes”][/if 2099]n

n

Abstract

nUnmanned Aerial Vehicles (UAVs), commonly known as drones, have become integral across diverse fields such as agriculture, surveillance, and defense, with expanding roles in critical operations like search and rescue and post-disaster management. Despite their versatility, current UAVs encounter challenges in disaster response due to limitations in flight time, costs, and accuracy, particularly in dynamic weather conditions. The UAV is equipped with features essential for disaster site surveillance, human detection, communication with isolated individuals, and the delivery of vital supplies such as medical aid or food packages. The hardware foundation comprises a standard F450 drone frame and an APM flight controller. For human detection, a cost-effective solution utilizing a pre-trained machine learning model, YOLOv8, in conjunction with an FPV camera is implemented. Additional sensors can be seamlessly integrated to enhance human detection capabilities. Computational tasks are managed through a ground control station, ensuring the efficient execution of algorithms. Communication with affected individuals is facilitated by a SIM800L GSM GPRS module, accompanied by dedicated microphone and speaker components. Future developments include incorporating a modular dropping mechanism to the drone for flexible deployment options. The prototype aims for semi-autonomous control, detecting humans at altitudes of 10-25 meters and facilitating communication or dropping packages at specified locations. As a future scope, a hybrid UAV with rover capabilities can further enhance operational efficiency. The successful implementation of this drone prototype promises significant advancements in disaster response capabilities, leading to improved situational awareness, efficient communication, and timely delivery of essential supplies to affected areas.

n

n

n

Keywords: YOLO, UAV, Mission Planner, Machine learning, Human detection

n[if 424 equals=”Regular Issue”][This article belongs to Journal of Aerospace Engineering & Technology(joaet)]

n

[/if 424][if 424 equals=”Special Issue”][This article belongs to Special Issue under section in Journal of Aerospace Engineering & Technology(joaet)][/if 424][if 424 equals=”Conference”]This article belongs to Conference [/if 424]

n

n

n

How to cite this article: Abhishek S., Anirudh K.K, Gautham G. Lal, Faris Muhammed. Multi-functional UAV for Disaster Response and Management. Journal of Aerospace Engineering & Technology. July 22, 2024; 14(01):1-6.

n

How to cite this URL: Abhishek S., Anirudh K.K, Gautham G. Lal, Faris Muhammed. Multi-functional UAV for Disaster Response and Management. Journal of Aerospace Engineering & Technology. July 22, 2024; 14(01):1-6. Available from: https://journals.stmjournals.com/joaet/article=July 22, 2024/view=0

nn[if 992 equals=”Open Access”] Full Text PDF Download[/if 992] n

n[if 992 not_equal=’Open Access’] [/if 992]nn

n

nn[if 379 not_equal=””]n

Browse Figures

n

n

[foreach 379]n

n[/foreach]n

n

n

n[/if 379]n

n

References

n[if 1104 equals=””]n

  1. Lee, D. Har and D. Kum. Drone-Assisted Disaster Management: Finding Victims via Infrared Camera and Lidar Sensor Fusion. 2016 3rd Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE), Nadi, Fiji. 2016. pp. 84-89. doi: 10.1109/APWC-on-CSE.2016.025.
  2. Dong, K. Ota and M. Dong. UAV-Based Real-Time Survivor Detection System in Post-Disaster Search and Rescue Operations in IEEE Journal on Miniaturization for Air and Space Systems. 2021. vol. 2, no. 4, pp. 209-219. doi: 10.1109/JMASS.2021.3083659
  3. -W. Chen, M. -R. Xie, Y. -M. Chen, T. -T. Chu and Y. -B. Lin. DroneTalk: An Internet-of-Things-Based Drone System for Last-Mile Drone Delivery. IEEE Transactions on Intelligent Transportation Systems. 2022. vol. 23, no. 9. pp. 15204-15217. doi: 10.1109/TITS.2021.3138432.
  4. Tariq, M. Rahim, N. Aslam, N. Bawany and U. Faseeha. DronAID : A Smart Human Detection Drone for Rescue. 2018 15th International Conference on Smart Cities: Improving Quality of Life Using ICT & IoT (HONET-ICT), Islamabad, Pakistan. 2018. pp. 33-37. doi: 10.1109/HONET.2018.8551326.
  5. Nikhil, S. M. Shreyas, G. Vyshnavi and S. Yadav. Unmanned Aerial Vehicles (UAV) in Disaster Management Applications. 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India. 2020, pp. 140-148. doi: 10.1109/ICSSIT48917.2020.9214241.
  6. T, K. V, A. K, R. N, and J. R. R, “Flying Rover”, pices, vol. 4, no. 5, pp. 98-111, Sep. 2020.
  7. Swamy Shivakumara R., Sasnur, S. Sangamesh., Sai Prasanna G., Naga Srikanth B., & Kharvi, V. Sanketh. (2020). Design and Development of Unmanned Ground and Aerial Vehicle with The Concept of Integration of Drone and Rover. Journal of Emerging Technologies and Innovative Research, 7(8), 1629-1638.
  8. -L. Tham, T. H. Cham and B. -H. Kwan, “Detecting Non-Injured and Injured Humans in Thermal Imaging using YOLOv8,” 2023 International Conference on Computer and Applications (ICCA), Cairo, Egypt, 2023, pp. 1-5, doi: 10.1109/ICCA59364.2023.10401590.
  9. -T. Do, M. -H. Ha, D. -C. Nguyen, K. Thai and Q. . -H. D. Ba, “Human Detection Based Yolo Backbones-Transformer in UAVs,” 2023 International Conference on System Science and Engineering (ICSSE), Ho Chi Minh, Vietnam, 2023, pp. 576-580, doi: 10.1109/ICSSE58758.2023.10227141.
  10. Garcia-Romeo, H. Fuentes, N. Medrano, B. Calvo, S. Celma and D. Antolin. An electronic interface for measuring CO2 emissions in embedded systems. 2012 IEEE International Instrumentation and Measurement Technology Conference Proceedings, Graz, Austria. 2012. pp. 417-420. doi: 10.1109/I2MTC.2012.6229384.

 

nn[/if 1104][if 1104 not_equal=””]n

    [foreach 1102]n t

  1. [if 1106 equals=””], [/if 1106][if 1106 not_equal=””],[/if 1106]
  2. n[/foreach]

n[/if 1104]

nn


nn[if 1114 equals=”Yes”]n

n[/if 1114]

n

n

[if 424 not_equal=””]Regular Issue[else]Published[/if 424] Subscription Original Research

n

n

n

n

n

Journal of Aerospace Engineering & Technology

n

[if 344 not_equal=””]ISSN: 2231-038X[/if 344]

n

n

n

n

n

[if 2146 equals=”Yes”][/if 2146][if 2146 not_equal=”Yes”][/if 2146]n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n[if 1748 not_equal=””]

[else]

[/if 1748]n

n

n

Volume 14
[if 424 equals=”Regular Issue”]Issue[/if 424][if 424 equals=”Special Issue”]Special Issue[/if 424] [if 424 equals=”Conference”][/if 424] 01
Received May 20, 2024
Accepted June 4, 2024
Published July 22, 2024

n

n

n

n

n

n nfunction myFunction2() {nvar x = document.getElementById(“browsefigure”);nif (x.style.display === “block”) {nx.style.display = “none”;n}nelse { x.style.display = “Block”; }n}ndocument.querySelector(“.prevBtn”).addEventListener(“click”, () => {nchangeSlides(-1);n});ndocument.querySelector(“.nextBtn”).addEventListener(“click”, () => {nchangeSlides(1);n});nvar slideIndex = 1;nshowSlides(slideIndex);nfunction changeSlides(n) {nshowSlides((slideIndex += n));n}nfunction currentSlide(n) {nshowSlides((slideIndex = n));n}nfunction showSlides(n) {nvar i;nvar slides = document.getElementsByClassName(“Slide”);nvar dots = document.getElementsByClassName(“Navdot”);nif (n > slides.length) { slideIndex = 1; }nif (n (item.style.display = “none”));nArray.from(dots).forEach(nitem => (item.className = item.className.replace(” selected”, “”))n);nslides[slideIndex – 1].style.display = “block”;ndots[slideIndex – 1].className += ” selected”;n}n”}]