Helipad Detection for UAV based on YOLOv4 Transfer Learning Model
    1. Dynamics and Control Laboratory, Department of Industrial and Mechanical Engineering, Institute of Technology of Cambodia, Russian Federation Blvd., P.O. Box 86, Phnom Penh, Cambodia.

Received: July 19,2021 / Revised: Accepted: November 19,2021 / Published: December 30,2021

Download PDF
Browse Figures
×

 Humans have fast and accurate visual system that allows them to perform complex tasks like driving with a little consciousness. Without visual system, UAV cannot do complex tasks like humans. When UAVs are equipped with visual system and trained with fast and accurate model, they will be able to carry out even more complex tasks such as autonomous landing. Computer vision is a technique suitable for UAV visual system. In this paper, we consider a computer vision technique that uses a deep learning model to recognize the landing site (Helipad). We conducted an experiment of training the deep learning model to recognize Helipad. In order to land on the desired site safely, we proposed a detection method based on YOLOv4-tiny transfer learning model to detect the Helipad in real time. The digital images were used as training data in order for the model to learn and gain a high-level recognizing object that exists in an image. The data collection to train the model was delimited by collecting them from the internet and video’s snapshot. The annotation tool was used in order to draw ground truth box for 184 training samples and 57 testing samples with 1 class. The YOLOv4-tiny model was trained on darknet framework, using YOLOv4-tiny pre-trained weight and the described input data. After training was completed with GPU acceleration, the best weights were saved in order to use in OpenCV’s Deep Neural Network (DNN). The model was first validated with testing images, tested on videos and finally real-time streaming video in order to investigate its performance. We used Intersection over Union (IOU), precision, recall, miss rate and mean Average Precision (mAP) as the evaluation metrics as well as Loss-function visualization to visualize and analyze the model’s performance. During real-time streaming video, we investigate frames per second (FPS) and inference time. Finally, the experimental results show that the detection method can accurately detect the Helipad in real-time video.