Skip to content

TreeKangaroo/yolo-drone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

An Autonomous Drone/Heterogeneous Swarm with Object Detection and Tracking Capabilities


This repository contains the code from my autonomous object-tracking drone project (2021-2022) and drone swarm project (2022-2023). The master branch contains code for the individual drone project and the swarm_release branch contains code for the swarm project (switch to the swarm_release branch for documentation), Below describes the single drone project:

compressed_drone_gif

This project develops an autonomous drone that can independently detect objects of interest and then fly closer to the object to take detailed photos. The drone hardware consists of a Holybro X500 drone development kit, a Nvidia Jetson Xavier NX GPU, and a RealSense camera. 3D printed fixtures were used to attach the GPU and camera to the drone frame. In software development, the Robot Operating System (ROS) was used, and three ROS nodes, ICODDA, pose estimation, and navigation control, were developed using Python. The ICODDA node performs image capture, object detection, and distance estimation. The YOLOv4 object detection network is integrated within the ICODDA node. A non-zero block average (NZBA) method was developed to estimate object distance in the presence of invalid and noisy depth data points. The pose estimation node implements a convolutional neural network that estimates the object’s pose based on depth data. It achieves an accuracy of 92% when validated with a dataset collected in the project. Finally, the navigation control node uses three PID (Proportional-Integral-Derivative) controllers to constantly adjust the drone’s speeds in the x, y, and z directions. It also implements a novel method to adjust the yaw speed based on recent pose estimation results. The drone’s autonomous operations were successfully demonstrated in test flights with different objects of interest. It achieves 24.4 frames per second image processing throughput and 0.034 second control latency. The developed technologies can be applied to autonomous drones in surveying, inspection, and search and rescue applications.

Hardware Build


drone

The drone is based on the Holybro X500 developer frame. It carries a Jetson Xavier NX (running Ubuntu 18.04) and an Intel Realsense d435i camera. The Jetson and camera fixtures are 3d-printed. The drone is powered by a 4 cell, 5200 mAH LiPo battery.

Software Organization


  • yolo_ros handles all YOLO related operations. You will need to download the pre-trained YOLO model online. Instructions can be found here
  • navigation handles all drone movement and communication with the flight controller via MAVROS.
  • pose contains the software needed to run a neural network that estimates bicycle pose based on depth data from within the bicycle's bounding box.

Notes on software


Important: In the current version, all the code files recognize the package name as yolov4_trt_ros. Before building a clone of this repo, it will probably be helpful to change the folder name to yolov4_trt_ros instead of yolo_drone

  • There are quite a few codes in every file used for testing different components of the stack.
  • There are also quite a few backup files from previous versions which may or may not work with how things are configured now.

Software dependencies


References and Acknowledgements


About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published