As artificial intelligence and robotics researchers, we are interested in developing algorithms for autonomous flight vehicle using deep learning and machine learning techniques to overcome the drawbacks of the conventional methods that are commonly used in robotics such as feature-based localization or filtering algorithms for sensor fusion. For development and research, we have chosen crazyflie as a research platform due to its accessibility and well-composed community, thank to Bitcraze.
We have initiated our development to build a selfie drone that accomplishes its mission which allows a drone to record a video, flying around an end-user without being environmentally constrained. After many internal discussions and research on this topic, we have got a question, “What if we can draw out a coordinate system around an end-user which will be a human”. To answer this question, Openpose, an open-source library to detect human pose, was utilized. Using the human body points that were detected through Openpose, we could compute the position of a drone relative to the user. We have also used face detection algorithm to initiate take-off; in addition, utilized face-landmarks to compute x, y, and z coordinate position to hover around a human face (also for narrow FOV camera module).
As rapid growth of deep learning algorithms has been made, we sincerely believe that autonomous drone tracking system using deep learning techniques will be generally designed and implemented in robotics community in short period of time. We hope that our project can be thought of as a first step towards the Robotics applied with AI
Here are some of the video clips of our demos; please enjoy it.
Demo with Crazyflie
Demo with Petrone (on-going)
3g FPV Camera with transmitter, 5.8GHz 25mW 48CH:
FPV Camera receiver 5.8GHz 40ch RX for Computers:
1g LiPo battery pack, 1s, 100mAh (for FPV camera)
Embedded Ubuntu 14.04
ROS indigo ver.
Crazyflie ROS repo:
Petrone ROS repo (our version):
Openpose ROS repo (our version):
*** Please refer to the README for installation ***
*** For more details about Openpose, please refer to the link below ***
ROS video recorder (our version):
Descriptions of Key Features
Using pre-trained data set using deep learning, it detects human face and calculates out the 3d vectors with unit of meter
For the initial flight triggering, the drone must detects human face for 3 seconds.
Be careful if you are wearing a photo necklace, it will detect your photo and rotates motors
Human Body Detection
Using Openpose libraries from CMU (please refer to the link above), a drone is able to detect human body structures such as right/left shoulder, wrist, ankle, elbow, neck, etc
Using this technique, the drone can recognize human body and behave accordingly based on the defined flight motion
Special Flight Movements
When a drone detects certain patterns of human body, it delivers special flight motion listed below:
Fade-away: when a person waves either left/right hand,
Landing / Flip Motion: when a person raises both hands up
Coordinate Estimation (Localization)
Instead of using the conventional methods of solving localization problem such as using landmarks, etc, our team has developed a new way of estimating coordinate system by utilizing human body detection described earlier.
This new approach has been thoroughly investigated via an aerial platform, Crazyflie and simulation tool, rviz