In-situ Target Detection and Tracking for Micro UAVs

Unmanned aerial vehicles are aircrafts without a human pilot, operated remotely using varying levels of automated functions. They were first introduced during World War I, and in January 1918 the first production contract awarded for an unmanned aircraft‎; awarded by the U.S. Army to the Dayton Wright Airplane Company for 25 Liberty Eagles. Over the years and with the rapidly increasing research and development in this field, UAVs are now deployed in urban or military settings for a wide variety of field robotic applications such as environmental disaster detection and monitoring, search and rescue, traffic monitoring, and target designation and tracking. The UbiComp group works on algorithms to enable autonomous Micro-unmanned Aerial Vehicles (MAVs) operations where video images captured by UAVs are processed using state-of-the-art algorithms for target detection, recognition, and tracking to facilitate different MAV applications. Nevertheless, processing of aerial UAV images is challenging due to the fact that images are usually acquired using low-resolution cameras from high altitudes giving rise to problems dealing with small-size objects for which discriminative models that are difficult to construct. The problem is further involved since MAVs are in continuous motion introducing problems of rapid and fast camera motion while tracking objects. Additional challenges result from significant appearance changes and partial/full occlusions that may cause drift in target tracking. We propose an integrated framework comprising novel set of algorithms to alleviate the above described problems in order to facilitate on-board robust real-time target detection and tracking in UAV imagery. The separation of scene and camera motions from the targets' motion is achieved by using an algorithm based on image feature processing and projective geometry. Detected targets are tracked with Kalman filtering while an overlap-rate-based data association mechanism followed by tracking persistency check are used to discriminate between true moving targets and false detections. The proposed framework doesn’t involve explicit application of image transformations to detect potential targets resulting in enhanced computational time and reduction of registration errors. The tracking performance is further enhanced by utilizing P-N learning to learn the targets' appearances online. To this end, novel P-N constraints are introduced based on data association to control the positive and negative samples while a cascaded classifier is employed to detect the targets in case of association failure after learning targets' appearance.

Ubiquitous & Visual Computing group also develops different MAV models including the Nile University Research Quad-rotor (NURQ) used for experimental data collection (shown on figure on the right). Quad-rotors in general have lots of advantages over fixed-wing UAVs including their ability to hover, low cost and vertical take-off and landing enabling them to be used indoors or in constrained and urban areas. Assembled in our ubiquitous & visual computing lab, NURQ is 50 cm in diameter with a total weight of 1735 gm. A 3-cell polyLithium battery is used with 11.1V and capacity 2200mAh and a discharge rate 45C (constant) - 90C (burst) that enables the quad-rotor of flight duration up to 10 minutes (or more based on battery and useful payload). The propellers used are of length 10'' which makes them safe for indoor usage also. The Autopilot board used is an Ardupilot‎ mega board with an Atmega 2560 running at 16 MHz processor, gyros, an accelerometer, magnetometer, and a barometer sensors. A LV-EZ0 sonar is used for altitude measurements on low altitudes up to 6.5 meters while above this it uses the barometer as altitude indicator. The Arducopter open-source auto pilot software is used to control the quad-rotor and perform position and altitude stabilization. A camera is mounted under the quad-copter and works 60 fps and provides images of resolution 320*240. The payload also includes an embedded computer board, remote communication modules, and a number of additional sensors.


















 









One proposed solution to target detection and tracking in MAV imagery comprises three phases: fast point feature detection (left), outlier clustering (middle), and target tracking with data association and cascaded classifiers (right).



Nile University Research Quad-rotor (NURQ).



A test-bed has been implemented to enable automated quantitative evaluation of the different detection and tracking algorithms. It comprises two functionalities: ground truth labeling and tracking output evaluation. Shown above is the GUI of our test-bed.



The computed Hamming distance between BRIEF descriptors improves the robustness of the framework. Note that having a huge structure (electricity tower) in between the camera and target violates the planar constraint assumption made while computing homography and partially occludes the target. Nevertheless, the proposed framework is still capable of accurate target detection and tracking.


 
Copyright © 2010-2014 Ubiquitous & Visual Computing Group