Particle Filter Tracking Architecture for use Onboard Unmanned Aerial Vehicles A Thesis Presented to The Academic Faculty by Ben T. Ludington In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy School of Electrical and Computer Engineering Georgia Institute of Technology December 2006 Particle Filter Tracking Architecture for use Onboard Unmanned Aerial Vehicles Approved by: Dr. George Vachtsevanos, Advisor Dr. Anthony Yezzi School of Electrical and Computer Engineering School of Electrical and Computer Engineering Georgia Institute of Technology Georgia Institute of Technology Dr. Bonnie Heck Ferri Dr. Eric Johnson School of Electrical and Computer Engineering Daniel Guggenheim School of Aerospace Engineering Georgia Institute of Technology Georgia Institute of Technology Dr. Patricio Vela School of Electrical and Computer Engineering Georgia Institute of Technology Date Approved: November 10, 2006 ACKNOWLEDGEMENTS This work was financially supported by both the School of Electrical and Computer Engi- neering at Georgia Tech and DARPA’s HURT project. This work would not have been possible without the advice, guidance, and mentoring of my advisor, Professor George Vachtsevanos. The many conversations we have had about this work over the past years have been invaluable. Professor Eric Johnson, director of the Georgia Tech UAV Research Facility, provided additional guidance and support with simulation and flight testing. I also thank the other members of my committee, Professor Bonnie Heck, Professor Patricio Vela, and Professor Tony Yezzi. Several students and co-workers were also instrumental in the completion of this work: Johan Reimann, Dr. Graham Drozeski, Dr. Suresh Kannan, Nimrod Rooz, Phillip Jones, Wayne Pickell, Henrik Christophersen, Dr. Liang Tang, Alan Wu, Sharon Lawrence and others. Finally, my fiancee, Leah, helped me in so many ways. I do not want to think about doing this without her being there every step of the way. iii TABLE OF CONTENTS ACKNOWLEDGEMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii LIST OF TABLES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii NOMENCLATURE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xii SUMMARY. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv 1 INTRODUCTION AND MOTIVATION. . . . . . . . . . . . . . . . . . . 1 1.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1.1 Particle Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1.2 Closing the Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2 General Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.3 Other Application Domains . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.4 Organization Of This Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 PARTICLE FILTER BACKGROUND . . . . . . . . . . . . . . . . . . . . 8 2.1 Problem Description and Bayesian Tracking . . . . . . . . . . . . . . . . . 8 2.2 Other State Estimation Tools . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3 Particle Filter Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.4 Model Adaptation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.4.1 System Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.4.2 Measurement Model and Data Fusion . . . . . . . . . . . . . . . . . 14 2.5 Computational Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.6 Performance Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.6.1 Measurement Residuals . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.6.2 Kullback-Leibler Distance . . . . . . . . . . . . . . . . . . . . . . . 18 2.6.3 Change Detection Approaches . . . . . . . . . . . . . . . . . . . . . 19 3 BASELINE PARTICLE FILTER FOR VISUAL TRACKING . . . . . 20 3.1 Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.1.1 System Update Model . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.1.2 Measurement Model . . . . . . . . . . . . . . . . . . . . . . . . . . 22 iv 3.2 Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4 PARTICLE FILTER ADAPTATION . . . . . . . . . . . . . . . . . . . . . 33 4.1 System Update Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.2 Measurement Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 4.3 Efficiency Improvements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 4.4 Performance Estimate Using a Multi-Layer Perceptron Neural Network . . 37 4.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 5 TARGET TRACKING BACKGROUND . . . . . . . . . . . . . . . . . . 48 5.1 Visual Servoing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 5.1.1 Position-Based Visual Servoing . . . . . . . . . . . . . . . . . . . . 49 5.1.2 Image-Based Visual Servoing . . . . . . . . . . . . . . . . . . . . . 50 5.2 Visual Servoing and Unmanned Aerial Systems . . . . . . . . . . . . . . . 51 6 CLOSED LOOP TRACKING IMPLEMENTATION . . . . . . . . . . . 55 6.1 GTMax Unmanned Helicopter . . . . . . . . . . . . . . . . . . . . . . . . . 55 6.2 Coordinate System Transformation . . . . . . . . . . . . . . . . . . . . . . 58 6.3 Linear Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 6.4 Camera Command and Waypoint Generation . . . . . . . . . . . . . . . . 62 7 RESULTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 7.1 Simulation Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 7.1.1 Tracking With No Obstructions . . . . . . . . . . . . . . . . . . . . 68 7.1.2 Tracking With Obstructions . . . . . . . . . . . . . . . . . . . . . . 70 7.2 Flight Test Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 8 CONCLUSIONS AND POSSIBLE FUTURE WORK . . . . . . . . . . 82 8.1 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 8.2 Possible Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 8.2.1 Further Comparisons to Other Techniques . . . . . . . . . . . . . . 84 8.2.2 Target Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 8.2.3 Multiple Aircraft . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 8.2.4 Multiple Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 v 8.2.5 Non-Steerable Cameras . . . . . . . . . . . . . . . . . . . . . . . . . 86 APPENDIX A — PROOF OF MOTION METRIC . . . . . . . . . . . . . 87 APPENDIX B — A NOTE ABOUT UPDATING A PORTION OF THE PARTICLE SET AND CONVERGENCE . . . . . . . . . . . . . . . . . . 89 APPENDIX C — PARTICLE FILTER RESULT MOVIES . . . . . . . . 92 REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 RELATED PUBLICATIONS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 vi LIST OF TABLES 1 Parameters used for the baseline particle filter. . . . . . . . . . . . . . . . . 27 2 Ranges of parameters used for the adaptive particle filter. . . . . . . . . . . 38 3 Summary of the case study comparisons. . . . . . . . . . . . . . . . . . . . . 43 4 Characteristics of the Yamaha RMax airframe. . . . . . . . . . . . . . . . . 56 5 GTMax Baseline Avionics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 vii LIST OF FIGURES 1 FramefromtheGTMaxunmannedhelicopter. Duetoclutterandthelimited resolutionoftheonboardcamera,itcanbedifficultforbothhumanoperators and traditional automated systems to estimate the state of the target, which is a sports utility vehicle near the center of the frame. This work seeks to provide an automated solution that will track such a target under similar tracking conditions.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Vehicle tracking movie taken from the GTMax unmanned helicopter. Like Figure 1, this movie illustrates the difficulties encountered in tracking a vehi- cle from an aerial platform. This is a movie; in electronic copies of the thesis, click on the image to play the movie. . . . . . . . . . . . . . . . . . . . . . . 3 3 Graphical overview of the system.. . . . . . . . . . . . . . . . . . . . . . . . 4 4 Typical image captured from a camera mounted on a UAV. . . . . . . . . . 10 5 Measurement likelihood distribution. The likelihood distribution was gener- ated using color and motion cues. . . . . . . . . . . . . . . . . . . . . . . . . 11 6 Desired particle filter output when attempting to track a soldier in an urban environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 7 Typical frame from the GTMax unmanned helicopter. The sports utility vehicle near the center of the frame is the target of interest. . . . . . . . . . 23 8 The right side shows the difference pixels of the left side image. Notice the number of difference pixels in the vicinity of the target. . . . . . . . . . . . 24 9 System diagram of the automatic initialization routine.. . . . . . . . . . . . 26 10 Three typical outputs of the automatic initialization routine. In all cases, the majority of the particles are placed in the vicinity of the target. . . . . 27 11 Frames 8, 23, 38, 53, 68, and 86 of the baseline particle filter output used to track a soldier in an urban environment. The output movie is included in Appendix C, Figure 55. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 12 Trackingerrorofthebaselineparticlefilterusedtotrackasoldierinanurban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 13 Processing time of the baseline particle filter used to track a soldier in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 14 Frames 32, 95, 158, 222, 285, and 348 of the adaptive particle filter output used to track a SUV in a rural environment. The output movie is included in Appendix C, Figure 56. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 15 Tracking error of the baseline particle filter used to track a SUV in a rural environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 viii 16 Frames 17, 50, 83, 116, 149, and 182 of the adaptive particle filter output used to track a van in an urban environment. The output movie is included in Appendix C, Figure 57. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 17 Tracking error of the baseline particle filter used to track a van in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 18 Frames 8, 23, 38, 53, 68, and 86 of the adaptive particle filter output used to track a soldier in an urban environment. The output movie is included in Appendix C, Figure 58. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 19 Tracking error of the adaptive particle filter used to track a soldier in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 20 Processing time of the adaptive particle filter used to track a soldier in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 21 Number of particles used by the adaptive particle filter to track a soldier in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 22 Output of the neural network performance estimate used by the adaptive particle filter to track a soldier in an urban environment. . . . . . . . . . . . 43 23 Frames 32, 95, 158, 222, 285, 348 the adaptive particle filter output used to track a truck in a rural environment. The output movie is included in Appendix C, Figure 59. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 24 Tracking error of the adaptive particle filter used to track a SUV in a rural environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 25 Output of the neural network performance estimate used by the adaptive particle filter to track a SUV in a rural environment. . . . . . . . . . . . . . 45 26 Number of particles used by the adaptive particle filter to track a SUV in a rural environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 27 Frames 17, 50, 83, 116, 149, 182 the adaptive particle filter output used to track a van in an urban environment. The output movie is included in Appendix C, Figure 60. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 28 Tracking error of the adaptive particle filter used to track a van in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 29 Output of the neural network performance estimate used by the adaptive particle filter to track a van in an environment environment. . . . . . . . . . 47 30 Number of particles used by the adaptive particle filter to track a van in an urban environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 31 System diagram of a position-based visual servoing system. . . . . . . . . . 49 32 System diagram of an image-based visual servoing system. . . . . . . . . . . 50 33 GTMax unmanned research helicopter . . . . . . . . . . . . . . . . . . . . . 55 34 Axis 213 camera. This camera is used onboard the GTMax. . . . . . . . . . 58 ix 35 Screenshot of the GTMax simulation tools. The scene window on the right shows the simulated camera view that can be used for tracking testing.. . . 59 36 Simplified geometry used for coordinate system transformation. The particle filter yields e and e , which are used to find θ and ψ . . . . . . . . . . . . 60 h v e e 37 Waypoint generation, case one. No obstructions are known. Therefore, the next waypoint will be in the direction of the target and the existing GTMax software is used. The gray circle represents the reachable points. The same result will occur even if the target is outside of the reachable points. . . . . 64 38 Waypoint generation, case two. A linear obstruction is known and the lines of sight are not reachable. Therefore, the next waypoint is the nearest point on one of the lines of sight. The dashed circle represents the reachable points. 65 39 Waypoint generation, case three. A linear obstruction is known and the lines of sight are reachable. Therefore, the next waypoint is the reachable point on one of the lines of sight that is nearest the target. The dashed circle represents the reachable points, and the gray region represents the reachable points that are not occluded. . . . . . . . . . . . . . . . . . . . . . . . . . . 66 40 Target used to test the particle filter in simulation. The left image was taken near the target, while the right image was taken from an altitude of 150 feet. 68 41 Simulated paths of the target, helicopter, and commanded camera position. 69 42 Simulated path of the target, and particle filter output. . . . . . . . . . . . 70 43 Simulated paths of the target, helicopter, commanded camera position, and particle filter output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 44 Position of the target in the image. . . . . . . . . . . . . . . . . . . . . . . . 72 45 Moving average of the tracking error, performance estimator, and number of particles. The window for the moving average is 50 frames. . . . . . . . . . 73 46 Simulated paths of the target, helicopter, and commanded camera position. The obstruction is represented by the thick black line. . . . . . . . . . . . . 74 47 Simulated path of the target, and particle filter output. The obstruction is represented by the thick black line. . . . . . . . . . . . . . . . . . . . . . . . 75 48 Simulated paths of the target, helicopter, commanded camera position, and particle filter output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 49 Position of the target in the image. . . . . . . . . . . . . . . . . . . . . . . . 77 50 Moving average of the tracking error, performance estimator, and number of particles. The window for the moving average is 50 frames. . . . . . . . . . 78 51 Frames collected during particle filter flight testing. . . . . . . . . . . . . . . 79 52 Frames collected during adaptive particle filter flight testing. The movie collected during this portion of the flight is included in Appendix C, Figure 61. 80 x
Description: