ebook img

Vision-Based Estimation and Tracking Using Multiple Unmanned Aerial Vehicles by Mingfeng ... PDF

151 Pages·2013·5.17 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Vision-Based Estimation and Tracking Using Multiple Unmanned Aerial Vehicles by Mingfeng ...

Vision-Based Estimation and Tracking Using Multiple Unmanned Aerial Vehicles by Mingfeng Zhang A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Graduate Department of Institute for Aerospace Studies University of Toronto Copyright c 2013 by Mingfeng Zhang (cid:13) Abstract Vision-Based Estimation and Tracking Using Multiple Unmanned Aerial Vehicles Mingfeng Zhang Doctor of Philosophy Graduate Department of Institute for Aerospace Studies University of Toronto 2013 Small unmanned aerial vehicles (UAVs) have very limited payload capacity and power supply, while they are envisioned to be deployed in complicated environments to perform challengingtasks; therefore, itisimperativetoimprovetheiroverallautonomyandenergy efficiency. This thesis presents a vision-based target surveillance system for small fixed- wing UAVs, which provides them with the capability of autonomously localizing and tracking an uncooperative ground moving target (GMT) using passive vision sensors such as monocular cameras. This system could greatly improve the autonomy of UAVs sincetargetestimationandtrackingarecriticalfunctionsinvolvedinmanyUAVmissions. In this thesis, the UAV-GMT tracking problem is studied from various perspectives, and three different tracking algorithms are developed. The first tracking algorithm is based on sliding mode control, and it enables a fixed-wing UAV to maintain a constant distance relative to a GMT with a speed up to the UAV’s cruising speed. The second trackingalgorithmisdesignedtodealwithanevasiveGMT.Itisderivedintheframework of the pursuit-evasion game theory and guarantees persistent observation of the GMT. The third one, based on visual servoing control, directly uses visual measurements as feedback signals to close the control loop, so it does not require the GMT’s states to assist tracking. The second part of this thesis presents a formation flight algorithm which enables multi-UAV cooperation in target tracking and target motion analysis. The motion of the formation is solely determined by its virtual leader; therefore, tracking algorithmsdevelopedforasingleUAVcanbedirectlyextendedtoamulti-UAVformation ii through implementations on the virtual leader. The third part of this thesis develops a cooperative nonlinear filter in order to recover the states of a GMT using multiple vision-enabled UAVs. Through fusing visual measurements from multiple sources, the observability of this bearing-only estimation problem is ensured. Special attention is taken in the filter design in order to accommodate the GMT’s unknown dynamics. The tracking, estimation, and formation algorithms developed in this thesis are verified in extensive numerical simulations. iii Acknowledgements Thisthesiswouldnothavebeenpossiblewithoutthesupportofmanypeople. Iwouldlike to express my deepest gratitude to my supervisor Prof. Hugh H.T. Liu for his guidance, support, encouragement, and patience throughout my research period at UTIAS. I also want to thank my doctoral examination committee members Prof. Peter R. Grant and Prof. Christopher J. Damaren for valuable feedback and insightful suggestions they provided for my research progress. I am thankful to colleagues and friends in the Flight Systems and Control group, es- peciallyMs.ChenGao, Mr.MoeElnabelsy, Mr.DenysBohdanov, Dr.SohrabHaghighat, Dr. Keith Leung, and Mr. Yoshitsugu Hitachi, for their generous help. I am grateful to my family and friends, who have always believed in me and helped me reach my goals. I could not have come this far without their continuous support and encouragement over the years. iv Contents 1 Introduction 1 1.1 Background and Motivation . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2.1 Vision Sensors in UAV Applications . . . . . . . . . . . . . . . . . 3 1.2.2 Vision-Based Estimation and Tracking . . . . . . . . . . . . . . . 6 1.2.3 Visual Servoing Control . . . . . . . . . . . . . . . . . . . . . . . 9 1.2.4 Motion Control of Fixed-Wing UAVs . . . . . . . . . . . . . . . . 10 1.3 Vehicle and Sensor Modeling . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.3.1 UAV Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.3.2 Target Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 1.3.3 Vision Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 1.4 Thesis Contributions and Outline . . . . . . . . . . . . . . . . . . . . . . 19 2 Loitering-Based Tracking 22 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.3 Tracking Law Based on Sliding Mode Control . . . . . . . . . . . . . . . 25 2.4 Simulation and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3 Persistent Tracking 36 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 3.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 3.2.1 Relative Motion Model . . . . . . . . . . . . . . . . . . . . . . . . 38 3.2.2 Definition of Persistent Tracking . . . . . . . . . . . . . . . . . . . 40 3.3 Viability Set and Tracking Algorithm . . . . . . . . . . . . . . . . . . . . 41 v 3.3.1 Viability Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.3.2 Persistent Tracking Algorithm . . . . . . . . . . . . . . . . . . . . 45 3.4 Simulation with a Unicycle UAV . . . . . . . . . . . . . . . . . . . . . . . 48 3.5 Implementation with a 6DOF Nonlinear UAV . . . . . . . . . . . . . . . 53 3.5.1 UAV Dynamic Model . . . . . . . . . . . . . . . . . . . . . . . . . 53 3.5.2 Flight Controller Design . . . . . . . . . . . . . . . . . . . . . . . 55 3.5.3 Simulation Results . . . . . . . . . . . . . . . . . . . . . . . . . . 57 3.6 Experiments with Ground Vehicles . . . . . . . . . . . . . . . . . . . . . 58 3.6.1 Experiment Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 3.6.2 Results and Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 64 3.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 4 Image-Based Target Tracking 70 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 4.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 4.2.1 System Models and Tracking Geometry . . . . . . . . . . . . . . . 71 4.2.2 Relative Motion in Image Space . . . . . . . . . . . . . . . . . . . 71 4.3 Image-Based Tracking Law . . . . . . . . . . . . . . . . . . . . . . . . . . 73 4.3.1 Tracking Stationary Targets . . . . . . . . . . . . . . . . . . . . . 73 4.3.2 Tracking a Moving Target . . . . . . . . . . . . . . . . . . . . . . 74 4.4 Simulation and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4.4.1 Simulation with Stationary Targets . . . . . . . . . . . . . . . . . 75 4.4.2 Simulation with a Moving Target . . . . . . . . . . . . . . . . . . 76 4.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5 Cooperative Tracking Using Multiple UAVs 81 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 5.3 Formation Control Law . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 5.4 Simulation and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 5.4.1 Multi-UAV Path Following . . . . . . . . . . . . . . . . . . . . . . 87 5.4.2 Cooperative Persistent Tracking . . . . . . . . . . . . . . . . . . 88 5.4.3 Cooperative Loitering-Based Tracking . . . . . . . . . . . . . . . . 93 5.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 vi 6 Cooperative Estimation Using Multiple UAVs 103 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 6.2 Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 6.2.1 Vehicle Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 6.2.2 Measurement Model . . . . . . . . . . . . . . . . . . . . . . . . . 104 6.3 Cooperative Nonlinear Filter . . . . . . . . . . . . . . . . . . . . . . . . . 106 6.3.1 Filter Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . 106 6.3.2 Filter Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 6.3.3 Filter Synthesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 6.4 Simulation and Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 6.4.1 Simulation Results . . . . . . . . . . . . . . . . . . . . . . . . . . 114 6.4.2 Performance Comparison . . . . . . . . . . . . . . . . . . . . . . . 117 6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 7 Conclusions and Future Work 121 A Projection Operator 124 Bibliography 125 vii List of Tables 2.1 Gains and parameters of the loitering-based tracking algorithm . . . . . . 30 3.1 Physical parameters of a flying wing UAV . . . . . . . . . . . . . . . . . 54 3.2 Parameters in the persistent tracking simulation . . . . . . . . . . . . . 58 3.3 Speed and turning rate of robots used in persistent tracking experiments 63 5.1 Initial states of UAVs in path following . . . . . . . . . . . . . . . . . . . 87 5.2 Gains of the formation control algorithm . . . . . . . . . . . . . . . . . . 88 5.3 Initial states of UAVs in cooperative persistent tracking . . . . . . . . . . 92 viii List of Figures 1.1 System architecture for vision-based tracking and estimation . . . . . . . 14 1.2 The FSC flying wing UAV . . . . . . . . . . . . . . . . . . . . . . . . . . 15 1.3 Camera perspective projection model . . . . . . . . . . . . . . . . . . . . 18 1.4 FOV of an onboard camera . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.1 Tracking a moving target by a UAV . . . . . . . . . . . . . . . . . . . . . 24 2.2 The signum and saturation functions . . . . . . . . . . . . . . . . . . . . 28 2.3 GMT speed and heading angle . . . . . . . . . . . . . . . . . . . . . . . . 31 2.4 UAV and GMT 2D trajectories in an inertial frame . . . . . . . . . . . . 31 2.5 The distance error and the UAV heading angle error . . . . . . . . . . . . 32 2.6 UAV heading angle and angle λ . . . . . . . . . . . . . . . . . . . . . . . 33 2.7 UAV turning rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 2.8 The sliding manifold variable . . . . . . . . . . . . . . . . . . . . . . . . 34 3.1 UAV and GMT motion model in different frames . . . . . . . . . . . . . 38 3.2 Simplified relative motion model in UAV frame . . . . . . . . . . . . . . 39 3.3 Configuration of the semipermeable surface . . . . . . . . . . . . . . . . . 42 3.4 Persistent tracking algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 46 3.5 Control flow of the persistent tracking algorithm . . . . . . . . . . . . . . 47 3.6 Simulation result with GMT running away from UAV at full speed . . . . 49 3.7 Simulation result with GMT running away from UAV at half speed . . . 50 3.8 Simulation result with constant velocity GMT . . . . . . . . . . . . . . . 51 3.9 Simulation result with stationary GMT . . . . . . . . . . . . . . . . . . . 52 3.10 UAV dimensional turning rate in simulation . . . . . . . . . . . . . . . . 53 3.11 Speed control loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 3.12 Inner loop of altitude controller . . . . . . . . . . . . . . . . . . . . . . . 56 3.13 Outer loop of altitude controller . . . . . . . . . . . . . . . . . . . . . . . 56 ix 3.14 Turning rate control loop . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 3.15 Augmented UAV in persistent tracking . . . . . . . . . . . . . . . . . . . 58 3.16 Simulation result with 6DOF UAV and GMT running away from UAV at full speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3.17 Simulation result with 6DOF UAV and GMT running away from UAV at half speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 3.18 Simulation result with 6DOF UAV and constant velocity GMT . . . . . . 61 3.19 Simulation result with 6DOF UAV and stationary GMT . . . . . . . . . 62 3.20 Ground robots and the Vicon motion capture system . . . . . . . . . . . 63 3.21 Experimental result with GMT running away from UAV at full speed . . 65 3.22 Experimental result with GMT running away from UAV at half speed . . 66 3.23 Experimental result with constant velocity GMT . . . . . . . . . . . . . . 67 3.24 Experimental result with stationary GMT . . . . . . . . . . . . . . . . . 68 3.25 Experimental result with manually-operated GMT . . . . . . . . . . . . . 69 4.1 UAV 3D trajectory in tracking stationary targets . . . . . . . . . . . . . 76 4.2 UAV 2D trajectory in tracking stationary targets . . . . . . . . . . . . . 77 4.3 UAV turning rate in tracking stationary targets . . . . . . . . . . . . . . 77 4.4 GMT speed and heading angle . . . . . . . . . . . . . . . . . . . . . . . . 78 4.5 UAV and GMT 3D trajectories in image-based tracking . . . . . . . . . . 79 4.6 UAV and GMT 2D trajectories in image-based tracking . . . . . . . . . . 79 4.7 Relative distance between the UAV and the GMT . . . . . . . . . . . . . 80 4.8 UAV turning rate in tracking a GMT . . . . . . . . . . . . . . . . . . . . 80 5.1 Geometry of virtual structure formation in an inertial frame . . . . . . . 83 5.2 Trajectories of a three-UAV formation in path following . . . . . . . . . . 89 5.3 Trajectories of a three-UAV formation in path following (0-100 sec) . . . 90 5.4 Speeds of a three-UAV formation in path following . . . . . . . . . . . . 90 5.5 Heading angles of a three-UAV formation in path following . . . . . . . . 91 5.6 Distance error and angle separation of three UAVs in formation . . . . . 91 5.7 A four-UAV formation deployed in persistent tracking . . . . . . . . . . . 92 5.8 Target tracking using four UAVs in formation . . . . . . . . . . . . . . . 93 5.9 Speeds of four UAVs in formation while tracking a GMT . . . . . . . . . 94 5.10 Heading angles of four UAVs in formation while tracking a GMT . . . . . 94 x

Description:
In this thesis, the UAV-GMT tracking problem is studied from various motion of the formation is solely determined by its virtual leader; therefore, tracking .. respect to objects in its environment such as landmarks, targets, and . horizon detection algorithm and then feeds it back to the flight c
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.