ebook img

Appearance-Based Navigation, Localization, Mapping, and Map Merging for Heterogeneous PDF

167 Pages·2013·23.11 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Appearance-Based Navigation, Localization, Mapping, and Map Merging for Heterogeneous

UC Merced UC Merced Electronic Theses and Dissertations Title Appearance-Based Navigation, Localization, Mapping, and Map Merging for Heterogeneous Teams of Robots Permalink https://escholarship.org/uc/item/4hb1j7fs Author Erinc, Gorkem Publication Date 2013 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California University of California Merced Appearance-Based Navigation, Localization, Mapping, and Map Merging for Heterogeneous Teams of Robots A dissertation submitted in partial satisfaction of the requirements for the degree Doctor of Philosophy in Electrical Engineering and Computer Science by Go¨rkem Erin¸c 2013 (cid:13)c Copyright by Go¨rkem Erin¸c 2013 Abstract of the Dissertation Appearance-Based Navigation, Localization, Mapping, and Map Merging for Heterogeneous Teams of Robots by Go¨rkem Erin¸c Spatial awareness is a vital component for most autonomous robots operating in unstructured environments. Appearance-based maps are emerging as an important class of spatial representations for robots. Requiring only a cam- era instead of an expensive sensor like a laser range finder, appearance-based maps provide a suitable world model to human perception and offer a natural way to exchange information between robots and humans. In this dissertation, we embrace this representation and present a framework that provides navi- gation, localization, mapping, and map merging capabilities to heterogeneous multi-robot systems using exclusively monocular vision. Our first contribution is integrating different ideas from separately proposed solutions into a robust appearance-based localization and mapping framework that does not suffer from the individual issues tied to the original proposed methods. Next, we introduce a novel visual navigation algorithm that steers a robot between two images through the shortest possible path. Thanks to its invariance to changes in the tilt angle and the elevation of the cameras, the images collected by an- other robot with a totally different morphology and camera placement can be used for navigation. Furthermore, we tackle the problem of merging together two or more appearance-based maps independently built by robots operat- ing in the same environment, and propose an anytime algorithm aiming to quickly identify the more advantageous parts to merge. Noting the lack of any evaluation criteria for appearance-based maps, we introduce our task specific quality metric that measures the utility of a map with respect to three major robotic tasks: localization, mapping, and navigation. Additionally, in order to measure the quality of merged appearance-based maps, and the performance of the merging algorithm, we propose the use of algebraic connectivity, a con- cept which we borrowed from graph theory. Finally, we introduce a machine learning based WiFi localization technique which we later embrace as the core of our novel heterogeneous map merging algorithm. All algorithms introduced in this dissertation are validated on real robots. ii The dissertation of Go¨rkem Erin¸c is approved, and it is acceptable in quality and form for publication on microfilm and electronically. Ming-Hsuan Yang David C. Noelle Stefano Carpin, Committee Chair University of California, Merced 2013 iii Dedicated to my father... iv Table of Contents 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Dissertation Contributions . . . . . . . . . . . . . . . . . . . . . . . . 5 2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.1 Visual Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.1.1 PBVS methods . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.1.2 IBVS methods . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.2 Vision-based Localization and Mapping . . . . . . . . . . . . . . . . . 14 2.3 Map Merging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.4 Wifi Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3 Visual Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1.1 System Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1.2 Camera Model . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.1.3 Epipolar Geometry . . . . . . . . . . . . . . . . . . . . . . . . 24 3.2 Navigation Algorithm for Heterogeneous Multi-Robot Systems . . . . 29 3.2.1 Single Robot Navigation . . . . . . . . . . . . . . . . . . . . . 29 3.2.2 Multi-Robot Navigation . . . . . . . . . . . . . . . . . . . . . 32 3.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.3.1 Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.3.2 Implementation on a Multi-Robot System . . . . . . . . . . . 37 3.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 4 Appearance-Based Localization and Mapping . . . . . . . . . . . . . 43 4.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 4.2 Image Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.3 Data Structures and Training . . . . . . . . . . . . . . . . . . . . . . 47 v 4.4 Localization in Appearance-Based Maps . . . . . . . . . . . . . . . . 48 4.5 Appearance-Based Map Building . . . . . . . . . . . . . . . . . . . . 51 4.6 Planning and Navigation on Appearance-Based Maps . . . . . . . . . 51 4.7 Quality Assessment of Appearance-Based Maps . . . . . . . . . . . . 55 4.7.1 Evaluation of Visual Maps . . . . . . . . . . . . . . . . . . . . 58 4.7.1.1 Localization . . . . . . . . . . . . . . . . . . . . . . . 60 4.7.1.2 Planning . . . . . . . . . . . . . . . . . . . . . . . . 64 4.7.1.3 Navigation . . . . . . . . . . . . . . . . . . . . . . . 69 4.8 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 5 Appearance-Based Map Merging . . . . . . . . . . . . . . . . . . . . . 73 5.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 5.2 Measuring the Quality of Merged Maps . . . . . . . . . . . . . . . . . 76 5.2.1 Defining Entanglement in Terms of Connectivity . . . . . . . . 76 5.2.2 Algebraic Connectivity . . . . . . . . . . . . . . . . . . . . . . 77 5.2.2.1 Properties of Algebraic Connectivity . . . . . . . . . 78 5.3 Merging Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.3.1 Merging Two Maps . . . . . . . . . . . . . . . . . . . . . . . . 81 5.3.2 Merging Multiple Maps . . . . . . . . . . . . . . . . . . . . . . 86 5.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 5.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 6 Heterogeneous Map Merging . . . . . . . . . . . . . . . . . . . . . . . 94 6.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 6.2 Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 6.3 WiFi Localization and Mapping . . . . . . . . . . . . . . . . . . . . . 99 6.3.1 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 6.3.1.1 Description . . . . . . . . . . . . . . . . . . . . . . . 101 6.3.1.2 Classification Algorithms . . . . . . . . . . . . . . . 101 6.3.1.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . 106 6.3.2 Regresion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 vi 6.3.2.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . 115 6.3.3 Monte Carlo Localization . . . . . . . . . . . . . . . . . . . . 115 6.3.3.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . 117 6.4 Merging Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 6.4.1 Map Overlap Estimation . . . . . . . . . . . . . . . . . . . . . 119 6.4.1.1 OCC Algorithms . . . . . . . . . . . . . . . . . . . . 119 6.4.1.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . 122 6.4.2 Probability Distribution Function . . . . . . . . . . . . . . . . 124 6.4.3 Edge-based Refinement and Regression . . . . . . . . . . . . . 125 6.4.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 6.4.4.1 Full Map Merge . . . . . . . . . . . . . . . . . . . . . 127 6.4.4.2 Partial Map Merge . . . . . . . . . . . . . . . . . . . 127 6.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 vii List of Figures 3.1 Pinhole camera model . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 Transformation from normalized coordinates to coordinates in pixels . 23 3.3 Epipolar geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.4 Illustration of the visual navigation problem . . . . . . . . . . . . . . 30 3.5 Four-step navigation algorithm . . . . . . . . . . . . . . . . . . . . . 31 3.6 Tilt-correction process . . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.7 Controller error profiles plotted for each stage of the navigation algo- rithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 3.8 Distance and heading errors between the actual and target views plot- ted during the four stages of the navigation algorithm . . . . . . . . . 37 3.9 Heterogeneous robot team used to test the navigation algorithm . . . 38 3.10 Sample results of the servoing algorithm tested on the Create robot . 39 3.11 Sample results of the servoing algorithm tested on the P3AT robot. . 40 3.12 Sample results of the servoing algorithm tested on the Create robot using the map generated by the P3AT robot . . . . . . . . . . . . . . 41 3.13 Sample results of the servoing algorithm tested on the P3AT robot using the map generated by the Create robot . . . . . . . . . . . . . . 42 4.1 Overview of dictionary learning, map building and localization proce- dures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 4.2 A simple appearance-based map with edges inserted between suffi- ciently similar images. . . . . . . . . . . . . . . . . . . . . . . . . . . 46 4.3 Sample image with extracted SIFT features . . . . . . . . . . . . . . 47 4.4 A representative set of random training images collected from online repositories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.5 Performance comparison of majority voting schema and pairwise im- age matching method . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 4.6 Snapshots of graphical user interface during map building process . . 52 4.7 Sample path followed to create an appearance-based map . . . . . . . 55 4.8 Waypoint images and their corresponding final views from both robots 56 viii

Description:
vision. These vision based solutions work on image space and do not necessarily need metric localization for mapping or navigation purposes. In other words involved such as first responders in urban search and rescue or children in child-care, . working on Unmanned Aerial Vehicles (UAV).
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.