ebook img

Hybrid Aerial and Ground-based Mobile Robot for Retail Inventory by Yibo Lyu A dissertation ... PDF

106 Pages·2017·2.85 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Hybrid Aerial and Ground-based Mobile Robot for Retail Inventory by Yibo Lyu A dissertation ...

Hybrid Aerial and Ground-based Mobile Robot for Retail Inventory by Yibo Lyu A dissertation submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Doctor of Philosophy Auburn, Alabama August 4, 2018 Keywords: autonomous mobile robot. RFID based inventory, SLAM, UAV, Parallel Tracking and Mapping (PTAM) Copyright 2018 by Yibo Lyu Approved by Thaddeus A. Roppel, Chair, Associate Professor of Electrical and Computer Engineering Shiwen Mao, Samuel Ginn Endowed Professor of Electrical and Computer Engineering John Y. Hung, Professor of Electrical and Computer Engineering Robert Dean, Professor of Electrical and Computer Engineering Abstract Radio frequency identification (RFID) technology is increasingly used in retail stores and warehouses because it can help to improve the inventory process, enable automatic checkout, and reduce shoplifting. This dissertation introduces two related robotic systems that are designed to support RFID-based inventory counting; the first is a ground-based mobile robot, and the second is an unmanned aerial vehicle (UAV). The UAV is intended to supplement the vertical reach of the ground vehicle. The ground-based robot uses two novel methods to create a map and plan its path- (1) automapping, a semiautonomous algorithm that guides the robot through the space to be inventoried, and (2) multi-layer mapping, which synthesizes a structured-light sensor (e.g. Kinect) with a light detection and ranging (LIDAR). The experimental results show that the map made by the new automapping method is as good as one made manually. The multilayer map is more accurate than the map made by traditional simultaneous location and mapping (SLAM), compared with the ground truth of the map. In this dissertation, a control algorithm for the UAV guides the UAV to a landing pad on the ground-based robot. This algorithm is named PTAM, for parallel tracking and mapping. PTAM is a camera tracking system which can work without pre-made map and landmarks. Also, it uses a 2-D camera to do a 3-D tracking. Therefore, the UAV can track its position and scale by using this algorithm. Then, it can generate a 3-D path according to inventory needs ii and the UAV will move according to this path point by point. The experimental result shows that the UAV can finish a pre-set flight path with tolerable error. The flight path includes taking off from a landmark, moving from point to point as pre-set, and returning to the landmark. iii Acknowledgments I would first like to thank my advisor Dr. Thaddeus Roppel for his support and guidance during my time at Auburn University. I would like to thank Justin Patton for taking me to the RFID world and providing me with all the resources for my research. I would like to thank Dr. Senthilkumar CP for his instruction with RFID technology. I would like to thank Dr. Shiwen Mao, Dr. John Y. Hung and Dr. Robert N. Dean for serving my dissertation committee. My sincere thanks also go to my wife for standing behind me throughout my life. She has been my inspiration and motivation for continuing to improve my knowledge. My dissertation could not have been finished without her help and support. I would also like to thank my parents for their wise counsel and sympathetic ear. You are always there for me. Finally, there are my friends. We were not only able to support each other, but also happily by talking about things other than just our papers. At last I would like to thank all my colleagues in our lab and at Auburn University. I enjoy working here for the pleasant atmosphere and the mind storm here also contributes to my dissertation. A special thank to Dr. Jian Zhang. He gives me many help and instructions with my research work. iv Table of Contents Abstract ...................................................................................................................................... ii Acknowledgments..................................................................................................................... iv Table of Contents ....................................................................................................................... v List of Tables .......................................................................................................................... viii List of Illustrations .................................................................................................................... ix List of Abbreviations................................................................................................................ xii Chapter 1 Introduction ............................................................................................................... 1 1.1 Introduction ................................................................................................................... 1 1.2 Goals and Methods ....................................................................................................... 2 Chapter 2 Literature Review ...................................................................................................... 6 2.1 Mapping ........................................................................................................................ 6 2.2 Robot Mapping ............................................................................................................. 9 2.3 UAV ............................................................................................................................ 13 Chapter 3 Background ............................................................................................................. 15 3.1 Model & Type of RFID System .................................................................................. 16 3.2 UHF Passive RFID System ......................................................................................... 19 Chapter 4 Mapping .................................................................................................................. 22 4.1 SLAM ......................................................................................................................... 22 v 4.2 Auto-mapping ............................................................................................................. 26 4.2.1 Frontier-based Exploration Algorithm .............................................................. 27 4.2.2 Auto-mapping Algorithm .................................................................................. 28 4.3 Multi-layer Mapping ................................................................................................... 32 4.3.1 Features for Kinect & Lidar .............................................................................. 33 4.3.2 Optimized Process for SLAM ........................................................................... 34 4.3.3 Multi-layer Mapping Algorithm ....................................................................... 35 4.4 Experiment & Result................................................................................................... 36 4.4.1 Robot Platform Overview ................................................................................. 36 4.4.2 Auto-mapping ................................................................................................... 40 4.4.3 Multi-layer mapping ......................................................................................... 45 Chapter 5 Unmanned Aerial Vehicle (UAV) ............................................................................ 53 5.1 UAV Model ................................................................................................................. 53 5.2 PTAM .......................................................................................................................... 58 5.3 Monocular UAV control platform ............................................................................... 62 5.4 Control Algorithm ....................................................................................................... 65 5.4.1 High-level Control Algorithm ........................................................................... 65 5.4.2 Low-level Control Method ............................................................................... 67 5.5 Experiment & results .................................................................................................. 74 5.5.1 Platform overview ............................................................................................. 74 vi 5.5.2 Experiment & result .......................................................................................... 76 Chapter 6 Conclusion & Future Work ..................................................................................... 83 6.1 Conclusion .................................................................................................................. 83 6.2 Future work ................................................................................................................. 84 Reference ................................................................................................................................. 86 vii List of Tables Table 1 The comparison of errors of maps made by human & auto-mapping ........................ 44 Table 2 Pose estimation errors of LIDAR and Kinect ............................................................. 48 Table 3 The errors of obstacle positions in Merged-sensor map and Kinect-layer map ......... 49 Table 4 The errors of obstacle positions and sizes .................................................................. 52 viii List of Illustrations Figure 2.1 Essential SLAM problem ......................................................................................... 7 Figure 2.2 Ground truth and estimated camera trajectory in 2D ............................................... 9 Figure 2.3 Output voxel grid (in 1cm resolution) ...................................................................... 9 Figure 2.4 Typical operation of PTAM .................................................................................... 12 Figure 3.1 A basic RFID system .............................................................................................. 16 Figure 3.2 Frequency spectrum for RFID ................................................................................ 17 Figure 3.3 Passive, semi-passive, and active RFID system ..................................................... 18 Figure 3.4 Simulation of received power distribution ............................................................. 20 Figure 4.1 Process of SLAM ................................................................................................... 26 Figure 4.2 Example for frontier detection. An 8-neighbor world is assumed. ......................... 28 Figure 4.3 Navigation stack in ROS ........................................................................................ 31 Figure 4.4 Map shown in rviz .................................................................................................. 32 Figure 4.5 Process of multilayer mapping ............................................................................... 35 Figure 4.6 (a) photo of shoe rack, (b) depth image of shoe rack ............................................. 36 Figure 4.7 Robot System Components .................................................................................... 37 Figure 4.8 Robot Description ................................................................................................... 38 ix Figure 4.9 Kinect Field of View............................................................................................... 39 Figure 4.10 (a) experiment sales floor, (b) blue print map ...................................................... 40 Figure 4.11 Initial map for auto-mapping ................................................................................ 41 Figure 4.12 Finished auto-mapping map ................................................................................. 43 Figure 4.13 (a) map made by auto-mapping, (b) map made by human, (c) blueprint map ..... 44 Figure 4.14 (a) Lidar and Kinect pose estimation trail while mapping, (b) positions comparison of Lidar, Kinect, and ground truth ........................................................................ 47 Figure 4.15 (a) LIDAR-map, (b) Kinect-map(c) Blue print map ............................................ 49 Figure 4.16 (a) Merged-sensor map, (b) Kinect-layer map, (c) Blue print map ...................... 50 Figure 4.17 (a) Kinect-layer map, (b) Blue-print map ............................................................. 52 Figure 5.1 Simplified quadrotor motor in hovering ................................................................. 55 Figure 5.2 Throttle movement ................................................................................................. 56 Figure 5.3 Roll movement ....................................................................................................... 56 Figure 5.4 Pitch movement ...................................................................................................... 57 Figure 5.5 Yaw movement ....................................................................................................... 58 Figure 5.6 Flow Chart for Tracking Process ............................................................................ 58 Figure 5.7 Flow Chart for Mapping Process ............................................................................ 60 Figure 5.8 A symbiotic autonomous ground robot and drone system operating in a warehouse .................................................................................................................................................. 67 Figure 5.9 The drone autonomous takes off, navigates and lands back ................................... 68 Figure 5.10 Top view of front camera model ........................................................................... 70 x

Description:
and the second is an unmanned aerial vehicle (UAV). The ground-based robot uses two novel methods to create a map and plan its .. Figure 4.14 (a) Lidar and Kinect pose estimation trail while mapping, For capturing information in a 3-D environment, we use Microsoft Kinect™. Page 76
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.