Towards accurate earth-referenced LiDAR mapping with an autonomous mobile robot.

dc.contributor.authorYoung, Matthew
dc.date.accessioned2020-07-27T01:30:55Z
dc.date.available2020-07-27T01:30:55Z
dc.date.issued2020en
dc.description.abstractThere is enormous potential in the geospatial industry for mobile robotics to auto- mate terrain mapping. They can reduce operator-induced errors and perform tasks autonomously without supervision. This thesis seeks to quantify the accuracy and speed of an autonomous mapping robot by comparing it to conventional survey methods. To do this, a proof-of-concept robot was developed from a “Jackal” Unmanned Ground Vehicle (UGV), controlled by the Robot Operating System (ROS). High-accuracy GNSS, IMU and wheel encoder data was fused in an open-source Unscented Kalman Filter (UKF) to localize the robot in UTM coordinates. The ROS navigation stack was used to achieve autonomous navigation. This robot was used to map two sites, by measuring it’s own position as it autonomously navigated between predefined waypoints. This achieved a mean vertical accuracy of 8-17 mm, and mapped the sites in approximately 13-14 minutes. A selection of conventional GNSS RTK, total station and aerial photogrammetry methods by comparison achieved an accuracy in the range 0-70 mm, and took either a few minutes or approximately an hour to complete. The robot was then upgraded with a LiDAR unit, and a novel method was developed for accurately aligning and registering the scans to produce a point cloud that could be compared to one collected by a scanning total station. This method, termed Anchor Cloud Mapping (ACM) was inspired by the methods survey GNSS and total stations use for calibration. The core principles are that the robots trajectory is divided into independent sections, where the clouds in each section are registered using a keyscan, mesh-based, Generalized Iterative Closest Point method. Each cloud is then pivoted around a calibrated stationary robot pose, and aligned to best fit the UKF trajectory. When compared to a Trimble SX10 scanning total station (which is accurate to 2.5 mm), the robot/ACM cloud has a median point-to-point accuracy of 51-59 mm, and collects several times more points which are more evenly distributed throughout the environment. The robot can autonomously survey an area in minutes while the SX10 requires several hours.en
dc.identifier.urihttps://hdl.handle.net/10092/100762
dc.identifier.urihttp://dx.doi.org/10.26021/962
dc.languageEnglish
dc.language.isoen
dc.publisherUniversity of Canterburyen
dc.rightsAll Right Reserveden
dc.rights.urihttps://canterbury.libguides.com/rights/thesesen
dc.titleTowards accurate earth-referenced LiDAR mapping with an autonomous mobile robot.en
dc.typeTheses / Dissertationsen
thesis.degree.disciplineMechanical Engineeringen
thesis.degree.grantorUniversity of Canterburyen
thesis.degree.levelDoctoralen
thesis.degree.nameDoctor of Philosophyen
uc.bibnumber2940646en
uc.collegeFaculty of Engineeringen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Young, Matthew_Final PhD Thesis.pdf
Size:
58.77 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: