Visual-inertial odometry for windblown vegetated environments.

dc.contributor.authorSchofield, Sam
dc.date.accessioned2023-09-14T02:34:11Z
dc.date.available2023-09-14T02:34:11Z
dc.date.issued2023
dc.description.abstractAerial manipulation (performing interaction tasks with unmanned aerial vehicles) can revolutionise the forestry and arboriculture industries by automating tree pruning and similar tasks. A major challenge currently preventing aerial manipulation from being used in these industries is the inability to accurately estimate the UAV's pose (position and orientation) in the target environments, specifi cally those with abundant wind- blown vegetation. Standard methods of UAV pose estimation, such as GPS, do not provide enough precision to allow aerial manipulation, while more precise methods, like visual odometry (VO), do not work reliably in environments dominated by such scene motion. This work takes several steps towards remedying the latter problem. First, a novel approach was proposed for aligning a motion-capture rigid body with the optical centre of a camera. The proposed method was more accurate than the existing approach and removed the need for a specially prepared calibration target. This work enabled more accurate ground truth trajectories to be captured and was also used when lab-testing a tree-pruning UAV that used a combination of camera and motion capture data for navigation. Next, a novel analysis of the state-estimation requirements of UAVs was performed to determine how existing VO algorithms would need to be improved to enable aerial manipulation in windblown vegetated environments. Preliminary experiments, completed in simulation and then verif ied using a real UAV, showed that VO accuracy was not a bottleneck to UAV position-hold performance in normal operating conditions. This result directed later research to focus on improving the robustness of VO in dynamic vegetated environments (such as forests) instead of trying to improve VO precision further. Several state-of-the-art visual-inertial odometry (VIO) algorithms were evaluated in scenes containing moving vegetation to determine whether existing algorithms met the desired robustness requirements. A preliminary analysis was performed using real data, followed by a more in-depth evaluation using a semi-synthetic dataset (consisting of real IMU data and synthetic images). These experiments con firmed the assumption that existing VIO algorithms were not suitable for use onboard a UAV performing manipulation tasks in a windblown forest. While generating the aforementioned semi-synthetic dataset, a major aw in existing methods for producing semi-synthetic data was discovered and addressed. A detailed comparison showed that VIO algorithms that were evaluated using data generated with the proposed approach performed signi ficantly (79%) better than when evaluated using data generated with existing methods. Finally, a novel outlier detection method was proposed that utilised the difference in nature between camera and scene-motion-induced optical ow. Experiments showed that, when used as a pre-processing step to RANSAC, the proposed method signi ficantly improved visual odometry accuracy (up to 20%) with minimal computational overhead in windblown forest environments.
dc.identifier.urihttps://hdl.handle.net/10092/106173
dc.identifier.urihttps://doi.org/10.26021/15047
dc.languageEnglish
dc.language.isoen
dc.rightsAll Right Reserved
dc.rights.urihttps://canterbury.libguides.com/rights/theses
dc.titleVisual-inertial odometry for windblown vegetated environments.
dc.typeTheses / Dissertations
thesis.degree.disciplineComputer Science
thesis.degree.grantorUniversity of Canterbury
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy
uc.collegeFaculty of Engineering
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Schofield, Sam_Final PhD Thesis.pdf
Size:
70.51 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: