Intense Navigation: Using Active Sensor Intensity Observations To Improve Localization and Mapping

Loading...
Thumbnail Image

Authors

Hewitt, Robert

Date

Type

thesis

Language

eng

Keyword

Robotics , Time-of-Flight , Localization , Mapping , Visual Odometry , SLAM , LiDAR , Navigation

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Where am I? This question continues to be one of the fundamental challenges posed in robotics research. The ability of a robot to localize itself and map its environment has proven to be a difficult and rich research problem. While significant progress has been made it still remains a difficult task to perform in dynamic, 3D environments, over long distances. Stereo cameras are a proven workhorse for the task of Visual Odometry (VO) and three-dimensional Simultaneous Localization and Mapping (SLAM), but they require reliable lighting conditions and matching regions in both images. Light Detection And Ranging (LiDAR) sensors provide an alternative; they are lighting-invariant, provide dense depth information directly, and intensity information that resembles grayscale camera images. In many cases where lighting is unavailable or inconsistent, such as underground mining or planetary exploration, LiDAR is particularly suited for the task of localization. Both VO and SLAM can use a type of nonlinear optimization called bundle adjustment (BA) to solve for the optimal sensor pose and landmark positions given a set of matched observations at two or more separate poses. This thesis develops a version of BA, called IntenseBA. The algorithm estimates a map of landmarks, augmenting the standard three-dimensional point landmark with surface normal and reflectivity states. Because LiDAR intensity observations are dependent on the sensor pose and these landmark states, it is able to use observations of the landmarks to probabilistically determine the most likely estimate of the sensor pose and landmarks. The problem is shown to be observable in all states and an analysis of its sensitivity to noise in each observation is done through a simulation. A calibration procedure and analysis of modern keypoint algorithms is presented which allows the theoretical model to be applied to real data from a SwissRanger SR4000 Time-of-Flight (ToF) camera. Experiments were conducted in the European Space Agency's Planetary Utilisation Testbed, which emulates a Martian terrain. These experiments tested the IntenseBA algorithm (used to perform VO) and show the algorithm can accurately map all state estimates and improve upon accuracy compared to traditional and state-of-the-art approaches by incorporating these additional observations.

Description

Citation

Publisher

License

Queen's University's Thesis/Dissertation Non-Exclusive License for Deposit to QSpace and Library and Archives Canada
ProQuest PhD and Master's Theses International Dissemination Agreement
Intellectual Property Guidelines at Queen's University
Copying and Preserving Your Thesis
This publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.

Journal

Volume

Issue

PubMed ID

External DOI

ISSN

EISSN