Contributions to Real-time Metric Localisation with Wearable Vision Systems

Daniel Gutierrez


Under the rapid development of electronics and computer science in the last years, cameras have become
omnipresent nowadays, to such extent that almost everybody is able to carry one at all times embedded into
their cellular phone. What makes cameras specially appealing for us is their ability to quickly capture a lot of
information of the environment encoded in one image or video, allowing us to immortalize special moments inour life or share reliable visual information of the environment with other persons. However, while the task ofextracting the information from an image may by trivial for us, in the case of computers complex algorithmswith a high computational burden are required to transform a raw image into useful information. In this sense, the same rapid development in computer science that allowed the widespread of cameras has enabled also the possibility of real-time application of previously practically infeasible algorithms.

Among the current fields of research in the computer vision community, this thesis is specially concerned in
metric localisation and mapping algorithms. These algorithms are a key component in many practical applications such as robot navigation, augmented reality or reconstructing 3D models of the environment.

The goal of this thesis  is to delve into visual localisation and mapping from vision, paying special attention
to conventional and unconventional cameras which can be easily worn or handled by a human. In this thesis I
contribute in the following aspects of the visual odometry and SLAM (Simultaneous Localisation and Mapping)

- Generalised Monocular SLAM for catadioptric central cameras

- Resolution of the scale problem in monocular vision

- Dense RGB-D odometry

- Robust place recognition

- Pose-graph optimisation


Computer Vision; Motion, Tracking, Video Analysis; 3D and Stereo

Full Text:

PDF (152Kb)
Copyright (c) 2016 Daniel Gutierrez