Monocular Localization in Feature-Annotated 3D Polygon Maps

Abstract

Localization in six degrees of freedom is becoming increasingly relevant, especially in indoor environments where GPS is not available. To localize autonomous vehicles like UAVs in such areas, reliable methods for self-localization with low-weight sensors are required. In this paper, we present an approach to precisely localize systems with monocular cameras in polygonal 3D maps annotated with keypoints and feature descriptors computed from LiDAR data and associated reference images. Our contribution consists of offline map computation from high resolution 3D point clouds with corresponding reference images as well as online localization within these maps using low cost sensors. During localization, features extracted from the vehicle’s camera image stream are matched against the reference map. The proposed method is capable of real-time localization and suitable for precise global localization. The evaluation shows comparable results to state of the art with high re-localization accuracy.

Publication
Proc. 10th European Conference on Mobile Robotics, Bonn, Germany
Date
Links