Abstract
Elevation maps are a popular data structure for representing the environment of a mobile robot operating outdoors or on not-flat surfaces. Elevation maps store in each cell of a discrete grid the height of the surface at the corresponding place in the environment. However, the use of this 2½-dimensional representation, is disadvantageous when utilized for mapping with mobile robots operating on the ground, since vertical or overhanging objects cannot be represented appropriately. Furthermore, such objects can lead to registration errors when two elevation maps have to be matched. In this paper, an approach is proposed that allows a mobile robot to deal with vertical and overhanging objects in elevation maps. The approach classifies the points in the environment according to whether they correspond to such objects or not. Also presented is a variant of the ICP algorithm that utilizes the classification of cells during the data association. Additionally, it is shown how the constraints computed by the ICP algorithm can be applied to determine globally consistent alignments. Experiments carried out with a real robot in an outdoor environment demonstrate that the proposed approach yields highly accurate elevation maps even in the case of loops. Experimental results are presented demonstrating that that the proposed classification increases the robustness of the scan matching process.
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
