Abstract
In this paper, we present an approach to the problem of actively controlling the configuration of a team of mobile agents equipped with cameras so as to optimize the quality of the estimates derived from their measurements. The issue of optimizing the robots' configuration is particularly important in the context of teams equipped with vision sensors, since most estimation schemes of interest will involve some form of triangulation.
We provide a theoretical framework for tackling the sensor planning problem, and a practical computational strategy inspired by work on particle filtering for implementing the approach. We then extend our framework by showing how modeled system dynamics and configuration space obstacles can be handled. These ideas have been applied to a target tracking task, and demonstrated both in simulation and with actual robot platforms. The results indicate that the framework is able to solve fairly difficult sensor planning problems online without requiring excessive amounts of computational resources.
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
