Abstract
Most approaches to controlling autonomous systems require extensive pre-mission preparation, intensive effort by the human operator, or have strong limitations on the range of possible missions that can be accomplished. In this paper we describe an approach called Cognitive Patterns that promises to alleviate these challenges by replicating three key processes of human cognition—pattern generation, perception/action, and adaptation—and instantiating them in a new architecture which can then be embedded into an autonomous system. An early version of this approach connected high-level knowledge representations in an ontology with a robot's sensing and acting abilities. The advantages of this approach were then demonstrated in a simulation environment. A more refined version, based on lessons learned and called Cognitive Patterns Knowledge Generation, can deal with anomalies, unexpected events, and uncertainties, and is also described in terms of its components, their interactions, and benefits.
Get full access to this article
View all access options for this article.
