Abstract
Automated data collection, assessment, and diagnosis capabilities are key requirements of an instructional framework to support performance evaluation and debrief of multiple teams participating in distributed simulation-based exercises. This paper discusses methods and issues in the application of automated performance data collection and assessment capabilities as part of a prototype Debriefing Distributed Simulation-Based Exercises (DDSBE) system. The automated data collection process obtains data from local and distributed simulation systems and operator consoles to assess individual, team, and multi-team performance on training objectives during scenario-critical/key events. The automated assessment component accesses expected performance data related to critical scenario events and compares the actions observed within a specified time window with the set of expected responses. Performance is assessed at the multi-team, team, and individual levels as appropriate. Automated and observer-based semiautomated assessments are integrated into data products suitable for diagnostic analysis and for single- and multi-team debrief development. Preliminary data from an initial demonstration are discussed.
Get full access to this article
View all access options for this article.
