Abstract
Intelligence analysis is a high-stakes domain that poses challenges to effective individual and collaborative cognition. The design of support tools and analytical pedagogy could benefit from an understanding of how challenges that are reported in other decision-making literature generalize and are manifested in more naturalistic settings. The objective of this research was to elicit challenges for cognition in collaborative intelligence analysis. Two complementary research methods were used: unstructured interviews with 46 analysts and supervisors, and observations of eight teams of military intelligence analysts conducting a training scenario. Interviews with designers, educators, and practitioners in the intelligence community revealed trends in unsupported cognitive work and cultural challenges, whereas observations from a training exercise for army intelligence analysts instantiated other cognitive challenges of collaborative analysis. This study indicates that analytical style (part tradition and part due to individual reasoning tendencies) can result in premature narrowing, difficulty in reframing, and getting lost in the details. The study also illustrates the effects of friction within and across federated teams, how variable tempo can produce inexpert behavior, and considerations for the design of analytical support tools. This work suggests the value of complementary research methods in the study of other domains involving collaborative work. It is likely that these cognitive challenges affect other domains involving collaborative analysis. Finally, this study suggests that the effects of individual cognitive challenges are difficult to isolate in naturalistic settings and should most likely be considered collectively rather than independently.
Get full access to this article
View all access options for this article.
