Abstract
Influence operations (IO) are coordinated efforts by an actor, individual or group, to interfere in the process of meaning-making for manipulation or corruption of public debate (e.g., Bergh, 2020), often involving dis-or misinformation spread. IOs are often employed through social media and based around political, social and/or ‘hotbutton’ issues and narratives. There is increasing study of misinformation spread and correction, but less research examines IO mitigation using human participants and controlled tests (rather than post-facto metrics). We conducted a lightly scoped literature review of IO research using social media, revealing emergent difficulties and challenges to designing and studying IO, and paths toward improving experimentation based on human factors research.
Get full access to this article
View all access options for this article.
