Abstract
Objective
This experiment examined how error biases of an imperfect automated decision aid system impacted trust and dependency behaviors in a simulated drone collision avoidance task.
Background
Prior work on human-automation interaction indicates asymmetrical effects of error biases, misses, and false alarms, on compliance and reliance. Yet, it is unclear whether the effect is due to unbalanced perceptual salience of the automation errors or their trust toward the automated system.
Method
Sixty-eight participants interacted with a drone monitoring task with the assistance of a collision avoidance aid that varied in error bias (i.e., miss-prone and false-alarm prone). Participants’ automation trust ratings and dependency behaviors (i.e., compliance and reliance) were measured.
Results
With error biases equally salient, participants showed a similar decrease in levels of trust along multiple factors of automation trust when interacting with unreliable automation aids. Compliance rates were higher when interacting with a miss-prone system than a false-alarm prone system, whereas reliance rates showed the opposite pattern.
Conclusion
Error bias determines compliance and reliance behaviors systematically. Saliency-matched false alarm and miss errors by automation degrade trust, potentially undermining the development of performance-based trust.
Application
Designers of automated systems should consider how different error types systematically affect dependency behaviors to create transparent systems that properly calibrate trust to the capability of the automation.
Keywords
Get full access to this article
View all access options for this article.
