Abstract
Traditional approaches to HMI design focus on the use of visual displays and manual inputs, but these do not take advantage of the full range of means by which humans can perceive and interact with their environment. For wearable computing systems, the selection of modalities depends greatly on the proper consideration of human cognitive capabilities. The Multimodal Interface Research Platform (MIRP) is a wearable platform for evaluating task-relevant human performance by presenting information using three modalities: Visual (via head mounted display), Auditory (via earphones), and Haptic (via four vibrating actuators on the shoulders). Within the context of a predetermined task scenario, MIRP is able to monitor and record the user's interactions with the system and collect reaction time and a coarse accuracy determination of whether a message was understood. This enables observations about simple reaction time with respect to different alert/message modalities, as well as inferences about their understandability.
Get full access to this article
View all access options for this article.
