Abstract
Introduction:
To date, brain–computer interfaces (BCIs) have not achieved reliable real-time communication through auditory or tactile modalities. Such interfaces would be crucial for brain-injured patients with severe motor impairments who are also blind or deaf. This study validates the functionality of the NeuroCommTrainer, a mobile and easy-to-use multimodal BCI with flex-printed electrode strips that does not require vision and adapts to users’ attentiveness levels to initiate stimulation.
Methods:
In a study of 20 healthy participants, we evaluated auditory and vibrotactile oddball paradigms to train the system to differentiate rare and frequent event-related potentials (ERPs). In real-time online sessions, the system detected participants’ mental focus to adaptively initiate stimulation through attentiveness monitoring.
Results:
The NeuroCommTrainer successfully captured auditory and tactile ERPs, achieving a classification accuracy of 75% for stimuli in the calibration session, which is not yet reflected in the online session with 34% of found targets (chance level = 16.7%).
Discussion:
The presented early-stage prototype of the NeuroCommTrainer requires several improvements before clinical application in brain-damaged patients, which include refined algorithms to reduce classification variance across participants, and enhanced attentiveness detection specifically tuned to brain activity of the targeted patient group. The present study makes a critical step in this direction and shows that a transition into a practicable communication system for brain-damaged patients may be achievable in the future.
Impact Statement
Efforts to facilitate communication through brain–computer interfaces for patients with brain injuries have thus far achieved limited success. This is primarily due to the systems’ lack of adaptability and responsiveness to the users’ unique mental states and fluctuating arousal levels throughout the day. Our multimodal NeuroCommTrainer system combines long-term electroencephalography-based attentiveness monitoring with state-dependent stimulus delivery and neural response classification. We show that we can detect chosen targets and thus user responses significantly better than chance. Our system opens up an avenue for brain-damaged patients to eventually communicate yes/no answers via the BCI.
Keywords
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
