Abstract
Any virtual reality (VR) system may be considered to be a bidirectional interface between patient and effector devices. Most standard configurations of VR interaction present a computergenerated, virtual environment for the patient with measurement of the patient response, followed usually by rather stereotyped changes in the computer-generated presentation. The patient's behavior is thus molded to the computer "conception" of reality. In contrast, techniques of quantitative patient videospace analysis developed in this laboratory provide for use of real-world input/output in patient and clinician interfaces. Monitoring of patient movement during epileptic seizures with spatially oriented time-domain and spectral processing of video intensity data leads to a mapping procedure that allows pattern recognition of ictal behaviors, thus providing the substrate for the clinician virtually to observe the behavior with greater analytic detail. A similar technique, based on real-world videospace and applied to patients with movement disorders, produces signals that should be suitable for controlling haptic therapeutic and assistive devices. Close linking of trigger signals to video material presented to subjects in simulation training environments provides the methodology that will be useful for monitoring cognitive responses through event-related evoked potentials. Such monitoring provides a basis for closing psychophysical feedback loops to increase effectiveness of training paradigms and cognitive therapies. The technique also provides for bidirectional patient interaction by tying video displays to physiologic responses and to electrophysiologically measurable cognitive responses, thus suggesting enhanced modes of biofeedback and cognitive feedback approaches.
Get full access to this article
View all access options for this article.
