Abstract
Direction of self-motion during walking is indicated by multiple cues, including optic flow, non-visual sensory cues, and motor prediction. I measured the variability in walking direction with and without visual feedback, and tested whether visual and non-visual cues are weighted in an optimal manner. Open-loop walking in an immersive virtual environment was used to assess the accuracy of perceived walking direction. Observers walked toward a target 4 m away either with no vision, or vision during the first 1 m of walking. Three simulated environments were tested: target-only, target and textured ground, or target with textured ground and scattered posts. With no vision, variability in walking direction averaged 3 deg. Visual feedback during initial movement reduced variability to about 1.5 deg, and there was no effect of visual environment. Based on these measures, an optimal estimator would strongly weigh visual information. A second experiment measured the perceptual weighting of visual and non-visual cues. Optic flow specified a conflicting heading direction (±5 deg), and bias in walking direction was used to infer cue weights. The observed visual weights were smaller than predicted (33–43% vs. 71%), and varied depending on the visual environment. Non-visual information had more influence than expected given the relative reliability of cues. This is consistent with some studies of visually guided walking that have observed a limited role of optic flow in online control.
