Abstract
The question how channels tuned to different motion directions contribute to motion perception has been investigated by using motion adaptation to silence certain channels, and then measuring performance in a fine motion-discrimination task. To help constrain models of how the channels become integrated, we examined whether changes in performance stem from reduced accuracy (bias) or from reduced precision (sensitivity) in direction judgments. On a given trial, the observer first adapted to a field of dots moving coherently in a given direction (ranging ±180° from upward), then judged whether the motion of an ensuing test stimulus (ranging ±3°) was left or right of reference. Bias and sensitivity of the psychometric fits were computed for each adapter direction. Relative to baseline performance, post-adaptation judgments showed significant changes in sensitivity that were tightly correlated with overall performance. Meanwhile, bias shifts were found to be weaker and less systematic. Both performance and sensitivity suffered the largest losses at ±60°, with some enhancement at 180°. No similar trends were found in the domain of bias. A regression model, with precision as the sole predictor, captured 97% of the variation in performance; no gains were found in adding bias to the model. Our findings on fine motion-discrimination question the idealized notion of a pure feature detector, as the main impact of adaptation in such a system would be to bias direction judgments away from the adapted direction.
Get full access to this article
View all access options for this article.
