J Neurophysiol. 2011 Jan;105(1):200-8 doi: 10.1152/jn.00725.2009. 2010 Nov 10.

Population anisotropy in area MT explains a perceptual difference between near and far disparity motion segmentation

Calabro FJ, Vaina LM.

Abstract

Segmentation of the visual scene into relevant object components is a fundamental process for successfully interacting with our surroundings. Many visual cues, including motion and binocular disparity, support segmentation, yet the mechanisms using these cues are unclear. We used a psychophysical motion discrimination task in which noise dots were displaced in depth to investigate the role of segmentation through disparity cues in visual motion stimuli (experiment 1). We found a subtle, but significant, bias indicating that near disparity noise disrupted the segmentation of motion more than equidistant far disparity noise. A control experiment showed that the near-far difference could not be attributed to attention (experiment 2). To account for the near-far bias, we constructed a biologically constrained model using recordings from neurons in the middle temporal area (MT) to simulate human observers' performance on experiment 1. Performance of the model of MT neurons showed a near-disparity skew similar to that shown by human observers. To isolate the cause of the skew, we simulated performance of a model containing units derived from properties of MT neurons, using phase-modulated Gabor disparity tuning. Using a skewed-normal population distribution of preferred disparities, the model reproduced the elevated motion discrimination thresholds for near-disparity noise, whereas a skewed-normal population of phases (creating individually asymmetric units) did not lead to any performance skew. Results from the model suggest that the properties of neurons in area MT are computationally sufficient to perform disparity segmentation during motion processing and produce similar disparity biases as those produced by human observers.

PMID: 21068268