Humans are remarkably proficient at the task of distinguishing between symmetric and non-symmetric visual patterns. The neural mechanisms underlying this ability are still unclear. Here we examine symmetry perception along a dimension that can help place some constraints on the nature of these mechanisms. Specifically, we study whether and how human performance on the task of classifying patterns as bilaterally symmetric versus non-symmetric changes as a function of the spatial separation between the flanks. Working with briefly flashed stimuli that embody flank separations of 6 degrees to 54 degrees, we find that classification performance declines significantly with increasing inter-flank distance, but remains well above chance even at the largest separations. Response time registers a progressive increase as the space between the flanks expands. Baseline studies show that these performance changes cannot be attributed solely to reduced acuity in the visual periphery, or increased conduction times for relaying information from those locations. The findings argue for the need to adapt current feedforward models of symmetry perception to be more consistent with the empirical data, and also point to the possible involvement of recurrent processing, as suggested by recent computational results.
Keywords: Bilateral symmetry; Computational models; Spatial integration.
Copyright © 2024 Elsevier Ltd. All rights reserved.