Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data

PLoS One. 2016 Jan 19;11(1):e0146500. doi: 10.1371/journal.pone.0146500. eCollection 2016.

Abstract

Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus-specific. Based on these proofs-of-concept, we conclude that our data-driven methods can reliably extract relevant tuning information from neuronal recordings, including cells whose seemingly haphazard response curves defy conventional fitting approaches.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Artifacts*
  • Attention / physiology*
  • Fixation, Ocular / physiology
  • Limit of Detection
  • Macaca mulatta
  • Male
  • Models, Neurological
  • Normal Distribution
  • Orientation / physiology
  • Photic Stimulation*
  • Statistics as Topic / methods*
  • Statistics as Topic / standards
  • Visual Cortex / physiology*

Grants and funding

This research was supported by Volkswagen Foundation (www.volkswagenstiftung.de), grant I/79868, by the Bernstein Center of Computational Neuroscience Goettingen (www.bccn-goettingen.de), grants 01GQ0433 and 01GQ1005C of the Bundesministerium fuer Bildung und Forschung (BMBF, www.bmbf.de) and the German Research Foundation (DFG, www.dfg.de) Collaborative Research Center 889 “Cellular Mechanisms of Sensory Processing”. DB was supported by the Marie Curie career development fellowship (http://ec.europa.eu/research/mariecurieactions/) FP7-IEF 330792 (“DynViB”).