A Stable Visual World in Primate Primary Visual Cortex

Adam P. Morris, Bart Krekelberg

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.

Original languageEnglish
Pages (from-to)1471-1480
Number of pages10
JournalCurrent Biology
Volume29
Issue number9
DOIs
Publication statusPublished - 6 May 2019

Keywords

  • computation
  • electrophysiology
  • eye position
  • population coding
  • primary visual cortex
  • vision

Cite this

Morris, Adam P. ; Krekelberg, Bart. / A Stable Visual World in Primate Primary Visual Cortex. In: Current Biology. 2019 ; Vol. 29, No. 9. pp. 1471-1480.
@article{867becd9fcd2417f9aeb579e97e48d47,
title = "A Stable Visual World in Primate Primary Visual Cortex",
abstract = "Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.",
keywords = "computation, electrophysiology, eye position, population coding, primary visual cortex, vision",
author = "Morris, {Adam P.} and Bart Krekelberg",
year = "2019",
month = "5",
day = "6",
doi = "10.1016/j.cub.2019.03.069",
language = "English",
volume = "29",
pages = "1471--1480",
journal = "Current Biology",
issn = "0960-9822",
publisher = "Elsevier",
number = "9",

}

A Stable Visual World in Primate Primary Visual Cortex. / Morris, Adam P.; Krekelberg, Bart.

In: Current Biology, Vol. 29, No. 9, 06.05.2019, p. 1471-1480.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - A Stable Visual World in Primate Primary Visual Cortex

AU - Morris, Adam P.

AU - Krekelberg, Bart

PY - 2019/5/6

Y1 - 2019/5/6

N2 - Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.

AB - Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.

KW - computation

KW - electrophysiology

KW - eye position

KW - population coding

KW - primary visual cortex

KW - vision

UR - http://www.scopus.com/inward/record.url?scp=85064888382&partnerID=8YFLogxK

U2 - 10.1016/j.cub.2019.03.069

DO - 10.1016/j.cub.2019.03.069

M3 - Article

VL - 29

SP - 1471

EP - 1480

JO - Current Biology

JF - Current Biology

SN - 0960-9822

IS - 9

ER -