Neural markers of predictive coding under perceptual uncertainty revealed with hierarchical frequency tagging

Noam Gordon, Roger Koenig-Robert, Naotsugu Tsuchiya, Jeroen JA van Boxtel, Jakob Hohwy

Research output: Contribution to journalArticleResearchpeer-review

31 Citations (Scopus)


There is a growing understanding that both top-down and bottom-up signals underlie perception. But it is not known how these signals integrate with each other and how this depends on the perceived stimuli’s predictability. ‘Predictive coding’ theories describe this integration in terms of how well top-down predictions fit with bottom-up sensory input. Identifying neural markers for such signal integration is therefore essential for the study of perception and predictive coding theories. To achieve this, we combined EEG methods that preferentially tag different levels in the visual hierarchy. Importantly, we examined intermodulation components as a measure of integration between these signals. Our results link the different signals to core aspects of predictive coding, and suggest that top-down predictions indeed integrate with bottom-up signals in a manner that is modulated by the predictability of the sensory input, providing evidence for predictive coding and opening new avenues to studying such interactions in perception.

Original languageEnglish
Article numbere22749
Number of pages17
Publication statusPublished - 28 Feb 2017


  • hierarchical frequency tagging
  • human
  • intermodulation
  • predictive coding
  • semantic wavelet-induced frequency tagging

Cite this