Content and misrepresentation in hierarchical generative models

Research output: Contribution to journalArticleResearchpeer-review

15 Citations (Scopus)

Abstract

In this paper, we consider how certain longstanding philosophical questions about mental representation may be answered on the assumption that cognitive and perceptual systems implement hierarchical generative models, such as those discussed within the prediction error minimization (PEM) framework. We build on existing treatments of representation via structural resemblance, such as those in Gładziejewski (Synthese 193(2):559–582, 2016) and Gładziejewski and Miłkowski (Biol Philos, 2017), to argue for a representationalist interpretation of the PEM framework. We further motivate the proposed approach to content by arguing that it is consistent with approaches implicit in theories of unsupervised learning in neural networks. In the course of this discussion, we argue that the structural representation proposal, properly understood, has more in common with functional-role than with causal/informational or teleosemantic theories. In the remainder of the paper, we describe the PEM framework for approximate Bayesian inference in some detail, and discuss how structural representations might arise within the proposed Bayesian hierarchies. After explicating the notion of variational inference, we define a subjectively accessible measure of misrepresentation for hierarchical Bayesian networks by appeal to the Kullbach–Leibler divergence between posterior generative and approximate recognition densities, and discuss a related measure of objective misrepresentation in terms of correspondence with the facts.

Original languageEnglish
Pages (from-to)2387-2415
Number of pages29
JournalSynthese
Volume195
Issue number6
DOIs
Publication statusPublished - 1 Jun 2018

Keywords

  • Functional role semantics
  • Generative model
  • Kullbach–Leibler divergence
  • Misrepresentation
  • Prediction error minimization
  • Problem of content
  • Recognition model
  • Structural resemblance
  • Unsupervised learning
  • Variational Bayesian inference

Cite this

@article{9b6c8c517cca47c18ff23cd21efbe15c,
title = "Content and misrepresentation in hierarchical generative models",
abstract = "In this paper, we consider how certain longstanding philosophical questions about mental representation may be answered on the assumption that cognitive and perceptual systems implement hierarchical generative models, such as those discussed within the prediction error minimization (PEM) framework. We build on existing treatments of representation via structural resemblance, such as those in Gładziejewski (Synthese 193(2):559–582, 2016) and Gładziejewski and Miłkowski (Biol Philos, 2017), to argue for a representationalist interpretation of the PEM framework. We further motivate the proposed approach to content by arguing that it is consistent with approaches implicit in theories of unsupervised learning in neural networks. In the course of this discussion, we argue that the structural representation proposal, properly understood, has more in common with functional-role than with causal/informational or teleosemantic theories. In the remainder of the paper, we describe the PEM framework for approximate Bayesian inference in some detail, and discuss how structural representations might arise within the proposed Bayesian hierarchies. After explicating the notion of variational inference, we define a subjectively accessible measure of misrepresentation for hierarchical Bayesian networks by appeal to the Kullbach–Leibler divergence between posterior generative and approximate recognition densities, and discuss a related measure of objective misrepresentation in terms of correspondence with the facts.",
keywords = "Functional role semantics, Generative model, Kullbach–Leibler divergence, Misrepresentation, Prediction error minimization, Problem of content, Recognition model, Structural resemblance, Unsupervised learning, Variational Bayesian inference",
author = "Alex Kiefer and Jakob Hohwy",
year = "2018",
month = "6",
day = "1",
doi = "10.1007/s11229-017-1435-7",
language = "English",
volume = "195",
pages = "2387--2415",
journal = "Synthese",
issn = "0039-7857",
publisher = "Springer-Verlag London Ltd.",
number = "6",

}

Content and misrepresentation in hierarchical generative models. / Kiefer, Alex; Hohwy, Jakob.

In: Synthese, Vol. 195, No. 6, 01.06.2018, p. 2387-2415.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Content and misrepresentation in hierarchical generative models

AU - Kiefer, Alex

AU - Hohwy, Jakob

PY - 2018/6/1

Y1 - 2018/6/1

N2 - In this paper, we consider how certain longstanding philosophical questions about mental representation may be answered on the assumption that cognitive and perceptual systems implement hierarchical generative models, such as those discussed within the prediction error minimization (PEM) framework. We build on existing treatments of representation via structural resemblance, such as those in Gładziejewski (Synthese 193(2):559–582, 2016) and Gładziejewski and Miłkowski (Biol Philos, 2017), to argue for a representationalist interpretation of the PEM framework. We further motivate the proposed approach to content by arguing that it is consistent with approaches implicit in theories of unsupervised learning in neural networks. In the course of this discussion, we argue that the structural representation proposal, properly understood, has more in common with functional-role than with causal/informational or teleosemantic theories. In the remainder of the paper, we describe the PEM framework for approximate Bayesian inference in some detail, and discuss how structural representations might arise within the proposed Bayesian hierarchies. After explicating the notion of variational inference, we define a subjectively accessible measure of misrepresentation for hierarchical Bayesian networks by appeal to the Kullbach–Leibler divergence between posterior generative and approximate recognition densities, and discuss a related measure of objective misrepresentation in terms of correspondence with the facts.

AB - In this paper, we consider how certain longstanding philosophical questions about mental representation may be answered on the assumption that cognitive and perceptual systems implement hierarchical generative models, such as those discussed within the prediction error minimization (PEM) framework. We build on existing treatments of representation via structural resemblance, such as those in Gładziejewski (Synthese 193(2):559–582, 2016) and Gładziejewski and Miłkowski (Biol Philos, 2017), to argue for a representationalist interpretation of the PEM framework. We further motivate the proposed approach to content by arguing that it is consistent with approaches implicit in theories of unsupervised learning in neural networks. In the course of this discussion, we argue that the structural representation proposal, properly understood, has more in common with functional-role than with causal/informational or teleosemantic theories. In the remainder of the paper, we describe the PEM framework for approximate Bayesian inference in some detail, and discuss how structural representations might arise within the proposed Bayesian hierarchies. After explicating the notion of variational inference, we define a subjectively accessible measure of misrepresentation for hierarchical Bayesian networks by appeal to the Kullbach–Leibler divergence between posterior generative and approximate recognition densities, and discuss a related measure of objective misrepresentation in terms of correspondence with the facts.

KW - Functional role semantics

KW - Generative model

KW - Kullbach–Leibler divergence

KW - Misrepresentation

KW - Prediction error minimization

KW - Problem of content

KW - Recognition model

KW - Structural resemblance

KW - Unsupervised learning

KW - Variational Bayesian inference

UR - http://www.scopus.com/inward/record.url?scp=85019886952&partnerID=8YFLogxK

U2 - 10.1007/s11229-017-1435-7

DO - 10.1007/s11229-017-1435-7

M3 - Article

AN - SCOPUS:85019886952

VL - 195

SP - 2387

EP - 2415

JO - Synthese

JF - Synthese

SN - 0039-7857

IS - 6

ER -