Capturing contextual morality: applying game theory on smartphones

Niels Van Berkel, Simo Hosio, Benjamin Tag, Jorge Goncalves

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

2 Citations (Scopus)


In order to build more fair Artificial Intelligence applications, a thorough understanding of human morality is required. Given the variable nature of human moral values, AI algorithms will have to adjust their behaviour based on the moral values of its users in order to align with end user expectations. Quantifying human moral values is, however, a challenging task which cannot easily be completed using e.g. surveys. In order to address this problem, we propose the use of game theory in longitudinal mobile sensing deployments. Game theory has long been used in disciplines such as Economics to quantify human preferences by asking participants to choose between a set of hypothetical options and outcomes. The behaviour observed in these games, combined with the use of mobile sensors, enables researchers to obtain unique insights into the effect of context on participant convictions.

Original languageEnglish
Title of host publicationProceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers
EditorsSaeed Abdullah
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages5
ISBN (Electronic)9781450368698
Publication statusPublished - 2019
Externally publishedYes
EventWorkshop on Longitudinal Mobile, Wearable and Ubiquitous Data Collection from Human Subject Studies 2019 - London, United Kingdom
Duration: 9 Sept 20199 Sept 2019 (Proceedings) (Website)


ConferenceWorkshop on Longitudinal Mobile, Wearable and Ubiquitous Data Collection from Human Subject Studies 2019
Abbreviated titleLDC 2019
Country/TerritoryUnited Kingdom
Internet address


  • Artificial intelligence
  • Context
  • Ethics
  • Game theory
  • Mobile sensing
  • Moral

Cite this