Measuring player skill using dynamic difficulty adjustment

Simon Demediuk, Marco Tamassia, William L. Raffe, Fabio Zambetta, Florian 'Floyd' Mueller, Xiaodong Li

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

9 Citations (Scopus)

Abstract

Video games have a long history of use for educational and training purposes, as they provided increased motivation and learning for players. One of the limitations of using video games in this manner is, players still need to be tested outside of the game environment to test their learning outcomes. Traditionally, determining a player's skill level in a competitive game, requires players to compete directly with each other. Through the application of the Adaptive Training Framework, this work presents a novel method to determine the skill level of the player after each interaction with the video game. This is done by measuring the effort of a Dynamic Difficult Adjustment agent, without the need for direct competition between players. The experiments conducted in this research show that by measuring the players Heuristic Value Average, we can obtain the same ranking of players as state-of-the-art ranking systems, without the need for direct competition.

Original languageEnglish
Title of host publicationProceedings of the Australasian Computer Science Week Multiconference, ACSW 2018
EditorsMinh Ngoc Dinh
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages7
ISBN (Electronic)9781450354363
DOIs
Publication statusPublished - 2018
Externally publishedYes
EventInteractive Entertainment 2018 - Brisbane, Australia
Duration: 29 Jan 20182 Feb 2018

Conference

ConferenceInteractive Entertainment 2018
Abbreviated titleIE 2018
Country/TerritoryAustralia
CityBrisbane
Period29/01/182/02/18

Keywords

  • Dynamic Difficulty Adjustment
  • Monte Carlo Tree Search
  • Player Skill Ranking
  • Video Games

Cite this