Abstract
The generation of music that adapts dynamically to content and actions has an important role in building more immersive, memorable, and emotive game experiences. To date, the development of adaptive music systems (AMSs) for video games is limited both by the nature of algorithms used for real-time music generation and the limited modeling of player action, game-world context, and emotion in current games. We propose that these issues must be addressed in tandem for the quality and flexibility of adaptive game music to significantly improve. Cognitive models of knowledge organization and emotional effect are integrated with multimodal, multiagent composition techniques to produce a novel AMS. The system is integrated into two stylistically distinct games. Gamers reported an overall higher immersion and correlation of music with game-world concepts with the AMS than that with the original game soundtracks in both the games.
Original language | English |
---|---|
Pages (from-to) | 270-280 |
Number of pages | 11 |
Journal | IEEE Transactions on Games |
Volume | 12 |
Issue number | 3 |
DOIs | |
Publication status | Published - Sept 2020 |
Keywords
- Agent-based modeling
- computer generated music
- neural networks