Exploring Real-Time Music-to-Image Systems for Creative Inspiration in Music Creation

Meng Yang, Maria Teresa Llano, Jon McCormack

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

This paper presents a study on the use of a real-time music- to-image system as a mechanism to support and inspire musi- cians during their creative process. The system takes MIDI messages from a keyboard as input which are then inter- preted and analysed using state-of-the-art generative AI mod- els. Based on the perceived emotion and music structure, the system's interpretation is converted into visual imagery that is presented in real-time to musicians. We conducted a user study in which musicians improvised and composed using the system. Our findings show that most musicians found the generated images were a novel mechanism when playing, evidencing the potential of music-to-image systems to inspire and enhance their creative process.
Original languageEnglish
Title of host publicationProceedings of 15th International Conference on Computational Creativity, ICCC'24
EditorsPedro Martins, Teresa Llano
Place of PublicationJönköping Sweden
PublisherAssociation for Computational Creativity (ACC)
Number of pages10
Publication statusPublished - 2024
EventInternational Conference on Computational Creativity 2024 - Jönköping, Sweden
Duration: 17 Jun 202421 Jun 2024
Conference number: 15th
https://computationalcreativity.net/iccc24/ (Website)

Conference

ConferenceInternational Conference on Computational Creativity 2024
Abbreviated titleICCC'24
Country/TerritorySweden
CityJönköping
Period17/06/2421/06/24
Internet address

Keywords

  • Music
  • AI
  • Emotion

Cite this