Abstract
Affective computing is an emerging research area which provides insights on human's mental state through human-machine interaction. During the interaction process, bio-signal analysis is essential to detect human affective changes. Currently, machine learning methods to analyse bio-signals are the state of the art to detect the affective states, but most empirical works mainly deploy traditional machine learning methods rather than deep learning models due to the need for explainability. In this paper, we propose a deep learning model to process multimodal-multisensory bio-signals for affect recognition. It supports batch training for different sampling rate signals at the same time, and our results show significant improvement compared to the state of the art. Furthermore, the results are interpreted at the sensor- and signal- level to improve the explainaibility of our deep learning model.
Original language | English |
---|---|
Title of host publication | Proceedings of the 28th ACM International Conference on Information and Knowledge Management |
Editors | Peng Cui, Elke Rundensteiner, David Carmel, Qi He, Jeffrey Xu Yu |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 2069-2072 |
Number of pages | 4 |
ISBN (Electronic) | 9781450369763 |
DOIs | |
Publication status | Published - 2019 |
Event | ACM International Conference on Information and Knowledge Management 2019 - Beijing, China Duration: 3 Nov 2019 → 7 Nov 2019 Conference number: 28th http://www.cikm2019.net/ https://dl.acm.org/doi/proceedings/10.1145/3357384 |
Conference
Conference | ACM International Conference on Information and Knowledge Management 2019 |
---|---|
Abbreviated title | CIKM 2019 |
Country/Territory | China |
City | Beijing |
Period | 3/11/19 → 7/11/19 |
Internet address |
Keywords
- Affect recognition
- Deep learning
- Explainability
- Multimodal fusion