Abstract
To support research and development of next-generation multimodal interfaces for complex collaborative tasks, a comprehensive new infrastructure has been created for collecting and analyzing time-synchronized audio, video, and pen-based data during multi-party meetings. This infrastructure needs to be unobtrusive and to collect rich data involving multiple information sources of high temporal fidelity to allow the collection and annotation of simulation-driven studies of natural human-human-computer interactions. Furthermore, it must be flexibly extensible to facilitate exploratory research. This paper describes both the infrastructure put in place to record, encode, playback and annotate the meeting-related media data, and also the simulation environment used to prototype novel system concepts.
Original language | English |
---|---|
Title of host publication | ICMI '06 - Proceedings of the 8th international conference on Multimodal interfaces |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 209-216 |
Number of pages | 8 |
ISBN (Print) | 159593541X, 9781595935410 |
DOIs | |
Publication status | Published - 2006 |
Externally published | Yes |
Event | International Conference on Multimodal Interfaces 2006 - Banff, Canada Duration: 2 Nov 2006 → 4 Nov 2006 Conference number: 8th https://dl.acm.org/doi/proceedings/10.1145/1180995 (Proceedings) |
Conference
Conference | International Conference on Multimodal Interfaces 2006 |
---|---|
Abbreviated title | ICMI 2006 |
Country/Territory | Canada |
City | Banff |
Period | 2/11/06 → 4/11/06 |
Internet address |
|
Keywords
- Annotation tools
- Data collection infrastructure
- Meeting
- Multi-party
- Multimodal interfaces
- Simulation studies
- Synchronized media