Multiparty multimodal interaction: A preliminary analysis

Philip R. Cohen, Rachel Coulston, Kelly Krout

Research output: Contribution to conferencePaperpeer-review

2 Citations (Scopus)

Abstract

When people work together, they often talk about the objects in their environment. Not surprisingly, their dialogues are multimodal, incorporating speech, gesture, gaze, haptics, and perhaps other modalities. However, proponents of technology may be troubled to learn that despite the current state and future promise of spoken and multimodal research, many of these workers do not particularly want to talk to machines - they want to converse with their colleagues. Still, if there were unobtrusive computer support for their multimodal dialogues, these same individuals would be pleased to benefit from digital technology. This paper offers a first step towards building such multimodal systems for supporting face-to-face collaborative work by providing both qualitative and quantitative analyses of multiparty multimodal dialogues in a field setting.

Original languageEnglish
Pages201-204
Number of pages4
Publication statusPublished - 1 Jan 2002
Externally publishedYes
Event7th International Conference on Spoken Language Processing, ICSLP 2002 - Denver, United States of America
Duration: 16 Sep 200220 Sep 2002

Conference

Conference7th International Conference on Spoken Language Processing, ICSLP 2002
Country/TerritoryUnited States of America
CityDenver
Period16/09/0220/09/02

Cite this