Flexi-modal and multi-machine user interfaces

B. Myers, R. Malkin, M. Bett, A. Waibel, B. Bostwick, R.C. Miller, Jie Yang, M. Denecke, E. Seemann, Jie Zhu, Choon Hong Peck, D. Kong (+2 others)
Proceedings. Fourth IEEE International Conference on Multimodal Interfaces  
We describe our system which facilitates collaboration using multiple modalities, including speech, handwriting, gestures, gaze tracking, direct manipulation, large projected touch-sensitive displays, laser pointer tracking, regular monitors with a mouse and keyboard, and wirelessly-networked handhelds. Our system allows multiple, geographically dispersed participants to simultaneously and flexibly mix different modalities using the right interface at the right time on one or more machines.
more » ... paper discusses each of the modalities provided, how they were integrated in the system architecture, and how the user interface enabled one or more people to flexibly use one or more devices.
doi:10.1109/icmi.2002.1167019 dblp:conf/icmi/MyersMBWBMYDSZPKNS02 fatcat:ijeuwtehvzbafaa5uamywfrsbq