Filters








11,649 Hits in 5.3 sec

Fusion in multimodal interactive systems

Bruno Dumas, Beat Signer, Denis Lalanne
2012 Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems - EICS '12  
We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM).  ...  Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules.  ...  Bruno Dumas is supported by MobiCraNT, a project forming part of the Strategic Platforms programme by the Brussels Institute for Research and Innovation (Innoviris).  ... 
doi:10.1145/2305484.2305490 dblp:conf/eics/DumasSL12 fatcat:fa6waqjtkzfzfiq6qfhnpdcevq

Towards automatic evaluation of multimodal user interfaces

J. Coutaz, D. Salber, S. Balbo
1993 Knowledge-Based Systems  
Our goal is to provide a software support for performing this difficult task. This article presents an early analysis and experience towards the automatic evaluation of multimodal user interfaces.  ...  Our goal is to provide a software support for performing this difficult task. This article presents an early analysis and experience towards the automatic evaluation of multimodal user interfaces.  ...  ACKNOWLEDGEMENTS This article was influenced by stimulating discussions with our colleagues of the pole IHM-Multimodal of PRC Communication Homme-Machine including Jean Caelen (ICP, Grenoble) and Claude  ... 
doi:10.1016/0950-7051(93)90018-o fatcat:mlfk43w3vfa7xk6mqnkzllxzwe

Towards automatic evaluation of multimodal user interfaces

Sandrine Balbo, Joëlle Coutaz, Daniel Salber
1993 Proceedings of the 1st international conference on Intelligent user interfaces - IUI '93  
Our goal is to provide a software support for performing this difficult task. This article presents an early analysis and experience towards the automatic evaluation of multimodal user interfaces.  ...  Our goal is to provide a software support for performing this difficult task. This article presents an early analysis and experience towards the automatic evaluation of multimodal user interfaces.  ...  ACKNOWLEDGEMENTS This article was influenced by stimulating discussions with our colleagues of the pole IHM-Multimodal of PRC Communication Homme-Machine including Jean Caelen (ICP, Grenoble) and Claude  ... 
doi:10.1145/169891.169972 dblp:conf/iui/BalboCS93 fatcat:wjuul3xae5fxxcq6zpbko4vrai

Engineering device-spanning, multimodal web applications using a model-based design approach

Sebastian Feuerstack, Ednaldo Brigante Pizzolato
2012 Proceedings of the 18th Brazilian symposium on Multimedia and the web - WebMedia '12  
We propose a model-based run-time framework to design and execute multi-modal interfaces for the web.  ...  We proof our approach by checking its conformance against common requirements for multimodal frameworks, classify it based on characteristics identified by others, and present initial results of a performance  ...  Further on, Sebastian Feuerstack is grateful to the Deutsche Forschungsgemeinschaft (DFG) for the financial support of his work.  ... 
doi:10.1145/2382636.2382646 dblp:conf/webmedia/FeuerstackP12 fatcat:4mep6tcanzez5fpx76jfntqcei

Design guidelines for adaptive multimodal mobile input solutions

Bruno Dumas, María Solórzano, Beat Signer
2013 Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services - MobileHCI '13  
Based on a detailed analysis of the state of the art, we propose eight design guidelines for adaptive multimodal mobile input solutions.  ...  One particular research direction is the automatic context-aware adaptation of input modalities in multimodal mobile interfaces.  ...  ACKNOWLEDGMENTS Bruno Dumas is supported by MobiCraNT, a project forming part of the Strategic Platforms programme by the Brussels Institute for Research and Innovation (Innoviris).  ... 
doi:10.1145/2493190.2493227 dblp:conf/mhci/DumasSS13 fatcat:caqh2pb5rvfvfi6g2do4drnzte

Supporting Mobile Multimodal Interaction with a Rule-Based Framework [article]

Andreas Möller, Stefan Diewald, Luis Roalter, Matthias Kranz
2014 arXiv   pre-print
To reduce this effort, we created a framework that simplifies and accelerates the creation of multimodal applications for prototyping and research.  ...  Building multimodal applications requires various APIs with different paradigms, high-level interpretation of contextual data, and a method for fusing individual inputs and outputs.  ...  Acknowledgment We thank our student Max Walker for his help with extending the M3I framework and with creating the graphical end user toolkit on top of the framework.  ... 
arXiv:1406.3225v1 fatcat:d62d6fzd4fhaha35hvyibzwpbi

The application of ubiquitous multimodal synchronous data capture in CAD

Aparajithan Sivanathan, Theodore Lim, James Ritchie, Raymond Sung, Zoe Kosmadoudi, Ying Liu
2015 Computer-Aided Design  
This paper proposes a generic framework for ubiquitous multimodal synchronous data capture, based around the capture of CAD system activities, to monitor and log a variety of inputs, interactions, biophysical  ...  of metadata in CAD environments providing a new perspective on, and a new way of investigating CAD-based design activities.  ...  The authors would finally like to offer their gratitude to the industrial collaborators for their involvement in the research.  ... 
doi:10.1016/j.cad.2013.10.001 fatcat:vet34wrwtrarfp5wtvvqpakyvm

Towards learned feedback for enhancing trust in information seeking dialogue for radiologists

Daniel Sonntag
2011 Proceedings of the 15th international conference on Intelligent user interfaces - IUI '11  
This means we can semi-automatically learn models to improve the question feedback and trust in the multimodal QA system for radiologists.  ...  Speech Dialogue System The generic framework follows a programming model which eases the interface to external third-party components (e.g., the automatic speech recognizer (ASR), natural language understanding  ... 
doi:10.1145/1943403.1943473 dblp:conf/iui/Sonntag11 fatcat:zpa3xiimxbhpbbdjyqhfeakjtu

A Framework to Develop Adaptive Multimodal Dialog Systems for Android-Based Mobile Devices [chapter]

David Griol, José Manuel Molina
2014 Lecture Notes in Computer Science  
In this paper we propose a framework to facilitate the software engineering life cycle for multimodal interfaces in Android.  ...  Despite the usefulness of such classes, there are no strategies defined for multimodal interface development for Android systems, and developers create ad-hoc solutions that make apps costly to implement  ...  Proposed Framework to Develop Adaptive Multimodal Dialog Systems for Android-Based Mobile Devices Figure 1 shows the proposed framework.  ... 
doi:10.1007/978-3-319-07617-1_3 fatcat:u2krizfupvhkrkuybzgfechwuu

A Framework for Rapid Prototyping of Multimodal Interaction Concepts

Ronny Seiger, Florian Niebling, Mandy Korzetz, Tobias Nicolai, Thomas Schlegel
2015 ACM SIGCHI Symposium on Engineering Interactive Computing System  
By applying Connect, model-based prototypes of multimodal interaction concepts involving multiple devices can be created, evaluated and refined during the entire engineering process.  ...  Thus, the need for integrated development and evaluation of suitable interaction concepts for ubiquitous systems increases.  ...  A model-based approach for dynamically creating multimodal user interfaces composed of several components is described by Feuerstack and Pizzolato [8] as part of their MINT framework.  ... 
dblp:conf/eics/SeigerNKNS15 fatcat:uh42ufri5jdexhyowzdlfzfxhm

MultiPro: Prototyping Multimodal UI with Anthropomorphic Agents

Philipp Kulms, Herwin van Welbergen, Stefan Kopp
2018 Mensch & Computer  
Modern user interfaces (UI) often provide natural and multimodal interaction, sometimes modelled in the form of a conversation with an anthropomorphic agent embedded in the system.  ...  Our multimodal prototyping framework MultiPro helps designers in rapidly designing UIs to explore these questions.  ...  The framework offers support for full multimodal system design: MultiPro provides an interface for input and output modalities, supports multimodal fusion and fission, interaction flow management, and  ... 
doi:10.18420/muc2018-mci-0236 dblp:conf/mc/KulmsWK18 fatcat:ijo6vh3a55dlxlwu7orekph64i

A graphical editor for the SMUIML multimodal user interaction description language

Bruno Dumas, Beat Signer, Denis Lalanne
2014 Science of Computer Programming  
The presented graphical editor represents the third component of a triad of tools for the development of multimodal user interfaces, consisting of an XML-based modelling language, a framework for the authoring  ...  of multimodal interfaces and a graphical editor.  ...  The work on HephaisTK and SMUIML has been funded by the Hasler Foundation in the context of the MeModules project and by the Swiss National Center of Competence in Research on Interactive Multimodal Information  ... 
doi:10.1016/j.scico.2013.04.003 fatcat:gxgatkwbcbhxphlez7q2em2hru

Haptic-Enabled Multimodal Interface for the Planning of Hip Arthroplasty

N.G. Tsagarakis, J.O. Gray, D.G. Caldwell, C. Zannoni, M. Petrone, D. Testi, M. Viceconti
2006 IEEE Multimedia  
four specific tasks that form the basis for our multimodal hip arthroplasty planning environment: ❚ Preparing the subject-specific musculoskeletal model.  ...  We developed the Multisense demonstrator on top of a multimodal application framework (MAF) 7 that supports multimodal visualization, interaction, and improved synchronization of multiple cues.  ...  Multimodal application framework The multimodal application framework (MAF) is a software library for rapidly developing innovative multimodal environments for medical applications.  ... 
doi:10.1109/mmul.2006.55 fatcat:liguvutrirhe7oomw5vmewzqay

Multimodal interaction on mobile phones

Marcos Serrano, Laurence Nigay, Rachel Demumieux, Jérôme Descos, Patrick Losquin
2006 Proceedings of the 8th conference on Human-computer interaction with mobile devices and services - MobileHCI '06  
By reusing and assembling components, ACICARE enables the rapid development of multimodal interfaces as well as the automatic capture of multimodal usage for in-field evaluations.  ...  In this paper we address this problem by describing a component-based approach, called ACICARE, for developing and evaluating multimodal interfaces on mobile phones.  ...  task force creating human-machine interfaces similar to human-human communication of the European Sixth Framework Programme (FP6-2002-IST1-507609).  ... 
doi:10.1145/1152215.1152242 dblp:conf/mhci/SerranoNDDL06 fatcat:hloik76yprfrbl5nfhjho3fide

A user interface framework for multimodal VR interactions

Marc Erich Latoschik
2005 Proceedings of the 7th international conference on Multimodal interfaces - ICMI '05  
This article presents a User Interface (UI) framework for multimodal interactions targeted at immersive virtual environments.  ...  The framework introduces a Knowledge Representation Layer which augments objects of the simulated environment with Semantic Entities as a central object model that bridges and interfaces Virtual Reality  ...  AI-representation for the KRL as well as a neural network layer which will support the KRL as well as the matching stage of the gesture processing.  ... 
doi:10.1145/1088463.1088479 dblp:conf/icmi/Latoschik05 fatcat:u6l5l7zqyzdszosbxd4qrne6di
« Previous Showing results 1 — 15 out of 11,649 results