8,620 Hits in 3.4 sec

Event-based Synchronization of Model-Based Multimodal User Interfaces

Marco Blumendorf, Sebastian Feuerstack, Sahin Albayrak
2006 ACM/IEEE International Conference on Model Driven Engineering Languages and Systems  
In this paper we present an approach allowing the event-based synchronization of distributed user interfaces based on a multilevel user interface model.  ...  We also describe a runtime system we created, allowing the execution of model-based user interface descriptions and the distribution of user interfaces across various devices and modalities using channels  ...  We thank the German Federal Ministry of Economics and Technology for supporting our work as part of the Service Centric Home project in the "Next Generation Media" program.  ... 
dblp:conf/models/BlumendorfFA06 fatcat:l7tln6izx5cf7hp7a5uw4dg6da

Engineering device-spanning, multimodal web applications using a model-based design approach

Sebastian Feuerstack, Ednaldo Brigante Pizzolato
2012 Proceedings of the 18th Brazilian symposium on Multimedia and the web - WebMedia '12  
We propose a model-based run-time framework to design and execute multi-modal interfaces for the web.  ...  We proof our approach by checking its conformance against common requirements for multimodal frameworks, classify it based on characteristics identified by others, and present initial results of a performance  ...  Further on, Sebastian Feuerstack is grateful to the Deutsche Forschungsgemeinschaft (DFG) for the financial support of his work.  ... 
doi:10.1145/2382636.2382646 dblp:conf/webmedia/FeuerstackP12 fatcat:4mep6tcanzez5fpx76jfntqcei

Multimodal interaction with xforms

Mikko Honkala, Mikko Pohja
2006 Proceedings of the 6th international conference on Web engineering - ICWE '06  
The model separates modality-independent parts from the modality-dependent parts, thus automatically providing most of the user interface to all modalities.  ...  Thus, for ease-of-authoring and maintainability, it is necessary to provide a cross-modal user interface language, whose semantic level is higher.  ...  Also, we would like to thank Alessandro Cogliati, the developer of the CSS engine, and Petri Vuorimaa, the leader of the research group.  ... 
doi:10.1145/1145581.1145624 dblp:conf/icwe/HonkalaP06 fatcat:bssvnu3cgzardpx72fogqeptzm

Description languages for multimodal interaction: a set of guidelines and its illustration with SMUIML

Bruno Dumas, Denis Lalanne, Rolf Ingold
2010 Journal on Multimodal User Interfaces  
This article introduces the problem of modeling multimodal interaction, in the form of markup languages.  ...  After an analysis of the current state of the art in multimodal interaction description languages, nine guidelines for languages dedicated at multimodal interaction description are introduced, as well  ...  SMUIML SMUIML stands for Synchronized Multimodal User Interaction Modeling Language.  ... 
doi:10.1007/s12193-010-0043-3 fatcat:j4q6einikfcivoxrk4fm46gt4i

Flexible Multimodal Architecture for CAD Application [article]

M. Dellisanti, M. Fiorentino, G. Monno, A. E. Uva
2007 Smart Tools and Applications in Graphics  
The flexibility of the system is ensured by the use of a hierarchical XML-based configuration structure.  ...  This framework can be applied to test the advantage of multimodal interfaces in VRAD applications and provides tools to explore the synergy of different input\output techniques  ...  Acknowledgements We would like to thank Florin Girbacia (University of Transylvania, Romania) for implementing\testing the voice command module.  ... 
doi:10.2312/localchapterevents/italchap/italianchapconf2007/113-118 dblp:conf/egItaly/FabianoFMU07 fatcat:tuhnvbazqfgjbh2hq45fdlkkbm

Strengths and weaknesses of software architectures for the rapid creation of tangible and multimodal interfaces

Bruno Dumas, Denis Lalanne, Dominique Guinard, Reto Koenig, Rolf Ingold
2008 Proceedings of the 2nd international conference on Tangible and embedded interaction - TEI '08  
Finally, the article stresses the major issues associated with the development of toolkits allowing the creation of multimodal and tangible interfaces, and presents our future objectives.  ...  This paper reviews the challenges associated with the development of tangible and multimodal interfaces and exposes our experiences with the development of three different software architectures to rapidly  ...  The fusion and dialog managers of HephaisTK are scripted by means of a SMUIML (Synchronized Multimodal User Interfaces Modelling Language) XML file [8] .  ... 
doi:10.1145/1347390.1347403 dblp:conf/tei/DumasLGKI08 fatcat:mhjpkfpj35dbvduighse3zznni

A Novel Dialog Model for the Design of Multimodal User Interfaces [chapter]

Robbie Schaefer, Steffen Bleul, Wolfgang Mueller
2005 Lecture Notes in Computer Science  
Variation in different mobile devices with different capabilities and interaction modalities as well as changing user context in nomadic applications, poses huge challenges to the design of user interfaces  ...  In this short paper, we present a new dialog model for multimodal interaction together with an advanced control model, which can either be used for direct modeling by an interface designer or in conjunction  ...  [2] ), we present MIPIM (Multimodal Interface Presentation and Interaction Model), a new dialog model for the design of multimodal User Interfaces.  ... 
doi:10.1007/11431879_13 fatcat:dfu3jm2ffffr3d6eus2wuww6f4

Benchmarking fusion engines of multimodal interactive systems

Bruno Dumas, Rolf Ingold, Denis Lalanne
2009 Proceedings of the 2009 international conference on Multimodal interfaces - ICMI-MLMI '09  
This article proposes an evaluation framework to benchmark the performance of multimodal fusion engines.  ...  It then discusses the importance of evaluation as a mean to assess fusion engines, not only from the user perspective, but also at a performance level.  ...  is to achieve the evaluation of a multimodal interface in a step-by-step manner, and base later evaluations on the results of the former ones.  ... 
doi:10.1145/1647314.1647345 dblp:conf/icmi/DumasIL09 fatcat:ja4jfxrjhfewbjtwzkt5xckuky


Nadia Elouali, Xavier Le Pallec, José Rouillard, Jean-Claude Tarby
2014 Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems - CHI EA '14  
We introduce M4L modeling language and MIMIC framework that aim to produce easily sensor-based multimodal mobile applications by generating up to 100% of their interfaces.  ...  We then present our model-based approach solution.  ...  It allows direct and graphical manipulation of the M4L's concepts to model multimodal mobile interfaces.  ... 
doi:10.1145/2559206.2581222 dblp:conf/chi/EloualiPRT14 fatcat:5oxdmnq4gbh5povlqleldv7hwq

Haptic-Enabled Multimodal Interface for the Planning of Hip Arthroplasty

N.G. Tsagarakis, J.O. Gray, D.G. Caldwell, C. Zannoni, M. Petrone, D. Testi, M. Viceconti
2006 IEEE Multimedia  
We developed the Multisense demonstrator on top of a multimodal application framework (MAF) 7 that supports multimodal visualization, interaction, and improved synchronization of multiple cues.  ...  M ultimodal environments seek to create computational scenarios that fuse sensory data (sight, sound, touch, and perhaps smell) to form an advanced, realistic, and intuitive user interface.  ...  Interaction and synchronization model User interaction involves the I/O devices, views subsystem, and operation subsystem.  ... 
doi:10.1109/mmul.2006.55 fatcat:liguvutrirhe7oomw5vmewzqay

Multimodal Interfaces: A Survey of Principles, Models and Frameworks [chapter]

Bruno Dumas, Denis Lalanne, Sharon Oviatt
2009 Lecture Notes in Computer Science  
Modeling of multimodal interaction as well as tools allowing rapid creation of multimodal interfaces are then presented.  ...  The grand challenge of multimodal interface creation is to build reliable processing systems able to analyze and understand multiple communication means in real-time.  ...  conditional clauses and loops. • Extensible event definition mechanisms are also needed for communication between user interface objects and the interaction model. • Data Modeling should be carefully  ... 
doi:10.1007/978-3-642-00437-7_1 fatcat:2kpxjb4kqfcupkrxeexlvwi3su

A graphical editor for the SMUIML multimodal user interaction description language

Bruno Dumas, Beat Signer, Denis Lalanne
2014 Science of Computer Programming  
The presented graphical editor represents the third component of a triad of tools for the development of multimodal user interfaces, consisting of an XML-based modelling language, a framework for the authoring  ...  of multimodal interfaces and a graphical editor.  ...  The work on HephaisTK and SMUIML has been funded by the Hasler Foundation in the context of the MeModules project and by the Swiss National Center of Competence in Research on Interactive Multimodal Information  ... 
doi:10.1016/j.scico.2013.04.003 fatcat:gxgatkwbcbhxphlez7q2em2hru

Nucleome Browser: An integrative and multimodal data navigation platform for 4D Nucleome [article]

Xiaopeng Zhu, Yang Zhang, Yuchuan Wang, Dechao Tian, Andrew S. Belmont, Jason R. Swedlow, Jian Ma
2022 bioRxiv   pre-print
., genomics, imaging, 3D genome structure models, and single-cell data) and external data portals by a new adaptive communication mechanism.  ...  Nucleome Browser provides a scalable solution for integrating massive amounts of 4D Nucleome data to navigate multiscale nuclear structure and function in a wide range of biological contexts, enabling  ...  The authors are also grateful to members of the Ma lab for helpful discussions and comments on the manuscript.  ... 
doi:10.1101/2022.02.21.481225 fatcat:zgy4bbz6nzbmxflg2rgjdqdagy

Single application model, multiple synchronized views

R. Hosn, S.H. Maes, T.V. Raman
2001 IEEE International Conference on Multimedia and Expo, 2001. ICME 2001.  
User interface is a mean to an end -its primary goal is to capture user intent and communicate the results of the requested computation.  ...  We describe an application framework that enables tightly synchronized multimodal user interaction.  ...  The framework also leverages evolving XML-based industry standards for modeling application content, representing user interaction, as well as for communicating the results of user interaction among various  ... 
doi:10.1109/icme.2001.1237813 dblp:conf/icmcs/HosnMR01 fatcat:qmsrx76qlnbzhpwvgmvltb2kte

Delivering Interactive Multimedia Services in Dynamic Pervasive Computing Environments

Cristian Hesselman, Pablo Cesar, Ishan Vaishnavi, Mathieu Boussard, Ralf Kernchen, Stefan Meissner, Antonietta Spedalieri, Albert Sinfreu, Christian Räck
2008 Proceedings of the First International Conference on Ambient Media and Systems  
The overall goal is to enhance the experience of mobile users by intelligently adapting the way a service is presented, in particular by adapting the way the user receives multimedia content from the service  ...  ., from gesture-based input to voice commands). These changes are triggered by changes in the user's context (e.g., when the user gets into his car).  ...  Future work includes a more detailed study of mechanisms for media synchronization, advanced multimodal interaction, and device sharing (e.g., when two users share a device such as a wall-mounted display  ... 
doi:10.4108/icst.ambisys2008.2908 dblp:conf/ambisys/HesselmanCVBKMS08 fatcat:zhpuukoz6zb2blesujar43lydy
« Previous Showing results 1 — 15 out of 8,620 results