20,795 Hits in 3.4 sec

A prototype user interface for a mobile electronic clinical note entry system

Atif Zafar
2005 AMIA Annual Symposium Proceedings  
Based on a review of the literature on mobile device usability 1-4,we built a prototype user interface for mobile EMRs and held focus groups with clinician users whose feedback provided useful insight  ...  Based on a review of the literature on mobile device usability 1-4 , we built a prototype user interface for mobile EMRs and held focus groups with clinician users whose feedback provided useful insight  ...  Both groups felt that a multi-modal input paradigm would take practice to use and may dissuade some users. All felt that mobile devices for EMRs are preferable over comparable fixed-workstations.  ... 
pmid:16779451 pmcid:PMC1560684 fatcat:pgzq34pfpja73i4lai7zewkxhq

Macaw: An Extensible Conversational Information Seeking Platform [article]

Hamed Zamani, Nick Craswell
2019 arXiv   pre-print
It can also integrate with a user interface, which allows user studies and data collection in an interactive mode, where the back end can be fully algorithmic or a wizard of oz setup.  ...  Macaw supports multi-turn, multi-modal, and mixed-initiative interactions, and enables research for tasks such as document retrieval, question answering, recommendation, and structured data exploration  ...  The user interacts with the interface and the interface produces a Message object from the current interaction of user.  ... 
arXiv:1912.08904v1 fatcat:wfdzexyxbrbcnppxjfchbizcqe

An Expert System to Support the Design of Human-Computer Interfaces [chapter]

Cecilia Sosa Arias Peixoto, Tiago Cinto
2011 Expert Systems for Human, Materials and Automation  
The expert system was also used in the development of intelligent adaptive interfaces for a data mining tool, aiming to provide friendly and appropriate user interfaces to the person using the tool.  ...  In this context, one of the basic requirements is the development of interfaces with high usability that meet different modes of interaction depending on users, environments and tasks to be performed.  ...  Multi-modal interfaces A multi-modal interactive system is a system that relies on the use of multiple human communication channels.  ... 
doi:10.5772/18287 fatcat:6md6tnz63fgd3otki7njryblzi

BreastScreening: On the Use of Multi-Modality in Medical Imaging Diagnosis [article]

Francisco Maria Calisto, Nuno Jardim Nunes, Jacinto Carlos Nascimento
2020 arXiv   pre-print
This paper describes the field research, design and comparative deployment of a multimodal medical imaging user interface for breast screening.  ...  We summarize our work with recommendations from the radiologists for guiding the future design of medical imaging interfaces.  ...  In our design explorations, we sought to integrate several image modalities and visualization to support insight. User Interface The User Interface (UI) consists of two main components: 4.  ... 
arXiv:2004.03500v1 fatcat:eteu5g6tufchtlx7rczzxxc7mq

Discovery of Multi-perspective Declarative Process Models [chapter]

Stefan Schönig, Claudio Di Ciccio, Fabrizio M. Maggi, Jan Mendling
2016 Lecture Notes in Computer Science  
It comprises a web-based responsive user interface which allows process participants to choose and perform tasks and offers enterprise content management (ECM) functionality.  ...  The Declarative Process Intermediate Language (DPIL) is a declarative process modelling language that allows for specifying multiperspective and multi-modal flexible, processes.  ...  It is multi modal, meaning that both mandatory and recommended actions can be specified.  ... 
doi:10.1007/978-3-319-46295-0_6 fatcat:3lcbacfr4jbcvjsbypea2ioh2y


Rafizah Mohd Hanifa, Maizam Alias, Ida Aryanie Bahrudin, Miswan Surip, Zuraida Ibrahim, Rosfuzah Roslan
2015 Jurnal Teknologi  
One approach that has shown great potential in enhancing social interaction skills among autistic children is the multi-modal mind games approach.  ...  Cases of autism, a developmental disorder that disconnects individuals from their environment and people is on the rise with 30% increase being reported in Malaysia from 2008-2011.  ...  of multi-modal approach.  ... 
doi:10.11113/jt.v75.5049 fatcat:xblwvktivbbxjdsyjtzhizboiy

Best practices on personalization and adaptive interaction techniques in the scope of Smart Homes and Active Assisted Living

Nikolaos Liappas, José Gabriel Teriús-Padrón, Eduardo Machado, Mohammad Reza Loghmani, Rebeca Isabel García-Betances, Markus Vincze, Iván Carrillo Quero, María Fernanda Cabrera-Umpiérrez
2019 Zenodo  
Assistive systems and emerging technologies are capable of supporting individuals with specific needs and diseases effectively.  ...  The recommendations arise as a result of previous research studies conducted within the MSCA-ITN project ACROSSING.  ...  The recommendations are presented per use case with the relevant elements to consider based on the specific target group. Fig. 1 . 1 System architecture based on multi-modal approach.  ... 
doi:10.5281/zenodo.3734168 fatcat:h5e626a4mfdtnludsja6zsts44

Managing Personal Communication Environments in Next Generation Service Platforms

Ralf Kernchen, Matthieu Boussard, Cristian Hesselman, Claudia Villalonga, Eric Clavier, Anna V. Zhdanova, Pablo Cesar
2007 2007 16th IST Mobile and Wireless Communications Summit  
The current access to mobile services a user has, is defined by the user's mobile terminal as the single entry point to an operators network. This comes along with a set of limitations.  ...  Index Terms-next generation service enablers, multimodal interfaces, multi-device environments, context-awareness,  ...  The MDCS may also stick to the same modality for a particular service, but move to a different set of devices to interact with the user.  ... 
doi:10.1109/istmwc.2007.4299298 fatcat:k2gskbtm7vbpfm3237peocqv7a

Memory Grounded Conversational Reasoning

Seungwhan Moon, Pararth Shah, Rajen Subba, Anuj Kumar
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations  
We demonstrate a conversational system which engages the user through a multi-modal, multi-turn dialog over the user's memories.  ...  To implement such a system, we collect a new corpus of memory grounded conversations, which comprises human-to-human role-playing dialogs given synthetic memory graphs with simulated attributes.  ...  Memory Grounded Conversations User Interface The goal of the demonstrated system is to establish a natural user interface (UI) for interacting with memories.  ... 
doi:10.18653/v1/d19-3025 dblp:conf/emnlp/MoonSSK19 fatcat:4jej6zf7grf5xfw26ft7cfalrm

Multimodal interaction for data visualization

Bongshin Lee, Arjun Srinivasan, John Stasko, Melanie Tory, Vidya Setlur
2018 Proceedings of the 2018 International Conference on Advanced Visual Interfaces - AVI '18  
This workshop will bring together researchers with expertise in visualization, interaction design, and natural user interfaces.  ...  It can help people stay in the flow of their visual analysis and presentation, with the strengths of one interaction modality offsetting the weaknesses of others.  ...  The goal of this workshop is to bring together researchers with expertise in visualization, interaction design, and natural user interfaces.  ... 
doi:10.1145/3206505.3206602 dblp:conf/avi/LeeSSTS18 fatcat:tfgmhyyozzc3vle4mzg3oqd3w4

Why Do We Click: Visual Impression-aware News Recommendation [article]

Jiahao Xun, Shengyu Zhang, Zhou Zhao, Jieming Zhu, Qi Zhang, Jingjie Li, Xiuqiang He, Xiaofei He, Tat-Seng Chua, Fei Wu
2021 arXiv   pre-print
To accurately capture users' interests, we propose to model multi-modal features, in addition to the news titles that are widely used in existing works, for news recommendation.  ...  with visual-semantic modeling for news recommendation.  ...  Figure 1 : 1 An illustration of impression-aware news recommendation. (a) The interface that users are browsing.  ... 
arXiv:2109.12651v1 fatcat:pcjk6p7c4rbbrgc2hovl6zyfku

Experimental Evaluation of a Multi-modal User Interface for a Robotic Service [chapter]

Alessandro Di Nuovo, Ning Wang, Frank Broz, Tony Belpaeme, Ray Jones, Angelo Cangelosi
2016 Lecture Notes in Computer Science  
The MMUI system offers users two main modalities to send commands: they are a GUI, usually running on the tablet attached to the robot, and a SUI, with a wearable microphone on the user.  ...  This paper reports the experimental evaluation of a Multi-Modal User Interface (MMUI) designed to enhance the user experience in terms of service usability and to increase acceptability of assistive robot  ...  Multi-modal Interface for Elderly-Robot Interaction The MMUI system offers users two main modalities to send commands: they are a GUI, usually running on the tablet attached to the robot, and a SUI, with  ... 
doi:10.1007/978-3-319-40379-3_9 fatcat:rflse5mdnfhjtjvqgyq3mqpoym

Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry

Jai Vardhan, Girijesh Prasad
2015 International Journal of Computer Applications  
Towards this end, we have developed a multimodal human-computer interface (HCI) by combining an eyetracker with a soft-switch which may be considered as typically representing another modality.  ...  Therefore by combining multi-sensory modalities, we can make the whole process more natural and ensure enhanced performance even for the disabled users.  ...  RELATED WORK -MULTI-MODAL HCI SYSTEMS FOR TEXT ENTRY Use of multi-modality inputs may make user experience more natural.  ... 
doi:10.5120/ijca2015907194 fatcat:zo2krmpmfzarnpwwfljidcuxga

Using dialog and context in a speech-based interface for an information visualization environment

Kenneth Cox, Rebecca E. Grinter, David Mantilla
2000 Proceedings of the working conference on Advanced visual interfaces - AVI '00  
We describe a speech-based interface to an information visualization (infoVis) system. Users ask natural-language questions about a given data domain.  ...  Users can interact with these views via speech or direct manipulation. If users give incomplete information, our interface guides them in clarifying their questions.  ...  A second benefit of our approach is that we do not replace the GUI with a speech interface, but rather support multi-modal interaction where users can seamlessly move between the modes.  ... 
doi:10.1145/345513.345342 dblp:conf/avi/CoxGHJM00 fatcat:5c6pyut5cvbsdokh3uyckumdou

Speech interface dialog with smart glasses

Aryan Firouzian, Petri Pulli, Matus Pleva, Jozef Juhar, Stanislav Ondas
2017 2017 15th International Conference on Emerging eLearning Technologies and Applications (ICETA)  
This paper describes design of elderly-user-friendly multi-mode user interface with different modules.  ...  Senior citizens suffering from mild and moderate dementia are the primary target group of the proposed system.  ...  [2] designed a gaze and speech multi-modal interface to select objects with different colors in environment.  ... 
doi:10.1109/iceta.2017.8102483 fatcat:2m6vvtlq5ngjxjtfq2gyxxm33e
« Previous Showing results 1 — 15 out of 20,795 results