A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
An Input Widget Framework for Multi-Modal and Multi-Device Environments
Third IEEE Workshop on Software Technologies for Future Embedded and Ubiquitous Systems (SEUS'05)
We propose an input widget framework that provides high-level abstraction for heterogeneous input devices, that we call meta-inputs, for distributed multi-modal applications. ...
In such multi-modal environments, application programmers must take into account how to adapt heterogeneous input events to multi-modal services. ...
Implementation Our framework is designed for distributed multi-modal environments, which are in heterogeneous platforms and languages. ...
doi:10.1109/seus.2005.4
dblp:conf/seus/KobayashiTKHAN05
fatcat:kljqed73ibaqvpf73efysa5qdu
A Novel Dialog Model for the Design of Multimodal User Interfaces
[chapter]
2005
Lecture Notes in Computer Science
To avoid multiple designs for each device or modality, it is almost a must to employ a modelbased approach. ...
Variation in different mobile devices with different capabilities and interaction modalities as well as changing user context in nomadic applications, poses huge challenges to the design of user interfaces ...
For this approach, any available input or output device with their respective modalities can be used, which requires a framework to synchronize the interaction as, e.g., presented with W3Cs Multimodal ...
doi:10.1007/11431879_13
fatcat:dfu3jm2ffffr3d6eus2wuww6f4
Managing Personal Communication Environments in Next Generation Service Platforms
2007
2007 16th IST Mobile and Wireless Communications Summit
This framework defines functional components to enable multi-device delivery of communication and media service sessions, user input interpretation, terminal management and resource discovery. ...
Index Terms-next generation service enablers, multimodal interfaces, multi-device environments, context-awareness, ...
all members of the SPICE project for their contributions. ...
doi:10.1109/istmwc.2007.4299298
fatcat:k2gskbtm7vbpfm3237peocqv7a
The Implementation of Service Enabling with Spoken Language of a Multi-modal System Ozone
[chapter]
2006
Lecture Notes in Computer Science
In this paper we described the architecture and key issues of the service enabling layer of a multi-modal system Ozone which is oriented for new technologies and services for emerging nomadic societies ...
As a large multi-modal system, Ozone consists of many functional modules. However, spoken language played an important role to facilitate the usage of the system. ...
Summary We discussed the architecture and key issues in the implementation of the Ozone project which is a complicated system using multi-modality to facilitate human beings to use electric machines. ...
doi:10.1007/11939993_65
fatcat:wscu7xwbkjdtnbwdlt53mzkhn4
DireWolf - Distributing and Migrating User Interfaces for Widget-Based Web Applications
[chapter]
2013
Lecture Notes in Computer Science
The work presented opens the way for creating distributed Web applications which can access device specific functionalities such as multi-touch, text input, etc. in a federated and usable manner. ...
For a single user it provides more flexible control over different parts of an application by enabling the simultaneous use of smart phones, tablets and computers. ...
As a next step beyond the personal multi-device distributed computing environment, we will extend DireWolf to support multi-device multi-user collaboration. ...
doi:10.1007/978-3-642-39200-9_10
fatcat:5mx6lpxyqneapnysrd36ez5gpm
Interaction with stereoscopic data on and above multi-touch surfaces
2011
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces - ITS '11
With the increasing distribution of multi-touch capable devices multi-touch interaction becomes more and more ubiquitous. ...
This research will evaluate multi-touch and gestural 3D interaction on and above interactive surfaces and explore the design space of interaction with stereoscopic data. ...
Much work has been carried out on the definition of frameworks and taxonomies for such gesture-based multitouch input. ...
doi:10.1145/2076354.2076428
dblp:conf/tabletop/Daiber11
fatcat:uuo2ua7m4nakpn5yctrszbfgla
Challenges in mobile multi-device ecosystems
2016
mUX The Journal of Mobile User Experience
We base our findings on literature research and an expert survey. Specifically, we present grounded challenges relevant for the design, development and use of mobile multi-device environments. ...
Personal and intimate mobile and wearable devices such as head-mounted displays, smartwatches, smartphones and tablets are rarely part of such multi-device ecosystems. ...
For example, we need better support for creating user interface widgets that can adopt themselves to the manifold input and output configurations or awareness [49] in mobile multi-device environments ...
doi:10.1186/s13678-016-0007-y
fatcat:whsfd2wrrfg5tpq5qmwwxivcc4
Caring, Sharing Widgets: A Toolkit of Sensitive Widgets
[chapter]
2000
People and Computers XIV — Usability or Else!
This paper describes a toolkit of widgets that are capable of presenting themselves in multiple modalities, but further are capable of adapting their presentation to suit the contexts and environments ...
Although most of us communicate using multiple sensory modalities in our lives, and many of our computers are similarly capable of multi-modal interaction, most human-computer interaction is predominantly ...
Each has a corresponding modality mapper in every widget and an output device. API API Figure 1 -Toolkit architecture. ...
doi:10.1007/978-1-4471-0515-2_17
dblp:conf/bcshci/CreaseBG00
fatcat:il5jnbjns5gvhafgqtplm545oi
IMMIView: a multi-user solution for design review in real-time
2009
Journal of Real-Time Image Processing
IMMIView is an interactive system that rely on multiple modality and multi-user interaction to support collaborative design review. ...
We present a multi-modal fusion system developed to support multi-modal commands on a collaborative, colocated, environment, i.e: with two or more users interacting at the same time, on the same system ...
It also provides a simple Widget Manager for immersive environments. ...
doi:10.1007/s11554-009-0141-1
fatcat:hnh4a7xsj5hife6vo7l677v3ni
A Novel Design Approach for Multi-device Adaptable User Interfaces: Concepts, Methods and Examples
[chapter]
2011
Lecture Notes in Computer Science
To this end, this paper proposes a new integrative approach to multi-device user interface development for achieving deviceindependence by-design and further pursuing improved levels user experience for ...
all through adaptive presentational models for various devices and contexts of use. ...
Part of this work has been carried out in the framework of the European Commission co-funded AAL 1 project REMOTE 2 ("Remote health and social care for independent living of isolated elderly with chronic ...
doi:10.1007/978-3-642-21672-5_44
fatcat:jqpr7tudxrc4vixi25zfxpqysi
Multi-Touch interactions for control and display in interactive cockpits
2014
Proceedings of the International Conference on Human-Computer Interaction in Aerospace - HCI-Aero '14
Such interfaces and associated interaction techniques have demonstrated benefits partly due to the fact that the output device integrates input management thus bridging the (classical) gap between input ...
This paper proposes a notation and its associated tool for describing in a complete and unambiguous way multi-touch interactions thus mainly targeting at reliability. ...
The paper has provided a thorough design framework for fusion and fission engines in the area of multi-touch interactions addressing on an equal basis input management and output rendering. ...
doi:10.1145/2669592.2669650
fatcat:bwwnu6izkjdvnnedmuvhitknye
Medical Applications of Multi-Field Volume Rendering and VR Techniques
[article]
2004
EUROVIS 2005: Eurographics / IEEE VGTC Symposium on Visualization
This paper reports on a new approach for visualizing multi-field MRI or CT datasets in an immersive environment with medical applications. ...
The classification step is done at the desktop, taking advantage of the 2D mouse as a high accuracy input device. ...
Acknowledgments This work was funded in part by the Department of Energy VIEWS program, the DOE Computation Science Fellowship program, and the collaborative research centers (SFB) 374 and 382 of the German ...
doi:10.2312/vissym/vissym04/249-254
fatcat:yeyq3nenkvai3glk3eys6dqgqm
Medical Applications of Multi-Field Volume Rendering and VR Techniques
[article]
2004
EUROVIS 2005: Eurographics / IEEE VGTC Symposium on Visualization
This paper reports on a new approach for visualizing multi-field MRI or CT datasets in an immersive environment with medical applications. ...
The classification step is done at the desktop, taking advantage of the 2D mouse as a high accuracy input device. ...
Acknowledgments This work was funded in part by the Department of Energy VIEWS program, the DOE Computation Science Fellowship program, and the collaborative research centers (SFB) 374 and 382 of the German ...
doi:10.2312/vissym/vissym04/249-254
fatcat:6ojrhgh4gfcvxmk7alnrjqjevu
Enabling multi-user interaction in large high-resolution distributed environments
2011
Future generations computer systems
widgets. ...
While the streaming middleware to enable this kind of work exists and the optical networking infrastructure is becoming more widely available, enabling multi-user interaction in such environments is still ...
systems, collaboration software for use on multi-gigabit networks, and advanced networking infrastructure. ...
doi:10.1016/j.future.2010.11.018
fatcat:sc6xeoolszhutkbsr3vxtzl5ku
vrLib: A Designer Oriented Interaction and 3D User Interface Library
[article]
2007
International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
This project provides well known and new interaction techniques and ready to use widgets, which are totally hardware independent. ...
In this paper we propose vrLib, a toolkit which aims to make 3D graphical user interface and interaction very easy to design in virtual environments. ...
We are currently working on improving vrLib by adding multi-modal and multi-user interaction, and automatic generation of user interface from an XML document. ...
doi:10.2312/pe/ve2007short/037-038
dblp:conf/egve/SternbergerBB07
fatcat:kfozweirpjgrboq6cadcpdwxjq
« Previous
Showing results 1 — 15 out of 1,201 results