A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2012; you can also visit the original URL.
The file type is
Lecture Notes in Computer Science
Runtime models enable the implementation of highly adaptive applications but also require a rethinking in the way we approach models. Metamodels of runtime models must be supplemented with additional runtime concepts that have an impact on the way how runtime models are built and reflected in the underlying runtime architectures. The goal of this work is the generalization of common concepts found in different approaches utilizing runtime models and the provision of a basis for theirdoi:10.1007/978-3-642-21210-9_21 fatcat:gokxrd3jv5d33ftxzdopcaonym
more »... ng. After analyzing recent works dealing with runtime models, we present a metamodeling process for runtime models. Based on a meta-metamodel it guides the creation of metamodels combining design time and runtime concepts.
The creation of user interfaces usually involves various people in differentr oles and several tools that are designed to support each specific role. In this paper we propose at ool for rapid prototyping that allows all parties involved to directly interact with the system under development. Thet ooli sb ased on task tree developmenta nd integrates the system designer, theu seri nterface designer, the usability expert, and the user interface developer in ac ommonp rocess. The final system isdblp:conf/gi/FeuerstackBA06 fatcat:kvxcixc465cabkbtqcslquey7y
more »... ived from two sources, the taskm odel specifiedb yt he system architecta nd the final user interface specified by the user interfaced eveloper and designer. Aggregating ther untime system and the design tools into one complete integrated system is our approach to bridge the gap between the user interface designer working on system mock-ups and the actual developers implementing the system.
Offering user interfaces for interactive applications that are flexible enough to be adapted to various context-of-use scenarios such as supporting different display sizes or addressing various input styles requires an adaptive layout. We describe an approach for layout derivation that is embedded in a model-based user interface generation process. By an interactive and tool-supported process we can efficiently create a layout model that is composed of interpretations of the other design modelsdoi:10.1145/1385569.1385605 dblp:conf/avi/FeuerstackBSA08 fatcat:wv4qu7ftqvcvdjgi56pcgt2p4m
more »... and is consistent to the application design. By shifting the decision about which interpretations are relevant to support a specific context-of-use scenario from design-time to run-time, we can flexibly adapt the layout to consider new device capabilities, user demands and user interface distributions. We present our run-time environment that is able to evaluate the relevant model layout information to constraints as they are required and to reassemble the user interface parts regarding the updated containment, order, orientation and sizes information of the layout-model. Finally we present results of an evaluation we performed to test the design and run-time efficiency of our model-based layouting approach.
Lecture Notes in Computer Science
In this paper we analyse the requirements and challenges ambient assisted living and smart environments pose on interactive systems. We present a framework for the provisioning of user interfaces for such environments. The framework incorporates model-based user interface development technologies to create a runtime system that manages interaction resources and context information to adapt interaction. This approach allows the creation of adaptive and multimodal interactive ambient assisteddoi:10.1007/978-3-642-02710-9_18 fatcat:trsdkhz4kjgzzlviiipezplr4q
more »... ng applications. Keywords: smart environments, multimodal interaction, model-based user interface development, ambient assisted living, multi-access service platform.
Smart environments utilize computers as tools supporting the user in his daily life, moving interaction with computers from a single system to a complex, distributed environment. User interfaces available in this environment need to adapt to the specifics of the various available devices and are distributed across several devices at the same time. A problem arising with distributed user interfaces is the required synchronization of the different parts. In this paper we present an approachdblp:conf/models/BlumendorfFA06 fatcat:l7tln6izx5cf7hp7a5uw4dg6da
more »... ng the event-based synchronization of distributed user interfaces based on a multilevel user interface model. We also describe a runtime system we created, allowing the execution of model-based user interface descriptions and the distribution of user interfaces across various devices and modalities using channels established between the system and the end devices.
In this paper we define a notion to describe consistency within and between models, which has been identified as important issue when using model-based tools. We introduce the abstract syntax of models as attributed typed graphs and define a formalism of consistency based on this formal description. The application of the formalism is illustrated by an example. Author Keywords Model Driven Engineering, Abstract Syntax, Consistency ACM Classification Keywords D.2.4 Software/Program Verification: Formal methods General Terms Theorydoi:10.1145/1996461.1996498 dblp:conf/eics/TrollmannBSA11 fatcat:2va2sdh3zbaqbiujqsbtqceyqu
Smart environments bring together multiple users, (interaction) resources and services. This creates complex and unpredictable interactive computing environments that are hard to understand. Users thus have difficulties to build up their mental model of such interactive systems. To address this issue users need possibilities to evaluate the state of these systems and to adapt them according to their needs. In this work we describe the requirements and functionalities for evaluating anddoi:10.1145/1502650.1502725 dblp:conf/iui/RoscherBA09 fatcat:zwxlf37ovrhlnlb4lww6a3cl7m
more »... ng interactive spaces in smart environments from the system and the user perspective. Furthermore we present a model-based implementation of these capabilities which is accessible for the user in form of a meta user interface.
Enriched with more and more intelligent devices modern homes rapidly transform into smart environments. Their growing capabilities enable the implementation of a new generation of ubiquitous applications, but also raise the complexity of the development. Developers of applications for smart environments must cope with a multitude of sensors, devices, users and thus contexts. We present a model-based approach for modeling of, reasoning about and controlling smart environments. A context modeldoi:10.1109/percomw.2010.5470513 dblp:conf/percom/LehmannRBA10 fatcat:s4xnsdlq4regfahj44tnwuuz54
more »... vides adaptive applications with a unified access to the smart home environment and, through a unique approach of utilizing executable models, also reflects its state at runtime. The presented approach supports runtime user interface adaption and reconfiguration for seamless interaction and has been successfully utilized to build several context-adaptive applications running in our smart home testbed. (Abstract) Smart environments; context models; executable models; ambient intelligence I.
Developing Ambient Intelligence
Marco Blumendorf DAI-Labor, Technische Universität Berlin Marco Blumendorf is a PhD student at the DAI-Labor of the Technical University of Berlin. ...doi:10.1007/978-2-287-47610-5_1 fatcat:aqqt66ln45bgtht46wn25tdmoe
With the increasing importance of computers in all areas of life, new and innovative interaction concepts gain importance as current windows, icons, menus, and pointing concepts are rendered unusable. Well known graphical user interface currently move towards enhanced and multimodal interaction capabilities. In this paper we describe our approach to support this transition by extending graphical interfaces with multimodal interaction capabilities. Major aspects we focus on are the conveyance ofdoi:10.14236/ewic/create2010.34 fatcat:hlgqjvnpovgs3duymd7vbh27oy
more »... the usable modalities as well as the fluent transitions between different modality combinations when the interaction context changes.
Lecture Notes in Computer Science
Model-based user interface development is grounded on the idea to utilize models at design time to derive user interfaces from the modeled information. There is however an increasing demand for user interfaces that adapt to the context of use at runtime. The shift from design time to runtime means, that different design decisions are postponed until runtime. Utilizing user interface models at runtime provides a possibility to utilize the same basis of information for these postponed decisions.doi:10.1007/978-3-540-70569-7_22 fatcat:a3tgtnzcanaqpcsavruzjt4kyq
more »... he approach we are following goes even one step further. Instead of only postponing several design decisions, we aim at the utilization of stateful and executable models at runtime to completely express the user interaction and the user interface logic in a modelbased way.
Runtime Architecture To provide the required models, we utilize a modelbased runtime system for ubiquitous UIs, called the Multi-Access Service Platform (MASP, available at http://masp.dai-labor.de) (Blumendorf ...doi:10.14236/ewic/hci2011.83 fatcat:7bqt4ie5yjga5ipf3wp67wwlem
Lecture Notes in Computer Science
In this demonstration we present the Multi Access Service Platform (MASP), a model-based runtime architecture for user interface development based on the idea of dynamic executable models. Such models are selfcontained and complete as they contain the static structure, the dynamic state information as well as the execution logic. Utilizing dynamic executable models allows us to implement a rapid prototyping approach and provide mechanisms for the extension of the UI modeling language of the MASP.doi:10.1007/978-3-540-70569-7_29 fatcat:2xjwlfgkdrap7iittxzrgktaca
The ongoing utilization of computer technologies in all areas of life leads to the development of smart environments comprising numerous networked devices and resources. Interacting in and with such environments requires new interaction paradigms, abstracting from single interaction devices to utilize the environment as interaction space. Using a networked set of interaction resources allows supporting multiple modalities and new interaction techniques, but also requires the consideration ofdoi:10.1007/978-3-540-85379-4_14 fatcat:ldyvidfzkjbohjbk6jn6l43zgu
more »... set of devices and the adaptation to this set at runtime. While the generation of user interfaces based on UI models, although still challenging, has been widely researched, the runtime processing and delivery of the derivable user interfaces has gained less attention. Delivering distributed user interfaces while maintaining their interdependencies and keeping them synchronized is not a trivial problem. In this paper we present an approach to realize a runtime environment, capable of distributing user interfaces to a varying set of devices to support multimodal interaction based on a user interface model and the management of interaction resources.
Die wachsende Verbreitung des Computers in allen Bereichen des Lebens birgt neue Herausforderungen für Wissenschaftler und Programmierer in verschiedensten Fachrichtungen der Informatik. Vernetzte Geräte bilden intelligente Umgebungen, die unterschiedlichste Geräte, Sensoren und Aktoren integrieren und leiten allmählich einen Paradigmenwechsel in Richtung des "Ubiquitous Computing" ein. Mit der wachsenden Durchdringung unserer Lebensbereiche durch Computer-Technologie, nimmt auch das Bedürfnisdoi:10.14279/depositonce-2229 fatcat:tsaf3gwcw5hdnb3eo2p3wsdupe
more »... u, die steigende Komplexität über neuartige Benutzerschnittstellen einerseits handhabbar zu machen und andererseits vor dem Nutzer zu verbergen. Diese Arbeit prägt den Begriff Ubiquitous User Interface (Allgegenwärtige Benutzerschnittstelle) um Schnittstellen zu bezeichnen, die einer Vielzahl von Nutzern erlauben mit verschiedenen Geräten über mehrere Modalitäten mit einem Satz von Diensten in wechselnden Situationen zu interagieren. Die Entwicklung und Bereitstellung solcher Benutzerschnittstellen stellt neue Anforderungen an Design und Laufzeit. Der Einsatz von Modellen und Modellierungstechnologien ist ein vielversprechender Weg um der steigenden Komplexität von Software Herr zu werden. Diese Arbeit beschreibt einen modell-basierten Ansatz, der ausführbare Modelle von Benutzerschnittstellen mit einer Laufzeitarchitektur verbindet, um die wachsende Komplexität von Benutzerschnittstellen zu adressieren. Ausführbare Modelle identifizieren dabei die gemeinsamen Bausteine von dynamischen, in sich geschlossenen Modellen, die Design- und Laufzeitaspekte kombinieren. Die Überbrückung der Kluft zwischen Design- und Laufzeit innerhalb eines Modells ermöglicht die Heranziehung von Designinformationen für Laufzeitentscheidungen sowie Schlussfolgerungen über die Semantik von Interaktion und Präsentation. Basierend auf dem Konzept von ausführbaren Modellen wird ein Satz von Metamodellen eingeführt, der Designaspekte aktueller Benutzerschnittstellenbeschreibungssprachen aufgreift und zusätzliche Laufzeitaspekte wie Zustandsinformation [...]
« Previous Showing results 1 — 15 out of 22 results