Filters








434,190 Hits in 4.3 sec

A generic framework for executable gestural interaction models

Romuald Deshayes, Tom Mens, Philippe Palanque
2013 2013 IEEE Symposium on Visual Languages and Human Centric Computing  
An alternative for executable specification of gestural interaction is the use of heterogeneous modeling.  ...  A Generic Framework for Executable Gestural Interaction Models Romuald Abstract-Integrating new input devices and their associated interaction techniques into interactive applications has always been  ... 
doi:10.1109/vlhcc.2013.6645240 dblp:conf/vl/DeshayesPM13 fatcat:iawtjkt5czfqhfmckgjhdmupsa

PetriNect: A tool for executable modeling of gestural interaction

Romuald Deshayes, Tom Mens, Philippe Palanque
2013 2013 IEEE Symposium on Visual Languages and Human Centric Computing  
INTRODUCTION Mouse and keyboard have been the main devices for human-computer interaction for a few decades.  ...  A recent trend consists of controlling the computer using natural interaction such as unconstrained human gestures.  ... 
doi:10.1109/vlhcc.2013.6645266 dblp:conf/vl/DeshayesMP13 fatcat:ojd7yzjgt5aplnp37vdh5rkaqa

Executable Models for Human-Computer Interaction [chapter]

Marco Blumendorf, Grzegorz Lehmann, Sebastian Feuerstack, Sahin Albayrak
Lecture Notes in Computer Science  
Instead of only postponing several design decisions, we aim at the utilization of stateful and executable models at runtime to completely express the user interaction and the user interface logic in a  ...  Utilizing user interface models at runtime provides a possibility to utilize the same basis of information for these postponed decisions. The approach we are following goes even one step further.  ...  Acknowledgements We thank the German Federal Ministry of Economics and Technology for supporting our work as part of the Service Centric Home project in the "Next Generation Media" program.  ... 
doi:10.1007/978-3-540-70569-7_22 fatcat:a3tgtnzcanaqpcsavruzjt4kyq

From Human-Computer Interaction to Human-Robot Social Interaction [article]

Tarek Toumi, Abdelmadjid Zidani
2014 arXiv   pre-print
robot in human computer model to become adequate for human robot interaction and discuss challenges related to the proposed model.  ...  In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the  ...  Hafid and to Professors in the department of Industrial Engineering in the University of Batna for their experimental support.  ... 
arXiv:1412.1251v1 fatcat:afci7wrtyjb47f5gxe4w6wfgzi

Towards a Grid Applicable Parallel Architecture Machine [chapter]

Karolj Skala, Zorislav Sojat
2004 Lecture Notes in Computer Science  
Based on this interactive deformalized (i.e. natural language like) human-Grid interaction languages should be developed, enabling parallel programming to be done by inherently parallel model description  ...  This shall be attained by a Virtual Machine with inherent parallel execution, executing compiled and interpreted computer languages on a very complex p-code level.  ...  The multiplicity of the understanding and execution possibilities of such user programs is essential for the attainment of better human-machine interaction principles.  ... 
doi:10.1007/978-3-540-24688-6_18 fatcat:zex2ifkijffjtntrytga6aguey

Designing a human computer interface system based on cognitive model

M. Mayilvaganan, D. Kalpanadevi
2014 2014 IEEE International Conference on Computational Intelligence and Computing Research  
Aim of this research to focus on Human Computer Interface (HCI) system, designed by interface style, interaction techniques, tasks based on cognitive model can be discussed.  ...  It can be provide an idea to develop the software tool for evaluate the human knowledge, behaviour from the formation of classifying usability metrics.  ...  Figure 3 shows the task analysis of cognitive model based on human computer interaction.  ... 
doi:10.1109/iccic.2014.7238347 fatcat:bmvta4zgwnflhfvu33caacusoa

Turning points in interaction with computers

F. E. Allen
1999 IBM Systems Journal  
In the first topic, programming languages, the power of modeling and simulation via executable models is seen as an old idea whose time is yet to come.  ...  Discovering essays like Branscomb's was one of the many delightful rewards of looking in the Systems Journal for papers representing turning points in interaction with computers.  ...  Concluding remarks The development of interfaces by which humans and computers interact has brought about some of the most significant turning points in computing in the last 38 years.  ... 
doi:10.1147/sj.382.0135 fatcat:ul3jhb3un5bhbgwszlw2rsgb5u

Context-Awareness and Mobile HCI: Implications, Challenges and Opportunities [chapter]

Xiangang Qin, Chee-Wee Tan, Torkil Clemmensen
2017 Lecture Notes in Computer Science  
For this reason, Mobile Context-Awareness (MCA) and its implications for context-driven service innovations has been acknowledged as a promising future in Human Computer Interaction (HCI) [11] .  ...  between humans and computers by giving the latter a more active role to play.  ...  Human computer interaction can only be understood within a wider context and any HCI model needs to provide an appropriate conceptual basis for studies of computer use in its cultural, organizational and  ... 
doi:10.1007/978-3-319-58481-2_10 fatcat:rf4q2pstfjbwdjqbvvrreee3qe

A Framework for a Priori Evaluation of Multimodal User Interfaces Supporting Cooperation [chapter]

Magnus Larsson, Gilles Coppin, Franck Poirier, Olivier Grisvard
2013 Human–Computer Interaction Series  
It is rather an infrastructure for multimodal human-computer interaction and cooperation. However, are we as designers equipped to meet the rapid evolution within the computer industry?  ...  The understanding that human interaction with computer systems and other humans varies depending on if the actor/role is acting alone or in cooperation with other humans or computers as well as on the  ... 
doi:10.1007/978-1-4471-5499-0_7 dblp:series/hci/LarssonCPG13 fatcat:ywfsdeuoljgrjjvvkg5tjsjdge

An Overview of the EPIC Architecture for Cognition and Performance With Application to Human-Computer Interaction

Davis E. Kieras, Davis E. Meyer
1997 Human-Computer Interaction  
His primary research field is computational cognitive modeling, with specific interests in human-computer interaction and human performance simulation.  ...  The EPIC (Executive Process-Interactive Control) cognitive architecture developed by Kieras and Meyer (1997) works especially well for modeling perceptual-motor intensive tasks, multiple tasks, and complex  ...  Acknowledgments Funding for this research was provided by United States Office of Naval Research Grant N00014-03-1-0009 to the University of Michigan, and Grant N00014-02-10440 to the University of Oregon  ... 
doi:10.1207/s15327051hci1204_4 fatcat:oi3nmy2eijhxjcimzpuyihbx54

Action-Oriented Programming Model: Collective Executions and Interactions in the Fog

Niko Mäkitalo, Timo Aaltonen, Mikko Raatikainen, Aleksandr Ometov, Sergey Andreev, Yevgeni Koucheryavy, Tommi Mikkonen
2019 Journal of Systems and Software  
Highlights • The paper introduces six qualities for human-centric Fog Computing • The qualities help improving how humans experience multi-device computing • Based on the qualities, we redesign our Action-Oriented  ...  In contrast, Fog Computing approaches, where devices communicate and orchestrate their operations collectively and closer to the origin of data, lack adequate tools for programming secure interactions  ...  Hence, we contribute the Action-Oriented Programming model for the purposes of coordinating interactions between machines to augment humans.  ... 
doi:10.1016/j.jss.2019.110391 fatcat:6zzdwhk76rhwhnkvjmhkdw5b24

High-Level Modeling of Software-Management Interactions and Tasks for Autonomic Computing

Edin Arnautovic, Hermann Kaindl, J Falb, Roman Popp
2008 Fourth International Conference on Autonomic and Autonomous Systems (ICAS'08)  
Such discourse models are based on insights from theories of human communication. This should make them "natural" for humans to define and understand.  ...  Our well-defined models of interactions and tasks as well as their operationalization should facilitate their execution and automation.  ...  Related Work Our work relates both to the field of interaction modeling (between humans and computers as well as between computers) and to the field of autonomic and self-managed software systems.  ... 
doi:10.1109/icas.2008.46 dblp:conf/icas/ArnautovicKFP08 fatcat:rh2hn5smmbd7zoeny2j3ugoh5q

ARE AI TOOLS GOING TO BE THE NEW DESIGNERS? A TAXONOMY FOR MEASURING THE LEVEL OF AUTOMATION OF DESIGN ACTIVITIES

S. Altavilla, E. Blanco
2020 Proceedings of the Design Society: DESIGN Conference  
Advancement in AI allows thinking about new Designer-AI tools interaction in the design process.  ...  The paper is based on the literature on the concept of Levels of Automation in cognitive engineering, manufacturing and robotics, and proposes a grid of characterisation of the Level of Automation for  ...  Level 3 Designer (Direction) / Computer (Execution) Designer (Direction) / Computer (Execution) Designer (Direction) / Computer (Execution) Designer (Direction) / Computer (Execution) The tool generates  ... 
doi:10.1017/dsd.2020.286 fatcat:johgfkm3sbejhj4a5qwclwuire

Introduction to human-robot interaction

Jean Scholtz, Holly A. Yanco, Jill L. Drury
2006 Proceedings of the 11th international conference on Intelligent user interfaces - IUI '06  
Computer informs human after automatic execution only if human asks 9. Computer informs human after execution only if it decides to 10.  ...  Computer allows the human limited time to veto before automatic execution 7. Computer executes automatically then necessarily informs the human 8.  ...  of Efficient Human Robot Interaction [Goodrich 2003]  ... 
doi:10.1145/1111449.1111459 dblp:conf/iui/ScholtzYD06 fatcat:zeurdexrcnhcfkyiscj67phnle

EXE-SPEM: Towards Cloud-based Executable Software Process Models

Sami Alajrami, Barbara Gallina, Alexander Romanovsky
2016 Proceedings of the 4th International Conference on Model-Driven Engineering and Software Development  
EXE-SPEM is our extension of the Software and Systems Process Engineering (SPEM2.0) Meta-model to support creating cloud-based executable software process models.  ...  Since SPEM2.0 is a visual modelling language, we introduce an XML notation meta-model and mapping rules from EXE-SPEM to this notation which can be executed in a workflow engine.  ...  Barbara Gallina has been supported by by the Swedish Foundation for Strategic Research (SSF) project SYNOPSIS.  ... 
doi:10.5220/0005740605170526 dblp:conf/modelsward/AlajramiGR16 fatcat:iucjn5qxr5acxfpywhcjo6tcfy
« Previous Showing results 1 — 15 out of 434,190 results