Filters








42,464 Hits in 6.5 sec

Integrating context-free and context-dependent attentional mechanisms for gestural object reference

Gunther Heidemann, Robert Rae, Holger Bekel, Ingo Bax, Helge Ritter
2004 Machine Vision and Applications  
To evaluate hand movements for pointing gestures and to recognise object references, an approach to integrating bottom-up generated feature maps and top-down propagated recognition results is introduced  ...  Modules for contextfree focus of attention work in parallel with the hand gesture recognition. In contrast to other approaches, the fusion of the two branches is on the sub-symbolic level.  ...  This work was supported within the project VAMPIRE (Visual Active Memory Processes and Interactive REtrieval), which is part of the IST programme (IST-2001-34401).  ... 
doi:10.1007/s00138-004-0157-2 fatcat:hiebvzudzrclzlglksrufu5tku

Integrating Context-Free and Context-Dependent Attentional Mechanisms for Gestural Object Reference [chapter]

Gunther Heidemann, Robert Rae, Holger Bekel, Ingo Bax, Helge Ritter
2003 Lecture Notes in Computer Science  
To evaluate hand movements for pointing gestures and to recognise object references, an approach to integrating bottom-up generated feature maps and top-down propagated recognition results is introduced  ...  Modules for contextfree focus of attention work in parallel with the hand gesture recognition. In contrast to other approaches, the fusion of the two branches is on the sub-symbolic level.  ...  This work was supported within the project VAMPIRE (Visual Active Memory Processes and Interactive REtrieval), which is part of the IST programme (IST-2001-34401).  ... 
doi:10.1007/3-540-36592-3_3 fatcat:vqemcvw5ere4xmvitcxex73nwy

Ecological Interfaces: Extending the Pointing Paradigm by Visual Context [chapter]

Antonella De Angeli, Laurent Romary, Frederic Wolff
1999 Lecture Notes in Computer Science  
The proposal emphasises the role of the visual context on gestural communication. It is aimed at extending the concept of affordances to explain referring gesture variability.  ...  A discussion of practical implications of our findings for software architecture design is presented.  ...  Visual attention is a fundamental precondition for gestural communication.  ... 
doi:10.1007/3-540-48315-2_8 fatcat:eutz4wq5tbhbjdgudobg3qm5tu

Natural deictic communication with humanoid robots

Osamu Sugiyama, Takayuki Kanda, Michita Imai, Hiroshi Ishiguro, Norihiro Hagita
2007 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems  
person recognizes the pointing gesture and pays attention to the indicated object.  ...  Object Indication This is the process of generating a robot's attention-drawing behavior with pointing and a reference term in order to confirm to the recognized object with the interacting person.  ...  ACKNOWLEDGEMENT This research was supported by the Ministry of Internal Affairs and Communications of Japan.  ... 
doi:10.1109/iros.2007.4399120 dblp:conf/iros/SugiyamaKIIH07 fatcat:oukzfgnwlbazvmiwz6kkaz5yhi

Visual perception, language and gesture: A model for their understanding in multimodal dialogue systems

Frédéric Landragin
2006 Signal Processing  
The way we see the objects around us determines speech and gestures we use to refer to them. The gestures we produce structure our visual perception.  ...  This model may be useful for any kind of human-machine dialogue system that focuses on deep comprehension. We show how a referring act takes place into a contextual subset of objects.  ...  For other objects, familiarity depends on the user.  ... 
doi:10.1016/j.sigpro.2006.02.046 fatcat:3yygaktsgfhqxirdrqw3a5xzsq

Machine perception of real-time multimodal natural dialogue [chapter]

Kris Thórisson
2002 Advances in Consciousness Research  
. (2) Hands: Deictic gesture -pointing at objects, and iconic gesture illustrating tilting (in the context of 3-D viewpoint shifts)  ...  Ymir already provides the framework for perceiving facial gesture in context, and the full range of manual gesture.  ... 
doi:10.1075/aicr.35.11tho fatcat:ah4yrysvlbfsvdmzloljbplncu

Page 2107 of Psychological Abstracts Vol. 86, Issue 6 [page]

1999 Psychological Abstracts  
Gesture and the semantic category expressed by co-occurring but unheard speech (e.g., verbal reference to objects, actions) were counterbalanced in the events presented.  ...  HUMAN EXPERIMENTAL PSYCHOLOGY 86: attention-directing events involved 3 types of gesture—display, dem- onstration, and pointing.  ... 

Toward Natural Gesture/Speech Control of a Large Display [chapter]

Sanshzar Kettebekov, Rajeev Sharma
2001 Lecture Notes in Computer Science  
We discuss evolution of a computational framework for gesture and speech integration which was used to develop an interactive testbed (iMAP).  ...  Conducted user studies illustrate significance of accounting for the temporal alignment of gesture and speech parts in semantic mapping.  ...  Acknowledgment The financial support of the National Science Foundation (CAREER Grant IIS-00-81935 and Grant IRI-96-34618) and U. S. Army Research Laboratory (under Cooperative Agreement No.  ... 
doi:10.1007/3-540-45348-2_20 fatcat:ayvjisb5lnfxvjtj6kuemiiamy

Toward Natural Gesture/Speech Control of a Large Display [article]

S. Kettebekov, R. Sharma
2001 arXiv   pre-print
We discuss evolution of a computational framework for gesture and speech integration which was used to develop an interactive testbed (iMAP).  ...  Conducted user studies illustrate significance of accounting for the temporal alignment of gesture and speech parts in semantic mapping.  ...  Acknowledgment The financial support of the National Science Foundation (CAREER Grant IIS-00-81935 and Grant IRI-96-34618) and U. S. Army Research Laboratory (under Cooperative Agreement No.  ... 
arXiv:cs/0105026v1 fatcat:js2lpykt3vb3lhchfoc5vgqs2a

Enabling collaborative geoinformation access and decision‐making through a natural, multimodal interface

A. M. MacEachren, G. Cai, R. Sharma, I. Rauschert, I. Brewer, L. Bolelli, B. Shaparenko, S. Fuhrmann, H. Wang
2005 International Journal of Geographical Information Science  
and integrated speech-gesture interaction.  ...  system, and introduce the approach implemented for enabling mixed-initiative human-computer dialogue.  ...  BCS-0113030, IIS-97-33644, IIS-0081935 and EIA-0306845.  ... 
doi:10.1080/13658810412331280158 fatcat:67dyu2olfvg5piy6t4ip6yt4e4

Deixis, Meta-Perceptive Gaze Practices, and the Interactional Achievement of Joint Attention

Anja Stukenbrock
2020 Frontiers in Psychology  
The analysis draws on a model of the interactional organization of deictic reference and joint attention that serves as a sequential framework for investigating the functions of eye gaze.  ...  The paper investigates the use of gaze along with deictics and embodied pointing to accomplish reference and joint attention in naturally occurring social interaction.  ...  This research was funded by the Swiss National Science Foundation (SNSF) as part of the project 179108 "Deixis and Joint Attention: Vision in Interaction" (DeJA-VI).  ... 
doi:10.3389/fpsyg.2020.01779 pmid:33041877 pmcid:PMC7518716 fatcat:fqgiyybygfepno2nrotjfucpjm

Cognitive Principles in Robust Multimodal Interpretation

J. Y. Chai, Z. Prasov, S. Qu
2006 The Journal of Artificial Intelligence Research  
Multimodal conversational interfaces provide a natural means for users to communicate with computer systems through multiple modalities such as speech and gesture.  ...  Inspired by the previous investigation on cognitive status in multimodal human machine interaction, we have developed a greedy algorithm for interpreting user referring expressions (i.e., multimodal reference  ...  The authors would like to thank anonymous reviewers for their valuable comments and suggestions.  ... 
doi:10.1613/jair.1936 fatcat:jhqnxozrcrfjbdstj7jngkna4i

Coordination and context-dependence in the generation of embodied conversation

Justine Cassell, Matthew Stone, Hao Yan
2000 Proceedings of the first international conference on Natural language generation - INLG '00  
Our agent plans each utterance so that multiple communicative goals may be realized opportunistically by a composite action including not only speech but also coverbal gesture that fits the context and  ...  We accomplish this by reasoning from a grammar which describes gesture declaratively in terms of its discourse function, semantics and synchrony with speech.  ...  We thank Nancy Green, James Lester, Jeff Rickel, Candy Sidner, and anonymous reviewers for comments on this and earlier drafts.  ... 
doi:10.3115/1118253.1118277 dblp:conf/inlg/CassellSY00 fatcat:k3bbz2rpqrhkpbjw3vu33tmxie

Micro Workflow Gestural Analysis: Representation in Social Business Processes [chapter]

Ben Jennings, Anthony Finkelstein
2010 Lecture Notes in Business Information Processing  
These techniques can be applied in differing expert driven problem domains and the resultant data from such analysis of gestural meta data can help to build a reputational representation of human agents  ...  within specific business processes, which will assist in finding the most appropriate human agent for a given task.  ...  Reputation in the context of flexible micro workflows and Passive/Active Gesture Analysis (see section 5), refers to a body of data which can be acquired, analysed and represented programatically via a  ... 
doi:10.1007/978-3-642-12186-9_26 fatcat:xlsvhpuwlja6fd2zjvppyxepoq

Multisensory Integration: Flexible Use of General Operations

Nienke van Atteveldt, Micah M. Murray, Gregor Thut, Charles E. Schroeder
2014 Neuron  
This suggests that multisensory integration is flexible and context dependent and underlines the need for dynamically adaptive neuronal integration mechanisms.  ...  Research into the anatomical substrates and "principles" for integrating inputs from separate sensory surfaces has yielded divergent findings.  ...  Acknowledgments This work was supported by the Dutch Organization for Scientific Research (NWO, grant 451-07-020 to NvA), the Swiss National Science Foundation (SNSF, grant 320030-149982 to MMM), the National  ... 
doi:10.1016/j.neuron.2014.02.044 pmid:24656248 pmcid:PMC4090761 fatcat:qesv45jkejbqnam3meh32fbqma
« Previous Showing results 1 — 15 out of 42,464 results