Filters








17,143 Hits in 7.8 sec

Active vision techniques for visually mediated interaction

A Jonathan Howell, Hilary Buxton
2002 Image and Vision Computing  
Radial Basis Function (RBF) networks have been trained for gesture-based communication with colour/motion cues to direct face detection and capture 'attentional frames'.  ...  We use these methods for behaviour (user-camera) coordination in an integrated system.  ...  Acknowledgements The authors gratefully acknowledge the invaluable discussion, help and facilities provided by Shaogang Gong, Jamie Sherrah and Stephen McKenna and funding under the EPSRC ISCANIT and EU  ... 
doi:10.1016/s0262-8856(02)00095-1 fatcat:cgon2bsuozhkneotwo5ho7iem4

Towards joint attention for a domestic service robot - person awareness and gesture recognition using Time-of-Flight cameras

David Droeschel, Jorg Stuckler, Dirk Holz, Sven Behnke
2011 2011 IEEE International Conference on Robotics and Automation  
For perceiving showing and pointing gestures and for estimating the pointing direction a Time-of-Flight camera is used.  ...  Joint attention between a human user and a robot is essential for effective human-robot interaction.  ...  ACKNOWLEDGMENT This work has been supported partially by grant BE 2556/2-3 of German Research Foundation (DFG).  ... 
doi:10.1109/icra.2011.5980067 dblp:conf/icra/DroeschelSHB11 fatcat:yhmuqtlusbcynekevak4xv7bnu

Gaze and Gestures in Telepresence: multimodality, embodiment, and roles of collaboration [article]

Mauro Cherubini, Rodrigo de Oliveira, Nuria Oliver, Christian Ferran
2010 arXiv   pre-print
This paper proposes a controlled experiment to further investigate the usefulness of gaze awareness and gesture recognition in the support of collaborative work at a distance.  ...  move while collaborating at a distance and c) avoid asymmetries of communication between collaborators.  ...  ACKNOWLEDGMENTS Telefónica I+D participates in Torres Quevedo subprogram (MICINN), cofinanced by the European Social Fund, for Researchers recruitment.  ... 
arXiv:1001.3150v1 fatcat:escxkwkrfjc33fxlbqiezw2baq

Interacting with a mobile robot: Evaluating gestural object references

J. Schmidt, N. Hofemann, A. Haasch, J. Fritsch, G. Sagerer
2008 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems  
This paper describes a novel integrated vision system combining different algorithms for pose tracking, gesture detection, and object attention in order to enable a mobile robot to resolve gesture-based  ...  of gestural object references.  ...  The resulting hand trajectories of the body tracker are used as input for the gesture recognition.  ... 
doi:10.1109/iros.2008.4650649 dblp:conf/iros/SchmidtHHFS08 fatcat:qprhdngh7vcqtcr7bd2svkkkye

Gaze, Posture and Gesture Recognition to Minimize Focus Shifts for Intelligent Operating Rooms in a Collaborative Support System

Juan P. Wachs
2010 International Journal of Computers Communications & Control  
I describe how machine vision techniques are used to extract spatio-temporal measures and to interact with the system, and how computer graphics techniques can be used to display visual medical information  ...  Intelligent operating rooms minimize surgeon's focus shifts by minimizing both the focus spatial offset (distance moved by surgeon's head or gaze to the new target) and the movement spatial offset (distance  ...  Once the image is projected, the surgeon can interact with the images using hand gestures Gaze, Posture and Gesture Recognition to Minimize Focus Shifts for Intelligent Operating Rooms in a Collaborative  ... 
doi:10.15837/ijccc.2010.1.2467 fatcat:t4nfjeoxnfaovc7kyqps57wtvu

Gesture-based attention direction for a telepresence robot: Design and experimental study

Keng Peng Tee, Rui Yan, Yuanwei Chua, Zhiyong Huang, Somchaya Liemhetcharat
2014 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems  
Experiment results are very promising and show that i) the average gesture recognition rate is 92%, i) gesturebased attention direction rate is 90%, and that ii) only by considering the 3 types of audio-visual  ...  Gesture-based attention direction is realized by combining Localist Attractor Network (LAN) and Short-Term Memory (STM).  ...  The authors are grateful to Mr Jason Teo Kok Yung for design work and Ms Yow Ai Ping for help with the experiments.  ... 
doi:10.1109/iros.2014.6943138 dblp:conf/iros/TeeYCHL14 fatcat:5yyqiy66vvazxklccguarwczo4

KINterestTV - Towards Non–invasive Measure of User Interest While Watching TV [chapter]

Julien Leroy, François Rocca, Matei Mancas, Radhwan Ben Madhkour, Fabien Grisard, Tomas Kliegr, Jaroslav Kuchar, Jakub Vit, Ivan Pirner, Petr Zimmermann
2014 IFIP Advances in Information and Communication Technology  
Our approach is tested during an experiment simulating the attention changes of a user in a scenario involving second screen (tablet) interaction, a behavior that has become common for spectators and a  ...  typical source of attention switches.  ...  This additional software is used for annotation and generation of database for face recognition and active appearance modeling.  ... 
doi:10.1007/978-3-642-55143-7_8 fatcat:j5jrqja4xzczjmtl63t4b7xqpu

A Gesture-based Tool for Sterile Browsing of Radiology Images

J. P. Wachs, H. I. Stern, Y. Edan, M. Gillam, J. Handler, C. Feied, M. Smith
2008 JAMIA Journal of the American Medical Informatics Association  
This paper presents "Gestix," a vision-based hand gesture capture and recognition system that interprets in real-time the user's gestures for navigation and manipulation of images in an electronic medical  ...  , supporting their focus of attention, and providing fast response times.  ...  Subsequent tests, using students, resulted in gesture recognition accuracy of 96% (for the eight gestures used in the system).  ... 
doi:10.1197/jamia.m241 pmid:18451034 pmcid:PMC2410001 fatcat:ca32jenqyzg5ddfxb3s22fbmm4

A Gesture-based Tool for Sterile Browsing of Radiology Images

J. P. Wachs, H. I. Stern, Y. Edan, M. Gillam, J. Handler, C. Feied, M. Smith
2008 JAMIA Journal of the American Medical Informatics Association  
This paper presents "Gestix," a vision-based hand gesture capture and recognition system that interprets in real-time the user's gestures for navigation and manipulation of images in an electronic medical  ...  , supporting their focus of attention, and providing fast response times.  ...  Subsequent tests, using students, resulted in gesture recognition accuracy of 96% (for the eight gestures used in the system).  ... 
doi:10.1197/jamia.m2410 pmid:18451034 pmcid:PMC2410001 fatcat:vn5wiuri35c5nib3gbwnymi2zm

A multi-modal object attention system for a mobile robot

A. Haasch, N. Hofemann, J. Fritsch, G. Sagerer
2005 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems  
We present a multi-modal object attention system that is able to identify objects referenced by the user with gestures and verbal instructions.  ...  This way, the growing knowledge base of the robot companion improves the interaction quality as the robot can more easily focus its attention on objects it has been taught previously.  ...  Its task is to focus the robot's attention on objects that become of interest because the user is referring to them using multiple modalities.  ... 
doi:10.1109/iros.2005.1545191 dblp:conf/iros/HaaschHFS05 fatcat:ciskpzeacrckrpeypiglmauhci

Multimodal human–computer interaction: A survey

Alejandro Jaimes, Nicu Sebe
2007 Computer Vision and Image Understanding  
In particular, we focus on body, gesture, gaze, and affective interaction (facial expression recognition, and emotion in audio).  ...  We discuss user and task modeling, and multimodal fusion, highlighting challenges, open issues, and emerging applications for Multimodal Human Computer Interaction (MMHCI) research.  ...  A review of vision-based HCI is presented in [62] with a focus on head tracking, face and facial expression recognition, eye tracking, and gesture recognition.  ... 
doi:10.1016/j.cviu.2006.10.019 fatcat:gzaoce4i2zedxclvpu77z5ndry

Multimodal Human Computer Interaction: A Survey [chapter]

Alejandro Jaimes, Nicu Sebe
2005 Lecture Notes in Computer Science  
In particular, we focus on body, gesture, gaze, and affective interaction (facial expression recognition, and emotion in audio).  ...  We discuss user and task modeling, and multimodal fusion, highlighting challenges, open issues, and emerging applications for Multimodal Human Computer Interaction (MMHCI) research.  ...  A review of vision-based HCI is presented in [62] with a focus on head tracking, face and facial expression recognition, eye tracking, and gesture recognition.  ... 
doi:10.1007/11573425_1 fatcat:ale6vmjoungs3gh7xx4dksigva

Gesture-Based Human-Machine Interaction: Taxonomy, Problem Definition, and Analysis

Alessandro Carfi, Fulvio Mastrogiovanni
2021 IEEE Transactions on Cybernetics  
The possibility for humans to interact with physical or virtual systems using gestures has been vastly explored by researchers and designers in the last 20 years to provide new and intuitive interaction  ...  The main contributions of this article can be summarized as follows: 1) we provide a broad definition for the notion of functional gesture in HMI; 2) we design a flexible and expandable gesture taxonomy  ...  Only a few works clearly specify their focus on hand-based gestures [47] or organise gestures as hand-arm, head-face and body [29] .  ... 
doi:10.1109/tcyb.2021.3129119 pmid:34910648 fatcat:okcqrqjp6nhk3k4urync2syxwa

AIDIA - Adaptive Interface for Display InterAction

B. Stenger, T.E. Woodley, T.-K. Kim, C. Hernandez, R. Cipolla
2008 Procedings of the British Machine Vision Conference 2008  
An attention mechanism based on face and hand detection allows users in the camera's field of view to take control of the interface. Face recognition is used for identification and customisation.  ...  This paper presents a vision-based system for interaction with a display via hand pointing.  ...  ) and (c) fist tracking for moving a pointer and recognition of hand gestures such as a 'thumb up' or a 'shake' gesture for item selection.  ... 
doi:10.5244/c.22.78 dblp:conf/bmvc/StengerWKHC08 fatcat:n2ishu2zcfbwpldhxhu7jj6jzm

A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior

H. Gunes, M. Piccardi
2006 18th International Conference on Pattern Recognition (ICPR'06)  
There also exist a number of gesture databases of static and dynamic hand postures and dynamic hand gestures.  ...  Accordingly, in this paper, we present a bimodal database recorded by two high-resolution cameras simultaneously for use in automatic analysis of human nonverbal affective behavior.  ...  In the second stage they performed the same expressions using their face and body simultaneously. Examples of the data recorded by camera 1 (for body) and camera 2 (for face) can be seen in Fig. 3 .  ... 
doi:10.1109/icpr.2006.39 dblp:conf/icpr/GunesP06 fatcat:fjchiesdvjcidk47ggm7ikqgbm
« Previous Showing results 1 — 15 out of 17,143 results