1,574 Hits in 6.1 sec

Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction

Zhou Ren, Jingjing Meng, Junsong Yuan
2011 2011 8th International Conference on Information, Communications & Signal Processing  
Although the market for hand gesture based HCI is huge, building a robust hand gesture recognition system remains a challenging problem for traditional vision-based approaches, which are greatly limited  ...  And then we introduce several HCI applications built on top of a accurate and robust hand gesture recognition system based on FEMD.  ...  Acknowledgment This work was supported in part by the Nanyang Assistant Professorship (SUG M58040015) to Dr. Junsong Yuan.  ... 
doi:10.1109/icics.2011.6173545 dblp:conf/IEEEicics/RenMY11 fatcat:45qpf3grgnfv3g27o4pqvrwyzq

User-Centric Design of a Vision System for Interactive Applications

S. Borkowski, J.L. Crowley, J. Letessier, F. Berard
2006 Fourth IEEE International Conference on Computer Vision Systems (ICVS'06)  
Our framework is designed to allow developers unfamiliar with vision to use computer vision as an interaction modality.  ...  We validate our approach with an implementation of standard GUI widgets (buttons and sliders) based on computer vision.  ...  Finally, since the contract offered by a VB input system cannot be expressed simply as a part of an API, we propose to document it explicitly.  ... 
doi:10.1109/icvs.2006.61 dblp:conf/icvs/BorkowskiCLB06 fatcat:urfcfspomveipeeqzalzzrhx4a

The past, present, and future of gaze-enabled handheld mobile devices

Mohamed Khamis, Florian Alt, Andreas Bulling
2018 Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI '18  
research, such as visibility of the user's eyes, lighting conditions, and privacy implications.  ...  To this end, we discuss how research developed from building hardware prototypes, to accurate gaze estimation on unmodified smartphones and tablets.  ...  ACKNOWLEDGEMENTS This work was funded, in part, by the Cluster of Excellence on Multimodal Computing and Interaction (MMCI) at Saarland University, Germany, and by the Bavarian State Ministry of Education  ... 
doi:10.1145/3229434.3229452 dblp:conf/mhci/KhamisAB18 fatcat:wzebnfzaabfgbnwe7cmdumlnwu

ToF-sensors: New dimensions for realism and interactivity

Andreas Kolb, Erhardt Barth, Reinhard Koch
2008 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops  
The estimation of a range map by image analysis or laser scan techniques is still a timeconsuming and expensive part of such systems.  ...  A lower-priced, fast and robust alternative for distance measurements are Time-of-Flight (ToF) cameras.  ...  A MESA SR3000 cameras has been used. User Interaction An important application area is that of interactive systems such as alternative input devices, games, animated avatars etc.  ... 
doi:10.1109/cvprw.2008.4563159 dblp:conf/cvpr/KolbBK08 fatcat:ffiserkhtbdbfjqpad7xeycgty

Planar mirrors for image-based robot localization and 3-D reconstruction

Gian Luca Mariottini, Stefano Scheggi, Fabio Morbidi, Domenico Prattichizzo
2012 Mechatronics (Oxford)  
These systems have recently received an increasing attention because, unlike stereo cameras, can capture two views of the same scene without the need of hardware multi-camera synchronization and calibration  ...  Planar catadioptric vision sensors consist of a pinhole camera observing a scene being reflected on two (or more) planar mirrors.  ...  Acknowledgements The authors are grateful to Francesco Chinello and Valerio Savini, for their precious support in the preparation of the experimental setup.  ... 
doi:10.1016/j.mechatronics.2011.09.004 fatcat:2kxkrizq3bfixo5m2he3p62huu

A Multi-Gesture Interaction System Using a 3-D Iris Disk Model for Gaze Estimation and an Active Appearance Model for 3-D Hand Pointing

Michael J. Reale, Shaun Canavan, Lijun Yin, Kaoning Hu, Terry Hung
2011 IEEE transactions on multimedia  
We also propose a novel eye gaze estimation approach for point-of-regard (POR) tracking on a viewing screen.  ...  To track head, eye, and mouth movements, we present a two-camera system that detects the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based  ...  ACKNOWLEDGMENT The authors would like to thank Mr. X. Zhang for his help on the experiments and video recording/editing.  ... 
doi:10.1109/tmm.2011.2120600 fatcat:22s37ixwsfehlj64qjifrfnjiu

A Practical Paradigm and Platform for Video-Based Human-Computer Interaction

Jason J. Corso, Guangqi Ye, Darius Burschka, Gregory D. Hager
2008 Computer  
Using video in HCI is difficult, as evidenced by the absence of video-based interaction systems in production.  ...  the computer system (with little or no mediation), the system must supply continuous feedback to the user throughout the interaction and immediately there-examples of Video-Based HCI Most video-based approaches  ...  Corso is an assistant professor of computer science and engineering at the  ... 
doi:10.1109/mc.2008.141 fatcat:lc6pcj3l7fesjcdl5m2axc5b5i

Corneal Imaging Revisited: An Overview of Corneal Reflection Analysis and Applications

Christian Nitschke, Atsushi Nakazawa, Haruo Takemura
2013 IPSJ Transactions on Computer Vision and Applications  
These corneal reflections can be extracted from an image of the eye by modeling the eye-camera geometry as a catadioptric imaging system.  ...  As a result, one obtains the visual information of the environment and the relation to the observer (view, gaze), which allows for application in a number of fields.  ...  When captured by a camera, the compound of cornea and camera acts as a non-central catadioptric imaging system that requires per-frame calibration through eye pose estimation.  ... 
doi:10.2197/ipsjtcva.5.1 fatcat:gws5uirlibhfnf72bmd5mmqly4

A Hybrid View in a Laparoscopic Surgery Training System

Chuan Feng, Jerzy W. Rozenblit, Allan J. Hamilton
2007 14th Annual IEEE International Conference and Workshops on the Engineering of Computer-Based Systems (ECBS'07)  
A digital camera and magnetic position sensors are used to detect laparoscopic instruments in the system.  ...  To minimize the potential hazards of laparoscopic surgery, an assistive training system is being developed.  ...  A virtual camera algorithm is applied as follows: Assume the camera is fixed in a specific position, after the calibration.  ... 
doi:10.1109/ecbs.2007.6 dblp:conf/ecbs/FengRH07 fatcat:kjvz63bklngajaapoldrkqszkm

3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers

Mohsen Mansourya, Julian Steil, Yusuke Sugano, Andreas Bulling
2016 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA '16  
We present a novel 3D gaze estimation method for monocular head-mounted eye trackers.  ...  In contrast to previous work, our method does not aim to infer 3D eyeball poses but directly maps 2D pupil positions to 3D gaze directions in scene camera coordinate space.  ...  This work was funded, in part, by the Cluster of Excel-lence on Multimodal Computing and Interaction (MMCI) at Saarland University, the Alexander von Humboldt Foundation, as well as a JST CREST research  ... 
doi:10.1145/2857491.2857530 dblp:conf/etra/MansouryarSSB16 fatcat:zfda7nux7ndetl4rtmaxltquby

A Novel Human Computer Interaction Platform based College Mathematical Education Methodology [article]

Zhiyan Li
2016 arXiv   pre-print
As the further step, we implement the system with the re-write of script code to build up the personalized HCI assisted education scenario.  ...  This article proposes the analysis on novel human computer interaction (HCI) platform based college mathematical education methodology.  ...  Kinect HCI System Structure Because it can use its depth cameras provide depth image, the pixels recorded the calibration of each point in the scene depth, resolution of a few centimeters.  ... 
arXiv:1602.00801v2 fatcat:wwkeaz7umzhddep24f6bujocbq

Mo2Cap2: Real-time Mobile 3D Motion Capture with a Cap-mounted Fisheye Camera [article]

Weipeng Xu, Avishek Chatterjee, Michael Zollhoefer, Helge Rhodin, Pascal Fua, Hans-Peter Seidel, Christian Theobalt
2019 arXiv   pre-print
In addition to the novel hardware setup, our other main contributions are: 1) a large ground truth training corpus of top-down fisheye images and 2) a novel disentangled 3D pose estimation approach that  ...  We tackle these challenges based on a novel lightweight setup that converts a standard baseball cap to a device for high-quality pose estimation based on a single cap-mounted fisheye camera.  ...  Especially, our system provides a novel natural humancomputer-interaction (HCI) solution for recent popular virtual reality (VR) and augmented reality (AR) systems.  ... 
arXiv:1803.05959v2 fatcat:2u65qvvwufe5vi2z3vtyz3i4mq

Real-time gaze estimation via pupil center tracking

Dario Cazzato, Fabio Dominio, Roberto Manduchi, Silvia M. Castro
2018 Paladyn: Journal of Behavioral Robotics  
The possibility of a real-time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications.  ...  with a wide range of distances from the camera.  ...  Manduchi has been supported by Reader's Digest/Research to Prevent Blindness.  ... 
doi:10.1515/pjbr-2018-0002 fatcat:ibhix4fetnctrodxeuisswdnsa

Eye Gaze Techniques for Human Computer Interaction: A Research Survey

Anjana Sharma, Pawanesh Abrol
2013 International Journal of Computer Applications  
The use of the gaze as a human computer interface in different fields is an example of high end applications of these techniques.  ...  Human Computer Interaction (HCI) is an emerging technology.  ...  A novel gaze tracking system called FreeGaze method implies a few geometric and image processing with a single light source using a calibrated camera and single glint.  ... 
doi:10.5120/12386-8738 fatcat:3yheb75eqngxnev57rvgjwkndy

View-based Location and Tracking of Body Parts for Visual Interaction

A. Micilotta, R. Bowden
2004 Procedings of the British Machine Vision Conference 2004  
This paper presents a real time approach to locate and track the upper torso of the human body.  ...  Furthermore, we present a novel method to disambiguate the hands of the subject and to predict the likely position of elbows.  ...  The overall objective of this work is to create a visual HCI tool using an un-calibrated monocular camera system in a cluttered environment.  ... 
doi:10.5244/c.18.87 dblp:conf/bmvc/MicilottaB04 fatcat:z4foeafqljannj62w6jfwmb36i
« Previous Showing results 1 — 15 out of 1,574 results