Filters








89,813 Hits in 4.7 sec

User-assisted reflection detection and feature point tracking

Mohamed A. Elgharib, François Pitite, Anil Kokaram, Venkatesh Saligrama
2013 Proceedings of the 10th European Conference on Visual Media Production - CVMP '13  
The third contribution of this paper is an application for reflection detection. Here we explore better feature point tracking for the regions detected as reflection.  ...  User feedback is common in post-production video manipulation tools. Hence in the second contribution we propose an effective way of integrating few user-assisted masks to improve detection rates.  ...  USER ASSISTANCE FOR ROBUST DE-TECTION Missed detections could occur if the reflections contain weak feature points (see Fig. 2 , red).  ... 
doi:10.1145/2534008.2534011 dblp:conf/cvmp/ElgharibPKS13 fatcat:qt5k56bzwjbsdcqjw3xiv6zqiy

Eye Tracking and Head Movement Detection: A State-of-Art Survey

Amer Al-Rahayfeh, Miad Faezipour
2013 IEEE Journal of Translational Engineering in Health and Medicine  
INDEX TERMS Eye tracking, eye detection, head movement detection.  ...  Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications.  ...  Therefore, they used inter-frame difference in the coordinate of feature points to reflect the head movement.  ... 
doi:10.1109/jtehm.2013.2289879 pmid:27170851 pmcid:PMC4839304 fatcat:phkou4gdpfcldcelsqz3bulpse

Robust pupil detection for gaze-based user interface

Wen-Hung Liao, Li-Chiang Yu
2010 Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction - EGIHMI '10  
However, the operation is easily hindered by excessive corneal reflection and noise which prevent correct detection and precise localization of the pupil.  ...  We demonstrated several gaze-based user interfaces, including web-browsing and photo-viewing, with a wearable eye tracker constructed using the proposed method.  ...  Extraction of Feature Points Instead of simply applying edge detection to obtain candidate feature points, we investigate the intensity distribution in a neighborhood to select potential boundary points  ... 
doi:10.1145/2002333.2002341 fatcat:ktmzhlghx5cxncnx4cywdzf4ly

Inclusive Design: Accessibility Settings for People with Cognitive Disabilities [article]

Trae Waggoner, Julia Ann Jose, Ashwin Nair, Sudarsan Manikandan
2021 arXiv   pre-print
Unfortunately, for those who require more unique and sometimes challenging accommodations, such as people with Amyotrophic lateral sclerosis ( ALS), the most commonly used accessibility features are simply  ...  The purpose of this paper is to suggest a more affordable and readily available option for ALS assistive technology that can be implemented on a smartphone or tablet.  ...  The predictive text and auto-correct features would still be available and the user would blink to select it.  ... 
arXiv:2110.05688v1 fatcat:d4rt6isytfbutmvb7slmqebzkq

3D point of gaze estimation using head-mounted RGB-D cameras

Christopher McMurrough, Christopher Conly, Vassilis Athitsos, Fillia Makedon
2012 Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS '12  
The device consists of an eye tracking camera and forward facing RGB-D scene camera which, together, provide an es timate of the user gaze vector and its intersection with a 3D point in space.  ...  This paper presents a low-cost, wearable headset for 3D Point of Gaze (PoG) estimation in assistive applications.  ...  An accurate estimate of the 3D user PoG within an environment is clearly useful, as it can be used to detect user attention and intention to interact [1].  ... 
doi:10.1145/2384916.2384994 dblp:conf/assets/McMurroughCAM12 fatcat:64clx4jlkzchrmrag5yuhfcgzy

Multi-modal object of interest detection using eye gaze and RGB-D cameras

Christopher McMurrough, Jonathan Rich, Christopher Conly, Vassilis Athitsos, Fillia Makedon
2012 Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction - Gaze-In '12  
The device consists of an eye tracking camera and forward facing RGB-D scene camera which are able to provide an estimate of the user gaze vector and its intersection with a 3D point in space.  ...  This paper presents a low-cost, wearable headset for mobile 3D Point of Gaze (PoG) estimation in assistive applications.  ...  The LED also produces a corneal reflection on the user's eye, which can be seen by the camera and exploited to enhance tracking accuracy.  ... 
doi:10.1145/2401836.2401838 fatcat:rtifj4z6ajaybajp3dc76xm3da

A HRI Framework Based on Eye Tracking [chapter]

Xiaodong Zhao, Zheng Liu, Shiwei Cheng
2022 Frontiers in Artificial Intelligence and Applications  
These intents are then sent to an assistive robot and guide the robot to accomplish specific tasks such as using keys to open the door.  ...  The framework is designed for people with normal vision, and it will track and analyze their eye movements to infer their intents in a smart home environment.  ...  ), and Zhejiang Provincial Key Laboratory of Integration of Healthy Smart Kitchen System (No.2020F04) .  ... 
doi:10.3233/faia220048 fatcat:wv5bgutlnzckzgd24wh25ybc5q

User-assisted Video Reflection Removal [article]

Amgad Ahmed, Suhong Kim, Mohamed Elgharib, Mohamed Hefeeda
2020 arXiv   pre-print
This, however, is a challenging and ill-posed problem as there is an infinite number of valid decompositions. To address this problem, we propose a user-assisted method for video reflection removal.  ...  We show that user-assistance significantly improves the layer separation results.  ...  Color information is essential for the tracker to detect and track a feature, and if there is not enough color information, target tracking is hard or not possible.  ... 
arXiv:2009.03281v1 fatcat:elex4o5otvgmfgjr2j3p2wjtwa

Survey on Key Technologies of Eye Gaze Tracking

Jing-Yao HU, Yong-Yue XING, Lin-Na LIU, Xiao-Cui ZHANG, Qing Li, Jian-Nan CHI
2017 DEStech Transactions on Computer Science and Engineering  
Despite active research and significant progress in the last 30 years, gaze estimation remains challenging due to the light conditions, eye detection and calibration.  ...  We present a detailed review of recent techniques for eye detection and gaze estimation. We also survey methods for gaze estimation and summarize the advantages and disadvantages of these systems.  ...  Overview of Gaze Tracking Methods of Eye Feature Detection Researches in eye detection and tracking focuses on two areas: eye localization in the image and gaze estimation.  ... 
doi:10.12783/dtcse/aice-ncs2016/5623 fatcat:rec377l35nc47mnesh6kpgbehq

Visual Sensor Fusion Based Autonomous Robotic System for Assistive Drinking

Pieter Try, Steffen Schöllmann, Lukas Wöhle, Marion Gebhard
2021 Sensors  
The sensor fusion algorithm is implemented in a visual tracking system which consists of a 2-D camera and a single point time-of-flight distance sensor.  ...  This system features an abort command that is triggered by turning the head and unambiguous tracking of multiple faces which enable safe human robot interaction.  ...  The starting point of the navigation is an arbitrary pose where the cup is positioned in front of the user and the visual tracking system is oriented towards the user.  ... 
doi:10.3390/s21165419 pmid:34450861 pmcid:PMC8401834 fatcat:u5nymjzwtnelhbec46fhgtf524

Guided Text Spotting for Assistive Blind Navigation in Unfamiliar Indoor Environments [chapter]

Xuejian Rong, Bing Li, J. Pablo Muñoz, Jizhong Xiao, Aries Arditi, Yingli Tian
2016 Lecture Notes in Computer Science  
The density of extracted text-specific feature points serves as an efficient text indicator to guide the user closer to text-likely regions for better recognition performance.  ...  Specifically, a novel spatial-temporal text localization algorithm is proposed to localize and prune text regions, by integrating strokespecific features with a subsequent text tracking process.  ...  The candidate text regions are then tracked based on the feature points across consecutive video frames to reduce average computational load, eliminate occasional false alarms, and guide the blind user  ... 
doi:10.1007/978-3-319-50832-0_2 fatcat:mrtt5tfsobh4bp5wbrg3qrwbsu

Assistive Devices Analysis for Visually Impaired Persons: A Review on Taxonomy

Sadia Zafar, Muhammad Asif, Maaz Bin Ahmad, Taher M. Ghazal, Tauqeer Faiz, Munir Ahmad, Muhammad Adnan Khan
2022 IEEE Access  
VIPs need assistance in performing daily life tasks like object/obstacle detection and recognition, navigation, and mobility, particularly in indoor and outdoor environments.  ...  Finally, feedback is provided to the user through auditory and/or vibratory means. It is observed that most of the existing devices are constrained in their abilities.  ...  Her research interests include image and video processing, computer vision and machine learning, and web engineering.  ... 
doi:10.1109/access.2022.3146728 fatcat:gpjr5ip4abf3nnkj74upi2d42i

An interactive Augmented Reality system: A prototype for industrial maintenance training applications

Bassem Besbes, Sylvie Naudet Collette, Mohamed Tamaazousti, Steve Bourgeois, Vincent Gay-Bellile
2012 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)  
The training leverages user interactions by simply pointing on a specific object component.  ...  Index Terms: MR/AR applications [industrial and military MR/AR applications]: ;-[Sensors]: vision-based registration and tracking-; User interaction [interaction techniques for MR/AR]: ;-[MR/AR applications  ...  In order to take advantage of these features, the AR system must involve tools of wider applicability and must assure an intuitive interaction with the user.  ... 
doi:10.1109/ismar.2012.6402568 dblp:conf/ismar/BesbesNTBG12 fatcat:x76qyulimnektiiff25gkv2k2a

Design and Evaluation of Vision-based Head and Face Tracking Interfaces for Assistive Input [article]

Chamin Morikawa, Michael J. Lyons
2017 arXiv   pre-print
To illustrate this concretely we describe work from our own research in which we developed two vision-based facial feature tracking algorithms for human computer interaction and assistive input.  ...  Facial gesture interfaces open new possibilities for assistive input technologies. This chapter gives an overview of research aimed at developing vision-based head and face-tracking interfaces.  ...  Specifically these systems (a) initialize the face tracking by blink detection and nostril detection (b) track the tip of the nose for pointing (c) detect mouth opening to allow the user to input click  ... 
arXiv:1707.08019v2 fatcat:6bebqxu66bejrnre2cjrf4njom

Low-Complexity Pupil Tracking for Sunglasses-Wearing Faces for Glasses-Free 3D HUDs

Dongwoo Kang, Hyun Sung Chang
2021 Applied Sciences  
For bare faces with unobstructed eyes, we applied our previous regression-algorithm-based method that uses scale-invariant feature transform features.  ...  Performing real-time pupil localization and tracking is complicated by drivers wearing facial accessories such as masks, caps, or sunglasses.  ...  In addition, the eye region classifications were assisted by the cascade-AdaBoost classifier and LBP features.  ... 
doi:10.3390/app11104366 fatcat:w63oyhkjqnclzelc4ycffroxgq
« Previous Showing results 1 — 15 out of 89,813 results