Filters








48 Hits in 6.8 sec

Natural Pursuits for Eye Tracker Calibration

Michaela Murauer, Michael Haslgrübler, Alois Ferscha
2018 Proceedings of the 5th international Workshop on Sensor-based Activity Recognition and Interaction - iWOAR '18  
In this thesis, Natural Pursuit Calibration is presented, which is a comfortable, unobtrusive technique enabling ongoing attention detection and eye tracker calibration within an off-screen context.  ...  Due to the characteristics of the calibration process, it can be executed simultaneously to any primary task, without active user participation.  ...  Conclusion Within the scope of this work, Natural Pursuit Calibration was introduced, which is an unobtrusive calibration technique for mobile eye trackers, that applies smooth pursuit calibration, without  ... 
doi:10.1145/3266157.3266207 dblp:conf/iwoar/MurauerHF18 fatcat:cjsaa6ge2zaybipccacirsupri

Eye Tracking as a New Interface for Image Retrieval [chapter]

O K Oyekoya, F W M Stentiford
2006 Computer Communications and Networks  
More natural methods of interaction are in demand to replace devices such as the keyboard and the mouse, and it is becoming more important to develop the next generation of human-computer interfaces that  ...  Human behaviour depends on highly developed abilities to perceive and interpret visual information and provides a medium for the next generation of image retrieval interfaces.  ...  Acknowledgements The authors would like to thank the EPSRC, BT Exact, SIRA and the Imaging Faraday Partnership for their support in this work.  ... 
doi:10.1007/978-1-84628-429-8_17 fatcat:hibjjmrbkvbdzfpgvbz72hrf4a

Eye Tracking as a New Interface for Image Retrieval

O K Oyekoya, F W M Stentiford
2004 BT technology journal  
More natural methods of interaction are in demand to replace devices such as the keyboard and the mouse, and it is becoming more important to develop the next generation of human-computer interfaces that  ...  Human behaviour depends on highly developed abilities to perceive and interpret visual information and provides a medium for the next generation of image retrieval interfaces.  ...  Acknowledgements The authors would like to thank the EPSRC, BT Exact, SIRA and the Imaging Faraday Partnership for their support in this work.  ... 
doi:10.1023/b:bttj.0000047130.98920.2b fatcat:wkocgidslnhkxg6vl35t2dqnqu

Neuro-Developmental Engineering: towards Early Diagnosis of Neuro-Developmental Disorders [chapter]

Domenico Campolo, Fabrizio Taffoni, Giuseppina Schiavone, Domenico Formica, Eugenio Guglielmelli, Flavio Keller
2010 New Developments in Biomedical Engineering  
Mechanical sensing: typically used for body motion capture; it uses angle and range measurements with the help of gears and bend sensors; very accurate but bulky, often limiting mobility.  ...  Calibration Procedure for the AVVC eye tracker During calibration procedure movements of the pupil in pixels are transformed to eye position in degrees.  ... 
doi:10.5772/7595 fatcat:lnljxdvuwjbx5ephyz4g2m6jhe

Review of eye tracking metrics involved in emotional and cognitive processes

Vasileios Skaramagkas, Giorgos Giannakakis, Emmanouil Ktistakis, Dimitris Manousos, Ioannis Karatzanis, Nikolaos S. Tachos, Evanthia Tripoliti, Kostas Marias, Dimitrios I. Fotiadis, Manolis Tsiknakis
2021 Zenodo  
Although eye tracking is gaining ground in the research community, it is not yet a popular approach for the detection of emotional and cognitive states.  ...  Eye behaviour provides valuable information revealing one's higher cognitive functions and state of affect.  ...  Robust eye trackers, in terms of accuracy, portability and ease of use, have been developed, that are able to unobtrusively monitor eye movements in real-time.  ... 
doi:10.5281/zenodo.4740077 fatcat:twqx7mlwjbehlmeox7mftucwam

Eyewear Computing – Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

Andreas Bulling, Ozan Cakmakci, Kai Kunze, James M. Rehg, Marc Herbstritt
2016 Dagstuhl Reports  
In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to one half-day and to leave the rest of the week for hands-on sessions, group work, general  ...  The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays.  ...  Bulling, "Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency", in Proc. of the 28th ACM Symp. on User Interface Software and Technology (UIST'15), pp. 363-372, 2015.  ... 
doi:10.4230/dagrep.6.1.160 dblp:journals/dagstuhl-reports/BullingCKR16 fatcat:d3p4lmvat5aczhawlazkd2chpa

Workload-Aware Systems and Interfaces for Cognitive Augmentation [article]

Thomas Kosch
2020 arXiv   pre-print
This temporarily mobilizes additional resources to deal with the workload at the cost of accelerated mental exhaustion.  ...  The human body expresses the use of cognitive resources through physiological responses when confronted with a plethora of cognitive workload.  ...  Researchers have also used smooth pursuit to calibrate eye trackers. Pfeuffer et al. [303] investigated this approach by using animations on a display to implicitly calibrate an eye tracker.  ... 
arXiv:2010.07703v2 fatcat:v3cfq4cs2jhuzlym3za2r4b3le

Ubiquitous Gaze Sensing and Interaction (Dagstuhl Seminar 18252)

Lewis Chuang, Andrew Duchowski, Pernilla Qvarfordt, Daniel Weiskopf, Michael Wagner
2019 Dagstuhl Reports  
tracking across a diverse set of disciplines: geo-information systems, medicine, aviation, psychology, and neuroscience, to explore future applications and to identify requirements for reliable gaze sensing  ...  Eye tracking is no longer restricted to a well controlled laboratory setting, but moving into everyday settings.  ...  This has profound implications for design -whether a user is looking at a moving object can be robustly detected from the correlation of eye movement with the object's motion, as implemented in the Pursuits  ... 
doi:10.4230/dagrep.8.6.77 dblp:journals/dagstuhl-reports/ChuangDQW18 fatcat:7nfkzimerrb3hpcb25bwy7sf5e

Using Dynamics of Eye Movements, Speech Articulation and Brain Activity to Predict and Track mTBI Screening Outcomes

James R. Williamson, Doug Sturim, Trina Vian, Joseph Lacirignola, Trey E. Shenk, Sophia Yuditskaya, Hrishikesh M. Rao, Thomas M. Talavage, Kristin J. Heaton, Thomas F. Quatieri
2021 Frontiers in Neurology  
The focus of the present study is to investigate the potential for using passive measurements of fine motor movements (smooth pursuit eye tracking and read speech) and resting state brain activity (measured  ...  Unobtrusive measurement and tracking of cognitive functioning is needed to enable preventative interventions for people at elevated risk of concussive injury.  ...  ACKNOWLEDGMENTS Authors would like to thank Laurel Keyes for early data collection support.  ... 
doi:10.3389/fneur.2021.665338 fatcat:m6ryb4aqnbgera2ssu5h43xv34

Technologies for Multimodal Interaction in Extended Reality—A Scoping Review

Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, Roope Raisamo
2021 Multimodal Technologies and Interaction  
Our purpose was to provide a succinct, yet clear, insightful, and structured overview of emerging, underused multimodal technologies beyond standard video and audio for XR interaction, and to find research  ...  We conclude with our perspective on promising research avenues for multimodal interaction technologies.  ...  VOG-based trackers typically require calibration before the gaze point can be estimated.  ... 
doi:10.3390/mti5120081 fatcat:3oea4tqlwfcojfvnohv72id7i4

User Interfaces for Mobile Augmented Reality Systems [article]

Steve Feiner
2003 International Conference on Vision, Video and Graphics  
This talk provides an overview of work that explores user interface design issues for mobile augmented reality systems, which use tracked see-through and hear-through displays to overlay virtual graphics  ...  What should user interfaces look like when they become an integral part of how we experience the world around us?  ...  ., 2001) we describe a futuristic scenario of using such an unobtrusive mobile helper interface.  ... 
doi:10.2312/vvg.20031017 dblp:conf/vvg/Feiner03 fatcat:5mztekgvszg33ag6lyoyvxkit4

Visual and narrative priorities of the blind and non-blind: eye tracking and audio description

Elena Di Giovanni
2013 Perspectives: Studies in Translatology  
Conclusion The end of the account of this complex experiment brings us back to the beginning, but with a new perspective: the seemingly oxymoronic nature of the relationship between eye tracking research  ...  A careful use of these data is obviously necessary, just as it is necessary to consider the precise sequence of the saccades and smooth pursuits.  ... 
doi:10.1080/0907676x.2013.769610 fatcat:2iywcbma6ra37o2oy2zybgbmn4

On Driver Behavior Recognition for Increased Safety: A Roadmap

Luca Davoli, Marco Martalò, Antonio Cilfone, Laura Belli, Gianluigi Ferrari, Roberta Presta, Roberto Montanari, Maura Mengoni, Luca Giraldi, Elvio G. Amparore, Marco Botta, Idilio Drago (+3 others)
2020 Safety  
algorithms used for human emotion classification.  ...  In this paper, we first review the state-of-the-art of emotional and cognitive analysis for ADAS: we consider psychological models, the sensors needed for capturing physiological signals, and the typical  ...  Inertial Sensors Considering mobile sensing elements, often worn by the person to be monitored, the adoption of Inertial Measurement Units (IMUs) allows for estimation of the motion level.  ... 
doi:10.3390/safety6040055 fatcat:rntcjtvwkjaa5mrxsc2lji5kdy

Gaze Contingent Robotic Control in Minimally Invasive Surgery

Kenko Fujii, Guang-Zhong Yang
2015
A gaze parameter based framework is introduced to assess the use of a new field-of-view expansion technique for improved visualisation and camera trajectory comprehension when disorientated.  ...  To further improve the ergonomics of the gaze contingent system, an online calibration algorithm is integrated into the system. Throughout the thesis, detailed validation and discussion of t [...]  ...  Details of the calibration procedure for feature-based eye trackers were also discussed.  ... 
doi:10.25560/24562 fatcat:7qrzlrzg4jh4lh34wuzmunyqky

Fast and Secure Authentication in Virtual Reality Using Coordinated 3D Manipulation and Pointing

Florian Mathis, John H. Williamson, Kami Vaniea, Mohamed Khamis
2021 ACM Transactions on Computer-Human Interaction  
We found that entering a four-symbol RubikAuth password is fast: 1.69–3.5 s using controller tapping, 2.35–4.68 s using head pose and 2.39 –4.92 s using eye gaze, and highly resilient to observations:  ...  We conclude with an in-depth discussion of authentication systems for VR and outline five learned lessons for designing and evaluating authentication schemes.  ...  of Edinburgh (award number #65040).  ... 
doi:10.1145/3428121 fatcat:mkpestozozfmdf76qi57wpu2ha
« Previous Showing results 1 — 15 out of 48 results