Filters








2,396 Hits in 4.8 sec

Multiple-Gaze Geometry: Inferring Novel 3D Locations from Gazes Observed in Monocular Video [chapter]

Ernesto Brau, Jinyan Guan, Tanya Jeffries, Kobus Barnard
2018 Lecture Notes in Computer Science  
Conversely, knowing 3D locations of scene elements that draw visual attention, such as other people in the scene, can help infer gaze direction.  ...  We provide a Bayesian generative model for the temporal scene that captures the joint probability of camera parameters, locations of people, their gaze, what they are looking at, and locations of visual  ...  The top row shows a visualization of the results of the baseline algorithm (MGG-BASELINE), in which the yaw of the gaze direction is set based on the walking directions, and the static objects are estimated  ... 
doi:10.1007/978-3-030-01225-0_38 fatcat:dg6tki5svbabbk4ocinburm6ge

Using gaze information to improve image difference metrics

Marius Pedersen, Jon Y. Hardeberg, Peter Nussbaum, Bernice E. Rogowitz, Thrasyvoulos N. Pappas
2008 Human Vision and Electronic Imaging XIII  
A frequency map from the eye tracker data was applied as a weighting to the image difference metrics.  ...  We carried out a psychophysical experiment with 25 observers along with a recording of the observers gaze position.  ...  stripe in the middle of the image have a larger visual difference from the original.  ... 
doi:10.1117/12.764468 dblp:conf/hvei/PedersenHN08 fatcat:sprkvaqc7bckncaq6tazspslzi

Aggregated Gaze Data Visualization Using Contiguous Irregular Cartograms

Vassilios Krassanakis
2021 Digital  
Gaze data visualization constitutes one of the most critical processes during eye-tracking analysis.  ...  In the present study, contiguous irregular cartograms are used as a method to visualize eye-tracking data captured by several observers during the observation of a visual stimulus.  ...  Data Availability Statement: Not applicable. Conflicts of Interest: The author declares no conflict of interest.  ... 
doi:10.3390/digital1030010 fatcat:vduci4kmhfdhpp5vgt32hy2v5m

BarvEye - Bifocal Active Gaze Control for Autonomous Driving

Ernst Dieter Dickmanns
2015 Proceedings of the 10th International Conference on Computer Vision Theory and Applications  
For approaching human levels of performance, larger knowledge bases on separate levels for a) image features, b) objects / subjects, and c) situations in application domains have to be developed in connection  ...  Visual ranges of more than 200 m and simultaneous fields of view of at least 100° seem to be minimal requirements; potential viewing angles of more than 200° are desirable at road crossings and at traffic  ...  Figure 3 : 3 Figure 3: Position of traffic sign in wide angle image (curves 1), and gaze direction of the yaw platform (curve 2) for detecting, tracking, and high-resolution imaging of the sign.  ... 
doi:10.5220/0005258904280436 dblp:conf/visapp/Dickmanns15a fatcat:mgignbqtcff7jfbx5v5kjjp7yu

Gaze stabilization in mantis shrimp in response to angled stimuli

Ilse M. Daly, Martin J. How, Julian C. Partridge, Nicholas W. Roberts
2019 Journal of Comparative Physiology A. Sensory, neural, and behavioral physiology  
In this work, we reinforce this finding, demonstrating that the yaw gaze stabilization response of the mantis shrimp is robust to the ambiguous motion cues arising from the motion of striped visual gratings  ...  When it comes to mantis shrimp, however, the situation becomes complicated due to the complexity of their visual system and their range of eye movements.  ...  IMD and JCP analysed the data. All authors helped with experimental design as well as with writing and editing the manuscript.  ... 
doi:10.1007/s00359-019-01341-5 pmid:31093738 pmcid:PMC6647723 fatcat:4udekvercrfsrldwpsvzj5rahm

Saliency-Based Gaze Visualization for Eye Movement Analysis

Sangbong Yoo, Seongmin Jeong, Seokyeon Kim, Yun Jang
2021 Sensors  
Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention.  ...  Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data.  ...  Data Availability Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s21155178 fatcat:5vgryneurvbaljpvreodc25vf4

Visual gaze control during peering flight manoeuvres in honeybees

N. Boeddeker, J. M. Hemmi
2009 Proceedings of the Royal Society of London. Biological Sciences  
Gaze stabilization reduces motion blur and prevents image rotations. It also assists in depth perception based on translational optic flow.  ...  By having bees fly through an oscillating, patterned drum, we show that head stabilization is based mainly on visual motion cues.  ...  In every image of the video sequence, the pixel coordinates of the two eyes were then determined automatically by shifting and rotating the templates of the two eyes until the best match was found, determined  ... 
doi:10.1098/rspb.2009.1928 pmid:20007175 pmcid:PMC2842814 fatcat:vpcqub6h6bgrhefvlrnn3odtyu

Polarization contrasts and their effect on the gaze stabilisation of crustaceans

Christian Drerup, Martin J. How
2021 Journal of Experimental Biology  
Many animals go to great lengths to stabilise their eyes relative to the visual scene and do so to enhance the localisation of moving objects and to functionally partition the visual system relative to  ...  This work therefore suggests that the gaze stabilisation in many crustaceans cannot be elicited by the polarization of light alone.  ...  Fiddler crab collection was carried out with the authorization of the Consejerıá de Agricultura, Ganaderıá, Pesca y Desarrollo Sostenible, Junta de Andalucıá.  ... 
doi:10.1242/jeb.229898 pmid:33692078 pmcid:PMC8077661 fatcat:uemfg5hxorfsje5qzjjdna456q

Single web camera robust interactive eye-gaze tracking method

A. Wojciechowski, K. Fornalczyk
2015 Bulletin of the Polish Academy of Sciences: Technical Sciences  
The paper presents very effective, low cost, computer vision based, interactive eye-gaze tracking method.  ...  Eye-gaze tracking is an aspect of human-computer interaction still growing in popularity,. Tracking human gaze point can help control user interfaces and may help evaluate graphical user interfaces.  ...  Subtraction of left and right eyes images (one of the eye images was mirrored) let evaluate gaze direction.  ... 
doi:10.1515/bpasts-2015-0100 fatcat:ny4fuydvrrfotdvcyxjrv774uq

Effects of Experts' Annotations on Fashion Designers Apprentices' Gaze Patterns and Verbalisations

Alessia E. Coppi, Catharine Oertel, Alberto Cattaneo
2021 Zenodo  
durations and gaze coverage) and verbalisations (i.e., images descriptions) are afected.  ...  Therefore, this study focuses on trying to convey a professional way to look at images by expos ing apprentices to images annotated (e.g., circles) by experts and identifying if their gaze (e.g., fxation  ...  Hypotheses Based on the above-mentioned literature, annotations appear to be an effective technique for promoting visual expertise, often combined with eye tracking techniques.  ... 
doi:10.5281/zenodo.5751516 fatcat:xpsgyqgjcbbivc37z4asaubryq

Downward Gazing for Steadiness [article]

Yogev Koren, Rotem Mairon, Ilay Sofer, Yisrael Parmet, Ohad Ben-Shahar, Simona Bar-Haim
2020 bioRxiv   pre-print
AbstractWhen walking on an uneven surface or complex terrain, humans tend to gaze downward. Previous investigations indicate that visual information can be used for online control of stepping.  ...  Moreover, this evidence raises concerns regarding the way we interpret gaze behavior without the knowledge of the type and use of the information gathered.  ...  Acknowledgments This research was supported by the Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Initiative and by the Marcus Endowment Fund both at Ben-Gurion University of  ... 
doi:10.1101/2020.02.28.969162 fatcat:fqexgx6mrfflfiszkskupgxs3m

Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations

Anuradha Kar, Peter Corcoran
2018 Sensors  
eye tracking systems.  ...  An eye tracker's accuracy and system behavior play critical roles in determining the reliability and usability of eye gaze data obtained from them.  ...  Project ID: 13/SPP/I2868 on "Next Generation Imaging for Smartphone and Embedded Platforms". Conflicts of Interest: The authors declare no conflict of interest. Sensors 2018, 18, 3151  ... 
doi:10.3390/s18093151 pmid:30231547 pmcid:PMC6165570 fatcat:shu3ulrbzze3vbbocdyoifbcva

Gaze Data for the Analysis of Attention in Feature Films

Katherine Breeden, Pat Hanrahan
2017 ACM Transactions on Applied Perception  
We also thank Tilke Judd and colleagues for making their data available. Video encoding and frame extraction were performed using FFmpeg (http://mpeg.org/) and HandBrake (https://handbrake.fr).  ...  Many previous studies using eye tracking to probe visual attention have focused on static images; these studies have explored the inuence of both low and high-level image features.  ...  Aggregate analysis of the spatial distribution of these gaze points reveals subtle dierences when compared to eye tracking still images.  ... 
doi:10.1145/3127588 fatcat:7h6nwhhiobckthdiso3gma6u54

Audio- and Gaze-driven Facial Animation of Codec Avatars [article]

Alexander Richard, Colin Lea, Shugao Ma, Juergen Gall, Fernando de la Torre, Yaser Sheikh
2020 arXiv   pre-print
In this paper we describe the first approach to animate these parametric models in real-time which could be deployed on commodity virtual reality hardware using audio and/or eye tracking.  ...  the supplemental video which demonstrates our ability to generate full face motion far beyond the typically neutral lip articulations seen in competing work: https://research.fb.com/videos/audio-and-gaze-driven-facial-animation-of-codec-avatars  ...  Figure 1 shows images of all three subjects. Tracked 3D meshes, a tracked deep active appearance model, and head pose were extracted in a similar manner as described in Lombardi et al. [27] .  ... 
arXiv:2008.05023v1 fatcat:buynbu3ztbbslkwkfjwnuxcu6u

A Tree-Structured Model of Visual Appearance Applied to Gaze Tracking [chapter]

Jeffrey B. Mulligan
2005 Lecture Notes in Computer Science  
The ideas are illustrated with examples from an outdoor gaze-tracking application.  ...  Repeated application of the partitioning procedure results in a tree-structured representation of the image space.  ...  When images of the eye are collected in the laboratory, illumination can be carefully controlled, and the variations in pose are often restricted (e.g., we may only be interested in tracking the gaze within  ... 
doi:10.1007/11595755_37 fatcat:irg3pfpq2fc5tdu4pgwzqqhjua
« Previous Showing results 1 — 15 out of 2,396 results