Filters








2,260 Hits in 6.7 sec

Integrating Human Gaze into Attention for Egocentric Activity Recognition [article]

Kyle Min, Jason J. Corso
2020 arXiv   pre-print
In this work, we introduce an effective probabilistic approach to integrate human gaze into spatiotemporal attention for egocentric activity recognition.  ...  Our method outperforms all the previous state-of-the-art approaches on EGTEA, which is a large-scale dataset for egocentric activity recognition provided with gaze measurements.  ...  Acknowledgement We thank Ryan Szeto and Christina Jung for their valuable comments. This research was, in part, supported by NIST grant 60NANB17D191.  ... 
arXiv:2011.03920v1 fatcat:jsrsmciatfbvllzuy66firfbom

An Introduction to the 3rd Workshop on Egocentric (First-Person) Vision

Steve Mann, Kris M. Kitani, Yong Jae Lee, M. S. Ryoo, Alireza Fathi
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops  
with inside-out cameras (a camera to capture eye gaze and an outward-looking camera), recognizing human interactions and modeling focus of attention.  ...  We believe that this human-centric characteristic of egocentric vision can have a large impact on the way we approach central computer vision tasks such as visual detection, recognition, prediction, and  ...  During this time, Land and Hayhoe [30] also began exploring the relationship between eye gaze and hand motion, laying the conceptual groundwork for later integrating outside and inside egocentric vision  ... 
doi:10.1109/cvprw.2014.133 dblp:conf/cvpr/MannKLRF14 fatcat:3scdftvmbzgudkkbljndxawwjy

4th international workshop on pervasive eye tracking and mobile eye-based interaction

Thies Pfeiffer, Sophie Stellmach, Yusuke Sugano
2014 Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct  
With the recent growth of interest in smart glass devices and low-cost eye trackers, however, gaze-based techniques for mobile computing is becoming increasingly important.  ...  Previous work on eye tracking and eye-based human-computer interfaces mainly concentrated on making use of the eyes in traditional desktop settings.  ...  For example, egocentric videos have been used to reconstruct 3D social gaze to analyze human-human interaction [15] , and eye gaze information has been incorporated into first-person activity recognition  ... 
doi:10.1145/2638728.2641686 dblp:conf/huc/PfeifferSS14 fatcat:xxhe3ubl3rcidoxsapjmw4zb3m

Attention is All We Need: Nailing Down Object-centric Attention for Egocentric Activity Recognition [article]

Swathikiran Sudhakaran, Oswald Lanz
2018 arXiv   pre-print
In this paper we propose an end-to-end trainable deep neural network model for egocentric activity recognition.  ...  We learn highly specialized attention maps for each frame using class-specific activations from a CNN pre-trained for generic image recognition, and use them for spatio-temporal encoding of the video with  ...  , LANZ: OBJECT-CENTRIC ATTENTION FOR ACTIVITY RECOGNITION SUDHAKARAN, LANZ: OBJECT-CENTRIC ATTENTION FOR ACTIVITY RECOGNITION (a) Close coffee (b) Close honey (c) Close mustardFigure 5: Spatial attention  ... 
arXiv:1807.11794v1 fatcat:sqentueowrba5oesvn2wgxxsky

In the Eye of the Beholder: Gaze and Actions in First Person Video [article]

Yin Li, Miao Liu, James M. Rehg
2020 arXiv   pre-print
We further sample from these stochastic units, generating an attention map to guide the aggregation of visual features for action recognition.  ...  Moving beyond the dataset, we propose a novel deep model for joint gaze estimation and action recognition in FPV.  ...  This work was also partially supported by Intel Science of Technology Center for Pervasive Computing (ISTC-PC). The work was developed during the first author's Ph.D. thesis at Georgia Tech.  ... 
arXiv:2006.00626v2 fatcat:dhgnxh77pbf7fh32anb72wd4ym

Eyewear Computing – Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

Andreas Bulling, Ozan Cakmakci, Kai Kunze, James M. Rehg, Marc Herbstritt
2016 Dagstuhl Reports  
during the seminar, 4) an article in ACM Interactions entitled "Eyewear Computers for Human-Computer Interaction", as well as 5) two follow-up workshops on "Egocentric Perception, Interaction, and Computing  ...  augmentation and task guidance, eyewear computing for gaming, as well as prototyping of VR applications, 2) a list of datasets and research tools for eyewear computing, 3) three small-scale datasets recorded  ...  If gaze estimation can be naturally integrated into daily-life scenarios with these techniques, collaboration between eyewear computers and human attention will have larger potential for future investigation  ... 
doi:10.4230/dagrep.6.1.160 dblp:journals/dagstuhl-reports/BullingCKR16 fatcat:d3p4lmvat5aczhawlazkd2chpa

Digging Deeper Into Egocentric Gaze Prediction

Hamed Rezazadegan Tavakoli, Esa Rahtu, Juho Kannala, Ali Borji
2019 2019 IEEE Winter Conference on Applications of Computer Vision (WACV)  
This paper digs deeper into factors that influence egocentric gaze.  ...  where the tasks or sequences are similar, and (7) task and activity recognition can benefit from gaze prediction.  ...  This analysis shows that gaze can be used to further improve activity and tasks recognition and has applications in egocentric vision.  ... 
doi:10.1109/wacv.2019.00035 dblp:conf/wacv/TavakoliRKB19 fatcat:vgzfer6zhrfjjnmpqjzvvm6cfi

Learning to Recognize Daily Actions Using Gaze [chapter]

Alireza Fathi, Yin Li, James M. Rehg
2012 Lecture Notes in Computer Science  
We demonstrate improvements in action recognition rates and gaze prediction accuracy relative to state-of-the-art methods, on two new datasets that contain egocentric videos of daily activities and gaze  ...  We present a probabilistic generative model for simultaneously recognizing daily actions and predicting gaze locations in videos recorded from an egocentric camera.  ...  In this paper, we address the question of how such gaze measurements could be useful for activity recognition in egocentric video.  ... 
doi:10.1007/978-3-642-33718-5_23 fatcat:ay4dgavy3bfvljxxyii2hs3efy

Recognition of Activities of Daily Living with Egocentric Vision: A Review

Thi-Hoa-Cuc Nguyen, Jean-Christophe Nebel, Francisco Florez-Revuelta
2016 Sensors  
This paper presents a review of the state of the art of egocentric vision systems for the recognition of ADLs following a hierarchical structure: motion, action and activity levels, where each level provides  ...  Video-based recognition of activities of daily living (ADLs) is being used in ambient assisted living systems in order to support the independent living of older people.  ...  According to this, human behaviour analysis tasks are classified [9] into motion, action, activity and behaviour (Figure 1) .  ... 
doi:10.3390/s16010072 pmid:26751452 pmcid:PMC4732105 fatcat:okm2fswkjrdzleelae46u3nfna

Activity Recognition in Egocentric Life-Logging Videos [chapter]

Sibo Song, Vijay Chandrasekhar, Ngai-Man Cheung, Sanath Narayan, Liyuan Li, Joo-Hwee Lim
2015 Lecture Notes in Computer Science  
Activities in LENA can also be grouped into 5 top-level categories to meet various needs and multiple demands for activities analysis research.  ...  We evaluate state-of-the-art activity recognition using LENA in detail and also analyze the performance of popular descriptors in egocentric activity recognition.  ...  Georgia Tech Egocentric Activities Gaze(+) Datasets The Georgia Tech Egocentric Activities Gaze(+) Datasets [6] consist of two datasets which contain gaze location information associated with egocentric  ... 
doi:10.1007/978-3-319-16634-6_33 fatcat:lq26nyp6xzhyrmoeehrmaykwbe

Teacher's Perception in the Classroom [article]

Ömer Sümer, Patricia Goldberg, Kathleen Stürmer, Tina Seidel, Peter Gerjets, Ulrich Trautwein, Enkelejda Kasneci
2018 arXiv   pre-print
The ability for a teacher to engage all students in active learning processes in classroom constitutes a crucial prerequisite for enhancing students achievement.  ...  Teachers' attentional processes provide important insights into teachers' ability to focus their attention on relevant information in the complexity of classroom interaction and distribute their attention  ...  However, their work does not include eye tracking and gaze estimation for a finescale analysis of human attention.  ... 
arXiv:1805.08897v1 fatcat:pih4ginf5bbsjb6jjzlnhh2ixa

Predicting Gaze in Egocentric Video by Learning Task-dependent Attention Transition [article]

Yifei Huang, Minjie Cai, Zhenqiang Li, Yoichi Sato
2018 arXiv   pre-print
Experiments on public egocentric activity datasets show that our model significantly outperforms state-of-the-art gaze prediction methods and is able to learn meaningful transition of human attention.  ...  We present a new computational model for gaze prediction in egocentric videos by exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent on egocentric manipulation  ...  and predict human gaze in egocentric video [37] .  ... 
arXiv:1803.09125v3 fatcat:rqqcy636dvg53ptodus5xwhc4y

Predicting Gaze in Egocentric Video by Learning Task-Dependent Attention Transition [chapter]

Yifei Huang, Minjie Cai, Zhenqiang Li, Yoichi Sato
2018 Lecture Notes in Computer Science  
Experiments on public egocentric activity datasets show that our model significantly outperforms state-of-the-art gaze prediction methods and is able to learn meaningful transition of human attention.  ...  We present a new computational model for gaze prediction in egocentric videos by exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent on egocentric manipulation  ...  and predict human gaze in egocentric video [37] .  ... 
doi:10.1007/978-3-030-01225-0_46 fatcat:txpm32frjfe6zbfyk5jewuolue

Forecasting user attention during everyday mobile interactions using device-integrated and wearable sensors

Julian Steil, Philipp Müller, Yusuke Sugano, Andreas Bulling
2018 Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI '18  
We instead study attention forecasting -- the challenging task of predicting users' gaze behaviour (overt visual attention) in the near future.  ...  We propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users' visual scene.  ...  ACKNOWLEDGEMENTS We would like to thank all participants for their help with the data collection, as well as Preeti Dolakasharia, Nahid Akhtar and Muhammad Muaz Usmani for their help with the annotation  ... 
doi:10.1145/3229434.3229439 dblp:conf/mhci/SteilMSB18 fatcat:hgsflge6knez7i43az5upjaltq

Recognition of Activities from Eye Gaze and Egocentric Video [article]

Anjith George, Aurobinda Routray
2018 arXiv   pre-print
This paper presents a framework for recognition of human activity from egocentric video and eye tracking data obtained from a head-mounted eye tracker.  ...  The combination of features obtains better accuracy for activity classification as compared to individual features.  ...  Recent works in egocentric video-based (first person view) activity recognition [4] , [5] , [6] has shown great promise in providing insights into various activities.  ... 
arXiv:1805.07253v1 fatcat:zthzt2p6ujhpnmzax7zssj7d6y
« Previous Showing results 1 — 15 out of 2,260 results