Recognition of Activities from Eye Gaze and Egocentric Video [article]

Anjith George, Aurobinda Routray
2018 arXiv   pre-print
This paper presents a framework for recognition of human activity from egocentric video and eye tracking data obtained from a head-mounted eye tracker. Three channels of information such as eye movement, ego-motion, and visual features are combined for the classification of activities. Image features were extracted using a pre-trained convolutional neural network. Eye and ego-motion are quantized, and the windowed histograms are used as the features. The combination of features obtains better
more » ... curacy for activity classification as compared to individual features.
arXiv:1805.07253v1 fatcat:zthzt2p6ujhpnmzax7zssj7d6y