A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Generating accurate 3D gaze vectors using synchronized eye tracking and motion capture
[article]
2021
bioRxiv
pre-print
Assessing gaze behaviour during real-world tasks is difficult; dynamic bodies moving through dynamic worlds make finding gaze fixations challenging. Current approaches involve laborious coding of pupil positions overlaid on video. One solution is to combine eye tracking with motion tracking to generate 3D gaze vectors. When combined with tracked or known object locations, fixation detection can be automated. Here we use combined eye and motion tracking and explore how linear regression models
doi:10.1101/2021.10.22.465332
fatcat:dpwbdjab6fd7vfjq3invam2y4y