GazeBase is a large-scale longitudinal dataset consisting of 12,334 monocular (left) eye-movement recordings captured from 322 college-aged subjects. Subjects completed a battery of seven tasks in two adjacent sessions during each round of recording, including a - 1) fixation task, 2) horizontal saccade task, 3) random oblique saccade task, 4) reading task, 5/6) free viewing of cinematic video task, and 7) gaze-driven gaming task. A total of nine rounds of recording were conducted over a 37 month period, with participants in each subsequent round recruited exclusively from prior rounds. All data was collected using an EyeLink 1000 eye tracker at a 1,000 Hz sampling rate, with a calibration and validation protocol performed before each task to ensure data quality. Due to its large number of participants and longitudinal nature, GazeBase is well suited for exploring research hypotheses in eye movement biometrics, along with other emerging applications applying machine learning techniques to eye movement signal analysis. Eye movement classification labels produced by the instrument's real-time parser are provided for a relevant subset of GazeBase, along with pupil area. Full details regarding the experimental protocol are available in the corresponding data descriptor manuscript.
Funding
CAREER: Secure and Trustworthy Ocular Biometrics
Directorate for Computer & Information Science & Engineering