figshare
Browse

EyeTrackingVRDataset

Version 5 2024-10-14, 21:23
Version 4 2024-10-11, 20:20
Version 3 2024-10-11, 18:54
Version 2 2024-10-11, 18:52
Version 1 2024-05-06, 15:22
dataset
posted on 2024-10-14, 21:23 authored by Colin RubowColin Rubow, Chia-Hsuan Tsai, Haohan ZhangHaohan Zhang, Daniel S. Brown

A multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our
data includes head and eye movement for 20 volunteers that interacted with four different virtual reality environments that required coordinated head and eye behaviors. Our data collection involved two visual pursuit tasks and two visual searching tasks. Each participant performed each task three times, resulting in approximately 1080 seconds of paired head and eye movement per participant. This dataset enables research into predictive models of intended head movement conditioned on gaze for augmented and virtual reality experiences, as well as assistive devices like powered exoskeletons for individuals with head-neck mobility limitations. This dataset also allows biobehavioral and mechanism studies of the variability in head and eye movement across different participants and tasks.

Funding

R21-EB035378

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC