figshare
Browse
1/1
2 files

A Human Factors Evaluation of Mixed Reality Technologies for Command and Control Applications

journal contribution
posted on 2018-01-03, 09:56 authored by Christopher Bibb
Technical paper presented at the 2017 Defence and Security Doctoral Symposium.

This research assesses the Human Factors aspects of adopting Mixed Reality (MxR) technologies for advanced interaction and visualisation techniques within future cockpit environments, where the role of the occupant is envisaged to change from that of a pilot to mission management specialist. Specifically, the work sets out to assess the impact on human perceptual-motor performance, cognition, workload and situational awareness of replacing physical display and control facilities with virtual alternatives. The use of fully virtual displays, viewed through a head-mounted display, allows for the rapid integration of new advanced sensor visualisation methods as well as supporting the tailoring of task-oriented bespoke interface layouts to the end user’s immediate (and dynamically changing) needs, in contrast to the rigid nature of physical displays. Furthermore, a virtual display can reduce the time and cost of changes/upgrades without incurring extensive physical modifications to the platform. When performing continuous gross input tasks, early results indicate that the MxR system promotes minor performance increases in response time when compared to existing systems (touchscreen and HOTAS input methods), with a reduced physical and cognitive workload over prolonged use. However, for complex interaction tasks, the MxR system was subjectively rated as a more cumbersome display and interface method.

Funding

iCase studentship funded by EPSRC and industrial sponsor BAE systems

History

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC