figshare
Browse

SNaSI: Social navigation through subtle interactions with an AI agent

Download (2.52 MB)
Version 2 2019-04-22, 20:53
Version 1 2019-04-11, 07:59
journal contribution
posted on 2019-04-22, 20:53 authored by RTD ConferenceRTD Conference, Rebecca Kleinberger, Joshua Huburn, Martin Grayson, Cecily Morrison
Technology advances have set the stage for intelligent visual agents, with many initial applications being created for people who are blind or have low vision. While most focus on spatial navigation, recent literature suggests that supporting social navigation could be particularly powerful by providing appropriate cues that allow blind and low vision people to enter into and sustain social interaction. A particularly poignant design challenge to enable social navigation is managing agent interaction in a way that augments rather than disturbs social interaction. Usage of existing agent-like technologies have surfaced some of the difficulties in this regard. In particular, it is difficult to talk to a person when an agent is speaking to them. It is also difficult to speak with someone fiddling with a device to manipulate their agent. In this paper we present SNaSI, a wearable designed to provoke the thinking process around how we support social navigation through subtle interaction. Specifically, we are interested to generate thinking about the triangular relationship between a blind user, an communication partner and the system containing an AI agent. We explore how notions of subtlety, but not invisibility, can enable this triadic relationship. SNaSI builds upon previous research on sensory substitution and the work of Bach-y-Rita (Bach-y-Rita 2003) but explores those ideas in the form of a social instrument.

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC