figshare
Browse
Data_Sheet_1_Humans in XAI: increased reliance in decision-making under uncertainty by using explanation strategies.zip (1.11 MB)

Data_Sheet_1_Humans in XAI: increased reliance in decision-making under uncertainty by using explanation strategies.zip

Download (1.11 MB)
dataset
posted on 2024-03-08, 04:19 authored by Olesja Lammert, Birte Richter, Christian Schütze, Kirsten Thommes, Britta Wrede
Introduction

Although decision support systems (DSS) that rely on artificial intelligence (AI) increasingly provide explanations to computer and data scientists about opaque features of the decision process, especially when it involves uncertainty, there is still only limited attention to making the process transparent to end users.

Methods

This paper compares four distinct explanation strategies employed by a DSS, represented by the social agent Floka, designed to assist end users in making decisions under uncertainty. Using an economic experiment with 742 participants who make lottery choices according to the Holt and Laury paradigm, we contrast two explanation strategies offering accurate information (transparent vs. guided) with two strategies prioritizing human-centered explanations (emotional vs. authoritarian) and a baseline (no explanation).

Results and discussion

Our findings indicate that a guided explanation strategy results in higher user reliance than a transparent strategy. Furthermore, our results suggest that user reliance is contingent on the chosen explanation strategy, and, in some instances, the absence of an explanation can also lead to increased user reliance.

History