Skip to content

Code and data for the paper "When Do People Want an Explanation?", presented at HRI'24

Notifications You must be signed in to change notification settings

lwachowiak/HRI-Video-Survey-on-Preferred-Robot-Responses

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

When do People Want an Explanation from a Robot? (HRI'24)

This repository contains the data for the paper "When Do People Want an Explanation from a Robot?", which you can read here. The anonymous questionnaire responses can be found in the data folder. The analysis code is provided as juypter notebook. Lastly, the questionnaire used to gather the data can be accesed as PDF.

Reference

If you want to refer to or use the data, findings, videos, etc. cite:

@inproceedings{wachowiak_hri2024,
  author = {Wachowiak, Lennart and Fenn, Andrew and Kamran, Haris and Coles, Andrew and Celiktutan, Oya and Canal, Gerard},
  title = {When Do People Want an Explanation from a Robot?},
  year = {2024},
  isbn = {9798400703225},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  doi = {10.1145/3610977.3634990},
  booktitle = {Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction},
  pages = {752–761},
  numpages = {10},
  keywords = {error mitigation, explainability, hri, human-agent interaction, user study, user-centered ai, xai},
  location = {Boulder, CO, USA, },
  series = {HRI '24}
}

Abstract

Explanations are a critical topic in AI and robotics, and their importance in generating trust and allowing for successful human–robot interactions has been widely recognized. However, it is still an open question when and in what interaction contexts users most want an explanation from a robot. In our pre-registered study with 186 participants, we set out to identify a set of scenarios in which users show a strong need for explanations. Participants are shown 16 videos portraying seven distinct situation types, from successful human–robot interactions to robot errors and robot inabilities. Afterwards, they are asked to indicate if and how they wish the robot to communicate subsequent to the interaction in the video.

The results provide a set of interactions, grounded in literature and verified empirically, in which people show the need for an explanation. Moreover, we can rank these scenarios by how strongly users think an explanation is necessary and find statistically significant differences. Comparing giving explanations with other possible response types, such as the robot apologizing or asking for help, we find that why-explanations are always among the two highest-rated responses, with the exception of when the robot simply acts normally and successfully. This stands in stark contrast to the other possible response types that are useful in a much more restricted set of situations. Lastly, we test for factors of an individual that might influence their response preferences, for example, their general attitude towards robots, but find no significant correlations. Our results can guide roboticists in designing more user-centered and transparent interactions and let explainability researchers develop more pinpointed explanations.

HRI Videos

You can check out the stimuli used in the questionnaire on YouTube, ordered as in the PDF (actual order was randomized for each participant):

About

Code and data for the paper "When Do People Want an Explanation?", presented at HRI'24

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published