334
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Human-Centered Explainability for Intelligent Vehicles—A User Study

ORCID Icon, , &
Pages 3237-3253 | Received 29 Dec 2022, Accepted 09 May 2023, Published online: 28 May 2023
 

Abstract

Advances in artificial intelligence (AI) are leading to an increased use of algorithm-generated user-adaptivity in everyday systems. Explainable AI aims to make algorithmic decision-making more transparent to humans. As future vehicles become more intelligent and user-adaptive, explainability will play an important role in ensuring that drivers understand the AI system’s functionalities and outputs. However, when integrating explainability into in-vehicle features there is a lack of knowledge about user needs and requirements and how to address them. We conducted a study with 59 participants focusing on how end-users evaluate explainability in the context of user-adaptive comfort and infotainment features. Results show that explanations foster perceived understandability and transparency of the system, but that the need for explanation may vary between features. Additionally, we found that insufficiently designed explanations can decrease acceptance of the system. Our findings underline the requirement for a user-centered approach in explainable AI and indicate approaches for future research.

Acknowledgments

We thank everyone who supported our work by participating in this study. We also thank Lena Rittger for supporting the qualitative data analysis.

Ethical approval

The study was approved by the ethics committee of the Technical University of Munich under the approval number 718/21 S.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data that support the findings of this study are available from the corresponding author, Julia Graefe, upon reasonable request.

Additional information

Funding

The authors thank the AUDI AG for funding this work.

Notes on contributors

Julia Graefe

Julia Graefe studied Ergonomics—Human Factors Engineering at the Technical University of Munich. In March 2021 she started working as a research associate at the Chair of Ergonomics (Technical University of Munich). In cooperation with AUDI AG, her research focuses on human-centered explainability of adaptive in-vehicle systems.

Selma Paden

Selma Paden studied Industrial Engineering with Business Studies at the University of Applied Sciences Merseburg and graduated in 2022. From November 2021 to May 2022, she wrote her master’s thesis at AUDI AG in the field of human-centered explainability of adaptive in-vehicle systems.

Doreen Engelhardt

Doreen Engelhardt graduated in Applied Media and Communication Studies at the Technical University of Ilmenau. Since 2013 she works as a project lead at AUDI AG, Ingolstadt. She is responsible for pre-development projects covering innovations around empathic digital assistants, cultural adaptive UX, and artificial intelligence.

Klaus Bengler

Klaus Bengler graduated in Psychology at the University of Regensburg and received his Doctorate in 1994 in cooperation with BMW. In 1997 he joined BMW. Since May 2009 he leads the Chair of Ergonomics at the Technical University of Munich.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.