1,128
Views
5
CrossRef citations to date
0
Altmetric
Research Articles

A social evaluation of the perceived goodness of explainability in machine learning

, , & ORCID Icon
Pages 29-50 | Received 22 Nov 2020, Accepted 24 Jun 2021, Published online: 25 Jul 2021
 

ABSTRACT

Machine learning in decision support systems already outperforms pre-existing statistical methods. However, their predictions face challenges as calculations are often complex and not all model predictions are traceable. In fact, many well-performing models are black boxes to the user who– consequently– cannot interpret and understand the rationale behind a model’s prediction. Explainable artificial intelligence has emerged as a field of study to counteract this. However, current research often neglects the human factor. Against this backdrop, we derived and examined factors that influence the goodness of a model’s explainability in a social evaluation of end users. We implemented six common ML algorithms for four different benchmark datasets in a two-factor factorial design and asked potential end users to rate different factors in a survey. Our results show that the perceived goodness of explainability is moderated by the problem type and strongly correlates with trustworthiness as the most important factor.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Bayerisches Staatsministerium für Wirtschaft, Landesentwicklung und Energie (StMWi) [DIK0143/02].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.