700
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Understanding the impact and design of AI teammate etiquette

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 06 Jul 2022, Accepted 02 Mar 2023, Published online: 24 Mar 2023
 

ABSTRACT

Technical and practical advancements in Artificial Intelligence (AI) have led to AI teammates working alongside humans in an area known as human-agent teaming. While critical past research has shown the benefit to trust driven by the incorporation of interaction rules and structures (i.e. etiquette) in both AI tools and robotic teammates, research has yet to explicitly examine etiquette for digital AI teammates. Given the historic importance of trust within human-agent teams, the identification of etiquette’s impact within said teams should be paramount. Thus, this study empirically evaluates the impact of AI teammate etiquette through a mixed-methods study that compares AI teammates that either adhere to or ignore traditional etiquette standards for machine systems. The quantitative results show that traditional etiquette adherence leads to greater trust, perceived performance of the AI, and perceived performance of the team as a whole. However, qualitative results reveal that not all traditional etiquette behaviors have universal appeal due to the presence of individual differences. This research provides the first empirical and explicit exploration of etiquette within human-agent teams, and the results of this study should be used further design specific etiquette behaviors for AI teammates.

Disclosure statement

No potential conflict of interest was reported by the author(s).

HCI editorial record

First received on 7/5/2022. Revisions received on date, date, and date. Accepted by action-editor-name. Final manuscript received on date.

Additional information

Funding

This material is based upon work supported by the National Science Foundation under Grant No. 1829008.

Notes on contributors

Christopher Flathmann

Christopher Flathmann ([email protected]https://chrisflathmann.com/) is a researcher with an interest in human-AI teaming; he is a research assistant professor in the School of Computing of Clemson University.

Nathan J. McNeese

Nathan J. McNeese ([email protected]https://nathanmcneese.weebly.com/) is a researcher with an interest in human-AI teaming; he is a dean’s professor in the School of Computing of Clemson University.

Beau Schelble

Beau G. Schelble ([email protected],) is a researcher with an interest in human-AI teaming; he is a Ph.D. student in the School of Computing of Clemson University.

Bart Knijnenburg

Bart Knijnenburg ([email protected]https://www.usabart.nl) is a researcher with an interest in privacy decision-making and recommender systems; he is an assistant professor in the School of Computing of Clemson University.

Guo Freeman

Guo Freeman ([email protected]https://computing.clemson.edu/cugame/index.html) is a researcher with an interest in online communities; she is an assistant professor in the School of Computing of Clemson University.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 329.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.