530
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Examining the Effect of Anthropomorphic Design Cues on Healthcare Chatbots Acceptance and Organization-Public Relationships: Trust in a Warm Human Vs. a Competent Machine

ORCID Icon &
Received 13 Jun 2023, Accepted 28 Nov 2023, Published online: 11 Dec 2023

References

  • Acikgoz, F., & Vega, R. P. (2022). The role of privacy cynicism in consumer habits with voice assistants: a technology acceptance model perspective. International Journal of Human–Computer Interaction, 38(12), 1138–1152. https://doi.org/10.1080/10447318.2021.1987677
  • Bialkova, S. (2023). How to optimise interaction with chatbots? Key parameters emerging from actual application. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2023.2219963
  • Brown, J. E. H., & Halpern, J. (2021). AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare. SSM - Mental Health, 1, 100017. https://doi.org/10.1016/j.ssmmh.2021.100017
  • Chaiken, S. (1980). Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of Personality and Social Psychology, 39(5), 752–766. https://doi.org/10.1037/0022-3514.39.5.752
  • Chaix, B., Bibault, J.-E., Pienkowski, A., Delamon, G., Guillemassé, A., Nectoux, P., & Brouard, B. (2019). When chatbots meet patients: One-year prospective study of conversations between patients with breast cancer and a chatbot. JMIR Cancer, 5(1), e12856. https://doi.org/10.2196/12856
  • Chattaraman, V., Kwon, W.-S., Gilbert, J. E., & Ross, K. (2019). Should AI-based, conversational digital assistants employ social- or task-oriented interaction style? A task-competency and reciprocity perspective for older adults. Computers in Human Behavior, 90, 315–330. https://doi.org/10.1016/j.chb.2018.08.048
  • Chaves, A. P., & Gerosa, M. A. (2021). How should my chatbot interact? A survey on social characteristics in human–chatbot interaction design. International Journal of Human–Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438
  • Chen, J., Guo, F., Ren, Z., Li, M., & Ham, J. (2023). Effects of anthropomorphic design cues of chatbots on users’ perception and visual behaviors. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2023.2193514
  • Cheng, X., Zhang, X., Cohen, J., & Mou, J. (2022). Human vs. AI: Understanding the impact of anthropomorphism on consumer response to chatbots from the perspective of trust and relationship norms. Information Processing & Management, 59(3), 102940. https://doi.org/10.1016/j.ipm.2022.102940
  • Cheng, Y., & Jiang, H. (2020). How do AI-driven chatbots impact user experience? Examining gratifications, perceived privacy risk, satisfaction, loyalty, and continued use. Journal of Broadcasting & Electronic Media, 64(4), 592–614. https://doi.org/10.1080/08838151.2020.1834296
  • Cheng, Y., & Jiang, H. (2022). Customer–brand relationship in the era of artificial intelligence: Understanding the role of chatbot marketing efforts. Journal of Product & Brand Management, 31(2), 252–264. https://doi.org/10.1108/JPBM-05-2020-2907
  • Chow, J. C. L., Sanders, L., & Li, K. (2023a). Design of an educational chatbot using artificial intelligence in radiotherapy. AI, 4(1), 319–332. https://doi.org/10.3390/ai4010015
  • Chow, J. C. L., Sanders, L., & Li, K. (2023b). Impact of ChatGPT on medical chatbots as a disruptive technology. Frontiers in Artificial Intelligence, 6, 1166014. https://doi.org/10.3389/frai.2023.1166014
  • Christoforakos, L., Gallucci, A., Surmava-Große, T., Ullrich, D., & Diefenbach, S. (2021). Can robots earn our trust the same way humans do? A systematic exploration of competence, warmth, and anthropomorphism as determinants of trust development in HRI. Frontiers in Robotics and AI, 8, 640444. https://doi.org/10.3389/frobt.2021.640444
  • Dai, Z., & MacDorman, K. F. (2018). The doctor’s digital double: How warmth, competence, and animation promote adherence intention. PeerJ. Computer Science, 4, e168. https://doi.org/10.7717/peerj-cs.168
  • Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319. https://doi.org/10.2307/249008
  • De Valck, C., Bensing, J., Bruynooghe, R., & Batenburg, V. (2001). Cure-oriented versus care-oriented attitudes in medicine. Patient Education and Counseling, 45(2), 119–126. https://doi.org/10.1016/S0738-3991(00)00201-9
  • Drummond, D. (2021). Between competence and warmth: The remaining place of the physician in the era of artificial intelligence. NPJ Digital Medicine, 4(1), 85. https://doi.org/10.1038/s41746-021-00457-w
  • Ehrke, F., Bruckmüller, S., & Steffens, M. C. (2020). A double‐edged sword: How social diversity affects trust in representatives via perceived competence and warmth. European Journal of Social Psychology, 50(7), 1540–1554. https://doi.org/10.1002/ejsp.2709
  • Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
  • Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A Taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies, 132, 138–161. https://doi.org/10.1016/j.ijhcs.2019.07.009
  • Fogg, B. J., & Tseng, H. (1999). The elements of computer credibility. In Proceedings of The SIGCHI Conference on Human Factors in Computing Systems the CHI is the Limit - CHI ’99 (pp. 80–87). ACM. https://doi.org/10.1145/302979.303001
  • Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316. https://doi.org/10.1016/j.chb.2019.01.020
  • Hayes, A. F. (2017). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. Guilford publications.
  • Hon, L. C., & Grunig, J. E. (1999, November). Guidelines for measuring relationships in public relations. https://instituteforpr.org/measuring-relationships/
  • Howe, L. C., Leibowitz, K. A., & Crum, A. J. (2019). When your doctor “gets it” and “gets you”: The critical role of competence and warmth in the patient–provider interaction. Frontiers in Psychiatry, 10, 475. https://doi.org/10.3389/fpsyt.2019.00475
  • Huang, J.-W., & Lin, C.-P. (2011). To stick or not to stick: The social response theory in the development of continuance intention from organizational cross-level perspective. Computers in Human Behavior, 27(5), 1963–1973. https://doi.org/10.1016/j.chb.2011.05.003
  • Ijzerman, H., & Semin, G. R. (2009). The thermometer of social relations: Mapping social proximity on temperature. Psychological Science, 20(10), 1214–1220. https://doi.org/10.1111/j.1467-9280.2009.02434.x
  • Ischen, C., Araujo, T., van Noort, G., Voorveld, H., & Smit, E. (2020). “I am here to assist you today”: The role of entity, interactivity and experiential perceptions in chatbot persuasion. Journal of Broadcasting & Electronic Media, 64(4), 615–639. https://doi.org/10.1080/08838151.2020.1834297
  • Jin, S. V., & Youn, S. (2021). Why do consumers with social phobia prefer anthropomorphic customer service chatbots? Evolutionary explanations of the moderating roles of social phobia. Telematics and Informatics, 62, 101644. https://doi.org/10.1016/j.tele.2021.101644
  • Kamal, S. A., Shafiq, M., & Kakria, P. (2020). Investigating acceptance of telemedicine services through an extended technology acceptance model (TAM). Technology in Society, 60, 101212. https://doi.org/10.1016/j.techsoc.2019.101212
  • Kang, E., & Kang, Y. A. (2023). Counseling chatbot design: The effect of anthropomorphic chatbot characteristics on user self-disclosure and companionship. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2022.2163775
  • Kang, H., & Kim, K. J. (2020). Feeling connected to smart objects? A moderated mediation model of locus of agency, anthropomorphism, and sense of connectedness. International Journal of Human-Computer Studies, 133, 45–55. https://doi.org/10.1016/j.ijhcs.2019.09.002
  • Kang, H., & Kim, K. J. (2022). Does humanization or machinization make the IoT persuasive? The effects of source orientation and social presence. Computers in Human Behavior, 129, 107152. https://doi.org/10.1016/j.chb.2021.107152
  • Kervyn, N., Fiske, S. T., & Malone, C. (2012). Brands as intentional agents framework: How perceived intentions and ability can map brand perception. Journal of Consumer Psychology: The Official Journal of the Society for Consumer Psychology, 22(2), 166–176. https://doi.org/10.1016/j.jcps.2011.09.006
  • Ki, E.-J., & Hon, L. C. (2007). Reliability and validity of organization-public relationship measurement and linkages among relationship indicators in a membership organization. Journalism & Mass Communication Quarterly, 84(3), 419–438. https://doi.org/10.1177/107769900708400302
  • Kilani, N., & Rajaobelina, L. (2022). Impact of live chat service quality on behavioral intentions and relationship quality: A meta-analysis. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2022.2144126
  • Kirmani, A., Hamilton, R. W., Thompson, D. V., & Lantzy, S. (2017). Doing well versus doing good: The differential effect of underdog positioning on moral and competent service providers. Journal of Marketing, 81(1), 103–117. https://doi.org/10.1509/jm.15.0369
  • Klein, K., & Martinez, L. F. (2022). The impact of anthropomorphism on customer satisfaction in chatbot commerce: An experimental study in the food sector. Electronic Commerce Research, 23(4), 2789–2825. https://doi.org/10.1007/s10660-022-09562-8
  • Lee, J., Lee, D., & Lee, J. G. (2022). Influence of rapport and social presence with an AI psychotherapy chatbot on users’ self-disclosure. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2022.2146227
  • Li, Y., & Shin, H. (2023). Should a luxury brand’s chatbot use emoticons? Impact on brand status. Journal of Consumer Behaviour, 22(3), 569–581. https://doi.org/10.1002/cb.2104
  • Liebrecht, C., Sander, L., & Van Hooijdonk, C. (2021). Too informal? How a chatbot’s communication style affects brand attitude and quality of interaction. In A. Følstad, T. Araujo, S. Papadopoulos, E. L.-C. Law, E. Luger, M. Goodwin, & P. B. Brandtzaeg (Eds.), Chatbot research and design (Vol. 12604, pp. 16–31). Springer International Publishing. https://doi.org/10.1007/978-3-030-68288-0_2
  • Liu, K., & Tao, D. (2022). The roles of trust, personalization, loss of privacy, and anthropomorphism in public acceptance of smart healthcare services. Computers in Human Behavior, 127, 107026. https://doi.org/10.1016/j.chb.2021.107026
  • Liu, S. X., Shen, Q., & Hancock, J. (2021). Can a social robot be too warm or too competent? Older Chinese adults’ perceptions of social robots and vulnerabilities. Computers in Human Behavior, 125, 106942. https://doi.org/10.1016/j.chb.2021.106942
  • Men, L. R., Zhou, A., & Sunny Tsai, W.-H. (2022). Harnessing the power of chatbot social conversation for organizational listening: The impact on perceived transparency and organization-public relationships. Journal of Public Relations Research, 34(1–2), 20–44. https://doi.org/10.1080/1062726X.2022.2068553
  • Meng, J., & Dai, Y. (.). (2021). Emotional support from AI chatbots: Should a supportive partner self-disclose or not? Journal of Computer-Mediated Communication, 26(4), 207–222. https://doi.org/10.1093/jcmc/zmab005
  • Nadarzynski, T., Miles, O., Cowie, A., & Ridge, D. (2019). Acceptability of artificial intelligence (AI)-led chatbot services in healthare: A mixed-methods study. Digital Health, 5. https://doi.org/10.1177/2055207619871808
  • Nass, C., Fogg, B. J., & Moon, Y. (1996). Can computers be teammates? International Journal of Human-Computer Studies, 45(6), 669–678. https://doi.org/10.1006/ijhc.1996.0073
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Neef, C., Mai, V., & Richert, A. (2022). “I am scared of viruses, too”—Studying the impact of self-disclosure in chatbots for health-related applications. In M. Kurosu (Ed.), Human-computer interaction. User experience and behavior (Vol. 13304, pp. 515–530). Springer International Publishing. https://doi.org/10.1007/978-3-031-05412-9_35
  • OpenAI (2022, November 22). Introducing ChatGPT. https://openai.com/blog/chatgpt
  • Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people. Cambridge University Press.
  • Rheu, M., Shin, J. Y., Peng, W., & Huh-Yoo, J. (2021). Systematic review: Trust-building factors and implications for conversational agent design. International Journal of Human–Computer Interaction, 37(1), 81–96. https://doi.org/10.1080/10447318.2020.1807710
  • Roy, R., & Naidoo, V. (2021). Enhancing chatbot effectiveness: The role of anthropomorphic conversational styles and time orientation. Journal of Business Research, 126, 23–34. https://doi.org/10.1016/j.jbusres.2020.12.051
  • Seeger, A. M., Pfeiffer, J., & Heinzl, A. (2021). Texting with humanlike conversational agents: Designing for anthropomorphism. Journal of the Association for Information Systems, 22(4), 931–967. https://doi.org/10.17705/1jais.00685
  • Seitz, L., Bekmeier-Feuerhahn, S., & Gohil, K. (2022). Can we trust a chatbot like a physician? A qualitative study on understanding the emergence of trust toward diagnostic chatbots. International Journal of Human-Computer Studies, 165, 102848. https://doi.org/10.1016/j.ijhcs.2022.102848
  • Sherman, S. J., & Corty, E. (1984). Cognitive heuristics. In R. S. Wyer, Jr., & T. K. Srull (Eds.), Handbook of social cognition. Lawrence Erlbaum Associates Publishers.
  • Shin, D. (2021). The effects of explainability and causability on perception, trust, and acceptance: Implications for explainable AI. International Journal of Human-Computer Studies, 146, 102551. https://doi.org/10.1016/j.ijhcs.2020.102551
  • Shuqair, S., Pinto, D. C., So, K. K. F., Rita, P., & Mattila, A. S. (2021). A pathway to consumer forgiveness in the sharing economy: The role of relationship norms. International Journal of Hospitality Management, 98, 103041. https://doi.org/10.1016/j.ijhm.2021.103041
  • Siddique, S., & Chow, J. C. L. (2021). Machine learning in healthcare communication. Encyclopedia, 1(1), 220–239. https://doi.org/10.3390/encyclopedia1010021
  • Song, S. W., & Shin, M. (2022). Uncanny valley effects on chatbot trust, purchase intention, and adoption intention in the context of e-commerce: The moderating role of avatar familiarity. International Journal of Human–Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2022.2121038
  • Sundar, S. S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility (pp. 73–100). MacArthur Foundation Digital Media and Learning Initiative.
  • Sundar, S. S., Bellur, S., Oh, J., Jia, H., & Kim, H.-S. (2016). Theoretical importance of contingency in human-computer interaction: Effects of message interactivity on user engagement. Communication Research, 43(5), 595–625. https://doi.org/10.1177/0093650214534962
  • Sundar, S. S., & Marathe, S. S. (2010). Personalization versus customization: The importance of agency, privacy, and power usage. Human Communication Research, 36(3), 298–322. https://doi.org/10.1111/j.1468-2958.2010.01377.x
  • Tudor Car, L., Dhinagaran, D. A., Kyaw, B. M., Kowatsch, T., Joty, S., Theng, Y.-L., & Atun, R. (2020). Conversational agents in health care: Scoping review and conceptual analysis. Journal of Medical Internet Research, 22(8), e17158. https://doi.org/10.2196/17158
  • Vilaro, M. J., Wilson-Howard, D. S., Neil, J. M., Tavassoli, F., Zalake, M. S., Lok, B. C., Modave, F. P., George, T. J., Odedina, F. T., Carek, P. J., Mys, A. M., & Krieger, J. L. (2022). A subjective culture approach to cancer prevention: Rural black and white adults’ perceptions of using virtual health assistants to promote colorectal cancer screening. Health Communication, 37(9), 1123–1134. https://doi.org/10.1080/10410236.2021.1910166
  • Wiseman, R. M., Cuevas-Rodríguez, G., & Gomez-Mejia, L. R. (2012). Towards a social theory of agency: Towards a social theory of agency. Journal of Management Studies, 49(1), 202–222. https://doi.org/10.1111/j.1467-6486.2011.01016.x
  • Xu, L., Sanders, L., Li, K., & Chow, J. C. L. (2021). chatbot for health care and oncology applications using artificial intelligence and machine learning: Systematic review. JMIR Cancer, 7(4), e27850. https://doi.org/10.2196/27850
  • Xue, J., Zhou, Z., Zhang, L., & Majeed, S. (2020). Do brand competence and warmth always influence purchase intention? The moderating role of gender. Frontiers in Psychology, 11, 248. https://doi.org/10.3389/fpsyg.2020.00248
  • Yen, C., & Chiang, M.-C. (2021). Trust me, if you can: A study on the factors that influence consumers’ purchase intention triggered by chatbots based on brain image evidence and self-reported assessments. Behaviour & Information Technology, 40(11), 1177–1194. https://doi.org/10.1080/0144929X.2020.1743362
  • Youn, S., & Jin, S. V. (2021). “In A.I. we trust?” The effects of parasocial interaction and technopian versus luddite ideological views on chatbot-based customer relationship management in the emerging “feeling economy. Computers in Human Behavior, 119, 106721. https://doi.org/10.1016/j.chb.2021.106721
  • Zamora, J. (2017). I’m Sorry, Dave, I’m afraid I can’t do that: Chatbot perception and expectations. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 253–260). ACM. https://doi.org/10.1145/3125739.3125766

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.