1,220
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Effect of AI Chatbot’s Interactivity on Consumers’ Negative Word-of-Mouth Intention: Mediating Role of Perceived Empathy and Anger

ORCID Icon
Received 07 Feb 2023, Accepted 04 Jul 2023, Published online: 19 Jul 2023

References

  • Aaker, J. L., Garbinsky, E. N., & Vohs, K. D. (2012). Cultivating admiration in brands: Warmth, competence, and landing in the "golden quadrant. Journal of Consumer Psychology, 22(2), 191–194. https://doi.org/10.1016/j.jcps.2011.11.012
  • Adam, M., Wessel, M., & Benlian, A. (2021). AI-based chatbots in customer service and their effects on user compliance. Electronic Markets, 31(2), 427–445. https://doi.org/10.1007/s12525-020-00414-7
  • Aggarwal, P., Castleberry, S. B., Ridnour, R., & Shepherd, C. D. (2005). Salesperson empathy and listening: Impact on relationship outcomes. Journal of Marketing Theory and Practice, 13(3), 16–31. https://doi.org/10.1080/10696679.2005.11658547
  • Aguinis, H., & Bradley, K. J. (2014). Best practice recommendations for designing and implementing experimental vignette methodology studies. Organizational Research Methods, 17(4), 351–371. https://doi.org/10.1177/1094428114547952
  • Alicke, M. D., Braun, J. C., Glor, J. E., Klotz, M. L., Magee, J., Sederhoim, H., & Siegel, R. (1992). Complaining behavior in social interaction. Personality and Social Psychology Bulletin, 18(3), 286–295. https://doi.org/10.1177/0146167292183004
  • Anderson, E. W. (1998). Customer satisfaction and word of mouth. Journal of Service Research, 1(1), 5–17. https://doi.org/10.1177/109467059800100102
  • Antonetti, P. (2020). More than just a feeling: A research agenda for the study of consumer emotions following corporate social irresponsibility (CSI). Australasian Marketing Journal, 28(2), 67–70. https://doi.org/10.1016/j.ausmj.2020.01.005
  • Antonetti, P., & Maklan, S. (2016). An extended model of moral outrage at corporate social irresponsibility. Journal of Business Ethics, 135(3), 429–444. https://doi.org/10.1007/s10551-014-2487-y
  • Aoki, N. (2020). An experimental study of public trust in AI chatbots in the public sector. Government Information Quarterly, 37(4), 101490. https://doi.org/10.1016/j.giq.2020.101490
  • Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173–1182. https://doi.org/10.1037//0022-3514.51.6.1173
  • Bellur, S., & Sundar, S. S. (2017). Talking health with a machine: How does message interactivity affect attitudes and cognitions? Human Communication Research, 43(1), 25–53. https://doi.org/10.1111/hcre.12094
  • Berry, L. L., Seiders, K., & Grewal, D. (2002). Understanding service convenience. Journal of Marketing, 66(3), 1–17. https://doi.org/10.1509/jmkg.66.3.1.18505
  • Bickmore, T., & Cassell, J. (2005). Social dialogue with embodied conversational agents. In J. Van Kuppevelt, L. Dybkjær, & N. O. Bernsen (Eds.), Advances in natural multimodal dialogue systems (pp. 23–54). Springer Science & Business Media.
  • Brandtzaeg, P. B., & Følstad, A. (2018). Chatbots: Changing user needs and motivations. Interactions, 25(5), 38–43. https://doi.org/10.1145/3236669
  • Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjálmsson, H., & Yan, H. (1999, May). Embodiment in conversational interfaces: Rea [Paper presentation]. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 520–527). https://doi.org/10.1145/302979.303150
  • Chattaraman, V., Kwon, W. S., Gilbert, J. E., & Ross, K. (2019). Should AI-Based, conversational digital assistants employ a social- or task-oriented interaction style? A task-competency and reciprocity perspective for older adults. Computers in Human Behavior, 90, 315–330. https://doi.org/10.1016/j.chb.2018.08.048
  • Chaves, A. P., & Gerosa, M. A. (2021). How should my chatbot interact? A survey on social characteristics in human–chatbot interaction design. International Journal of Human–Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438
  • Cheng, Y., & Jiang, H. (2020b). How do AI-driven chatbots impact user experience? Examining gratifications, perceived privacy risk, satisfaction, loyalty, and continued use. Journal of Broadcasting & Electronic Media, 64(4), 592–614. https://doi.org/10.1080/08838151.2020.1834296
  • Chong, T., Yu, T., Keeling, D. I., & de Ruyter, K. (2021). AI chatbots on the service’s frontline are addressing the challenges and opportunities of the agency. Journal of Retailing and Consumer Services, 63, 102735. https://doi.org/10.1016/j.jretconser.2021.102735
  • Chung, M., Ko, E., Joung, H., & Kim, S. J. (2020). Chatbot e-service and customer satisfaction regarding luxury brands. Journal of Business Research, 117, 587–595. https://doi.org/10.1016/j.jbusres.2018.10.004
  • Coombs, W. T. (2021). Ongoing crisis communication: Planning, managing, and responding. Sage publications.
  • Crolic, C., Thomaz, F., Hadi, R., & Stephen, A. T. (2022). Blame the bot: Anthropomorphism and anger in customer–chatbot interactions. Journal of Marketing, 86(1), 132–148. https://doi.org/10.1177/00222429211045687
  • Davis, M. H. (1983). Measuring individual differences in empathy: Evidence for a multidimensional approach. Journal of Personality and Social Psychology, 44(1), 113–126. https://doi.org/10.1037/0022-3514.44.1.113
  • Fernandes, T., & Oliveira, E. (2021). Understanding consumers’ acceptance of automated technologies in service encounters: Drivers of digital voice assistants adoption. Journal of Business Research, 122, 180–191. https://doi.org/10.1016/j.jbusres.2020.08.058
  • Fiske, S. T., Cuddy, A. J., & Glick, P. (2007). Universal dimensions of social cognition: Warmth and competence. Trends in Cognitive Sciences, 11(2), 77–83. https://doi.org/10.1016/j.tics.2006.11.005
  • Fitzgerald, P., & Leudar, I. (2010). On active listening in person-centred, solution-focused psychotherapy. Journal of Pragmatics, 42(12), 3188–3198. https://doi.org/10.1016/j.pragma.2010.07.007
  • Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e7785. https://doi.org/10.2196/mental.7785
  • Folger, R., & Cropanzano, R. (2001). Fairness theory: Justice as accountability. Advances in Organizational Justice, 1(1–55), 12.
  • Følstad, A., & Brandtzaeg, P. B. (2020). Users’ experiences with chatbots: Findings from a questionnaire study. Quality and User Experience, 5(1), 3. https://doi.org/10.1007/s41233-020-00033-2
  • Forlizzi, J., Zimmerman, J., Mancuso, V., Kwak, S. (2007, August). How to interface agents affect the interaction between humans and computers. In Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces (pp. 209–221).
  • Fu, H., Wu, D. C., Huang, S. S., Song, H., & Gong, J. (2015). Monetary or nonmonetary compensation for service failure? A study of customer preferences under various loci of causality. International Journal of Hospitality Management, 46, 55–64. https://doi.org/10.1016/j.ijhm.2015.01.006
  • Gabbott, M., Tsarenko, Y., & Mok, W. H. (2011). Emotional intelligence as a moderator of coping strategies and service outcomes in circumstances of service failure. Journal of Service Research, 14(2), 234–248. https://doi.org/10.1177/1094670510391078
  • Gefen, D., & Straub, D. W. (2004). Consumer trust in B2C e-Commerce and the importance of social presence: Experiments in e-Products and e-Services. Omega, 32(6), 407–424. https://doi.org/10.1016/j.omega.2004.01.006
  • Gnewuch, U., Morana, S., & Maedche, A. (2017, December). Towards designing cooperative and social conversational agents for customer service. In ICIS.
  • Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316. https://doi.org/10.1016/j.chb.2019.01.020
  • Goleman, D. (1998). Working with emotional intelligence. Bantam.
  • Gudykunst, W. B., & Nishida, T. (1984). Individual and cultural influences on uncertainty reduction. Communication Monographs, 51(1), 23–36. https://doi.org/10.1080/03637758409390181
  • Guha, A., Bressgott, T., Grewal, D., Mahr, D., Wetzels, M., & Schweiger, E. (2023). How artificiality and intelligence affect voice assistant evaluations. Journal of the Academy of Marketing Science, 51(4), 843–866. https://doi.org/10.1007/s11747-022-00874-7
  • Hanssen, L., Jankowski, N. W., & Etienne, R. (1996). Interactivity from the perspective of communication studies. Acamedia Research Monograph, 19, 61–73.
  • Hassenzahl, M. (2007). The hedonic/pragmatic model of user experience. Towards a UX manifesto, 10, 2007.
  • Hatfield, E., Rapson, R. L., & Le, Y. C. L. (2011). Emotional contagion and empathy. In J. Decety & W. Ickes (Eds.), The social neuroscience of empathy (p. 19). Mit Press.
  • Haugeland, I. K. F., Følstad, A., Taylor, C., & Bjørkli, C. A. (2022). Understanding the user experience of customer service chatbots: An experimental study of chatbot interaction design. International Journal of Human-Computer Studies, 161, 102788. https://doi.org/10.1016/j.ijhcs.2022.102788
  • Hayes, A. F., & Scharkow, M. (2013). The relative trustworthiness of inferential tests of the indirect effect in statistical mediation analysis: Does method really matter? Psychological Science, 24(10), 1918–1927. https://doi.org/10.1177/0956797613480187
  • Herr, P. M., Kardes, F. R., & Kim, J. (1991). Effects of word-of-mouth and product-attribute information on persuasion: An accessibility-diagnosticity perspective. Journal of Consumer Research, 17(4), 454–462. https://doi.org/10.1086/208570
  • Ho, A., Hancock, J., & Miner, A. S. (2018). Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. The Journal of Communication, 68(4), 712–733. https://doi.org/10.1093/joc/jqy026
  • Hofeditz, L., Ehnis, C., Bunker, D., Brachten, F., & Stieglitz, S. (2019, June). Meaningful use of social bots? Possible applications in crisis communication during disasters. In ECIS.
  • Hoffman, M. L. (2001). Empathy and moral development: Implications for caring and justice. Cambridge University Press.
  • Hone, K. (2006). Empathic agents to reduce user frustration: The effects of varying agent characteristics. Interacting with Computers, 18(2), 227–245. https://doi.org/10.1016/j.intcom.2005.05.003
  • Huang, M. H., & Rust, R. T. (2018). Artificial intelligence in service. Journal of Service Research, 21(2), 155–172. https://doi.org/10.1177/1094670517752459
  • Ischen, C., Araujo, T., van Noort, G., Voorveld, H., & Smit, E. (2020). “I am here to assist you today”: The role of entity, interactivity and experiential perceptions in chatbot persuasion. Journal of Broadcasting & Electronic Media, 64(4), 615–639. https://doi.org/10.1080/08838151.2020.1834297
  • Jain, M., Kumar, P., Kota, R., & Patel, S. N. (2018, June). Evaluating and informing the design of chatbots. [Paper presentation]. 2018 Designing Interactive Systems Conference (pp. 895–906). https://doi.org/10.1145/3196709.3196735
  • Jiang, H., Cheng, Y., Yang, J., & Gao, S. (2022a). AI-powered chatbot communication with customers: Dialogic interactions, satisfaction, engagement, and customer behavior. Computers in Human Behavior, 134, 107329. https://doi.org/10.1016/j.chb.2022.107329
  • Jiang, Q., Zhang, Y., & Pian, W. (2022b). Chatbot as an emergency exists: Mediated empathy for resilience via human-AI interaction during the COVID-19 pandemic. Information Processing & Management, 59(6), 103074. https://doi.org/10.1016/j.ipm.2022.103074
  • Jin, Y., Fraustino, J. D., & Liu, B. F. (2016). The scared, the outraged, and the anxious: How crisis emotions, involvement, and demographics predict publics’ conative coping. International Journal of Strategic Communication, 10(4), 289–308. https://doi.org/10.1080/1553118X.2016.1160401
  • Kelley, S. W., & Davis, M. A. (1994). Antecedents to customer expectations for service recovery. Journal of the Academy of Marketing Science, 22(1), 52–61. https://doi.org/10.1177/0092070394221005
  • Kilani, N., & Rajaobelina, L. (2022). Impact of live chat service quality on behavioral intentions and relationship quality: A meta-analysis. International Journal of Human–Computer Interaction, 1–28. https://doi.org/10.1080/10447318.2022.2144126
  • Kim, T., & Ball, J. G. (2021). Unintended consequences of warmth appeals: An extension of the compensation effect between warmth and competence to advertising. Journal of Advertising, 50(5), 622–638. https://doi.org/10.1080/00913367.2021.1940393
  • Kim, Y., & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless? Computers in Human Behavior, 28(1), 241–250. https://doi.org/10.1016/j.chb.2011.09.006
  • Kolodinsky, J., & Aleong, J. (1990). An integrated model of consumer complaint action applied to services: A pilot study. The Journal of Consumer Satisfaction, Dissatisfaction, and Complaining Behavior, 3, 61–70.
  • Kruglanski, A. W., Molinario, E., & Lemay, E. P. (2021). Coping with COVID-19-induced threats to self. Group Processes & Intergroup Relations, 24(2), 284–289. https://doi.org/10.1177/1368430220982074
  • Kull, A. J., Romero, M., & Monahan, L. (2021). How may I help you? Driving brand engagement through the warmth of an initial chatbot message. Journal of Business Research, 135, 840–850. https://doi.org/10.1016/j.jbusres.2021.03.005
  • Lau, G. T., & Ng, S. (2009). Individual and situational factors are influencing negative word‐of‐mouth behavior. Canadian Journal of Administrative Sciences / Revue Canadienne des Sciences de L'Administration, 18(3), 163–178. https://doi.org/10.1111/j.1936-4490.2001.tb00253.x
  • Liu, B., & Sundar, S. S. (2018). Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior and Social Networking, 21(10), 625–636. https://doi.org/10.1089/cyber.2018.0110
  • Lu, X., & Jin, Y. (2020). Information vetting as a key component in social-mediated crisis communication: An exploratory study to examine the initial conceptualization. Public Relations Review, 46(2), 101891. https://doi.org/10.1016/j.pubrev.2020.101891
  • Lv, X., Yang, Y., Qin, D., Cao, X., & Xu, H. (2022). Artificial intelligence service recovery: The role of empathic response in hospitality customers’ continuous usage intention. Computers in Human Behavior, 126, 106993. https://doi.org/10.1016/j.chb.2021.106993
  • Mari, A., Mandelli, A., & Algesheimer, R. (2023). Shopping with voice assistants: How empathy affects individual and family decision-making outcomes. UZH Business Working Paper, University of Zurich, Institute of Business Administration.
  • Mazzarol, T., Sweeney, J. C., & Soutar, G. N. (2007). Conceptualizing word‐of‐mouth activity, triggers and conditions: An exploratory study. European Journal of Marketing, 41(11–12), 1475–1494.
  • McMillan, S. J., & Hwang, J. S. (2002). Measures of perceived interactivity: An exploration of the role of direction of communication, user control, and time in shaping perceptions of interactivity. Journal of Advertising, 31(3), 29–42. https://doi.org/10.1080/00913367.2002.10673674
  • Migacz, S. J., Zou, S., & Petrick, J. F. (2018). The “terminal” effects of service failure on airlines: Examining service recovery with justice theory. Journal of Travel Research, 57(1), 83–98. https://doi.org/10.1177/0047287516684979
  • Mori, M., MacDorman, K., & Kageki, N. (2012). The uncanny valley. IEEE Robotics & Automation Magazine, 19(2), 98–100. https://doi.org/10.1109/MRA.2012.2192811
  • Murtarelli, G., Gregory, A., & Romenti, S. (2021). A conversation-based perspective for shaping ethical human-machine interactions: The particular challenge of chatbots. Journal of Business Research, 129, 927–935. https://doi.org/10.1016/j.jbusres.2020.09.018
  • Naimark, M. (1990). Realness and interactivity. In B. Laurel & S. J. Mountford (Eds.), The art of human-computer interface design (pp. 455–459). Addison-Wesley Longman Publishing Co., Inc.
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Neururer, M., Schlögl, S., Brinkschulte, L., & Groth, A. (2018). Perceptions on authenticity in chatbots. Multimodal Technologies and Interaction, 2(3), 60. https://doi.org/10.3390/mti2030060
  • Nguyen, H., Masthoff, J. (2009, April). Designing empathic computers: The effect of multimodal empathic feedback using an animated agent. In Proceedings of the 4th International Conference on Persuasive Technology (pp. 1–9).
  • Nyer, P. U. (2000). An investigation into whether complaining can cause increased consumer satisfaction. Journal of Consumer Marketing, 17(1), 9–19. https://doi.org/10.1108/07363760010309500
  • Oliver, R. L. (1987). An investigation of the interrelationship between consumer (dis) satisfaction and complaint reports. ACR North American Advances.
  • Paiva, A., Leite, I., Boukricha, H., & Wachsmuth, I. (2017). Empathy in virtual agents and robots: A survey. ACM Transactions on Interactive Intelligent Systems, 7(3), 1–40. https://doi.org/10.1145/2912150
  • Parasuraman, A., Zeithaml, V. A., & Berry, L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. 1988, 64(1), 12–40.
  • Park, G., Yim, M. C., Chung, J., & Lee, S. (2022). Effect of AI chatbot empathy and identity disclosure on willingness to donate: The mediation of humanness and social presence. Behaviour & Information Technology, 1–13. https://doi.org/10.1080/0144929X.2022.2105746
  • Pavlik, J. V. (1996). New media technology: Cultural and commercial perspectives. Prentice Hall.
  • Picard, R. W. (2000). Affective computing. MIT press.
  • Puntoni, S., Reczek, R. W., Giesler, M., & Botti, S. (2021). Consumers and artificial intelligence: An experiential perspective. Journal of Marketing, 85(1), 131–151. https://doi.org/10.1177/0022242920953847
  • Rafaeli, S. (1988). From new media to communication. Sage Annual Review of Communication Research: Advancing Communication Science, 16(1), 110–134.
  • Rafaeli, A., Altman, D., Gremler, D. D., Huang, M.-H., Grewal, D., Iyer, B., Parasuraman, A., & de Ruyter, K. (2017). The future of frontline research: Invited commentaries. Journal of Service Research, 20(1), 91–99. https://doi.org/10.1177/1094670516679275
  • Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human-Computer Studies, 151, 102630. https://doi.org/10.1016/j.ijhcs.2021.102630
  • Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people. Cambridge University Press.
  • Rheu, M., Shin, J. Y., Peng, W., & Huh-Yoo, J. (2021). Systematic review: Trust-building factors and implications for conversational agent design. International Journal of Human–Computer Interaction, 37(1), 81–96. https://doi.org/10.1080/10447318.2020.1807710
  • Rizomyliotis, I., Kastanakis, M. N., Giovanis, A., Konstantoulaki, K., & Kostopoulos, I. (2022). “How mAy I help you today?” The use of AI chatbots in small family businesses and the moderating role of customer affective commitment. Journal of Business Research, 153, 329–340. https://doi.org/10.1016/j.jbusres.2022.08.035
  • Roter, D. L., & Hall, J. A. (2004). Physician gender and patient-centered communication: A critical review of empirical research. Annual Review of Public Health, 25(1), 497–519. https://doi.org/10.1146/annurev.publhealth.25.101802.123134
  • Roy, R., & Naidoo, V. (2021). Enhancing chatbot effectiveness: The role of anthropomorphic conversational styles and time orientation. Journal of Business Research, 126, 23–34. https://doi.org/10.1016/j.jbusres.2020.12.051
  • Ruan, Y., & Mezei, J. (2022). When do AI chatbots lead to higher customer satisfaction than human frontline employees in online shopping assistance? Considering product attribute type. Journal of Retailing and Consumer Services, 68, 103059. https://doi.org/10.1016/j.jretconser.2022.103059
  • Rupp, D. E., Shao, R., Thornton, M. A., & Skarlicki, D. P. (2013). Applicants’ and employees’ reactions to corporate social responsibility: The moderating effects of first‐party justice perceptions and moral identity. Personnel Psychology, 66(4), 895–933. https://doi.org/10.1111/peps.12030
  • Salovey, P., & Mayer, J. D. (1990). Emotional intelligence. Imagination, Cognition and Personality, 9(3), 185–211. https://doi.org/10.2190/DUGG-P24E-52WK-6CDG
  • Schuetzler, R. M., Grimes, G. M., & Giboney, J. S. (2019). The effect of conversational agent skill on user behavior during deception. Computers in Human Behavior, 97, 250–259. https://doi.org/10.1016/j.chb.2019.03.033
  • Sheth, A., Yip, H. Y., Iyengar, A., & Tepper, P. (2019). Cognitive services and intelligent chatbots: Current perspectives and special issue introduction. IEEE Internet Computing, 23(2), 6–12. https://doi.org/10.1109/MIC.2018.2889231
  • Shum, H. Y., He, X. D., & Li, D. (2018). From Eliza to XiaoIce: Challenges and opportunities with social chatbots. Frontiers of Information Technology & Electronic Engineering, 19(1), 10–26. https://doi.org/10.1631/FITEE.1700826
  • Smith, A. K., Bolton, R. N., & Wagner, J. (1999). A model of customer satisfaction with service encounters involving failure and recovery. Journal of Marketing Research, 36(3), 356–372. https://doi.org/10.1177/002224379903600305
  • Stiles, W. B. (1987). “I have to talk to somebody” A fever model of disclosure. In V. J. Derlaga, & J. H. Berg (Eds.), Self-disclosure: Theory, research, and therapy (pp. 257–282). Springer Science & Business Media.
  • Sundar, S. S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility. In M. J. Metzger & A. J. Flanagin (Eds.), Digital media, youth, and credibility (pp. 73–100). The MIT Press. https://doi.org/10.1162/dmal.9780262562324.073
  • Sundar, S. S., Go, E., Kim, H. S., & Zhang, B. (2015). Communicating art, virtually! Psychological effects of technological affordances in a virtual museum. International Journal of Human-Computer Interaction, 31(6), 385–401. https://doi.org/10.1080/10447318.2015.1033912
  • Sundar, S. S., & Limperos, A. M. (2013). Uses and grats 2.0: New gratifications for new media. Journal of Broadcasting & Electronic Media, 57(4), 504–525. https://doi.org/10.1080/08838151.2013.845827
  • Sundar, S. S., Bellur, S., Oh, J., Jia, H., & Kim, H. S. (2016). Theoretical importance of contingency in human-computer interaction: Effects of message interactivity on user engagement. Communication Research, 43(5), 595–625. https://doi.org/10.1177/0093650214534962
  • Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Taylor, M., & Kent, M. L. (2004). Congressional web sites and their potential for public dialogue. Atlantic Journal of Communication, 12(2), 59–76. https://doi.org/10.1207/s15456889ajc1202_1
  • Taylor, M., & Kent, M. L. (2014). Dialogic engagement: Clarifying foundational concepts. Journal of Public Relations Research, 26(5), 384–398. https://doi.org/10.1080/1062726X.2014.956106
  • Thomas, P., Czerwinski, M., McDuff, D., Craswell, N., & Mark, G. (2018, March). Style and alignment in information-seeking conversation [Paper presentation]. The 2018 Conference on Human Information Interaction & Retrieval (pp. 42–51). https://doi.org/10.1145/3176349.3176388
  • Tsai, W. H. S., & Chuan, C. H. (2023). Humanizing chatbots for interactive marketing. In C. L. Wang (Ed.), The Palgrave handbook of interactive marketing (pp. 255–273). Springer Nature.
  • Tsai, W. H. S., Lun, D., Carcioppolo, N., & Chuan, C. H. (2021). Human versus chatbot: Understanding the role of emotion in health marketing communication for vaccines. Psychology & Marketing, 38(12), 2377–2392. https://doi.org/10.1002/mar.21556
  • Urakami, J., Moore, B. A., Sutthithatip, S., & Park, S. (2019, September). Users’ perception of empathic expressions by an advanced intelligent system. [Paper presentation]. 7th International Conference on Human-Agent Interaction (pp. 11–18). https://doi.org/10.1145/3349537.3351895
  • Van den Broeck, E., Zarouali, B., & Poels, K. (2019). Chatbot advertising effectiveness: When does the message get through? Computers in Human Behavior, 98, 150–157. https://doi.org/10.1016/j.chb.2019.04.009
  • Wei, C., Liu, M. W., & Keh, H. T. (2020). The road to consumer forgiveness is paved with money or apology? The roles of empathy and power in service recovery. Journal of Business Research, 118, 321–334. https://doi.org/10.1016/j.jbusres.2020.06.061
  • Wetzer, I. M., Zeelenberg, M., & Pieters, R. (2007). “Never eat in that restaurant, I did!”: Exploring why people engage in negative word‐of‐mouth communication. Psychology and Marketing, 24(8), 661–680. https://doi.org/10.1002/mar.20178
  • Witte, K. (1994). Fear control and danger control: A test of the extended parallel process model (EPPM). Communication Monographs, 61(2), 113–134. https://doi.org/10.1080/03637759409376328
  • Xiao, Z., Zhou, M. X., Chen, W., Yang, H., & Chi, C. (2020). If I hear you correctly: building and evaluating interview chatbots with active listening skills [Paper presentation]. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–14). ACM. https://doi.org/10.1145/3313831.3376131
  • Xu, K., Chen, X., & Huang, L. (2022). Deep mind in social responses to technologies: A new approach to explaining the computers are social actors phenomena. Computers in Human Behavior, 134, 107321. https://doi.org/10.1016/j.chb.2022.107321
  • Zack, M. H. (1993). Interactivity and communication mode choice in ongoing management groups. Information Systems Research, 4(3), 207–239. https://doi.org/10.1287/isre.4.3.207
  • Zeelenberg, M., & Pieters, R. (2004). Beyond valence in customer dissatisfaction: A review and new findings on behavioral responses to regret and disappointment in failed services. Journal of Business Research, 57(4), 445–455. https://doi.org/10.1016/S0148-2963(02)00278-3
  • Zhang, J., Oh, Y. J., Lange, P., Yu, Z., & Fukuoka, Y. (2020). Artificial intelligence chatbot behavior change model for designing artificial intelligence chatbots to promote physical activity and a healthy diet: Viewpoint. Journal of Medical Internet Research, 22(9), e22845. https://doi.org/10.2196/22845
  • Zhang, L., Anjum, M. A., & Wang, Y. (2023). The impact of trust-building mechanisms on purchase intention towards metaverse shopping: The moderating role of age. International Journal of Human–Computer Interaction, 1–19. https://doi.org/10.1080/10447318.2023.2184594
  • Zhu, X., & Smith, R. A. (2021). Stigma, communication, and health. In T. L. Thompson, R. Parrott, & J. F. Nussbaum (Eds.), The Routledge handbook of health communication (pp. 77–90). Routledge.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.