480
Views
0
CrossRef citations to date
0
Altmetric
Survey Article

More Unique, More Accepting? Integrating Sense of Uniqueness, Perceived Knowledge, and Perceived Empathy with Acceptance of Medical Artificial Intelligence

, , &
Received 09 Apr 2023, Accepted 30 Nov 2023, Published online: 21 Dec 2023

References

  • Adadi, A., Lahmer, M., & Nasiri, S. (2022). Artificial intelligence and COVID-19: A systematic umbrella review and roads ahead. Journal of King Saud University. Computer and Information Sciences, 34(8), 5898–5920. https://doi.org/10.1016/j.jksuci.2021.07.010
  • Ahmad, Z., Rahim, S., Zubair, M., & Abdul-Ghafar, J. (2021). Artificial intelligence (AI) in medicine, current applications and future role with special emphasis on its potential and promise in pathology: Present and future impact, obstacles including costs and acceptance among pathologists, practical and philosophical considerations. A comprehensive review. Diagnostic Pathology, 16(1), 24. https://doi.org/10.1186/s13000-021-01085-4
  • Alkan, O. K., Çeliktutan, O., Salam, H., Mahmoud, M., Buckley, G., & Phelan, N. (2023). Interactive technologies for AI in healthcare: Diagnosis, management, and assistance (ITAH). In Companion Proceedings of the 28th International Conference on Intelligent User Interfaces (pp. 193–195). https://doi.org/10.1145/3581754.3584167
  • Bartneck, C., Bleeker, T., Bun, J., Fens, P., & Riet, L. (2010). The influence of robot anthropomorphism on the feelings of embarrassment when interacting with robots. Paladyn, Journal of Behavioral Robotics, 1(2), 109–115. https://doi.org/10.2478/s13230-010-0011-3
  • Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71–81. https://doi.org/10.1007/s12369-008-0001-3
  • Bauchat, J. R., Seropian, M., & Jeffries, P. R. (2016). Communication and empathy in the patient-centered care model—Why simulation-based training is not optional. Clinical Simulation in Nursing, 12(8), 356–359. https://doi.org/10.1016/j.ecns.2016.04.003
  • Baudier, P., Kondrateva, G., Ammi, C., Chang, V., & Schiavone, F. (2023). Digital transformation of healthcare during the COVID-19 pandemic: Patients’ teleconsultation acceptance and trusting beliefs. Technovation, 120, 102547. https://doi.org/10.1016/j.technovation.2022.102547
  • Beldad, A. D., & Hegner, S. M. (2018). Expanding the technology acceptance model with the inclusion of trust, social influence, and health valuation to determine the predictors of German users’ willingness to continue using a fitness app: A structural equation modeling approach. International Journal of Human–Computer Interaction, 34(9), 882–893. https://doi.org/10.1080/10447318.2017.1403220
  • Bernotat, J., & Eyssel, F. (2017). A robot at home—How affect, technology commitment, and personality traits influence user experience in an intelligent robotics apartment. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 641–646). IEEE. https://doi.org/10.1109/ROMAN.2017.8172370
  • Bigman, Y. E., & Gray, K. (2018). People are averse to machines making moral decisions. Cognition, 181, 21–34. https://doi.org/10.1016/j.cognition.2018.08.003
  • Bogue, R. (2020). Robots in a contagious world. Industrial Robot: The International Journal of Robotics Research and Application, 47(5), 673–642. https://doi.org/10.1108/IR-05-2020-0101
  • Borau, S., Otterbring, T., Laporte, S., & Fosso Wamba, S. (2021). The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychology & Marketing, 38(7), 1052–1068. https://doi.org/10.1002/mar.21480
  • Botti, S., Orfali, K., & Iyengar, S. S. (2009). Tragic choices: Autonomy and emotional responses to medical decisions. Journal of Consumer Research, 36(3), 337–352. https://doi.org/10.1086/598969
  • Brislin, R. W. (1970). Back-translation for cross-cultural research. Journal of Cross-Cultural Psychology, 1(3), 185–216. https://doi.org/10.1177/135910457000100301
  • Burgess, E. R., Jankovic, I., Austin, M., Cai, N., Kapuundefinedcińska, A., Currie, S., Overhage, J. M., Poole, E. S., & Kaye, J. (2023). Healthcare AI treatment decision support: Design principles to enhance clinician adoption and trust. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1–19). https://doi.org/10.1145/3544548.3581251
  • Cadario, R., Longoni, C., & Morewedge, C. K. (2021). Understanding, explaining, and utilizing medical artificial intelligence. Nature Human Behaviour, 5(12), 1636–1642. https://doi.org/10.1038/s41562-021-01146-0
  • Calisto, F. M., Ferreira, A., Nascimento, J. C., & Gonçalves, D. (2017, October). Towards touch-based medical image diagnosis annotation. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (pp. 390–395). https://doi.org/10.1145/3132272.3134111
  • Calisto, F. M., Nunes, N., & Nascimento, J. C. (2022). Modeling adoption of intelligent agents in medical imaging. International Journal of Human–Computer Studies, 168, 102922. https://doi.org/10.1016/j.ijhcs.2022.102922
  • Calisto, F. M., Santiago, C., Nunes, N., & Nascimento, J. C. (2022). BreastScreening-AI: Evaluating medical intelligent agents for human–AI interactions. Artificial Intelligence in Medicine, 127, 102285. https://doi.org/10.1016/j.artmed.2022.102285
  • Canale, S. D., Louis, D. Z., Maio, V., Wang, X., Rossi, G., Hojat, M., & Gonnella, J. S. (2012). The relationship between physician empathy and disease complications: An empirical study of primary care physicians and their diabetic patients in Parma, Italy. Academic Medicine: Journal of the Association of American Medical Colleges, 87(9), 1243–1249. https://doi.org/10.1097/ACM.0b013e3182628fbf
  • Castelo, N., Bos, M. W., & Lehmann, D. R. (2019). Task-dependent algorithm aversion. Journal of Marketing Research, 56(5), 809–825. https://doi.org/10.1177/0022243719851788
  • Castillo, D., Canhoto, A. I., & Said, E. (2021). The dark side of AI-powered service interactions: Exploring the process of co-destruction from the customer perspective. The Service Industries Journal, 41(13–14), 900–925. https://doi.org/10.1080/02642069.2020.1787993
  • Caviola, L., Everett, J. A. C., & Faber, N. S. (2019). The moral standing of animals: Towards a psychology of speciesism. Journal of Personality and Social Psychology, 116(6), 1011–1029. https://doi.org/10.1037/pspp0000182
  • Chi, O. H., Denton, G., & Gursoy, D. (2020). Artificially intelligent device use in service delivery: A systematic review, synthesis, and research agenda. Journal of Hospitality Marketing & Management, 29(7), 757–786. https://doi.org/10.1080/19368623.2020.1721394
  • Choi, D., Oh, I.-S., & Colbert, A. E. (2015). Understanding organizational commitment: A meta-analytic examination of the roles of the five-factor model of personality and culture. The Journal of Applied Psychology, 100(5), 1542–1567. https://doi.org/10.1037/apl0000014
  • Choung, H., David, P., & Ross, A. (2023). Trust in AI and its role in the acceptance of AI technologies. International Journal of Human–Computer Interaction, 39(9), 1727–1739. https://doi.org/10.1080/10447318.2022.2050543
  • Chuah, S. H. W., Aw, E. C. X., & Yee, D. (2021). Unveiling the complexity of consumers’ intention to use service robots: An fsQCA approach. Computers in Human Behavior, 123, 106870. https://doi.org/10.1016/j.chb.2021.106870
  • Davenport, T., Guha, A., Grewal, D., & Bressgott, T. (2020). How artificial intelligence will change the future of marketing. Journal of the Academy of Marketing Science, 48(1), 24–42. https://doi.org/10.1007/s11747-019-00696-0
  • Dufour, L., Maoret, M., & Montani, F. (2020). Coupling high self-perceived creativity and successful newcomer adjustment in organizations: The role of supervisor trust and support for authentic self expression. Journal of Management Studies, 57(8), 1531–1555. https://doi.org/10.1111/joms.12547
  • Eastwood, J., Snook, B., & Luther, K. (2012). What people want from their professionals: Attitudes toward decision-making strategies: Attitudes toward decision-making strategies. Journal of Behavioral Decision Making, 25(5), 458–468. https://doi.org/10.1002/bdm.741
  • Epstein, R. M., & Peters, E. (2009). Beyond information exploring patients’ preferences. JAMA, 302(2), 195–197. https://doi.org/10.1001/jama.2009.984
  • Esmaeilzadeh, P., Mirzaei, T., & Dharanikota, S. (2021). Patients’ perceptions toward human–artificial intelligence interaction in health care: Experimental Study. Journal of Medical Internet Research, 23(11), e25856. https://doi.org/10.2196/25856
  • Esteva, A., Kuprel, B., Novoa, R. A., Ko, J., Swetter, S. M., Blau, H. M., & Thrun, S. (2017). Dermatologist-level classification of skin cancer with deep neural networks. Nature, 542(7639), 115–118. https://doi.org/10.1038/nature21056
  • Fan, A., Wu, L., Miao, L., & Mattila, A. S. (2020). When does technology anthropomorphism help alleviate customer dissatisfaction after a service failure? – The moderating role of consumer technology self-efficacy and interdependent self-construal. Journal of Hospitality Marketing & Management, 29(3), 269–290. https://doi.org/10.1080/19368623.2019.1639095
  • Fernández-Llamas, C., Conde, M. A., Rodríguez-Lera, F. J., Rodríguez-Sedano, F. J., & García, F. (2018). May I teach you? Students’ behavior when lectured by robotic vs. human teachers. Computers in Human Behavior, 80, 460–469. https://doi.org/10.1016/j.chb.2017.09.028
  • Ferrari, F., Paladino, M. P., & Jetten, J. (2016). Blurring human–machine distinctions: Anthropomorphic appearance in social robots as a threat to human distinctiveness. International Journal of Social Robotics, 8(2), 287–302. https://doi.org/10.1007/s12369-016-0338-y
  • Fiske, S., Cuddy, A., Glick, P., & Xu, J. (2002). A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition. Journal of Personality and Social Psychology, 82(6), 878–902. https://doi.org/10.1037//0022-3514.82.6.878
  • FRank, D. A., Elbæk, C. T., Børsting, C. K., Mitkidis, P., Otterbring, T., & Borau, S. (2021). Drivers and social implications of artificial intelligence adoption in healthcare during the COVID-19 pandemic. PLOS One, 16(11), e0259928. https://doi.org/10.1371/journal.pone.0259928
  • Fritsch, S. J., Blankenheim, A., Wahl, A., Hetfeld, P., Maassen, O., Deffge, S., Kunze, J., Rossaint, R., Riedel, M., Marx, G., & Bickenbach, J. (2022). Attitudes and perception of artificial intelligence in healthcare: A cross-sectional survey among patients. Digital Health, 8. https://doi.org/10.1177/20552076221116772
  • Gray, H. M., Gray, K., & Wegner, D. M. (2007). Dimensions of mind perception. Science, 315(5812), 619–619. https://doi.org/10.1126/science.1134475
  • Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125–130. https://doi.org/10.1016/j.cognition.2012.06.007
  • Greenbaum, R. L., Hill, A., Mawritz, M. B., & Quade, M. J. (2017). Employee machiavellianism to unethical behavior: The role of abusive supervision as a trait activator. Journal of Management, 43(2), 585–609. https://doi.org/10.1177/0149206314535434
  • Guo, X., Zhang, X., & Sun, Y. (2016). The privacy–personalization paradox in mHealth services acceptance of different age groups. Electronic Commerce Research and Applications, 16, 55–65. https://doi.org/10.1016/j.elerap.2015.11.001
  • Gursoy, D., Chi, O. H., Lu, L., & Nunkoo, R. (2019). Consumers acceptance of artificially intelligent (AI) device use in service delivery. International Journal of Information Management, 49, 157–169. https://doi.org/10.1016/j.ijinfomgt.2019.03.008
  • Haleem, A., Javaid, M., & Khan, I. H. (2019). Current status and applications of artificial intelligence (AI) in medical field: An overview. Current Medicine Research and Practice, 9(6), 231–237. https://doi.org/10.1016/j.cmrp.2019.11.005
  • Haring, K. S., Silvera-Tawil, D., Matsumoto, Y., Velonaki, M., & Watanabe, K. (2014). Perception of an android robot in Japan and Australia: A cross-cultural comparison. In Social Robotics: 6th International Conference, ICSR 2014, Sydney, Nsw, Australia, October 27–29, 2014. Proceedings (Vol. 6, pp. 166–175). Springer International Publishing. https://doi.org/10.1007/978-3-319-11973-1_17
  • Harrison, M. A., & Hall, A. E. (2010). Anthropomorphism, empathy, and perceived communicative ability vary with phylogenetic relatedness to humans. Journal of Social, Evolutionary, and Cultural Psychology, 4(1), 34–48. https://doi.org/10.1037/h0099303
  • Harst, L., Lantzsch, H., & Scheibe, M. (2019). Theories predicting end-user acceptance of telemedicine use: Systematic review. Journal of Medical Internet Research, 21(5), e13117. https://doi.org/10.2196/13117
  • Haslam, N. (2006). Dehumanization: An integrative review. Personality and Social Psychology Review, 10(3), 252–264. https://doi.org/10.1207/s15327957pspr1003_4
  • Haslam, N., & Loughnan, S. (2014). Dehumanization and infrahumanization. Annual Review of Psychology, 65(1), 399–423. https://doi.org/10.1146/annurev-psych-010213-115045
  • Hayes, A. F. (2018). Partial, conditional, and moderated moderated mediation: Quantification, inference, and interpretation. Communication Monographs, 85(1), 4–40. https://doi.org/10.1080/03637751.2017.1352100
  • He, J., Baxter, S. L., Xu, J., Xu, J., Zhou, X., & Zhang, K. (2019). The practical implementation of artificial intelligence technologies in medicine. Nature Medicine, 25(1), 30–36. https://doi.org/10.1038/s41591-018-0307-0
  • Hojat, M., Louis, D. Z., Markham, F. W., Wender, R., Rabinowitz, C., & Gonnella, J. S. (2011). Physicians’ empathy and clinical outcomes for diabetic patients. Academic Medicine: Journal of the Association of American Medical Colleges, 86(3), 359–364. https://doi.org/10.1097/ACM.0b013e3182086fe1
  • Hsieh, P.-J. (2016). An empirical investigation of patients’ acceptance and resistance toward the health cloud: The dual factor perspective. Computers in Human Behavior, 63, 959–969. https://doi.org/10.1016/j.chb.2016.06.029
  • Hu, J., & Wang, R. (2023). Familiarity breeds trust? The relationship between dating app use and trust in dating algorithms via algorithm awareness and critical algorithm perceptions. International Journal of Human–Computer Interaction, 1–12. https://doi.org/10.1080/10447318.2023.2217014
  • Huo, W., Yuan, X., Li, X., Luo, W., Xie, J., & Shi, B. (2023). Increasing acceptance of medical AI: The role of medical staff participation in AI development. International Journal of Medical Informatics, 175, 105073. https://doi.org/10.1016/j.ijmedinf.2023.105073
  • Huo, W., Zhang, Z., Qu, J., Yan, J., Yan, S., Yan, J., & Shi, B. (2023). Speciesism and preference of human–artificial intelligence interaction: A study on medical artificial intelligence. International Journal of Human–Computer Interaction, 1–13. https://doi.org/10.1080/10447318.2023.2176985
  • Huo, W., Zheng, G., Yan, J., Sun, L., & Han, L. (2022). Interacting with medical artificial intelligence: Integrating self-responsibility attribution, human–computer trust, and personality. Computers in Human Behavior, 132, 107253. https://doi.org/10.1016/j.chb.2022.107253
  • Inman, J. J., Campbell, M. C., Kirmani, A., & Price, L. L. (2018). Our vision for the Journal of Consumer Research: It’s all about the consumer. Journal of Consumer Research, 44(5), 955–959. https://doi.org/10.1093/jcr/ucx123
  • Isbanner, S., O'Shaughnessy, P., Steel, D., Wilcock, S., & Carter, S. (2022). The adoption of artificial intelligence in health care and social services in Australia: Findings from a methodologically innovative national survey of values and attitudes (the AVA-AI study). Journal of Medical Internet Research, 24(8), e37611. https://doi.org/10.2196/37611
  • Ivanov, S. H., & Webster, C. (2017). Adoption of robots, artificial intelligence and service automation by travel, tourism and hospitality companies–A cost-benefit analysis. In International Scientific Conference “Contemporary Tourism – Traditions and Innovations” (pp. 19–21). Sofia University.
  • Jin, S. V., & Youn, S. (2023). Social presence and imagery processing as predictors of chatbot continuance intention in human–AI-interaction. International Journal of Human–Computer Interaction, 39(9), 1874–1886. https://doi.org/10.1080/10447318.2022.2129277
  • Kaplan, F. (2004). Who is afraid of the humanoid? Investigating cultural differences in the acceptance of robots. International Journal of Humanoid Robotics, 01(03), 465–480. https://doi.org/10.1142/S0219843604000289
  • Kaplan, A. D., Sanders, T., & Hancock, P. A. (2018). The relationship between extroversion and the tendency to anthropomorphize robots: A Bayesian analysis. Frontiers in Robotics and AI, 5, 135. https://doi.org/10.3389/frobt.2018.00135
  • Keel, S., Lee, P. Y., Scheetz, J., Li, Z., Kotowicz, M. A., MacIsaac, R. J., & He, M. (2018). Feasibility and patient acceptability of a novel artificial intelligence-based screening model for diabetic retinopathy at endocrinology outpatient services: A pilot study. Scientific Reports, 8(1), 4330. https://doi.org/10.1038/s41598-018-22612-2
  • Kelley, J. M., Kraft-Todd, G., Schapira, L., Kossowsky, J., & Riess, H. (2014). The influence of the patient-clinician relationship on healthcare outcomes: A systematic review and meta-analysis of randomized controlled trials. PLOS One, 9(4), e94207. https://doi.org/10.1371/journal.pone.0094207
  • Kelly, C. J., Karthikesalingam, A., Suleyman, M., Corrado, G., & King, D. (2019). Key challenges for delivering clinical impact with artificial intelligence. BMC Medicine, 17(1), 195. https://doi.org/10.1186/s12916-019-1426-2
  • Kelly, S., Kaye, S. A., & Oviedo-Trespalacios, O. (2022). What factors contribute to the acceptance of artificial intelligence? A systematic review. Telematics and Informatics, 77, 101925. https://doi.org/10.1016/j.tele.2022.101925
  • Kemp, E., Bui, M., Krishen, A., Homer, P. M., & LaTour, M. S. (2017). Understanding the power of hope and empathy in healthcare marketing. Journal of Consumer Marketing, 34(2), 85–95. https://doi.org/10.1108/JCM-04-2016-1765
  • Kerasidou, A. (2020). Artificial intelligence and the ongoing need for empathy, compassion and trust in healthcare. Bulletin of the World Health Organization, 98(4), 245–250. https://doi.org/10.2471/BLT.19.237198
  • Khairat, S., Marc, D., Crosby, W., & Al Sanousi, A. (2018). Reasons for physicians not adopting clinical decision support systems: Critical analysis. JMIR Medical Informatics, 6(2), e8912. https://doi.org/10.2196/medinform.8912
  • KIjsanayotin, B., Pannarunothai, S., & Speedie, S. M. (2009). Factors influencing health information technology adoption in Thailand’s community health centers: Applying the UTAUT model. International Journal of Medical Informatics, 78(6), 404–416. https://doi.org/10.1016/j.ijmedinf.2008.12.005
  • Kim, S. Y., Schmitt, B. H., & Thalmann, N. M. (2019). Eliza in the uncanny valley: Anthropomorphizing consumer robots increases their perceived warmth but decreases liking. Marketing Letters, 30(1), 1–12. https://doi.org/10.1007/s11002-019-09485-9
  • Kim, W. B., & Hur, H. J. (2023). What makes people feel empathy for AI chatbots? Assessing the role of competence and warmth. International Journal of Human–Computer Interaction, 1–14. https://doi.org/10.1080/10447318.2023.2219961
  • Klotz, A. C., Swider, B. W., & Kwon, S. H. (2022). Back-translation practices in organizational research: Avoiding loss in translation. The Journal of Applied Psychology, 108(5), 699–727. https://doi.org/10.1037/apl0001050
  • Komorowski, M., Celi, L., Badawi, O., Gordon, A. C., & Faisal, A. A. (2018). The artificial intelligence clinician learns optimal treatment strategies for sepsis in intensive care. Nature Medicine, 24(11), 1716–1720. https://doi.org/10.1038/s41591-018-0213-5
  • Kraus, J., Miller, L., Klumpp, M., Babel, F., Scholz, D., Merger, J., & Baumann, M. (2023). On the role of beliefs and trust for the intention to use service robots: An integrated trustworthiness beliefs model for robot acceptance. International Journal of Social Robotics, 1–24. https://doi.org/10.1007/s12369-022-00952-4
  • Kristensen, D. B., & Ruckenstein, M. (2018). Co-evolving with self-tracking technologies. New Media & Society, 20(10), 3624–3640. https://doi.org/10.1177/1461444818755650
  • Lalmuanawma, S., Hussain, J., & Chhakchhuak, L. (2020). Applications of machine learning and artificial intelligence for Covid-19 (SARS-CoV-2) pandemic: A review. Chaos, Solitons, and Fractals, 139, 110059. https://doi.org/10.1016/j.chaos.2020.110059
  • Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., & Paiva, A. (2013). The influence of empathy in human–robot relations. International Journal of Human–Computer Studies, 71(3), 250–260. https://doi.org/10.1016/j.ijhcs.2012.09.005
  • Leyens, J., Cortes, B., Demoulin, S., Dovidio, J. F., Fiske, S. T., Gaunt, R., Paladino, M., Rodriguez‐Perez, A., Rodriguez‐Torres, R., & Vaes, J. (2003). Emotional prejudice, essentialism, and nationalism: The 2002 Tajfel Lecture. European Journal of Social Psychology, 33(6), 703–717. https://doi.org/10.1002/ejsp.170
  • Li, C. R., Zhang, E., & Han, J. T. (2021). Adoption of online follow-up service by patients: An empirical study based on the elaboration likelihood model. Computers in Human Behavior, 114, 106581. https://doi.org/10.1016/j.chb.2020.106581
  • Lin, Y., Huang, G., Ho, Y., & Lou, M. (2020). Patient willingness to undergo a two‐week free trial of a telemedicine service for coronary artery disease after coronary intervention: A mixed‐methods study. Journal of Nursing Management, 28(2), 407–416. https://doi.org/10.1111/jonm.12942
  • Liu, K., & Tao, D. (2022). The roles of trust, personalization, loss of privacy, and anthropomorphism in public acceptance of smart healthcare services. Computers in Human Behavior, 127, 107026. https://doi.org/10.1016/j.chb.2021.107026
  • Liu, R., Rong, Y., & Peng, Z. (2020). A review of medical artificial intelligence. Global Health Journal, 4(2), 42–45. https://doi.org/10.1016/j.glohj.2020.04.002
  • Liu, Y., Yan, W., Hu, B., Li, Z., & Lai, Y. L. (2022). Effects of personalization and source expertise on users’ health beliefs and usage intention toward health chatbots: Evidence from an online experiment. Digital Health, 8. https://doi.org/10.1177/20552076221129718
  • Longoni, C., Bonezzi, A., & Morewedge, C. K. (2019). Resistance to medical artificial intelligence. Journal of Consumer Research, 46(4), 629–650. https://doi.org/10.1093/jcr/ucz013
  • Loughnan, S., & Haslam, N. (2007). Animals and androids: Implicit associations between social categories and nonhumans. Psychological Science, 18(2), 116–121. https://doi.org/10.1111/j.1467-9280.2007.01858.x
  • Lu, L., Cai, R., & Gursoy, D. (2019). Developing and validating a service robot integration willingness scale. International Journal of Hospitality Management, 80, 36–51. https://doi.org/10.1016/j.ijhm.2019.01.005
  • Luo, X., Tong, S., Fang, Z., & Qu, Z. (2019). Frontiers: Machines vs. humans: The impact of artificial intelligence chatbot disclosure on customer purchases. Marketing Science, 38(6), 937–947. https://doi.org/10.1287/mksc.2019.1192
  • Lv, X., Liu, Y., Luo, J., Liu, Y., & Li, C. (2021). Does a cute artificial intelligence assistant soften the blow? The impact of cuteness on customer tolerance of assistant service failure. Annals of Tourism Research, 87, 103114. https://doi.org/10.1016/j.annals.2020.103114
  • Markus, A. F., Kors, J. A., & Rijnbeek, P. R. (2021). The role of explainability in creating trustworthy artificial intelligence for health care: A comprehensive survey of the terminology, design choices, and evaluation strategies. Journal of Biomedical Informatics, 113, 103655. https://doi.org/10.1016/j.jbi.2020.103655
  • McElroy, J. C., Hendrickson, A. R., Townsend, A. M., & DeMarie, S. M. (2007). Dispositional factors in internet use: Personality versus cognitive style. MIS Quarterly, 31(4), 809–820. https://doi.org/10.2307/25148821
  • Mende, M., Scott, M. L., van Doorn, J., Grewal, D., & Shanks, I. (2019). Service robots rising: How humanoid robots influence service experiences and elicit compensatory consumer responses. Journal of Marketing Research, 56(4), 535–556. https://doi.org/10.1177/0022243718822827
  • Nambisan, P. (2011). Information seeking and social support in online health communities: Impact on patients’ perceived empathy. Journal of the American Medical Informatics Association, 18(3), 298–304. https://doi.org/10.1136/amiajnl-2010-000058
  • Naneva, S., Sarda Gou, M., Webb, T. L., & Prescott, T. J. (2020). A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. International Journal of Social Robotics, 12(6), 1179–1201. https://doi.org/10.1007/s12369-020-00659-4
  • Nissenbaum, H., & Walker, D. (1998). Will computers dehumanize education? A grounded approach to values at risk. Technology in Society, 20(3), 237–273. https://doi.org/10.1016/S0160-791X(98)00011-6
  • Palmeira, M., & Spassova, G. (2015). Consumer reactions to professionals who use decision aids. European Journal of Marketing, 49(3/4), 302–326. https://doi.org/10.1108/EJM-07-2013-0390
  • Pelau, C., Dabija, D. C., & Ene, I. (2021). What makes an AI device human-like? The role of interaction quality, empathy and perceived psychological anthropomorphic characteristics in the acceptance of artificial intelligence in the service industry. Computers in Human Behavior, 122, 106855. https://doi.org/10.1016/j.chb.2021.106855
  • Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. The Journal of Applied Psychology, 88(5), 879–903. https://doi.org/10.1037/0021-9010.88.5.879
  • Rosenthal-von Der Pütten, A. M., Schulte, F. P., Eimler, S. C., Sobieraj, S., Hoffmann, L., Maderwald, S., Brand, M., & Krämer, N. C. (2014). Investigations on empathy towards humans and robots using fMRI. Computers in Human Behavior, 33, 201–212. https://doi.org/10.1016/j.chb.2014.01.004
  • Sattar, K., Yusoff, M. S. B., Arifin, W. N., Mohd Yasin, M. A., & Mat Nor, M. Z. (2023). A scoping review on the relationship between mental wellbeing and medical professionalism. Medical Education Online, 28(1), 2165892. https://doi.org/10.1080/10872981.2023.2165892
  • Scopelliti, I., Morewedge, C. K., McCormick, E., Min, H. L., Lebrecht, S., & Kassam, K. S. (2015). Bias blind spot: Structure, measurement, and consequences. Management Science, 61(10), 2468–2486. https://doi.org/10.1287/mnsc.2014.2096
  • Seethamraju, R., Diatha, K. S., & Garg, S. (2018). Intention to use a mobile-based information technology solution for tuberculosis treatment monitoring – Applying a UTAUT model. Information Systems Frontiers, 20(1), 163–181. https://doi.org/10.1007/s10796-017-9801-z
  • Sela, Y., & Amichai-Hamburger, Y. (2023). “Baby, I Can’t Drive My Car”: How controllability mediates the relationship between personality and the acceptance of autonomous vehicles? International Journal of Human–Computer Interaction, 1–11. https://doi.org/10.1080/10447318.2023.2219965
  • Sensenig, J., & Brehm, J. W. (1968). Attitude change from an implied threat to attitudinal freedom. Journal of Personality and Social Psychology, 8(4), 324–330. https://doi.org/10.1037/h0021241
  • Seyitoğlu, F., & Ivanov, S. (2020). A conceptual framework of the service delivery system design for hospitality firms in the (post-) viral world: The role of service robots. International Journal of Hospitality Management, 91, 102661. https://doi.org/10.1016/j.ijhm.2020.102661
  • Simşek, O. F., & Demir, M. (2014). A cross-cultural investigation into the relationships among parental support for basic psychological needs, sense of uniqueness, and happiness. The Journal of Psychology, 148(4), 387–411. https://doi.org/10.1080/00223980.2013.805115
  • Şimşek, Ö. F., & Yalınçetin, B. (2010). I feel unique, therefore I am: The development and preliminary validation of the personal sense of uniqueness (PSU) scale. Personality and Individual Differences, 49(6), 576–581. https://doi.org/10.1016/j.paid.2010.05.006
  • Singh, P., King-Shier, K., & Sinclair, S. (2020). South Asian patients’ perceptions and experiences of compassion in healthcare. Ethnicity & Health, 25(4), 606–624. https://doi.org/10.1080/13557858.2020.1722068
  • Snyder, C. R., & Fromkin, H. L. (2012). Uniqueness: The human pursuit of difference. Springer Science & Business Media.
  • Sohn, K., & Kwon, O. (2020). Technology acceptance theories and factors influencing artificial Intelligence-based intelligent products. Telematics and Informatics, 47, 101324. https://doi.org/10.1016/j.tele.2019.101324
  • Somasekar, J., Sharma, A., & Ramesh, G. (2020). Machine learning and image analysis applications in the fight against COVID-19 pandemic: Datasets, research directions, challenges and opportunities. Materials Today: Proceedings. https://doi.org/10.1016/j.matpr.2020.09.352
  • Sousa, V. D., & Rojjanasrirat, W. (2011). Translation, adaptation and validation of instruments or scales for use in cross-cultural health care research: A clear and user-friendly guideline: Validation of instruments or scales. Journal of Evaluation in Clinical Practice, 17(2), 268–274. https://doi.org/10.1111/j.1365-2753.2010.01434.x
  • Spiro, H. (2009). Commentary: The practice of empathy. Academic Medicine: Journal of the Association of American Medical Colleges, 84(9), 1177–1179. https://doi.org/10.1097/ACM.0b013e3181b18934
  • Stai, B., Heller, N., McSweeney, S., Rickman, J., Blake, P., Vasdev, R., Edgerton, Z., Tejpaul, R., Peterson, M., Rosenberg, J., Kalapara, A., Regmi, S., Papanikolopoulos, N., & Weight, C. (2020). Public perceptions of artificial intelligence and robotics in medicine. Journal of Endourology, 34(10), 1041–1048. https://doi.org/10.1101/2019.12.15.19014985
  • Stawarz, K., Katz, D., Ayobi, A., Marshall, P., Yamagata, T., Santos-Rodriguez, R., Flach, P., & O’Kane, A. A. (2023). Co-designing opportunities for human-centred machine learning in supporting type 1 diabetes decision-making. International Journal of Human–Computer Studies, 173, 103003. https://doi.org/10.1016/j.ijhcs.2023.103003
  • Stein, J. P., Appel, M., Jost, A., & Ohler, P. (2020). Matter over mind? How the acceptance of digital entities depends on their appearance, mental prowess, and the interaction between both. International Journal of Human–Computer Studies, 142, 102463. https://doi.org/10.1016/j.ijhcs.2020.102463
  • Tao, D., Chen, Z., Qin, M., & Cheng, M. (2023). Modeling consumer acceptance and usage behaviors of m-Health: An integrated model of self-determination theory, task–technology fit, and the technology acceptance model. Healthcare, 11(11), 1550. https://doi.org/10.3390/healthcare11111550
  • Topol, E. J. (2019). High-performance medicine: The convergence of human and artificial intelligence. Nature Medicine, 25(1), 44–56. https://doi.org/10.1038/s41591-018-0300-7
  • Tran, A. Q., Nguyen, L. H., Nguyen, H. S. A., Nguyen, C. T., Vu, L. G., Zhang, M., Vu, T. M. T., Nguyen, S. H., Tran, B. X., Latkin, C. A., Ho, R. C. M., & Ho, C. S. H. (2021). Determinants of intention to use artificial intelligence-based diagnosis support system among prospective physicians. Frontiers in Public Health, 9, 755644. https://doi.org/10.3389/fpubh.2021.755644
  • Uzir, M. U. H., Al Halbusi, H., Lim, R., Jerin, I., Abdul Hamid, A. B., Ramayah, T., & Haque, A. (2021). Applied artificial intelligence and user satisfaction: Smartwatch usage for healthcare in Bangladesh during COVID-19. Technology in Society, 67, 101780. https://doi.org/10.1016/j.techsoc.2021.101780
  • Vaishya, R., Javaid, M., Khan, I. H., & Haleem, A. (2020). Artificial intelligence (AI) applications for COVID-19 pandemic. Diabetes & Metabolic Syndrome, 14(4), 337–339. https://doi.org/10.1016/j.dsx.2020.04.012
  • Valtolina, S., Barricelli, B. R., & Di Gaetano, S. (2020). Communicability of traditional interfaces VS chatbots in healthcare and smart home domains. Behaviour & Information Technology, 39(1), 108–132. https://doi.org/10.1080/0144929X.2019.1637025
  • Vanman, E. J., & Kappas, A. (2019). “Danger, Will Robinson!” The challenges of social robots for intergroup relations. Social and Personality Psychology Compass, 13(8), e12489. https://doi.org/10.1111/spc3.12489
  • Veazie, P. J., & Cai, S. (2007). A connection between medication adherence, patient sense of uniqueness, and the personalization of information. Medical Hypotheses, 68(2), 335–342. https://doi.org/10.1016/j.mehy.2006.04.077
  • Wade, V., Gray, L., & Carati, C. (2017). Theoretical frameworks in telemedicine research. Journal of Telemedicine and Telecare, 23(1), 181–187. https://doi.org/10.1177/1357633X15626650
  • Walther, J. B., Jang, J. W., & Hanna Edwards, A. A. (2018). Evaluating health advice in a Web 2.0 environment: The impact of multiple user-generated factors on HIV advice perceptions. Health Communication, 33(1), 57–67. https://doi.org/10.1080/10410236.2016.1242036
  • Wang, H., Zhang, J., Luximon, Y., Qin, M., Geng, P., & Tao, D. (2022). The determinants of user acceptance of mobile medical platforms: An investigation integrating the TPB, TAM, and patient-centered factors. International Journal of Environmental Research and Public Health, 19(17), 10758. https://doi.org/10.3390/ijerph191710758
  • Wang, J., Molina, M. D., & Sundar, S. S. (2020). When expert recommendation contradicts peer opinion: Relative social influence of valence, group identity and artificial intelligence. Computers in Human Behavior, 107, 106278. https://doi.org/10.1016/j.chb.2020.106278
  • Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The uncanny valley: Existence and explanations. Review of General Psychology, 19(4), 393–407. https://doi.org/10.1037/gpr0000056
  • Wang, Z., Walther, J. B., Pingree, S., & Hawkins, R. P. (2008). Health information, credibility, homophily, and influence via the Internet: Web sites versus discussion groups. Health Communication, 23(4), 358–368. https://doi.org/10.1080/10410230802229738
  • Waytz, A., Gray, K., Epley, N., & Wegner, D. M. (2010). Causes and consequences of mind perception. Trends in Cognitive Sciences, 14(8), 383–388. https://doi.org/10.1016/j.tics.2010.05.006
  • Wu, W., Wu, Y. J., & Wang, H. (2021). Perceived city smartness level and technical information transparency: The acceptance intention of health information technology during a lockdown. Computers in Human Behavior, 122, 106840. https://doi.org/10.1016/j.chb.2021.106840
  • Wu, W. Y., & Ke, C. C. (2015). An online shopping behavior model integrating personality traits, perceived risk, and technology acceptance. Social Behavior and Personality: An International Journal, 43(1), 85–97. https://doi.org/10.2224/sbp.2015.43.1.85
  • Xie, X., Han, Y., Anderson, A., & Ribeiro-Navarrete, S. (2022). Digital platforms and SMEs’ business model innovation: Exploring the mediating mechanisms of capability reconfiguration. International Journal of Information Management, 65, 102513. https://doi.org/10.1016/j.ijinfomgt.2022.102513
  • Yam, K. C., Bigman, Y. E., Tang, P. M., Ilies, R., De Cremer, D., Soh, H., & Gray, K. (2021). Robots at work: People prefer—And forgive—Service robots with perceived feelings. The Journal of Applied Psychology, 106(10), 1557–1572. https://doi.org/10.1037/apl0000834
  • Yang, Y., Liu, Y., Lv, X., Ai, J., & Li, Y. (2022). Anthropomorphism and customers’ willingness to use artificial intelligence service agents. Journal of Hospitality Marketing & Management, 31(1), 1–23. https://doi.org/10.1080/19368623.2021.1926037
  • Ye, T., Xue, J., He, M., Gu, J., Lin, H., Xu, B., & Cheng, Y. (2019). Psychosocial factors affecting Artificial Intelligence adoption in health care in China: Cross-sectional study. Journal of Medical Internet Research, 21(10), e14316. https://doi.org/10.2196/14316
  • Yin, J., Ngiam, K. Y., & Teo, H. H. (2021). Role of artificial intelligence applications in real-life clinical practice: Systematic review. Journal of Medical Internet Research, 23(4), e25759. https://doi.org/10.2196/25759
  • Yogeeswaran, K., Złotowski, J., Livingstone, M., Bartneck, C., Sumioka, H., & Ishiguro, H. (2016). The interactive effects of robot anthropomorphism and robot ability on perceived threat and support for robotics research. Journal of Human–Robot Interaction, 5(2), 29–47. https://doi.org/10.5898/JHRI.5.2.Yogeeswaran
  • Yokoi, R., Eguchi, Y., Fujita, T., & Nakayachi, K. (2021). Artificial intelligence is trusted less than a doctor in medical treatment decisions: influence of perceived care and value similarity. International Journal of Human–Computer Interaction, 37(10), 981–990. https://doi.org/10.1080/10447318.2020.1861763
  • Young, K. L., & Carpenter, C. (2018). Corrigendum to “Does Science Fiction Affect Political Fact? Yes and No: A Survey Experiment on ‘Killer Robots’”. International Studies Quarterly, 63(1), 213–213. https://doi.org/10.1093/isq/sqy044
  • Yun, J. H., Lee, E., & Kim, D. H. (2021). Behavioral and neural evidence on consumer responses to human doctors and medical artificial intelligence. Psychology & Marketing, 38(4), 610–625. https://doi.org/10.1002/mar.21445
  • Zhang, Z., & Zheng, L. (2021). Consumer community cognition, brand loyalty, and behaviour intentions within online publishing communities: An empirical study of Epubit in China. Learned Publishing, 34(2), 116–127. https://doi.org/10.1002/leap.1327
  • Złotowski, J., Yogeeswaran, K., & Bartneck, C. (2017). Can we control it? Autonomous robots threaten human identity, uniqueness, safety, and resources. International Journal of Human–Computer Studies, 100, 48–54. https://doi.org/10.1016/j.ijhcs.2016.12.008

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.