6,210
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Toward a design theory for virtual companionship

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 194-234 | Received 08 Oct 2021, Accepted 23 May 2022, Published online: 18 Jul 2022

References

  • Abbass, H. A. (2019). Social integration of artificial intelligence: Functions, automation allocation logic and human-autonomy trust. Cognitive Computation, 11(2), 159–171. https://doi.org/10.1007/s12559-018-9619-0
  • Ahmad, R., Siemon, D., Gnewuch, U., & Robra-Bissantz, S. (2021) The benefits and caveats of personality-adaptive conversational agents in mental health care. AMCIS 2021 Proceedings
  • Ahmad, R., Siemon, D., Gnewuch, U., & Robra-Bissantz, S. (2022). Designing personality-adaptive conversational agents for mental health care. Inf Syst Front. https://doi.org/10.1007/s10796-022-10254-9
  • Al-Natour, S., Benbasat, I., & Centefelli, R. (2010) Trustworthy virtual advisors and enjoyable interactions: Designing for expressiveness and transparency. In: Proceedings of European Conference on information systems 2010
  • Altman, I., & Taylor, D. A. (1973). Social penetration: The development of interpersonal relationships. Holt, Rinehart & Winston.
  • Anderson, M., & Anderson, S. L. (2011). Machine ethics. Cambridge University Press.
  • Ashleigh, M. J., Higgs, M., & Dulewicz, V. (2012). A new propensity to trust scale and its relationship with individual well-being: Implications for HRM policies and practices. Human Resource Management Journal, 22, 360–376.
  • Asikis, T., & Pournaras, E. (2020). Optimization of privacy-utility trade-offs under informational self-determination. Future Generation Computer Systems, 109, 488–499. https://doi.org/10.1016/j.future.2018.07.018
  • Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117(3), 497. https://doi.org/10.1037/0033-2909.117.3.497
  • Becker, C., Kopp, S., & Wachsmuth, I. (2007). Why emotions should be integrated into conversational agents. In Conversational informatics: An engineering approach (pp. 49–67).
  • Benbasat, I., & Wang, W. (2005). Trust in and adoption of online recommendation agents. Journal of the Association for Information Systems, 6(3), 72–101. https://doi.org/10.17705/1jais.00065
  • Benke, I., Gnewuch, U., & Maedche, A. (2022). Understanding the impact of control levels over emotion-aware chatbots. Computers in Human Behavior, 129, 107122. https://doi.org/10.1016/j.chb.2021.107122
  • Berscheid, E. (1999). The greening of relationship science. American Psychologist, 54, 260–266.
  • Bickmore, T. W., Caruso, L., Clough-Gorr, K., & Heeren, T. (2005). ‘It’s just like you talk to a friend’ relational agents for older adults. Interacting with Computers, 17(6), 711–735. https://doi.org/10.1016/j.intcom.2005.09.002
  • Bickmore, T. W., & Picard, R. W. (2005). Establishing and maintaining long-term human-computer relationships. ACM Trans Comput-Hum Interact, 12, 293–327.
  • Bickmore, T., Trinh, H., Asadi, R., & Olafsson, S. (2018). Safety first: Conversational agents for health care. In R. J. Moore, M. H. Szymanski, R. Arar, & G.-J. Ren (Eds.), Studies in conversational UX design (pp. 33–57). Springer International Publishing.
  • Blau, P. M. (1968). Social exchange. International Encyclopedia of the Social Sciences, 7, 452–457.
  • Bødker, S., & Kyng, M. (2018). Participatory design that matters—facing the big issues. ACM Trans Comput-Hum Interact, 25(4), 1–4. https://doi.org/10.1145/3152421 31
  • Brown, T. B., Mann, B., Ryder, N. et al. (2020). Language models are few-shot learners. arXiv:200514165 [cs.
  • Bukowski, W. M., Hoza, B., & Boivin, M. (1993). Popularity, friendship, and emotional adjustment during early adolescence. New Directions for Child and Adolescent Development, 1993(60), 23–37. https://doi.org/10.1002/cd.23219936004
  • Bukowski, W. M., Hoza, B., & Boivin, M. (1994). Measuring friendship quality during pre- and early adolescence: The development and psychometric properties of the friendship qualities scale. Journal of Social and Personal Relationships, 11(3), 471–484. https://doi.org/10.1177/0265407594113011
  • Cacioppo, J. T., & Patrick, W. (2008). Loneliness: Human nature and the need for social connection. W. W. Norton & Company.
  • Carpenter, A. M., & Greene, K. (2015). Social penetration theory.
  • Carruthers, P., & Smith, P. K. (1996). Theories of theories of mind. Cambridge University Press.
  • Cassell, J., Bickmore, T., Billinghurst, M. et al (1999) Embodiment in Conversational Interfaces: Rea. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, pp 520–527
  • Chandra, L., Seidel, S., & Gregor, S. (2015) Prescriptive knowledge in is research: conceptualizing design principles in terms of materiality, action, and boundary conditions. In: 2015 48th Hawaii international conference on system sciences. pp 4039–4048
  • Chaves, A. P., & Gerosa, M. A. (2021). How should my chatbot interact? A survey on social characteristics in human–chatbot interaction design. International Journal of Human–Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438
  • Choudhury, M. D., Lee, M. K., Zhu, H., & Shamma, D. A. (2020). Introduction to this special issue on unifying human computer interaction and artificial intelligence. Human–Computer Interaction, 35(5–6), 355–361. https://doi.org/10.1080/07370024.2020.1744146
  • Clark, H. H., & Schaefer, E. F. (1987). Collaborating on contributions to conversations. Language and Cognitive Processes, 2(1), 19–41. https://doi.org/10.1080/01690968708406350
  • Clark, H. H., & Brennan, S. E. (1991). Grounding in communication. In L. B. Resnick, J. Levine, & S. D. Behrend (Eds.), Perspectives on socially shared cognition (pp. 127–149). APA.
  • Clark, H. H. (1992). Arenas of language use. University of Chicago Press.
  • Clark, H. H. (1996). Using language. Cambridge University Press.
  • Clark, L., Doyle, P., Garaialde, D. et al (2019a). The state of speech in HCI: Trends, themes and challenges. Interacting with Computers, 31, 349–371.
  • Clark, L., Pantidi, N., Cooney, O. et al (2019b). what makes a good conversation? Challenges in designing truly conversational agents. In: Proceedings of the 2019 chi conference on human factors in computing systems. Association for computing machinery, New York, NY, USA, pp 1–12
  • Clark, L., Ofemile, A., & Cowan, B. R. (2021). Exploring verbal uncanny valley effects with vague language in computer speech. In B. Weiss, J. Trouvain, M. Barkat-Defradas, & J. J. Ohala (Eds.), Voice attractiveness: Studies on sexy, likable, and charismatic speakers (pp. 317–330). Springer.
  • Collins, E., & Ghahramani, Z. (2021) LaMDA: Our breakthrough conversation technology. In: Google. Accessed 7 Sep 2021. https://blog.google/technology/ai/lamda/
  • Cornelissen, J., Höllerer, M. A., & Seidl, D. (2021). What theory is and can be: forms of theorizing in organizational scholarship. Organization Theory, 2, 26317877211020330. https://doi.org/10.1177/26317877211020328
  • Danilava, S., Busemann, S., & Schommer, C. (2012). Artificial conversational companions: A requirements analysis. 282–289.
  • Dautenhahn, K. (2004) Robots we like to live with - a developmental perspective on a personalized, life-long robot companion. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE catalog No.04TH8759). pp 17–22
  • Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. https://doi.org/10.2307/249008
  • De Angeli, A., Lynch, P., & Johnson, G. (2001) Personifying the e-market: 8th TC13 IFIP International conference on human-computer interaction. Human-computer interaction, interact ’01 198–205
  • de Visser, E. J., Monfort, S. S., McKendrick, R. et al (2016). Almost human: Anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology. Applied, 22, 331–349. https://doi.org/10.1037/xap0000092
  • Deci, E. L., & Ryan, R. M. (1985). Intrıinsic motivation and self-deternination in human behaviour. Plenum.
  • Demeure, V., Niewiadomski, R., & Pelachaud, C. (2011). How is believability of a virtual agent related to warmth, competence, personification, and embodiment? Presence, 20, 431–448. https://doi.org/10.1162/PRES_a_00065
  • Diederich, S., Brendel, A. B., & Kolbe, L. M. (2019a) Towards a taxonomy of platforms for conversational agent design. In: Proceedings of the international conference on Wirtschaftsinformatik. p 16
  • Diederich, S., Brendel, A. B., & Kolbe, L. M. (2019b) on conversational agents in information systems research: analyzing the past to guide future work. In: Proceedings of the international conference on Wirtschaftsinformatik. p 16
  • Diederich, S., Brendel, A., Morana, S., & Kolbe, L. (2022). On the design of and interaction with conversational agents: an organizing and assessing review of human-computer interaction research. Journal of the Association for Information Systems, 23(1), 96–138. https://doi.org/10.17705/1jais.00724
  • Doyle, P. R., Edwards, J., Dumbleton, O. et al (2019) Mapping perceptions of humanness in intelligent personal assistant interaction. In: Proceedings of the 21st international conference on human-computer interaction with mobile devices and services. association for computing machinery, New York, NY, USA, pp 1–12
  • Doyle, P. R., Clark, L., & Cowan, B. R. (2021) What do we see in them? Identifying dimensions of partner models for speech interfaces using a psycholexical approach. In: Proceedings of the 2021 CHI conference on human factors in computing systems. association for computing machinery, New York, NY, USA, pp 1–14
  • Elshan, E., Zierau, N., Engel, C. et al. (2022). Understanding the design elements affecting user acceptance of intelligent agents: past, present and future. Inf Syst Front, https://doi.org/10.1007/s10796-021-10230-9
  • Elson, J. S., Derrick, D., & Ligon, G. (2018) Examining trust and reliance in collaborations between humans and automated agents. In: Proceedings of Hawaii international conference on system sciences 2018
  • Fehr, B. (1996). Friendship processes. SAGE.
  • Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies, S1071581918305238. https://doi.org/10.1016/j.ijhcs.2019.07.009
  • Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2020). Gender bias in chatbot design. In A. Følstad, T. Araujo, S. Papadopoulos et al (Eds.), Chatbot research and design (pp. 79–93). Springer International Publishing.
  • Fischer-Hübner, S., Angulo, J., Karegar, F., & Pulls, T. (2016). Transparency, privacy and trust – technology for tracking and controlling my data disclosures: Does this work? In S. M. Habib, J. Vassileva, S. Mauw, & M. Mühlhäuser (Eds.), Trust management X (pp. 3–14). Springer International Publishing.
  • Fogg, B. J. (2002). Persuasive technology: Using computers to change what we think and do. Ubiquity, 2002. https://doi.org/10.1145/764008.763957
  • Fox, J., & Gambino, A. (2021). “Relationship Development with Humanoid Social Robots: Applying Interpersonal Theories to Human/Robot Interaction,” _cyberpsychology, Behavior, and Social Networking_, Mary Ann Liebert. Inc., publishers. https://doi.org/10.1089/cyber.2020.0181
  • Gnewuch, U., Morana, S., & Maedche, A. (2017) Towards designing cooperative and social conversational agents for customer service. In: Proceedings of the 38th international conference on information systems (ICIS)
  • Gnewuch, U., Morana, S., Adam, M., & Maedche, A. (2018a) Faster is not always better: understanding the effect of dynamic response delays in human-chatbot interaction. In: proceedings of European conference on information systems 2018
  • Gnewuch, U., Morana, S., Heckmann, C., & Maedche, A. (2018b) Designing conversational agents for energy feedback. In: International conference on design science research in information systems and technology. Springer, pp 18–33
  • Gong, L. (2008). How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Computers in Human Behavior, 24, 1494–1509. https://doi.org/10.1016/j.chb.2007.05.007
  • Gouldner, A. W. (1960). The norm of reciprocity: A preliminary statement. American Sociological Review, 25, 161–178. https://doi.org/10.2307/2092623
  • Gregor, S. (2002). Design theory in information systems. In Australasian Journal of Information Systems (Vol. Vol. 10). https://doi.org/10.3127/ajis.v10i1.439.
  • Gregor, S. (2006). The nature of theory in information systems. MIS Quarterly, 30(3), 611–642. https://doi.org/10.2307/25148742
  • Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems, 8.
  • Gregor, S., & Hevner, A. R. (2013). Positioning and presenting design science research for maximum impact. MIS Quarterly, 37(2), 337–355. https://doi.org/10.25300/MISQ/2013/37.2.01
  • Gregor, S., Kruse, L. C., & Seidel, S. (2020). Research perspectives: the anatomy of a design principle. Journal of the Association for Information Systems, 21. https://doi.org/10.17705/1jais.00649
  • Guzman, A. (2017). Making AI Safe for Humans: A Conversation With Siri. In Socialbots and Their Friends (pp. 69–82). Routledge.
  • Harvey, P. H., Currie, E., Daryanani, P., & Augusto, J. C. (2016). Enhancing student support with a virtual assistant. In G. Vincenti, A. Bucciero, & C. Vaz de Carvalho (Eds.), E-learning, E-education, and online training (pp. 101–109). Springer International Publishing.
  • Hecht, M. L. (1978). The conceptualization and measurement of interpersonal communication satisfaction. Human Communication Research, 4, 253–264. https://doi.org/10.1111/j.1468-2958.1978.tb00614.x
  • Heerink, M., Kröse, B., Evers, V., & Wielinga, B. (2010). Relating conversational expressiveness to social presence and acceptance of an assistive social robot. Virtual Reality, 14, 77–84. https://doi.org/10.1007/s10055-009-0142-1
  • Hevner, A., March, S. T., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28, 75–105. https://doi.org/10.2307/25148625
  • Hildebrand, C., & Bergner, A. (2021). Conversational robo advisors as surrogates of trust: Onboarding experience, firm perception, and consumer financial decision making. J of the Acad Mark Sci, 49, 659–676. https://doi.org/10.1007/s11747-020-00753-z
  • Hinde, R. A. (1979). Towards understanding relationships. press. Academic.
  • Hobert, S. (2019) Say hello to ‘coding tutor’! Design and evaluation of a chatbot-based learning system supporting students to learn to program. ICIS 2019 Proceedings
  • Homans, G. C. (1958). Social behavior as exchange. American Journal of Sociology, 63(6), 597–606. https://doi.org/10.1086/222355
  • Horton, D., & Wohl, R. R. (1956). Mass communication and para-social interaction. Psychiatry, 19, 215–229. https://doi.org/10.1080/00332747.1956.11023049
  • Hu, P., Lu, Y., & Gong, Y. (Yale). (2021). Dual humanness and trust in conversational AI: A person-centered approach. Computers in Human Behavior, 119, 106727. https://doi.org/10.1016/j.chb.2021.106727
  • Iivari, J. (2015). Distinguishing and contrasting two strategies for design science research. European Journal of Information Systems, 24(1), 107–115. https://doi.org/10.1057/ejis.2013.35
  • Iivari, J. (2019). A critical look at theories in design science research.
  • Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive robots as social partners and peer tutors for children: A field trial. Human–Computer Interaction, 19, 61–84. https://doi.org/10.1080/07370024.2004.9667340
  • Kankanhalli, A., Tan, B. C. Y., & Wei, K.-K. (2005). Contributing knowledge to electronic knowledge repositories: An empirical investigation. MIS Quarterly, 29, 113–143.
  • Kaplan, A., & Haenlein, M. (2019). Siri, siri, in my hand: Who’s the fairest in the land? On the interpretations, illustrations, and implications of artificial intelligence. Business Horizons, 62, 15–25. https://doi.org/10.1016/j.bushor.2018.08.004
  • Kapp, K. M. (2012). The gamification of learning and instruction. Wiley San Francisco.
  • Kelley, H. H., & Thibaut, J. W. (1978). Interpersonal relations: A theory of interdependence. John Wiley & Sons.
  • Kim, K. J., Park, E., & Shyam Sundar, S. (2013). Caregiving role in human–robot interaction: A study of the mediating effects of perceived benefit and social presence. Computers in Human Behavior, 29, 1799–1806. https://doi.org/10.1016/j.chb.2013.02.009
  • Knijnenburg, B. P., & Willemsen, M. C. (2016). Inferring capabilities of intelligent agents from their external traits. ACM Trans Interact Intell Syst, 6(28), 1–28:25. https://doi.org/10.1145/2963106
  • Krämer, N. C., Eimler, S., von der Pütten, A., & Payr, S. (2011). Theory of companions: What can theoretical models contribute to applications and understanding of human-robot interaction? Applied Artificial Intelligence, 25, 474–502. https://doi.org/10.1080/08839514.2011.587153
  • Krämer, N. C., von der Pütten, A., & Eimler, S. (2012). Human-agent and human-robot interaction theory: Similarities to and differences from human-human interaction. In M. Zacarias & J. V. de Oliveira (Eds.), Human-computer interaction: The agency perspective (pp. 215–240). Springer.
  • Krämer, N. C., Rosenthal-von der Pütten, A. M., & Hoffmann, L. (2015). Social effects of virtual and robot companions. In The handbook of the psychology of communication technology (pp. 137–159). John Wiley & Sons, Ltd.
  • Kuechler, B., & Vaishnavi, V. (2008). On theory development in design science research: Anatomy of a research project. European Journal of Information Systems, 17(5), 489–504. https://doi.org/10.1057/ejis.2008.40
  • Kuechler, W., & Vaishnavi, V. (2012). A framework for theory development in design science research: Multiple perspectives. Journal of the Association for Information Systems, 13(6), 395–423. https://doi.org/10.17705/1jais.00300
  • L’Abbate, M., Thiel, U., & Kamps, T. (2005) Can proactive behavior turn chatterbots into conversational agents? In: IEEE/WIC/ACM international conference on intelligent agent technology. pp 173–179
  • Lankton, N., McKnight, D., & Tripp, J. (2015). Technology, humanness, and trust: Rethinking trust in technology. Journal of the Association for Information Systems, 16. https://doi.org/10.17705/1jais.00411.
  • Lee, S., & Choi, J. (2017). Enhancing user experience with conversational agent for movie recommendation. Int J Hum-Comput Stud, 103, 95–105. https://doi.org/10.1016/j.ijhcs.2017.02.005
  • Lee, S. K., Kavya, P., & Lasser, S. (2021). Social interactions and relationships with an intelligent virtual agent. International Journal of Human-Computer Studies, 102608. https://doi.org/10.1016/j.ijhcs.2021.102608
  • Li, Y., Tee, K. P., Ge, S. S., & Li, H. (2013). Building companionship through human-robot collaboration. In Social Robotics (pp. 1–7). Springer International Publishing.
  • Luger, E., & Sellen, A. (2016) Like having a really bad PA: The gulf between user expectation and experience of conversational agents. In: Proceedings of the 2016 CHI conference on human factors in computing systems. ACM, pp 5286–5297
  • Maedche, A., Morana, S., Schacht, S., Werth, D., & Krumeich, J. (2016). Advanced user assistance systems. Business & Information Systems Engineering, 58(5), 367–370. https://doi.org/10.1007/s12599-016-0444-2
  • March, S. T., & Smith, G. F. (1995). Design and natural science research on information technology. Decision Support Systems, 15(4), 251–266. https://doi.org/10.1016/0167-9236(94)00041-2
  • McCrae, R. R., & John, O. P. (1992). An introduction to the five-factor model and its applications. Journal of Personality, 60(2), 175–215. https://doi.org/10.1111/j.1467-6494.1992.tb00970.x
  • McTear, M., Callejas, Z., & Griol, D. (2016). The conversational interface (Vol. 10, pp. 978). Springer.
  • McTear, M. F. (2017). The rise of the conversational interface: A new kid on the block? Lecture Notes in Computer Science 10341 LNAI, 38–49. https://doi.org/10.1007/978-3-319-69365-1_3
  • McTear, M. (2018) Conversational modelling for ChatBots: Current approaches and future directions. Technical report, Ulster University
  • Mendelson, M. J., & Aboud, F. (2012). McGill friendship questionnaire-respondent’s affection (MFQ-RA). Measurement Instrument Database for the Social Science.
  • Meth, H., Mueller, B., University of Groningen, et al. (2015). Designing a requirement mining system. Journal of the Association for Information Systems, 16, 799–837
  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. sage
  • Möller, F., Guggenberger, T. M., & Otto, B. (2020). Towards a Method for Design Principle Development in Information Systems. In S. Hofmann, O. Müller, & M. Rossi (Eds.), Designing for digital transformation. Co-creating services with citizens and industry (pp. 208–220). Springer International Publishing.
  • Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from consumers. Journal of Consumer Research, 26(4), 323–339. https://doi.org/10.1086/209566
  • Morana, S., Friemel, C., Gnewuch, U., Maedche, A., & Pfeiffer, J. (2017). Interaktion mit smarten Systemen — Aktueller Stand und zukünftige Entwicklungen im Bereich der Nutzerassistenz. Wirtschaftsinformatik & Management, 9(5), 42–51. https://doi.org/10.1007/s35764-017-0101-7
  • Mori, M. (2012). The uncanny valley: The original essay by Masahiro Mori. In: IEEE Spectrum: Technology, Engineering, and Science News. Accessed 26 Jun 2019. https://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley
  • Murphy, M. (2019) This app is trying to replicate you. Quartz. Accessed 25 Jan 2022. https://qz.com/1698337/replika-this-app-is-trying-to-replicate-you/
  • Nass, C., Steuer, J., & Tauber, E. R. (1994) Computers are social actors. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 72–78
  • Nass, C., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, D. C. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 43(2), 223–239. https://doi.org/10.1006/ijhc.1995.1042
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Nißen, M., Selimi, D., Janssen, A. et al. (2021). See you soon again, chatbot? A design taxonomy to characterize user-chatbot relationships with different time horizons. Computers in Human Behavior:107043. https://doi.org/10.1016/j.chb.2021.107043
  • Park, E., Jin, D., & Del Pobil Ap. (2012). The law of attraction in human-robot interaction. International Journal of Advanced Robotic Systems, 9, 35.
  • Pfeuffer, N., Adam, M., Toutaoui, J. et al. (2019a). Mr. and Mrs. Conversational agent - gender stereotyping in judge-advisor systems and the role of egocentric bias. ICIS 2019 proceedings
  • Pfeuffer, N., Benlian, A., Gimpel, H., & Hinz, O. (2019b). Anthropomorphic information systems. Bus Inf Syst Eng, 61, 523–533.
  • Porcheron, M., Fischer, J. E., Reeves, S., & Sharples, S. (2018). Voice interfaces in everyday life. In: Proceedings of the 2018 CHI conference on human factors in computing systems. association for computing machinery, New York, NY, USA, pp 1–12
  • Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a theory of mind? Behavioral and Brain Sciences, 1, 515–526.
  • Purao, S., Chandra Kruse, L., & Maedche, A. (2020). The origins of design principles: Where do … they all come from?
  • Qiu, L., & Benbasat, I. (2009). Evaluating anthropomorphic product recommendation agents: A social relationship perspective to designing information systems. Journal of Management Information Systems, 25(4), 145–182. https://doi.org/10.2753/MIS0742-1222250405
  • Qiu, L., & Benbasat, I. (2010). A study of demographic embodiments of product recommendation agents in electronic commerce. International Journal of Human-Computer Studies, 68, 669–688. https://doi.org/10.1016/j.ijhcs.2010.05.005
  • Randrup, N., Druckenmiller, D., & Briggs, R. O. (2016) Philosophy of collaboration. In: System sciences (HICSS), 2016 49th Hawaii international conference on. IEEE, pp 898–907
  • Ransbotham, S., Gerbert, P., Reeves, M. et al. (2018). Artificial intelligence in business gets real. MIT Sloan Management Review, 60280.
  • Rawlins, W. (2017). Friendship matters. Routledge.
  • Reeves, S., Porcheron, M., & Fischer, J. (2018). “This is not what we wanted”: Designing for conversation with voice interfaces. Interactions, 26, 46–51. https://doi.org/10.1145/3296699
  • Rheu, M., Shin, J. Y., Peng, W., & Huh-Yoo, J. (2021). Systematic review: Trust-building factors and implications for conversational agent design. International Journal of Human–Computer Interaction, 37, 81–96. https://doi.org/10.1080/10447318.2020.1807710
  • Robert, L. P., Pierce, C., Marquis, L., Kim, S., & Alahmad, R. (2020). Designing fair AI for managing employees in organizations: A review, critique, and design agenda. Human–Computer Interaction, 35(5–6), 545–575. https://doi.org/10.1080/07370024.2020.1735391
  • Rook, K. S. (1987). Social support versus companionship: Effects on life stress, loneliness, and evaluations by others. Journal of Personality and Social Psychology, 52(6), 1132–1147. https://doi.org/10.1037/0022-3514.52.6.1132
  • Rotter, J. B. (1980). Interpersonal trust, trustworthiness, and gullibility. American Psychologist, 35, 1–7.
  • Rzepka, C., & Berger, B. (2018) User interaction with AI-enabled systems: A systematic review of IS research. ICIS 2018 Proceedings
  • Saffarizadeh, K., Boodraj, M., & Alashoor, T. M. (2017). Conversational assistants: Investigating privacy concerns, trust, and self-disclosure. In: ICIS 2017 Proceedings
  • Saldaña, J. (2015). The coding manual for qualitative researchers. SAGE.
  • Sandberg, J., & Alvesson, M. (2021). Meanings of theory: Clarifying theory through typification. Journal of Management Studies, 58, 487–516. https://doi.org/10.1111/joms.12587
  • Schroeder, J., & Schroeder, M. (2018). Trusting in machines: How mode of interaction affects willingness to share personal information with machines. In: Proceedings of the 51st Hawaii international conference on system sciences
  • Seeber, I., Bittner, E., Briggs, R. O. et al. (2019). Machines as teammates: A research Agenda on AI in team collaboration. Information & Management: 103174. https://doi.org/10.1016/j.im.2019.103174
  • Seeger, A.-M., Pfeiffer, J., & Heinzl, A. (2018) Designing anthropomorphic conversational agents: Development and empirical evaluation of a design framework. In: ICIS 2018 Proceedings
  • Seymour, M., Riemer, K., & Kay, J. (2018). Actors, avatars and agents: Potentials and implications of natural face technology for the creation of realistic visual presence. Journal of the Association for Information Systems, 19.
  • Seymour, M., Yuan, L., Dennis, A., & Riemer, K. (2019). Crossing the uncanny valley? Understanding affinity, trustworthiness, and preference for more realistic virtual humans in immersive environments. In: HICSS 2019 Proceedings
  • Seymour, M., Yuan, L., Dennis, A., & Riemer, K. (2021). Have we crossed the uncanny valley? Understanding affinity, trustworthiness, and preference for realistic digital humans in immersive environments. Journal of the Association for Information Systems, 22. https://doi.org/10.17705/1jais.00674.
  • Shah, H., Warwick, K., Vallverdú, J., & Wu, D. (2016). Can machines talk? Comparison of Eliza with modern dialogue systems. Computers in Human Behavior, 58, 278–295. https://doi.org/10.1016/j.chb.2016.01.004
  • Shapiro, S. S., & Wilk, M. B. (1965). An analysis of variance test for normality (complete samples). Biometrika, 52, 591–611.
  • Shawar, B. A., & Atwell, E. (2007). Chatbots: Are they really useful? In: Ldv forum. pp 29–49
  • Siemon, D., Becker, F., Eckardt, L., & Robra-Bissantz, S. (2017). One for all and all for one - towards a framework for collaboration support systems. Education and Information Technologies. https://doi.org/10.1007/s10639-017-9651-9
  • Siemon, D., & Jusmann, S. (2021). Preferred appearance of embodied conversational agents in knowledge management. AMCIS 2021 Proceedings
  • Sin, J., & Munteanu, C. (2020). An empirically grounded sociotechnical perspective on designing virtual agents for older adults. Human–Computer Interaction, 35, 481–510. https://doi.org/10.1080/07370024.2020.1731690
  • Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion - A study of human-chatbot relationships. International Journal of Human-Computer Studies, 149, 102601. https://doi.org/10.1016/j.ijhcs.2021.102601
  • Söllner, M., Hoffmann, A., Hoffmann, H., & Leimeister, J. M. (2012). Vertrauensunterstützung für sozio-technische ubiquitäre Systeme. Z Betriebswirtsch, 82(4), 109–140. https://doi.org/10.1007/s11573-012-0584-x
  • Söllner, M., Hoffmann, A., & Leimeister, J. M. (2016). Why different trust relationships matter for information systems users. European Journal of Information Systems, 25, 274–287. https://doi.org/10.1057/ejis.2015.17
  • Solove, D. J. (2004). The digital person: technology and privacy in the information age. NYU Press.
  • Sprecher, S. (1986). The relation between inequity and emotions in close relationships. Social Psychology Quarterly, 49, 309–321. https://doi.org/10.2307/2786770
  • Stever, G. S. (2017). Parasocial theory: Concepts and measures. The International Encyclopedia of Media Effects. American Cancer Society, 1–12.
  • Strohmann, T., Siemon, D., & Robra-Bissantz, S. (2019). Designing virtual in-vehicle assistants: Design guidelines for creating a convincing user experience. AIS Transactions on Human-Computer Interaction, 11, 54–78. https://doi.org/10.17705/1thci.00113
  • Sunyaev, A., Dehling, T., Taylor, P. L., & Mandl, K. D. (2015). Availability and quality of mobile health app privacy policies. Journal of the American Medical Informatics Association, 22(e1), e28–e33. https://doi.org/10.1136/amiajnl-2013-002605
  • Svennevig, J. (2000). Getting acquainted in conversation: A study of initial interactions. John Benjamins Publishing.
  • Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235
  • Taddei, S., & Contena, B. (2013). Privacy, trust and control: Which relationships with online self-disclosure? Computers in Human Behavior, 29(3), 821–826. https://doi.org/10.1016/j.chb.2012.11.022
  • Tavanapour, N., Poser, M., & Bittner, E. A. C. (2019) Supporting the idea generation process in citizen participation - toward an interactive system with a conversational agent as facilitator. ECIS 2019 Proceedings 18
  • Thuan, N., Drechsler, A., & Antunes, P. (2019). Construction of design science research questions. Communications of the Association for Information Systems, 44. https://doi.org/10.17705/1CAIS.04420
  • Tsiourti, C. (2018). Artificial agents as social companions: Design guidelines for emotional interactions. University of Geneva.
  • Turilli, M., & Floridi, L. (2009). The ethics of information transparency. Ethics and Information Technology, 11(2), 105–112. https://doi.org/10.1007/s10676-009-9187-9
  • Turkle, S. (2010). In good company? Close Engagements with Artificial Companions. John Benjamins, 3–10.
  • Vardoulakis, L. P., Ring, L., Barry, B. et al. (2012). Designing relational agents as long term social companions for older adults. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7502 LNAI:289–302. https://doi.org/10.1007/978-3-642-33197-8-30
  • Vaswani, A., Shazeer, N., Parmar, N. et al. (2017). Attention is all you need. In Advances in neural information processing systems. Curran associates. Inc.
  • Venable, J., Pries-Heje, J., & Baskerville, R. (2016). FEDS: A framework for evaluation in design science research. European Journal of Information Systems, 25(1), 77–89. https://doi.org/10.1057/ejis.2014.36
  • Venkatesh, V., Morris, M. G., Davis, D., & Davis, F. D. User acceptance of information technology: Toward a unified view. (2003). MIS Quarterly, 27(3), 425. 425–478. https://doi.org/10.2307/30036540
  • Venkatesh, V., Brown, S. A., & Bala, H. (2013). Bridging the qualitative-quantitative divide: guidelines for conducting mixed methods research in information systems. MIS Quarterly, 37(1), 21–54. https://doi.org/10.25300/MISQ/2013/37.1.02
  • Veruggio, G., Operto, F., & Bekey, G. (2016). Roboethics: Social and ethical implications. In Springer handbook of robotics (pp. 2135–2160). Springer.
  • Walls, J. G., Widmeyer, G. R., & El Sawy, O. A. (1992). Building an information system design theory for vigilant EIS. Information Systems Research, 3(1), 36–59. https://doi.org/10.1287/isre.3.1.36
  • Walster, E., Walster, G. W., & Berscheid, E. (1978). Equity: Theory and research. https://doi.org/10.1007/BF01542062
  • Wambsganss, T., Janson, A., Soellner, M., & Leimeister, J. M. (2021) AI-based argumentation tutoring – a novel system class to improve learners’ Argumentation Skills. Proceedings 2021:10233. https://doi.org/10.5465/AMBPP.2021.10233abstract
  • Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS Quarterly, xiii–xxiii.
  • Weizenbaum, J. (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36–45. https://doi.org/10.1145/365153.365168
  • Wilks, Y. (2005). Artificial companions. Lecture Notes in Computer Science, 36–45.
  • Winkle, K., Melsión, G. I., McMillan, D., & Leite, I. (2021) Boosting robot credibility and challenging gender norms in responding to abusive behaviour: a case for feminist robots. In: Companion of the 2021 ACM/IEEE international conference on human-robot interaction. Association for computing machinery, New York, NY, USA, pp 29–37
  • Woolf, B. P., Arroyo, I., & Muldner, K. (2010). The effect of motivational learning companions on low achieving students and students with disabilities. In V. Aleven, J. Kay, J. Mostow et al (Eds.), Intelligent tutoring systems (pp. 327–337). Springer.
  • Xiao, J., Stasko, J., & Catrambone, R. (2007) The role of choice and customization on users’ interaction with embodied conversational agents: Effects on perception and performance. In: Proceedings of the SIGCHI conference on human factors in computing systems. Association for computing machinery, New York, NY, USA, pp 1293–1302
  • Young, J. E., Hawkins, R., Sharlin, E., & Igarashi, T. (2008). Toward acceptable domestic robots: Applying insights from social psychology. International Journal of Social Robotics, 1(1), 95. https://doi.org/10.1007/s12369-008-0006-y
  • Yu, H., Shen, Z., Miao, C. et al. (2018). Building ethics into artificial intelligence. In: Proceedings of the 27th international joint conference on artificial intelligence
  • Zhou, L., Gao, J., Li, D., & Shum, H.-Y. (2020). The design and implementation of XiaoIce, an empathetic social chatbot. Computational Linguistics, 46(1), 53–93. https://doi.org/10.1162/coli_a_00368