815
Views
0
CrossRef citations to date
0
Altmetric
Research Article

“Can you tell me about yourself?” The impacts of chatbot names and communication contexts on users’ willingness to self-disclose information in human-machine conversations

References

  • Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. https://doi.org/10.1126/science.aaa1465
  • Altman, I., & Taylor, D. (1973). Social penetration: The development of interpersonal relationships. Holt.
  • Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189. https://doi.org/10.1016/j.chb.2018.03.051
  • Ard, A. (2021). Amazon, can we have our name back?. The Washington Post. Retrieved September 14, 2022, from https://www.washingtonpost.com/technology/interactive/2021/people-named-alexa-name-change-amazon/.
  • Beattie, A., Edwards, A. P., & Edwards, C. (2020). A bot and a smile: Interpersonal impressions of chatbots and humans using emoji in computer-mediated communication. Communication Studies, 71(3), 409–427. https://doi.org/10.1080/10510974.2020.1725082
  • Brandtzaeg, P. B., & Følstad, A. (2017, November). Why people use chatbots. In International conference on internet science, Cham, (pp. 377–392). Springer.
  • Brixey, J., Hoegen, R., Lan, W., Rusow, J., Singla, K., Yin, X., Artstein, R., & Leuski, A. (2017, August). Shihbot: A Facebook chatbot for sexual health information on hiv/aids. In Proceedings of the 18th annual SIGdial meeting on discourse and dialogue, Saarbrücken, Germany, (pp. 370–373).
  • Calo, M. R. (2009). People can be so fake: A new dimension to privacy and technology scholarship. Penn State Law Review, 114(3), 809–855.
  • Cho, E., Molina, M. D., & Wang, J. (2019). The effects of modality, device, and task differences on perceived human likeness of voice-activated virtual assistants. Cyberpsychology, Behavior and Social Networking, 22(8), 515–520. https://doi.org/10.1089/cyber.2018.0571
  • Darling, K. (2015). ‘Who’s Johnny?‘Anthropomorphic framing in human-robot interaction, integration, and policy. Anthropomorphic framing in human-robot interaction, integration, and policy (March 23, 2015). Robot Ethics, 2. https://doi.org/10.2139/ssrn.2588669
  • Dinev, T., & Hart, P. (2006). An extended privacy calculus model for E-Commerce transactions. Information Systems Research, 17, 61–80. https://doi.org/10.1287/isre.1060.0080
  • Eyssel, F., & Kuchenbrandt, D. (2012). Social categorization of social robots: Anthropomorphism as a function of robot group membership. British Journal of Social Psychology, 51(4), 724–731. https://doi.org/10.1111/j.2044-8309.2011.02082.x
  • Følstad, A., & Brandtzæg, P. B. (2017). Chatbots and the new world of HCI. Interactions, 24(4), 38–42. https://doi.org/10.1145/3085558
  • Gambino, A., Fox, J., & Ratan, R. A. (2020). Building a stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71–85. https://doi.org/10.30658/hmc.1.5
  • Guzman, A. L., & Lewis, S. C. (2020). Artificial intelligence and communication: A human–Machine communication research agenda. New Media & Society, 22(1), 70–86. https://doi.org/10.1177/1461444819858691
  • Ho, C. C., & MacDorman, K. F. (2010). Revisiting the uncanny valley theory: Developing and validating an alternative to the godspeed indices. Computers in Human Behavior, 26(6), 1508–1518. https://doi.org/10.1016/j.chb.2010.05.015
  • Ischen, C., Araujo, T., Voorveld, H., van Noort, G., & Smit, E. (2019). Privacy Concerns in Chatbot Interactions, (pp. 34-48). In eds. Asbjørn Følstad, Theo Araujo, Symeon Papadopoulos, Effie LaiChong Law, Ole-Christoffer Granmo, Ewa Luger, Petter Bae Brandtzaeg, Chatbot Research and Design: Third International Workshop, CONVERSATIONS, Honolulu, Hawaii, (pp. 19–20).
  • Joinson, A. N., Paine, C., Buchanan, T., & Reips, U. D. (2008). Measuring self-disclosure online: Blurring and non-response to sensitive items in web-based surveys. Computers in Human Behavior, 24(5), 2158–2171. https://doi.org/10.1016/j.chb.2007.10.005
  • Kezer, M., Sevi, B., Cemalcilar, Z., & Baruh, L. (2016). Age differences in privacy attitudes, literacy and privacy management on Facebook. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 10(1). https://doi.org/10.5817/CP2016-1-2
  • Kim, J., Merrill, K., Jr., & Collins, C. (2021). AI as a friend or assistant: The mediating role of perceived usefulness in social AI vs. functional AI. Telematics and Informatics, 64, 101694. https://doi.org/10.1016/j.tele.2021.101694
  • Kim, Y., & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless? Computers in Human Behavior, 28(1), 241–250. https://doi.org/10.1016/j.chb.2011.09.006
  • Laban, G., George, J. N., Morrison, V., & Cross, E. S. (2021). Tell me more! Assessing interactions with social robots from speech. Paladyn, Journal of Behavioral Robotics, 12(1), 136–159. https://doi.org/10.1515/pjbr-2021-0011
  • Lankton, N. K., McKnight, D. H., & Tripp, J. (2015). Technology, humanness, and trust: Rethinking trust in technology. Journal of the Association for Information Systems, 16(10), 1. https://doi.org/10.17705/1jais.00411
  • Lee, K. M., Peng, W., Jin, S. A., & Yan, C. (2006). Can robots manifest personality?: An empirical test of personality recognition, social responses, and social presence in human–robot interaction. Journal of Communication, 56(4), 754–772. https://doi.org/10.1111/j.1460-2466.2006.00318.x
  • Lee, Y. C., Yamashita, N., Huang, Y., & Fu, W. (2020, April). “I Hear You, I Feel You”: Encouraging deep self-disclosure through a chatbot. In Proceedings of the 2020 CHI conference on human factors in computing systems, Honolulu, Hawaii, (pp. 1–12).
  • Lombard, M., & Xu, K. (2021). Social responses to media technologies in the 21st century: The media are social actors paradigm. Human-Machine Communication, 2(1), 2. https://doi.org/10.30658/hmc.2.2
  • Masur, P. K. (2019). Situational privacy and self-disclosure: Communication processes in online environments. Springer.
  • Metzger, M. J. (2006). Effects of site, vendor, and consumer characteristics on web site trust and disclosure. Communication Research, 33(3), 155–179. https://doi.org/10.1177/0093650206287076
  • Miner, A., Chow, A., Adler, S., Zaitsev, I., Tero, P., Darcy, A., & Paepcke, A. (2016, October). Conversational agents and mental health: Theory-informed assessment of language and affect. In Proceedings of the 4th international conference on human-agent interaction, Biopolis, Singapore, (pp. 123–130).
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. The Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  • Nass, C., Steuer, J., & Tauber, E. R. (1994, April). Computers are social actors. In Proceedings of the SIGCHI conference on Human factors in computing systems, Boston, Massachusetts, (pp. 72–78).
  • Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79(1), 119–157.
  • Palsson, G. (2014). Personal names: Embodiment, differentiation, exclusion, and belonging. Science, Technology, & Human Values, 39(4), 618–630. https://doi.org/10.1177/0162243913516808
  • Reeves, B., & Nass, C. (2002). The media equation: How people treat computers, television, and new media like real people and places (2nd ed.). CSLI Publications.
  • Rhodes, R., & Geller, J. (1992, June). Clinical issues in research on what clients don’t tell their therapists. Paper presented at the annual meeting of the Society for Psychotherapy Research, Berkeley, CA.
  • Statt, N. (2016). Why Google’s fancy new AI assistant is just called ‘Google’. The Verge. Retrieved September 14, 2022, from https://www.theverge.com/2016/5/20/11721278/google-ai-assistant-name-vs-alexa-siri.
  • Sundar, S. S., & Kim, J. (2019, May). Machine heuristic: When we trust computers more than humans with our personal information. In Proceedings of the 2019 CHI Conference on human factors in computing systems, Glasgow, Scotland, UK, (pp. 1–9).
  • Tang, J. H., & Wang, C. C. (2012). Self-disclosure among bloggers: Re-examination of social penetration theory. Cyberpsychology, Behavior and Social Networking, 15(5), 245–250. https://doi.org/10.1089/cyber.2011.0403
  • Tifferet, S. (2019). Gender differences in privacy tendencies on social network sites: A meta-analysis. Computers in Human Behavior, 93, 1–12. https://doi.org/10.1016/j.chb.2018.11.046
  • Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology, 52, 113–117. https://doi.org/10.1016/j.jesp.2014.01.005
  • Xu, K., & Liao, T. (2020). Explicating cues: A typology for understanding emerging media technologies. Journal of Computer-Mediated Communication, 25(1), 32–43. https://doi.org/10.1093/jcmc/zmz023
  • Xu, K., & Lombard, M. (2017). Persuasive computing: Feeling peer pressure from multiple computer agents. Computers in Human Behavior, 74, 152–162. https://doi.org/10.1016/j.chb.2017.04.043
  • Yang, H., & Lee, H. (2019). Understanding user behavior of virtual personal assistant devices. Information Systems & E-Business Management, 17(1), 65–87. https://doi.org/10.1007/s10257-018-0375-1
  • Young, R. K., Kennedy, A. H., Newhouse, A., Browne, P., & Thiessen, D. (1993). The effects of names on perception of intelligence, popularity, and competence. Journal of Applied Social Psychology, 23(21), 1770–1788. https://doi.org/10.1111/j.1559-1816.1993.tb01065.x

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.