12,717
Views
7
CrossRef citations to date
0
Altmetric
Research Article

AI companions for lonely individuals and the role of social presence

ORCID Icon, &

ABSTRACT

Artificial intelligence (AI) companiosns (e.g., social machine agents, social robots) are becoming increasingly available. Considering that AI companions can be beneficial for individuals seeking companionships or relationships, the social and relational aspects of an AI companion are important to investigate. To understand people’s perceptions of an AI companion, this study examines the roles of social presence and warmth of an AI companion through an online experiment. Primary findings indicate that social presence of a disembodied AI companion fosters greater perceived usefulness of the AI companion and willingness to recommend the AI companion for lonely individuals. Collectively, the study highlights the importance of social presence for disembodied AI companions.

Artificial intelligence (AI) agents vary in embodiment, the physical bodily presence of an agent (Lee, Jung, Kim, & Kim, Citation2006), and embodiment influences how individuals respond to AI. For example, embodied robots are more likely to prompt social interactions from a user (Fong, Nourbakhsh, & Dautenhahn, Citation2003) and induce more positive effects on user acceptance (Hoffmann, Bock, & Rosenthal Vd Pütten, Citation2018) than disembodied agents. Although technological features of embodiment can effectively facilitate social interactions with humans (Fong et al., Citation2003), it is unclear if this finding is true for AI designed for social purposes, such as AI companions. Since AI companions can be beneficial for individuals seeking companionships (Odekerken-Schröder, Mele, Russo-Spena, Mahr, & Ruggiero, Citation2020), human’s social perceptions of an AI companion may be particularly important for creating a socially meaningful interaction. To address the importance of social and relational perceptions toward an AI companion, this study examines social presence and warmth of AI companion, which are important in building companionships and relationships (Andersen & Guerrero, Citation1998; Han, Min, & Lee, Citation2016). Specifically, this study examines whether perceived social presence and warmth of an AI companion interact with the effect that embodiment of an AI companion has on perceptions of the AI companion.

Social presence and warmth

Social presence is conceptualized as “a psychological state in which virtual social actors are experienced as actual social actors” (Lee, 2006, p. 45). This definition is aligned with the idea of social richness, which is primarily concerned with sociable, warm, personal, or intimate perceptions of the other (Lombard & Ditton, Citation1997). Thus, social presence is not limited to being aware of the other agent; rather, it is a perception of socially and psychologically being involved in the interaction (Biocca, Harms, & Burgoon, Citation2003). Research highlights the importance of social presence (Oh, Bailenson, & Welch, Citation2018), particularly in humans’ perceptions about machine agents (e.g., Spence, Westerman, Edwards, & Edwards, Citation2014). For example, Shin and Choo (Citation2011) note that social presence intensifies the effects of perceived usefulness of robot interactions on favorable attitudes toward the robot.

Warmth, or social warmth, is the feeling of intimacy and friendliness (Fiske, Cuddy, & Glick, Citation2007). Warmth is a core component of what constitutes human essence (Haslam, Bain, Douge, Lee, & Bastian, Citation2005), which is important for relationship development and maintenance (Andersen & Guerrero, Citation1998). People report perceiving warmth when interacting with machine agents (Eyssel & Kuchenbrandt, Citation2012), and warmth influences humans’ preferences regarding how robots should behave in human-robot interactions (Scheunemann, Cuijpers, & Salge, Citation2020). Interestingly, people rate humans and humanlike robots with similar degrees of warmth, particularly when robots model human behaviors (Chung-En, Citation2018). This finding might be because people attribute warmth to robots based on their appearance (Mieczkowski, Liu, Hancock, & Reeves, Citation2019). Collectively, these findings imply that greater perceived warmth of machine agents could positively influence how humans perceive them (Oliveira, Arriaga, Correia, & Paiva, Citation2019).

The present study

The present study investigates how social presence and warmth of an AI companion play a role in conjunction with the embodiment of an AI agent. In particular, acknowledging the potential for AI to serve a social or relational role (Odekerken-Schröder et al., Citation2020), this study examines how people perceive the use of an AI companion for lonely individuals.

Research demonstrates that social presence and warmth can maximize benefits for lonely individuals. Greater perceived social presence of others can intensify lonely individuals’ positive media experiences (Kim, Song, & Lee, Citation2018) and close relationship perceptions of others connected via technology (Kim, Kim, & Yang, Citation2019). Further, social warmth can help address loneliness, or social coldness (Bargh & Shalev, Citation2012). Although these findings are from human-to-human communication contexts, the computers are social actors (CASA) paradigm (Reeves & Nass, Citation1996) suggests a similar pattern would be observed in human-to-machine communication contexts. According to CASA, people unknowingly apply social scripts present in human-to-human interactions when interacting with computers. This is explained by humans failing to focus on the asocial characteristics of a computers (Nass & Moon, Citation2000). Thus, individuals are likely to interact with computers, or machines broadly, in a similar manner to how they interact with other people.

Loneliness is a psychological state where individuals feel lost, distressed, and isolated from others (Fromm-Reichmann, Citation1959). Lonely individuals have the desire to build and maintain social relationships, but they tend to evaluate social interactions negatively (Bellucci, Citation2020). Because of this tendency, lonely individuals may not actively seek ways to address these issues on their own. Thus, their social networks can consider providing informational support, which provides advice or information (Mitchell & Trickett, Citation1980), such as suggesting the use of an AI companion. Statistics reports that 92% of individuals trust recommendations from their friends, with 74% of those considering word-of-mouth as more influential than advertising (Marinova, Citation2021). Given that lonely individuals are susceptible to prosocial suggestions from those in their life (Wang, Zhu, & Shiv, Citation2012), providing informational support might be an effective strategy. Thus, it is necessary to understand how individuals perceive an AI companion for lonely individuals and their willingness to recommend an AI companion to those who need it.

Taken together, this study investigates the role of social presence and warmth of an AI companion in conjunction with the effect that the embodiment of an AI companion has on perceptions of the AI companion for lonely individuals. Considering the lack of empirical evidence, we propose the following research question.

RQ1a-b: How do social presence and warmth interact with the embodiment of an AI companion on (a) perceived usefulness of the AI companion for lonely individuals and (b) willingness to recommend the AI companion to lonely individuals?

Method

An online experiment was conducted using a two-group comparison with a between-subjects design. Participants were randomly assigned to either the embodied condition (n = 46) or disembodied condition (n = 60). Cell sizes remain unbalanced due to the data cleaning processes.

Materials

The study developed stimuli that included two edited video clips from Season 2, Episode 1 of Black Mirror. The edited storyline featured Martha, who is presented as lonely and seeking companionships. Martha’s concerned friend enrolls her in a program to address her loneliness. This program introduces Martha to an AI companion named Ash. In the edited clips, Martha interacts with Ash, who is presented as either a disembodied or embodied AI.

Each clip was approximately five and a half minutes long. The first three minutes of the clips were identical, starting with an introduction to Martha as a lonely individual and her first interaction with Ash, the AI companion, through a text-based chatroom with a photo of Ash. This scene was the participants’ first introduction to Ash. Then, the clips dovetail to different scenes contingent on the condition. In the disembodied condition, Martha interacted with Ash, presented as a voice-based AI companion, through a mobile phone. During these interactions, participants could only hear Ash’s voice without any additional visual cues of Ash. In the embodied condition, Martha interacted with Ash, presented as a hyperrealistic, humanlike AI companion, in a face-to-face context. Participants were able to see and hear Ash. Storyboards for each condition are available at https://osf.io/62anc/?view_only=15a8606d8fca48f5a86592a6cc4650fe.

Sample

Initially, 242 undergraduate students responded to the study. A series of data cleaning processes were conducted to ensure the quality of the data. First, because the study recruitment message was distributed to multiple classes at one institution, some students completed the study more than once. The data identified that 76 responses were from multiple attempts; thus, their data were removed. Second, 41 individuals indicated that they had previously seen the Black Mirror television series, which was used as stimuli for this study. To avoid any potential bias developed from a previous exposure, the data from these 41 individuals were removed. Third, two participants failed the attention check, which asked “Please click ‘2’ in order to move to the next page” in the middle of the study; thus, their data were removed. Finally, a manipulation check was used to ensure the participants correctly identified which type of AI was present in the clip they watched (disembodied AI or embodied AI). Six individuals in the disembodied condition and 11 individuals in the embodied condition failed to provide a correct response; thus, their responses were removed.

After the data cleaning processes, the final sample included 106 participants. The sample primarily consisted of females (n = 76; 71.7%) and the average age was 24.83 years (SD = 7.43). Participants identified as White/Caucasian (n = 65; 61.3%), Latino/a/x or Hispanic (n = 23; 21.7%), Black/African American (n = 12; 11.3%), and other racial/ethnic groups (n = 5; 4.7%).

Procedure

Participants were recruited from a large university in the United States. Following IRB approval, a recruitment message was distributed. The message explained that the goal of the study is to understand students’ perceptions about technology. To avoid any potential biases, the study description was presented broadly. After acknowledging the informed consent document, participants watched a video clip and answered questions about the clip. A timer was used to prevent participants from proceeding without watching the clip in full. Participation remained voluntary, and all participants received extra credit. Confidentiality was guaranteed.

Measures

Before the stimulus, preexisting attitudes toward new technologies (α = .82) were measured with three items adopted from Nass, Lombard, Henriksen, and Steuer (Citation1995) (e.g., “How comfortable would you be with new technologies (e.g., robots, AI) taking personal roles (e.g., colleagues, bosses)”). Responses were obtained on a 6-point scale (1 = Very uncomfortable, 6 = Very comfortable).

After viewing the stimulus, participants provided their perceptions of Ash, the AI companion. Perceived usefulness for lonely individuals (α = .95) was measured with four items modified from Davis (Citation1989) (e.g., “AI/robots like Ash would be useful to lonely individuals”). Research using the original measure finds that it is positively associated with self-reported usage and user acceptance of a technology (Davis, Citation1989). Willingness to recommend to lonely individuals (α = .96) was measured with four items modified from Choi and Ji (Citation2015) (e.g., “If an AI/robot like Ash in the clip is given to me I might recommend it to lonely individuals”). The original measure is focused on behavioral intentions and is associated with usefulness and ease of use of technology, which are key to technology acceptance (Davis, Citation1989). Responses for usefulness and willingness to recommend were obtained on a 7-point Likert-type scale (1 = Strongly disagree; 7 = Strongly agree).

Warmth (α = .87) was evaluated with five items modified from Fiske et al. (Citation2007). Participants rated Ash on various traits (e.g., “Polite,” “Sensitive”) on a 7-point scale (1 = Not at all; 7 = Very). The original measure of warmth is related to mind attributions of social robots, the view that entities other than humans can hold mental capabilities (Eyssel & Kuchenbrandt, Citation2012). Social presence (α = .91) was measured with eight items (e.g., “Unsociable – Sociable,” “Remote – Immediate”) adopted from Lombard, Ditton, and Weinstein (Citation2009). Participants indicated how they felt about Ash on a 7-point semantic differential scale.

Results

Given that AI companions are advanced technologies, participants’ attitudes toward new technologies might influence their responses to the clip. Thus, preexisting attitudes toward new technologies were controlled for in all analyses. Further, we conducted a confirmatory factor analysis (CFA) on three measures to ensure validity. Specifically, CFAs were performed on perceived usefulness for lonely individuals, willingness to recommend to lonely individuals, and warmth because these measures were slightly modified to fit within the study’s context. According to Hu and Bentler (Citation1999), the recommended acceptable fit indices call for a non-significant chi-square test, CFI > .95, SRMR < .08, and RMSEA < .06. Although some of the indices in the measures do not fully meet the suggested scores, the measures are high in construct validity as the original measures are associated with relevant constructs in the literature as described in the measure section earlier. The results of the CFAs are available at https://osf.io/62anc/?view_only=15a8606d8fca48f5a86592a6cc4650fe.

Prior to conducting the analysis for the proposed RQs, a series of ANCOVAs examined the main effects of embodiment on the participants’ perceptions. Results indicated no significant effects on perceived usefulness for lonely individuals [F(1, 103) = 0.58, p > .05, ηp2 = .006] and willingness to recommend to lonely individuals [F(1, 103) = 0.66, p > .05, ηp2 = .006].

RQ1a-b and RQ2 examined how social presence and warmth interact with embodiment of an AI companion on perceptions of the AI companion. A series of PROCESS macros (Hayes, Citation2017) were used by employing a double-moderator model (Model #2). The procedure used 5000 bootstrap samples, and the results were interpreted based on the 95% Confidence Interval (CI). The experimental condition was dummy coded (0 = embodied, 1 = disembodied).

Regarding perceived usefulness for lonely individuals (RQ1a), social presence was significant (b = .54, t(99) = 2.26, p < .05). Specifically, when the AI companion is disembodied, greater social presence of the AI companion led people to perceive that the AI companion would be useful for lonely individuals. Warmth was not significant (b = −.34, t(99) = −1.42, p > .05).

Regarding willingness to recommend to lonely individuals (RQ1b), social presence was significant (b = .59, t(99) = 2.02, p < .05). Specifically, when the AI companion is disembodied, greater social presence of the AI companion led people to recommend the AI companion to lonely individuals. Warmth was not significant (b = −.09, t(99) = −0.30, p > .05).

Discussion

This study highlights the importance of social presence when considering the use of a disembodied AI companion for lonely individuals. Specifically, the study finds that among a disembodied AI companion, social presence fosters greater perceived usefulness of an AI companion and willingness to recommend an AI companion to lonely individuals. Considering that lonely individuals tend to seek more connections in mediated environments (Song et al., Citation2014), these social aspects may be particularly influential when considering a disembodied AI companion. One potential explanation is due to people’s tendency to idealize or hyperpersonalize the other based on available cues in a mediated environment (Walther, Citation1996). In the disembodied condition, participants may have imagined the AI companion more positively when only hearing its voice. Given the lack of physical presence of the AI, the participants may have paid more attention to the available social cues to process the interaction. Collectively, these may have fostered positive perceptions of a disembodied AI companion. However, social presence did not play any role in the embodied condition. Although more research is necessary to better understand this relationship, the presence of a human-like physically embodied AI may have affected how the participants evaluated the utility of the AI companion.

Unexpectedly, warmth did not significantly impact the relationships in the current study. This finding is surprising, because people often perceive some degree of warmth when engaging with machine agents (Eyssel & Kuchenbrandt, Citation2012) and warmth is an important factor that influences perceptions of machine agents (Scheunemann et al., Citation2020). Perhaps, warmth is important in relation to other variables, such as mind perceptions (Eyssel & Kuchenbrandt, Citation2012), but not when considering the utility of an AI companion for lonely individuals.

Overall, this study extends research on social presence and warmth. Although some scholars suggest social presence and warmth function in a similar manner (Song, Hollenbeck, & Zinkhan, Citation2008), the current study suggests otherwise. As evidenced in this study, social presence fosters positive perceptions of the utility of a disembodied AI companion (e.g., usefulness), but warmth is not a contributing factor.

Although this study reveals interesting findings, it also recognizes several limitations. First, the stimuli were edited from existing clips. While the content of the first three minutes of the clips in each condition was identical, the interactions in the remaining content varied. Although the primary content of both clips focused on building a friendship, the differences in video content could have affected participants’ responses. Future research should consider creating content that only varies in the degree of embodiment between the conditions. Second, participants were indirectly exposed to the AI companion by means of a fictional video clip instead of a direct interaction. Further, the indirect interaction occurred through a medium, as participants watched a video of the interaction rather than observing it in a face-to-face setting. Although indirect interactions can be useful, there are potential differences between direct and indirect interactions (e.g., Bainbridge, Hart, Kim, & Scassellati, Citation2011) and regarding how the indirect interaction occurs (e.g., virtually vs. face-to-face). As these technologies become more commercially accessible to the public, future research should address these issues. Finally, some of the CFA models did not achieve adequate fit. Thus, future researchers should be cautious in using these exact measures when testing similar relationships.

Conclusion

Overall, the study’s findings highlight the importance of the social aspects of a disembodied AI companion. Specifically, social presence of a disembodied AI companion helps perceive an AI companion as useful for lonely individuals and increases one’s willingness to recommend it to lonely individuals. Based on the study’s findings, future research should further explore other aspects to better understand how individuals may perceive AI companions.

Disclosure statement

No competing financial interests exist.

Additional information

Notes on contributors

Kelly Merrill

Kelly Merrill Jr. (M.A., University of Central Florida) is a doctoral candidate in the School ofCommunication at The Ohio State University in the U.S. His primary research interests are at the intersection of health communication communication technology. In particular, he is interested in health disparities and the use of communication technologies for physical, psychological, and social well-being.

Jihyun Kim

Jihyun Kim (Ph.D., University of Wisconsin-Milwaukee) is an Associate Professor in the Nicholson School of Communication and Media at the University of Central Florida. Her primary research focuses on the effects and implications of new media/communication technologies for meaningful outcomes. Her research also examines human-machine communication in diverse contexts, with a particular interest in the role of machine agents (e.g., AI, robots) and how humans perceive them.

Chad Collins

Chad Collins (M.A., University of Central Florida) is a communication instructor and writer based in Florida. He currently teaches at St. Johns River State College and is a teaching assistant at Arizona State University, while also serving as the Claremont Lincoln University writing coach.

References