2,012
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Emotion recognition of faces and emoji in individuals with moderate-severe traumatic brain injury

ORCID Icon, , , &
Pages 596-610 | Received 26 May 2022, Accepted 13 Feb 2023, Published online: 27 Feb 2023

ABSTRACT

Background

Facial emotion recognition deficits are common after moderate-severe traumatic brain injury (TBI) and linked to poor social outcomes. We examine whether emotion recognition deficits extend to facial expressions depicted by emoji.

Methods

Fifty-one individuals with moderate-severe TBI (25 female) and fifty-one neurotypical peers (26 female) viewed photos of human faces and emoji. Participants selected the best-fitting label from a set of basic emotions (anger, disgust, fear, sadness, neutral, surprise, happy) or social emotions (embarrassed, remorseful, anxious, neutral, flirting, confident, proud).

Results

We analyzed the likelihood of correctly labeling an emotion by group (neurotypical, TBI), stimulus condition (basic faces, basic emoji, social emoji), sex (female, male), and their interactions. Participants with TBI did not significantly differ from neurotypical peers in overall emotion labeling accuracy. Both groups had poorer labeling accuracy for emoji compared to faces. Participants with TBI (but not neurotypical peers) had poorer accuracy for labeling social emotions depicted by emoji compared to basic emotions depicted by emoji. There were no effects of participant sex.

Discussion

Because emotion representation is more ambiguous in emoji than human faces, studying emoji use and perception in TBI is an important consideration for understanding functional communication and social participation after brain injury.

Introduction

Social cognition deficits are common after moderate-severe traumatic brain injury (TBI). They can include deficits in emotion recognition, theory of mind, cognitive empathy, and pragmatic inference (Citation1) and have the potential to negatively affect social functioning in everyday communication settings. In particular, deficits in emotion recognition in people with TBI have been linked to poorer social integration outcomes (Citation2) and reduced communication competence (Citation3). Studies of emotion recognition in TBI have provided important insights into the difficulties people with TBI encounter in face-to-face communication. However, not all communication is face-to-face. Increasingly, people engage in social communication via written media such as text messages, e-mail, and posts on social media. People with TBI communicate using a variety of platforms including social networking sites, instant messaging, dating apps, gaming, and e-mail (Citation4) and report that participation in social media serves important social functions such as connecting with others, sharing TBI stories, advocacy, and creating community (Citation4–6). Although people with TBI use social media in ways that are similar to those of their uninjured peers, a subset of participants with TBI report a change in usage following their brain injury (Citation7) as well as significant cognitive and technological barriers in their use of computer-mediated communication (CMC) more generally (Citation4,Citation8). Although CMC contexts lack in-person facial cues, written text is often accompanied by emoji – stylized graphical icons depicting facial expressions as well as other imageable ideas such as hand gestures, weather, food and drink, animals, plants, and activities. In the first investigation of emoji perception after brain injury, we examine whether the documented deficits in facial emotion recognition in TBI extend to recognition of emotions depicted by emoji, providing insights into the communicative competence of people with TBI and potential barriers to social participation in CMC contexts.

Emotion recognition of faces in TBI

Across cultures, people recognize at least six basic human emotions: anger, fear, sadness, disgust, happiness, and surprise (Citation9–11). The universality of emotion perception has been established in cross cultural studies in which participants match facial expressions to a closed-set of emotion labels; however, cultural variation has been detected when using an open response format in which participants freely generated emotion words without cues (Citation12). There is evidence that TBI can disrupt recognition of basic emotions; a meta-analysis of 13 studies found that between 13% and 39% of people with moderate-severe TBI have deficits in facial emotion recognition of static basic emotions, performing on average 1.1 standard deviations below neurotypical peers (Citation13). Facial emotion recognition deficits have been linked to social communication difficulties in TBI (Citation3,Citation14) and a reduction in perceived communication competence by close others (Citation15). However, people with TBI are not uniformly impaired at facial emotion recognition. Research into the risk factors associated with poor emotion recognition after brain injury suggest that injury severity (Citation16–18); location of lesion to gray and/or white matter (Citation19,Citation20); neuropsychological profile (Citation21,Citation22); and age (Citation22,Citation23) may contribute to emotion recognition impairments. In addition, biological sex may play a protective role against emotion recognition deficits (Citation24): females with TBI performed significantly better than males with TBI in a dynamic facial recognition task, and males (but not females) with TBI performed significantly worse than neurotypical peers in dynamic facial emotion recognition (Citation22). Participant age and sex have also been linked to emotion recognition abilities in neurotypical people, with younger participants outperforming older participants and females outperforming males (Citation25–29).

Emotion recognition of emoji

Although decades of research on facial emotion recognition have established the importance of facial expression for communication, research into the recognition of graphical representations of emotion such as emoji is in its early stages. In CMC contexts, emoji provide a nonverbal cue that parallels the role of facial expression in face-to-face communication. Although many emoji are devised to depict facial expressions, the extent to which emoji map onto specific emotions is unclear. Indeed, descriptions of a given emoji often span multiple emotions (Citation30,Citation31), suggesting that their interpretation is flexible. For example, Jaeger et al Citation31 categorized participants’ free-text emoji labels into 44 semantic categories and found that although some emoji had strong associations with a particular emotion (e.g., 99% of responses for the Grinning-face emoji were coded as “happy”), other emoji generated a variety of semantic category responses (e.g., 57% of responses for the Confused-face emoji were coded as “sad” whereas 5–28% of its responses were coded as categories such as “confused,” “disappointed,” “neutral,” “depressed,” or “nervous”). One additional challenge to interpreting emoji is the different representations of emoji across platforms. Although emoji have a standard Unicode identifier, each communication platform (e.g., iOS, Android, Facebook, Twitter) has its own style of emoji icons. As a result, a sender using one platform (e.g., iOS) may select a specific emoji that reflects their mood, but the receiver, using a different platform (e.g., Android), sees a different representation. Indeed, although potential for multiple interpretations exists within a platform, inter-platform differences in emoji depiction represent an even greater risk for miscommunication (Citation32,Citation33). Franco and Fugate (Citation34) asked participants to select up to three emoji that most accurately represented a variety of emotions (anger, calm, contempt, disgust, envy, fear, happy, love, sad, and surprise) and to rate the strength of the emoji-emotion relationship. The researchers compared responses within and across three platforms (iOS, Android, and Samsung) and found limited shared agreement on which emoji best represented these emotions, with the exception of the Angry-face emoji that was widely agreed to depict the emotion anger.

Given the lack of clear emoji-emotion relationships, Cherbonnier and Michinov (Citation35) created a novel set of emoji specifically designed to represent the six basic emotions previously identified by Ekman (Citation10) (anger, sad, disgust, surprise, happy, fear). Across three studies, the researchers compared emotion recognition for these novel emoji with Facebook emoji, iOS emoji, and human faces. For each stimulus item, participants selected the best-fitting emotion label from a bank of 14 emotion words (the six basic emotions and eight filler emotions). The novel emoji had superior emotion recognition compared to Facebook emoji, iOS emoji, and human faces. The authors attributed the improved recognition accuracy of the novel emoji to the disgust emotion, which was depicted with a tongue sticking out and had significantly higher recognition rates and perceived intensity than the disgust emotions depicted by the Facebook and iOS emoji. Indeed, current emoji seem to lack a clear parallel for the facial expression disgust; using a corpus of 150 most popular emoji on Twitter (both face and non-face emoji), Shoeb and de Melo (Citation36) asked participants to rate the degree of association of each emoji with eight emotions (anger, anticipation, disgust, fear, joy, sadness, surprise, and trust). They found that few emoji were strongly associated with the basic emotions, except for joy, which had many associations to both face and object emoji. In particular, disgust was not strongly associated with any emoji, although it was moderately associated with a few (e.g., Confounded-face ). The Confounded-face emoji was also found to be most frequently associated with disgust in Franco and Fugate (Citation34). Examining all basic emotions besides disgust, Fischer and Herbert (Citation37) compared the emotionality (i.e., how intensely a stimulus represents an emotion) of human faces and emoji and found no difference in emotionality between the two stimulus types, indicating that they represent associated emotions similarly. However, there were some differences for specific emotions. Fear and neutral were rated as having higher emotionality for faces than emoji, whereas anger and sadness had higher emotionality for emoji than faces. Although these studies provide an emerging picture of the relationship between emoji and emotion perception in neurotypical individuals, perception of emoji by people with TBI has not been examined.

Demographic variables that influence emoji perception

Demographic variables such as sex and age may influence emoji perception. Some studies report that females use emoji more frequently than males (Citation38–40), but males use a larger variety of types (Citation41). However, there is mixed evidence for whether sex differences exist in emoji perception. Jones and colleagues (Citation40) found that compared to males, females rated emoji depicting negative emotions as having significantly lower valence, and in another study, females rated emoji as being more familiar, clear, and meaningful than males did (Citation42). In contrast, males and females did not differ in their types or frequency of responses when selecting emotion labels associated with commonly-used emoji (Citation43) or in their interpretation of emoji functions (Citation39). Thus, although males and females may differ in their patterns of emoji use or familiarity, they may perceive semantic meanings associated with emoji similarly.

In a study of age differences in emoji use, older participants reported using emoji less frequently, had less positive attitudes toward emoji, and identified fewer motives for using them than younger participants (Citation38). Regarding emoji perception, older participants were more likely than younger participants to respond “I don’t know” when asked to interpret emoji functions and interpreted emoji more literally (Citation39). However, others (Citation43) found no influence of age on emoji interpretation. These studies (Citation38,Citation39,Citation43) treated age as a categorical variable, leaving an open question as to the age-related effect size when age is treated as a continuous variable. Given documented sex and age differences in emotion recognition in faces (Citation22,Citation23,Citation25–28), continued work is needed to understand the demographic factors that affect emotion recognition in emoji.

Study aims

The current study had four aims:

  1. Replicate findings of reduced facial emotion recognition accuracy of static basic human emotions in a novel comparably-sized sample of individuals with moderate-severe TBI.

In a sample of 53 people with moderate-severe TBI, Rigon and colleagues (Citation22) found that participants with TBI had significantly reduced accuracy labeling basic emotions depicted by static human faces than their neurotypical peers. Further, female participants showed higher facial recognition accuracy of static faces depicting basic emotions than males. There was no significant sex-by-group interaction for static facial recognition tasks. Given the hallmark heterogeneity of TBI (Citation44), we set out to test whether these findings hold in a novel comparably-sized and sex-balanced sample of people with moderate-severe TBI, recruited from a different testing site. Although not a direct replication due to stimulus presentation differences (in-person versus Zoom), we predicted that like the participants in Rigon et al, this new sample of participants with TBI would also demonstrate poorer facial emotion recognition than neurotypical peers and that males would demonstrate lower emotion label accuracy than females.

  • (2) Compare emotion recognition accuracy of emotions depicted by emoji and human faces.

Given that some emoji can represent multiple emotions (Citation30,Citation31,Citation34), we predicted that both participants with and without TBI would have reduced emotion labeling accuracy for emoji relative to human faces. We hypothesized that facial emotion recognition deficits in moderate-severe TBI would extend to the emoji modality, and thus, predicted a group-by-stimulus condition interaction where the TBI group would demonstrate a greater reduction in emotion labeling accuracy of emoji than human faces relative to the neurotypical group.

  • (3) Compare emotion recognition accuracy for emoji depicting basic versus social emotions.

To date, studies examining emoji-emotion relationships have focused on the six basic emotions initially identified by Ekman and colleagues (Citation34–37). There are over 100 facial expression emoji represented by unique Unicode identifiers and depicted across platforms (https://unicode.org/emoji/charts/full-emoji-list.html). Given that senders frequently use facial emoji depicting a variety of emotions and moods, we compared emotion recognition for emoji depicting the six basic emotions (anger, disgust, fear, sadness, surprise, happy) as well as six social emotions (embarrassed, remorseful, anxious, flirting, confident, proud). Social emotions differ from basic emotions in that they require a social context, involving interactions with other people and inferences about others’ mental states (Citation45–49). For example, feelings of embarrassment require a social context where a person becomes self-conscious about another’s real or imagined negative perceptions. We predicted that both groups would have reduced emotion labeling accuracy for emoji depicting social emotions relative to basic emotions. However, given that people with TBI commonly have deficits in social cognition (Citation1), we predicted a group-by-stimulus condition interaction where the TBI group would demonstrate a greater reduction in emotion labeling accuracy of social emotions relative to basic emotions than the neurotypical group.

  • (4) Compare confidence ratings of emotion labels by stimulus type and group.

If emoji represent emotion more ambiguously than human faces, we would expect all participants to be less confident in matching emotion labels to emoji than human faces. Moderate-severe TBI can result in anosognosia (Citation50), a lack of deficit awareness, and those with reduced self-awareness tend to perform lower on emotion recognition tests (Citation51,Citation52). Thus, although we predicted participants with TBI will have reduced emotion labeling accuracy relative to neurotypical peers, we did not predict a parallel decrease in emotion labeling confidence ratings in the TBI group.

Methods

Participants

Participants were 51 individuals with moderate-severe TBI (25 Female) and 51 neurotypical peers (26 Female). TBI and neurotypical groups were matched on sex (X2 [1, N = 102] = 0.04, p = 0.84), age (MTBI = 38.53 years, SDTBI = 10.60 years; MNeurotypical = 37.51 years, SDNeurotypical = 10.73; t = −0.48, p = 0.63, d = 0.10), and education (MTBI = 14.84 years, SDTBI = 2.44 years; MNeurotypical = 15.45 years, SDNeurotpyical = 2.19; t = 1.32, p = 0.19, d = 0.26).

Participants with TBI sustained their injuries in adulthood, were in the chronic phase of their injury, at least 6 months since onset of injury (mean time post injury = 77.64 months; SD = 78.24 months), and were out of post-traumatic amnesia, exhibiting stable neuropsychological profiles. All participants with TBI met inclusion criteria for moderate-severe TBI, determined using the Mayo Classification System (Citation53), along with health history information collected through medical records and intake interviews. Participants were classified as moderate-severe if at least one of the following criteria was met: (1) Glasgow Coma Scale (GCS) <13 within 24 hours of acute care admission (i.e., moderate or severe according to the GCS), (2) positive neuroimaging findings (acute CT findings or lesions visible on a chronic MRI), (3) loss of consciousness (LOC) >30 minutes, or (4) post-traumatic amnesia (PTA) >24 hours. GCS was available for 41 participants (Median = 12, ranging from 3 to 15); loss of consciousness (LOC) information was available for 40 participants; post-traumatic amnesia (PTA) information was available for 41 participants; acute imaging information was available for 36 participants (35 with positive findings). Causes of injury were motor vehicle accidents (n = 21), falls (n = 9), motorcycle or snowmobile accidents (n = 5), being hit by a car as a pedestrian (n = 3), non-motorized vehicle accidents (n = 6), assault (n = 2), being hit by a moving object (n = 2), or other (e.g., hang gliding accident, fell out of moving car; n = 3). See for demographic and injury characteristics of the TBI sample.

Table 1. Demographic and injury information for participants with TBI.

Stimuli

Basic emotion faces

Participants completed the Karolinksi Directed Emotional Faces (KDEF) static face recognition test (Citation54) of seven facial expressions, including the six basic emotions and neutral expressions. We used the same 28 facial stimuli (4 different actors per emotion) as in Rigon, Turkstra, Mutlu, and Duff (Citation22). The static facial recognition task was chosen over the dynamic task to serve as a more direct comparison with the emoji stimuli which are also static images. Participants were presented one colored photo at a time with seven responses to the right of the photo (fearful, angry, disgusted, happy, neutral, sad, surprised). Response labels were presented in the same order on each trial. Participants selected the response that best described the person’s feeling.

Basic emotion emoji

In the same format as above, participants chose the best-matching emotion label for 28 emoji stimuli depicting basic emotions: anger (Unicode 1f620), disgusted (Unicode 1f616), fearful (Unicode 1f628), happy (Unicode 1f642), sad (Unicode 1f641), surprised (Unicode 1f632), and neutral (Unicode 1f610). In addition to referencing the emotion disgust in its Emojipedia description, the Confounded-face emoji (Unicode 1f61) chosen here has also been associated with and used to represent disgust in previous studies (Citation34–36). As with the facial recognition task, four versions of each emoji emotion were shown, using renderings from the Apple (iOS 14.2), Samsung (Samsung One UI 2.5), Google (Google Android 11.0), and Facebook (Facebook 4.0) platforms.

Social emotion emoji

In the final condition, participants chose the best-matching label for emoji depicting social emotions. A selection of potential social emotions was identified from Hareli and Parkinson’s (Citation47) review of 40 different emotion labels that have been used consistently to describe social emotions in the literature. However, not all of these labels had a clear emoji counterpart. To select the emoji stimuli, we searched keywords for each of the social emotion labels on the Emojipedia website (https://emojipedia.org), identifying 22/40 labels with corresponding emoji. After excluding emotions that included emoji that were used for the basic emotions task and collapsing emotions that shared emoji (e.g., shame and embarrassment were represented by same emoji), we narrowed the field down to 11 unique emoji representing social emotions. We selected a final set of six social emotion emoji that did not overlap in their Emojipedia emotion keyword descriptions: flirting (Unicode 1f60f), confident (Unicode 1f60e), embarrassed (Unicode 1f633), proud (Unicode 1f601), remorseful (Unicode 1f61e), and anxious (Unicode 1f630). Consistent with the basic emotion conditions, neutral (Unicode 1f610) was also included, and depictions from Apple, Samsung, Google, and Facebook platforms were used. All emoji in this final set explicitly mentioned the social emotion label they were chosen to represent in their Emojipedia description. See for example stimuli used for each of the three conditions described above.

Figure 1. Example stimuli for each of the three emotion recognition conditions. In the face condition, each emotion was depicted by four different actors. In the emoji conditions, each emotion was depicted by four different platforms, with iOS 14.2 renderings pictured here. .

Figure 1. Example stimuli for each of the three emotion recognition conditions. In the face condition, each emotion was depicted by four different actors. In the emoji conditions, each emotion was depicted by four different platforms, with iOS 14.2 renderings pictured here. .

Procedure

Study procedures and planned analyses were preregistered on Open Science Framework (https://osf.io/c8kyx) prior to data collection.Footnote1 Participants completed all study tasks remotely using Gorilla Experiment Builder (Citation55); www.gorilla.sc). All participants completed experimental conditions in the following order: (1) basic emotions depicted by faces, (2) basic emotions depicted by emoji, (3) filler task where participants labeled basic emotions depicted by faces wearing surgical masks as part of a separate preregistered study, (4) social emotions depicted by emoji. Presentation of all stimuli within each task were randomized for each participant. All participants completed the study using a laptop or desktop computer. Participants completed the task remotely from their home while supervised on a Zoom conference call with an experimenter who viewed the participants’ screen and gave verbal instructions prior to the task. For each condition, participants were instructed to click on the word that best describes what the person/emoji is feeling. Between each condition, participants rated how confident they were at correctly identifying the emotions on a scale of 1 (Not confident at all) to 5 (Extremely confident).

Analysis

Primary analyses

We used mixed-effect regression to predict the likelihood of correctly labeling an emotion as a function of group (Neurotypical, TBI), condition (basic emotion faces, basic emotion emoji, social emotion emoji), sex (female, male), and their interactions using the glmer() function of the lme4 package (Citation56) in R (R Core Team, 2021) with a logit linking function for binary data. Participant group was dummy coded such that neurotypical group served as the reference. Condition was Helmert contrast coded: Condition contrast 1 compared the likelihood of correctly labeling items in the basic emotion faces condition (2/3) to the average of the basic emotion emoji condition (−1/3) and social emotion emoji condition (−1/3), generating a comparison between face stimuli and emoji stimuli. Condition contrast 2 compared the likelihood of correctly labeling emotions in the basic emotion emoji condition (1/2) to the social emotion emoji condition (−1/2), generating a comparison of emotion type within emoji stimuli. Sex was dummy coded such that female participants served as the reference. We initially attempted to fit the model using a maximal random-effects structure (Citation57) with a random intercept for participant and random by-participant slopes for the effect of stimulus condition. This model failed to converge, so the final model includes only random intercept for participant. To interpret significant coefficients for logit-linked binomial regressions, we used odds ratios. We provide descriptive statistics to compare group differences for individual emotions. To compare participants’ confidence in their emotion labels in each stimulus condition, we used ordered regression to predict confidence ratings (Not confident at all, Slightly confident, Somewhat confident, Fairly confident, Extremely confident), as a function of group (Neurotypical, TBI), condition (basic emotion faces, basic emotion emoji, social emotion emoji) and their interactions using the polr() function of the MASS package. Coding for group and condition were identical as above.

Exploratory analyses

Note, in our preregistration (https://osf.io/enjd8/) we had originally planned to include emotion valence (positive, negative) as a fixed effect in the mixed-effect model. However, we removed this variable from our primary analysis for the following reasons: (1) to reduce number of statistical tests, thereby decreasing likelihood of type 1 errors; (2) emotions are not always easily categorized into a positive/negative binary. For example, although surprise is often categorized as positive, it does not have a clear valence connotation (Citation58), and the Neutral-face emoji can be associated with negative valence (Citation59), possibly reflecting the fact that although neutral facial expressions are common, neutral emoji are rare in CMC (Citation37; 3) for basic emotions, there are more negative than positive emotions. This unbalanced distribution may make it more difficult to distinguish negative emotions relative to positive emotions, contributing to valence effects (Citation60,Citation61; 4) valence effects may be confounded with difficulty (Citation61); and (5) significant effects of valence can be driven by a single emotion (Citation22). To explore valence effects in the current study, we conducted two-sample tests for equality of proportions using the prop.test() function in R to compare whether the proportion or correct label responses differed for positive and negative emotions in each of the three conditions. In additional exploratory analyses, we examine the effect of age on emotion label accuracy and intercorrelations between emotion recognition performance in each of the three conditions using Pearson’s correlation coefficients. The results for these exploratory analyses are provided in supplementary materials.

Results

Effect of stimulus condition on emotion recognition accuracy

First, we examined the effect of participant group and stimulus condition on emotion label accuracy (). For labeling basic emotions depicted by faces, neurotypical participants correctly labeled an average of 26.33 items (SD = 1.87), and participants with TBI correctly labeled an average of 26.00 items (SD = 1.60). For labeling basic emotions depicted by emoji, neurotypical participants correctly labeled 22.37 items, on average (SD = 2.11), and participants with TBI correctly labeled 22.65 items, on average (SD = 2.36). For labeling social emotions depicted by emoji, neurotypical participants correctly labeled 21.86 items on average (SD = 4.01), and participants with TBI correctly labeled 19.04 items on average (SD = 4.28).

Figure 2. Proportion of correct emotion labels for each of the three stimulus conditions by group. Points indicate mean performance of individual participants. Bars represent standard error of the mean.

Figure 2. Proportion of correct emotion labels for each of the three stimulus conditions by group. Points indicate mean performance of individual participants. Bars represent standard error of the mean.

Results of the analysis revealed there was no main effect of group (β = −0.10, z = −0.68, p = 0.50); participants with and without TBI did not significantly differ in their likelihood of correctly labeling emotions. There was a significant effect of condition contrast 1 (faces versus emoji; β = 1.45, z = 8.82, p < 0.001); neurotypical participants were 4.25 times more likely to correctly label emotions depicted by faces than emoji (OR = 4.25, 95% CI [3.08, 5.87]). The effect of condition contrast 2 (basic emoji versus social emoji) was not significant for neurotypical participants (β = 0.19, z = 1.48, p = 0.14); neurotypical participants did not significantly differ in their likelihood of correctly labeling basic emotions compared to social emotions depicted by emoji. There was no two-way interaction between group and condition contrast 1 (β = 0.09, z = 0.37, p = 0.71), indicating that both participants with and without TBI had better emotion recognition for faces than emoji. There was a significant two-way interaction between group and condition contrast 2 (β = 0.45, z = 2.49, p = 0.01). To probe this interaction, we re-ran the model, setting the TBI group as the reference level; this analysis revealed a significant effect of condition contrast 2 for the TBI group (β = 0.64, z = 5.00, p < 0.001). Participants with TBI were 1.90 times more likely to correctly label basic emotions than social emotions depicted by emoji (OR = 1.90, 95% CI [1.48, 2.44]).

Effect of sex on emotion recognition accuracy

Next, we examined the effect of participant sex on emotion label accuracy. There was no main effect of sex (β = 0.15, z = 0.96, p = 0.34); neurotypical male and female participants did not significantly differ in their likelihood of correctly labeling emotions. There was no two-way interaction between participant group and sex (β = −0.24, z = −1.14, p = 0.25), indicating that there was no effect of sex in the TBI group as well. In addition, sex did not interact with condition contrast 1 (β = 0.03, z = −0.13, p = 0.89) or condition contrast 2 (β = −0.16, z = −0.87, p = 0.39). Three-way interactions between group-by-sex-by-condition contrast 1 (β = −0.10, z = −0.30, p = 0.76) and group-by-sex-by-condition contrast 2 (β = 0.30, z = 1.15, p = 0.25) were also both non-significant. Thus, participant sex did not play a significant role in emotion recognition in this sample. Previous work identifying sex differences in static facial emotion recognition report that this difference was driven by females demonstrating better recognition of fearful faces than male participants (Citation22). We provide a breakdown of emotion label accuracy by sex, group, and stimulus condition for each individual emotion type in Supplementary Materials A. In the current study, neurotypical females and neurotypical males correctly labeled fearful faces 69% and 77% of the time, respectively, and female participants with TBI and male participants with TBI correctly labeled fearful faces 72% and 61% of the time, respectively.

Differences in emotion recognition accuracy by emotion type

Next, we examined emotion labeling accuracy for each emotion type by stimulus condition to understand whether certain emotions were driving the effects reported above. Although emotions depicted by faces had higher label accuracy than those depicted by emoji, the face emotion fearful was most commonly mislabeled (). Faces depicting fearful were correctly labeled 73% of the time by neurotypical participants and 66% of the time by participants with TBI. To explore which emotions fearful faces were most commonly mislabeled as, we generated a confusion matrix (). The neurotypical group labeled fearful faces as disgusted 12% of the time. The TBI group labeled fearful faces as disgusted 14% of the time and as surprised 11% of the time.

Table 2. Confusion Matrix Demonstrating Proportion of Basic Emotions Depicted by Faces Correctly and Incorrectly Labeled by Neurotypical Participants and Participants with TBI.

Figure 3. Proportion of correct labels for each emotion type for each stimulus condition and participant group.

Figure 3. Proportion of correct labels for each emotion type for each stimulus condition and participant group.

Basic emotions depicted by emoji had lower labeling accuracy than the same basic emotions depicted by faces. This difference was driven by two emoji emotions in both the neurotypical and TBI group: disgusted and fearful (). Emoji depicting disgusted were correctly labeled 39% of the time by neurotypical participants and 34% of the time by participants with TBI. Emoji depicting disgusted were most commonly mislabeled as angry or sad by both neurotypical and TBI groups (). Emoji depicting fearful were correctly labeled 42% of the time by neurotypical participants and 48% of the time by participants with TBI. Emoji depicting fearful were most commonly mislabeled as surprised by both neurotypical and TBI groups.

Table 3. Confusion Matrix Demonstrating Proportion of Basic Emotions Depicted by Emoji Correctly and Incorrectly Labeled by Neurotypical Participants and Participants with TBI.

Emoji depicting social emotions had reduced labeling accuracy across all emotions (), with the least accurate emotion being anxious for both groups. Emoji depicting anxious were correctly labeled 56% of the time by neurotypical participants and 43% of the time by participants with TBI. Emoji depicting anxious were most commonly mislabeled as remorseful for both neurotypical and TBI groups (). Although social emotions depicted by emoji often generated mislabels, the intended emotions were reflected in the majority response for both groups.

Table 4. Confusion Matrix Demonstrating Proportion of Social Emotions Depicted by Emoji Correctly and Incorrectly Labeled by Neurotypical Participants and Participants with TBI.

Emotion labeling confidence ratings

Next, we examined whether there were group or condition differences in participants’ confidence in their emotion labels using ordered regression. Participants rated confidence on a 5-point scale (1 = Not confident at all, 5 = Extremely confident). There was no main effect of group (β = −0.15, t = −0.56, p = 0.57); participants with and without TBI did not statistically differ on confidence ratings. There was a significant effect of condition contrast 1 (faces versus emoji; β = 1.32, t = 2.81, p = 0.005); neurotypical participants were less confident rating emotions depicted by emoji than faces. Compared to emoji, face stimuli were associated with an 1.32 times increased likelihood of being rated at a higher confidence level (OR = 1.32, 95% CI[0.45, 2.31]). The effect of condition contrast 2 (basic emoji versus social emoji) was not significant for neurotypical participants (β = 0.62, t = 1.61, p = 0.11); neurotypical participants did not significantly differ in their confidence ratings for basic emotions compared to social emotions depicted by emoji. There was no two-way interaction between group and condition contrast 1 (β = 0.11, t = 0.17, p = 0.87), indicating that participants in both groups were more likely to give higher confidence ratings for faces than emoji. There was no significant interaction between group and condition contrast 2 (β = 0.36, t = 0.66, p = 0.51), indicating a lack of difference in confidence ratings for basic and social emotion emoji in the TBI group as well.

Exploratory analyses

Following the primary analysis, we conducted three exploratory analyses, reporting estimated effect sizes and descriptive statistics. First, given the variability in performance in both participant groups, we asked if participants who performed poorly in emotion recognition of faces also had lower emotion recognition scores for emoji. We found a small positive relationship between label accuracy in the face condition and both the emoji conditions (r[100] = 0.25). Second, we examined whether emotion valence was related to label accuracy and found that participants had better label accuracy for positive than negative emotions in the basic emotion faces (Mpos = 0.99, Mneg = 0.90), basic emotion emoji (Mpos = 0.98, Mneg = 0.68), and social emotion emoji conditions (Mpos = 0.71, Mneg = 0.67). Finally, we examined the relationship between emotion label accuracy and participant age for each of the three conditions. The relationship between age and emotion recognition was inconsistent among conditions, ranging from none (r[100] = −0.13) to a small negative relationship (r[100] = −0.24). Full results for these exploratory analyses are presented in Supplementary Materials B. In addition, we have made raw data available for others wishing to conduct additional exploratory analyses (https://osf.io/enjd8/).

Comparison to prior facial emotion recognition findings in TBI

In contrast to our prediction, we did not find reduced emotion labeling accuracy for participants with TBI relative to neurotypical participants or for male participants relative to female participants. These findings are inconsistent with prior work by Rigon and colleagues (Citation22) who used the same basic emotion face stimuli. Both studies have a comparable sample size (51 participants with TBI in the current study compared to 53 participants with TBI in Rigon et al) and used the same inclusion criteria for diagnosis of moderate-severe TBI. Here, we explore other differences between these two samples that may have contributed to different results.

For labeling basic emotions depicted by faces, neurotypical participants in Rigon et al (Citation22), had an average emotion recognition score of 26.18 (SD = 1.98) and neurotypical participants in the current study had an average emotion recognition score of 26.33 (SD = 1.87). Participants with TBI in the current study performed in line with neurotypical participants with an average recognition score of 26.00 (SD = 1.60) compared to 24.45 (SD = 3.01) in Rigon et al (Citation22). A two-tailed t-test revealed participants with TBI in the current study had significantly better facial emotion recognition of basic emotions than the TBI participants in Rigon et al (Citation22), t(102) = 3.26, p = 0.002, d = 0.64.

Participants with TBI in both samples were in the chronic stage of their injury (greater than 6 months post onset) and received a classification of moderate-severe TBI based on the Mayo classification system (Citation53). However, participants with TBI in Rigon et al (Citation22) had a significantly lower mean injury chronicity (M = 14.94 months, SD = 16.33) than participants in the current study (M = 77.64 months, SD = 78.24 months) t(102) = 5.71, p < 0.001, d = 1.11. In exploring other demographic variables between the two samples, participants with TBI in the current study (M = 38.53 years, SD = 10.60) were significantly younger than participants with TBI in Rigon et al (Citation22), (M = 44.14 years, SD = 14.08), t(102) = 2.29, p = 0.02, d = 0.45. The two samples did not differ in level of education: Participants with TBI in the current study had an average education of 14.84 years (SD = 2.44) compared to 14.81 years (SD = 2.15) in Rigon et al (Citation22), t(102) = 0.07, p = 0.95. Similarly, A chi-square test of independence showed no significant difference in distribution of participant sex between the two samples of participants with TBI, X2(1, N = 104) = 0.04, p = 0.85.

Discussion

The current study represents a first investigation of whether emotion recognition of emoji parallel deficits in emotion recognition of faces in individuals with moderate-severe TBI. In a controlled design, we examined the ability of participants with and without TBI to match photos of faces and emoji to a closed set of basic or social emotion labels. We found no significant difference in emotion recognition in moderate-severe TBI compared to neurotypical peers. However, participants in both groups had lower accuracy labeling emotions depicted by emoji than accuracy for faces. Further, participants with TBI, but not neurotypical participants, had reduced accuracy recognizing social compared to basic emotions depicted by emoji. For basic emotions depicted by faces, participants in both groups had most difficulty recognizing fear, whereas for basic emotions depicted by emoji, participants had most difficulty recognizing fear and disgust. Social emotions depicted by emoji had reduced accuracy overall, with the lowest accuracy for emoji depicting anxious. There was no effect of sex on emotion recognition accuracy. Participants in both groups rated their confidence in their emotion labels higher for emotions depicted by faces than by emoji.

Emotion recognition of faces in TBI

The first aim of this study was to replicate results of Rigon and colleagues (Citation22), who found that participants with TBI had lower facial emotion recognition accuracy of static basic emotions than neurotypical participants and that males had lower facial emotion recognition accuracy than females. We used the same set of 28 stimuli of static human faces depicting basic emotions and compared labeling accuracy in a comparable sample size of participants with and without TBI. In contrast to the prior study, we did not find a deficit in emotion recognition for participants with TBI, and male participants did not have lower emotion recognition scores than females. Although these two studies used the same stimuli with the same stimulus layout, task administration differed between studies and thus, was not a direct replication. Participants in Rigon et al (Citation22) completed the task in-person in the physical presence of an experimenter, on a single computer, whereas participants in the current study completed the task remotely on their own personal computers in the virtual presence of an experimenter, while sharing their screens via Zoom. Average scores for both groups in the present study were highly similar to those of neurotypical participants in Rigon et al (Citation22), which suggests that administration method does not explain the difference in findings.

People with TBI are not uniformly impaired at emotion recognition. In their meta-analysis, Babbage and colleagues (Citation13) estimated that 13–39% of people with moderate-severe TBI have deficits in emotion recognition. Therefore, it is possible that the lack of significant difference in overall emotion recognition between the TBI and neurotypical groups in the current study reflects sampling differences across the heterogeneous TBI population (see 43 for a discussion of risks for sample bias in TBI). Although participants with TBI in both samples were in the chronic stage of their injury (greater than six months since onset) and received a classification of moderate-severe TBI based on the Mayo classification system (Citation53), there were some differences in sample demographics. Participants with TBI in the current study had a longer average time post-injury. Although it is possible that longer time post-injury could be linked to recovered emotion recognition ability, there is evidence that emotion recognition deficits in adults with TBI are stable over time (Citation16). Further, people with moderate-severe TBI generally show no change or even a deterioration in functional levels between 5 and 10 years after brain injury (Citation62). This suggests that increased chronicity may not account for fewer errors in the TBI group in the current study.

Another potential explanation for the better performance by current participants than those in Rigon et al (Citation22) is the difference in age: Participants in the latter study were significantly older than participants in the current study. Although we found no correlation between age and emotion labeling accuracy of faces in the current sample, other studies using the dynamic Emotion Recognition Test (Citation27), which uses gradually morphing facial expressions, have found a decline in facial emotion recognition with increasing age (Citation27–29). We also did not replicate the sex differences reported by Rigon et al (Citation22). Differences in that study were small, so the lack of a difference in our study might reflect sample size. Studies of sex differences in social functioning have produced inconsistent results in adults with and without TBI (Citation24). Therefore, future studies are needed to determine if differences truly exist, are idiosyncratic, or not reliably captured by stimuli. Studies have also rarely included gender, and the intersection of sex, gender, and TBI also merits future study.

Emotion recognition of emoji in TBI

A second aim of this study was to examine whether evidence of lower emotion recognition of faces in TBI extends to difficulties recognizing emotions in the emoji modality. Participants in both groups had lower recognition accuracy for emoji than faces, but there was no significant difference between groups. In the basic emotion emoji condition, errors were primarily in recognizing fear and disgust. In studies of face emotion recognition, fear and surprise are commonly confused, sharing common early signals in dynamic facial expression (e.g., upper lip raising and jaw dropping) (Citation63). Similarly, both disgust and anger are characterized early in facial expression by nose wrinkling (Citation63), and therefore, both facial and emoji stimuli may lead to more recognition errors when depicting less intense representations of these pairs of emotions. Studies of emoji emotionality indicate that current emoji lexicons lack a strong representation of disgust (Citation35,Citation36). Confusion matrices indicated that participants with and without TBI often selected different labels than the ones intended when labeling emotions depicted by emoji and rated their confidence in their emotion labels as lower compared to human faces. This is consistent with prior work demonstrating that a single emoji can represent multiple emotions (Citation30,Citation31). Although facial expressions depicted by emoji are simplified relative to human faces, they may not fall into discrete emotion categories. Indeed, it may be considered a strength of emoji that they represent emotion ambiguously, providing a modality that flexibly and creatively combines with written text to create rich communication contexts.

The final aim of the study was to compare emotion recognition for emoji depicting basic and social emotions. There was no difference in accuracy for basic versus social emotions in neurotypical adults, whereas participants with TBI had significantly lower scores for recognizing social compared to basic emotions. There was no parallel decrease in confidence ratings for labeling of social emotions compared to basic emotions in the TBI group, possibly reflecting a lack of deficit awareness. Social emotions are elicited in social contexts and are dependent on interactions with others and inferences about their mental states (Citation47–49), a context that is often challenging for people with TBI. For example, people with TBI can have difficulty inferring the feelings, intentions, and perspectives of others (Citation1,Citation14). These findings provide new insights into how social cognitive deficits in TBI might manifest as barriers in CMC contexts. In particular, people with TBI may be at increased risk for miscommunication when receiving messages with emoji depicting a range of moods and emotions to convey the sender’s mental state.

Future directions

In the current study, we examined how people with TBI perceived emotion depicted by emoji in isolation, but in CMC contexts, emoji are received in a social context. Thus, an important next direction is to examine how emoji impact message interpretation and the functions of emoji for communication in this population. Emoji serve a variety of communicative functions, including establishing emotional tone; lightening the mood; reducing ambiguity of discourse; softening or strengthening a message; expressing humor; promoting interaction; and maintaining and enhancing social relations (Citation39,Citation64–66). It is unknown if people with TBI use and perceive the communicative functions of emoji in context in similar ways.

Emoji can also change the interpretation of a message, influencing how receivers perceive the sender’s mental state, such as senders shifting the interpretation of neutral messages (e.g., “Hey, please call me”) in a negative or positive direction by adding an upset or happy emoji, respectively (Citation67). In addition, emoji can facilitate the interpretation of non-literal language. When undergraduate students were asked to interpret messages containing indirect meanings (e.g., a person is asked about their grade and responds, “Chemistry is a difficult course”), they were better and quicker at recognizing the indirect meaning when the response contained an emoji (Citation68). Emoji also improved interpretation of sarcastic messages for both younger and older adults (Citation69). Thus, emoji play a communicative role in social interaction. It is an open question whether social cognitive deficits in TBI (e.g., theory of mind, irony, sarcasm) are influenced by emoji. There is evidence from one study that participants with TBI benefit from nonverbal cues from gesture to interpret indirect messages (Citation70). Understanding the extent to which emoji can facilitate the ability of people with TBI to interpret non-literal communication would also provide new insights into the potential of adding emoji to written messages to support social communication in this population.

This study was concerned with how people with TBI perceived emoji, but it is another open question how they use emoji. In addition to serving several communicative functions, emoji play an important role in social participation and even follow certain usage rules. For example, emoji are used more often with friends than strangers and more often in positive than negative contexts (Citation66), as well as being deemed more appropriate in some settings (e.g., text messaging and social media) than others (e.g., e-mail) (Citation65). In fact, in work-related contexts, sending smiley emoji actually led to perceptions of reduced competence and effectiveness (Citation71,Citation72), highlighting the context-dependence of emoji use. Communicating effectively and appropriately in written contexts is an important skill for both social and vocational reintegration after TBI.

Finally, the current study investigated how individuals with TBI perceived emoji through a focused emotion-labeling task, and whether any diminished ability to correctly label emoji translates to social perception challenges. CMC theories suggest that, when nonverbal cues are unavailable or diminished, communicators adapt what cues they use and how they use these cues for effective communication (Citation73). It is possible that individuals with TBI look for and effectively utilize other social cues to supplement their interpretation of emoji that communicate social emotions. Little is known about the extent to which and ways in which individuals with TBI adapt to communication environments with diminished cues or environments where they experience deficiencies in social perception. Furthermore, emoji use may be construed as a communication convention rather than expression of emotion, and communicators may use emoji as part of conventions followed by their group (e.g., using Grinning-face emoji with all their in-group communications to express general affinity), regardless of their ability to correctly interpret them in a reliable fashion. Studies have shown strong cultural differences in emoji use (Citation74), which suggest group-based differences may result from differences in cultural and communicative conventions. Naturalistic and cross-cultural studies can inform our understanding of whether or not, and the extent to which, emoji are used as communication convention.

Conclusion

In this study, participants with TBI did not differ from neurotypical peers in their emotional recognition accuracy. Both participants with and without TBI had reduced accuracy labeling emotions depicted by emoji than faces, and participants with TBI had even poorer accuracy when labeling social emotions compared to basic emotions. Although many emoji depict facial expressions, they are much simpler than human faces, and a single emoji may reflect multiple emotions. Thus, emoji represent a flexible modality of communication, and their interpretation depends on the surrounding context. Emoji are a pervasive component of computer-mediated communication. Thus, studying emoji perception and use after brain injury may provide continued insights into communication barriers or supports in this population. Studying how people with TBI use and integrate information from multiple cues in both face-to-face and computer mediated communication contexts will advance understanding of functional communication and social participation in TBI.

Supplemental material

TBIN-2022-0219-File004.docx

Download MS Word (305.6 KB)

TBIN-2022-0219-File003.docx

Download MS Word (16.3 KB)

Acknowledgments

This work was supported by NIDCD grants R01 HD071089 a to M.C.D., L.T., & B.M.

Disclosure statement

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/02699052.2023.2181401

Additional information

Funding

This work was supported by the the National Institute on Deafness and Other Communication Disorders [R01 NIH HD071089].

Notes

1. This preregistration also includes procedures for data collection for a survey task in which participants with TBI generated free-text labels and provided valence and arousal ratings for frequently used emoji, as well as answered questions about their motives and attitudes toward emoji use. This survey was always completed before the current study so that the emotion labels used in the current study did not influence their freely generated perceptions of emoji emotions. The survey shares a common aim of understanding perception of emotions depicted by emoji in TBI, but its results are reported separately.

References

  • McDonald S, Impairments in social cognition following severe traumatic brain injury. J Int Neuropsychol Soc. 2013;19(3):231–46. doi:10.1017/S1355617712001506.
  • Knox L, Douglas J, Long-term ability to interpret facial expression after traumatic brain injury and its relation to social integration. Brain Cogn. 2009;69(2):442–49. doi:10.1016/j.bandc.2008.09.009.
  • Rigon A, Turkstra LS, Mutlu B, Duff MC, Facial-affect recognition deficit as a predictor of different aspects of social-communication impairment in traumatic brain injury. Neuropsychology. 2018;32(4):476–83. doi:10.1037/neu0000368
  • Brunner M, Palmer S, Togher L, Hemsley B, ‘I kind of figured it out’: the views and experiences of people with traumatic brain injury (TBI) in using social media—self-determination for participation and inclusion online. Int J Lang Commun Disord. 2019;54(2):221–33. doi:10.1111/1460-6984.12405.
  • Brunner M, Palmer S, Togher L, Dann S, Hemsley B, “If I knew what I was doing on Twitter then I would use it more”: twitter experiences and networks of people with traumatic brain injury (TBI). Brain Impair. 2020;21(1):1–18. doi:10.1017/BrImp.2019.12.
  • Brunner M, Hemsley B, Dann S, Togher L, Palmer S, Hashtag #TBI: a content and network data analysis of tweets about Traumatic Brain Injury. Brain Inj. 2018;32(1):49–63. doi:10.1080/02699052.2017.1403047.
  • Morrow EL, Zhao F, Turkstra L, Toma C, Mutlu B, Duff MC, Computer-mediated communication in adults with and without moderate-to-severe traumatic brain injury: survey of social media use. JMIR Rehabil Assist Technol. 2021;8(3):e26586. doi:10.2196/26586.
  • Brunner M, Hemsley B, Palmer S, Dann S, Togher L, Review of the literature on the use of social media by people with traumatic brain injury (TBI). Disabil Rehabil. 2015;37(17):1511–21. doi:10.3109/09638288.2015.1045992.
  • Ekman P, Facial expressions of emotion: new findings, new questions. Psychol Sci. 1992;3(1):34–38. doi:10.1111/j.1467-9280.1992.tb00253.x.
  • Ekman P, Are there basic emotions? Psychol Rev. 1992;99(3):550–53. doi:10.1037/0033-295X.99.3.550.
  • Ekman P, Friesen WV, Constants across cultures in the face and emotion. J Pers Soc Psychol. 1971;17(2):124–29. doi:10.1037/h0030377.
  • Gendron M, Roberson D, van der Vyver JM, Barrett LF, Perceptions of emotion from facial expressions are not culturally universal: evidence from a remote culture. Emotion. 2014;14(2):251–62. doi:10.1037/a0036052.
  • Babbage DR, Yim J, Zupan B, Neumann D, Tomita MR, Willer B, Meta-analysis of facial affect recognition difficulties after traumatic brain injury. Neuropsychology. 2011;25(3):277–85. doi:10.1037/a0021908.
  • McDonald S, Flanagan S, Social perception deficits after traumatic brain injury: interaction between emotion recognition, mentalizing ability, and social communication. Neuropsychology. 2004;18(3):572–79. doi:10.1037/0894-4105.18.3.572.
  • Watts A, Douglas J, Interpreting facial expression and communication competence following severe traumatic brain injury. Aphasiology. 2006;20(8):707–22. doi:10.1080/02687030500489953.
  • Ietswaart M, Milders M, Crawford JR, Currie D, Scott CL, Longitudinal aspects of emotion recognition in patients with traumatic brain injury. Neuropsychologia. 2008;46(1):148–59. doi:10.1016/j.neuropsychologia.2007.08.002.
  • Rosenberg H, Dethier M, Kessels RP, Frederick Westbrook R, McDonald S, Emotion perception after moderate-severe traumatic brain injury: the valence effect and the role of working memory, processing speed, and nonverbal reasoning. Neuropsychol. 2015;29(4):509. doi:10.1037/neu0000171.
  • Murphy JM, Bennett JM, de la Piedad Garcia X, Willis ML. Emotion recognition and traumatic brain injury: A systematic review and meta-analysis. Neuropsychology review. 2022;520–36. doi:10.1007/s11065-021-09510-7.
  • Adolphs R, Damasio H, Tranel D, Cooper G, Damasio AR, A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J Neurosci. 2000;20(7):2683–90. doi:10.1523/JNEUROSCI.20-07-02683.2000.
  • Rigon A, Voss MW, Turkstra LS, Mutlu B, Duff MC, Relationship between individual differences in functional connectivity and facial-emotion recognition abilities in adults with traumatic brain injury. Neuroimage Clin. 2017;13:370–77. doi:10.1016/j.nicl.2016.12.010.
  • Yim J, Babbage DR, Zupan B, Neumann D, Willer B, Brain Injury The relationship between facial affect recognition and cognitive functioning after traumatic brain injury. Brain Injury. 2013;27(10):1155–61. doi:10.3109/02699052.2013.804203.
  • Rigon A, Turkstra L, Mutlu B, Duff M, The female advantage: sex as a possible protective factor against emotion recognition impairment following traumatic brain injury. Cogn Affect Behav Neurosci. 2016;16(5):866–75. doi:10.3758/s13415-016-0437-0
  • Rigon A, Voss MW, Turkstra LS, Mutlu B, Duff MC, Different aspects of facial affect recognition impairment following traumatic brain injury: the role of perceptual and interpretative abilities. J Clin Exp Neuropsychol. 2018;1–15. doi:10.1080/13803395.2018.1437120.
  • Turkstra LS, Mutlu B, Ryan CW, Despins Stafslien EH, Richmond EK, Hosokawa E et al. Sex and gender differences in emotion recognition and theory of mind after TBI: a narrative review and directions for future research. Front Neurol. 2020;11:59.
  • Kret ME, De Gelder B, A review on sex differences in processing emotional signals. Neuropsychologia. 2012;50(7):1211–21. doi:10.1016/j.neuropsychologia.2011.12.022.
  • Montagne B, Kessels RPC, Frigerio E, de Haan EHF, Perrett DI, Sex differences in the perception of affective facial expressions: do men really lack emotional sensitivity? Cogn Process. 2005;6(2):136–41. doi:10.1007/s10339-005-0050-6.
  • Kessels RPC, Montagne B, Hendriks AW, Perrett DI, de Haan EHF, Assessment of perception of morphed facial expressions using the Emotion Recognition Task: normative data from healthy participants aged 8-75. J Neuropsychol. 2014;8(1):75–93. doi:10.1111/jnp.12009.
  • Montagne B, Kessels RPC, de Haan EHF, Perrett DI, The Emotion Recognition Task: a paradigm to measure the perception of facial emotional expressions at different intensities. Percept Mot Skills. 2007;104(2):589–98. doi:10.2466/pms.104.2.589-598
  • Byom L, Duff M, Mutlu B, Turkstra L, Facial emotion recognition of older adults with traumatic brain injury. Brain Inj. 2019;33(3):322–32. doi:10.1080/02699052.2018.1553066
  • Jaeger SR, Roigard CM, Jin D, Vidal L, Ares G, Valence, arousal and sentiment meanings of 33 facial emoji: insights for the use of emoji in consumer research. Food Res Int. 2019;119(September2018):895–907. doi:10.1016/j.foodres.2018.10.074
  • Jaeger SR, Ares G, Dominant meanings of facial emoji: insights from Chinese consumers and comparison with meanings from internet resources. Food Quality and Preference. 2017;62:275–83. doi:10.1016/j.foodqual.2017.04.009.
  • Miller H, Thebault-Spieker J, Chang S, Johnson I, Terveen L, Hecht B, “blissfully happy” or “ready to fight”: varying interpretations of emoji. Proceedings of the 10th International Conference on Web and Social Media, ICWSM 2016, Cologne, Germany. 2016; 259–68.
  • Tigwell GW, Flatla DR. Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, ACM. New York, NY, USA. 2016;859–66. doi:10.1145/2957265.2961844.
  • Franco CL, Fugate JMB, Emoji face renderings: exploring the role emoji platform differences have on emotional interpretation. J Nonverbal Behav. 2020;44(2):301–28. doi:10.1007/s10919-019-00330-1.
  • Cherbonnier A, Michinov N, The recognition of emotions beyond facial expressions: comparing emoticons specifically designed to convey basic emotions with other modes of expression. Comput Human Behav. 2021;118(May2020):106689. doi:10.1016/j.chb.2021.106689.
  • Shoeb A, de Melo G, Are emojis emotional? A study to understand the association between emojis and emotions. 2020.
  • Fischer B, Herbert C, Emoji as affective symbols: affective judgments of emoji, emoticons, and human faces varying in emotional content. Front Psychol. 2021;12(April). doi:10.3389/fpsyg.2021.645173.
  • Prada M, Rodrigues DL, Garrido M, Lopes D, Cavalheiro B, Gaspar R, Motives, frequency and attitudes toward emoji and emoticon use. Telemat Inform. 2018;35(7):1925–34. doi:10.1016/j.tele.2018.06.005
  • Herring SC, Dainas AR, Gender and age influences on interpretation of emoji functions. ACM Transact Soc Comput. 2020;3(2):1–26. doi:10.1145/3375629.
  • Jones LL, Wurm LH, Norville GA, Mullins KL, Sex differences in emoji use, familiarity, and valence. Comput Human Behav. 2020;108:106305. doi:10.1016/j.chb.2020.106305.
  • Tossell CC, Kortum P, Shepard C, Barg-Walkow LH, Rahmati A, Zhong L, A longitudinal study of emoticon use in text messaging from smartphones. Comput Human Behav. 2012;28(2):659–63. doi:10.1016/j.chb.2011.11.012.
  • Rodrigues D, Prada M, Gaspar R, Garrido M, Lopes D, Lisbon Emoji and Emoticon Database (LEED): norms for emoji and emoticons in seven evaluative dimensions. Behav Res Methods. 2018;50(1):392–405. doi:10.3758/s13428-017-0878-6.
  • Jaeger SR, Xia Y, Lee PY, Hunter DC, Beresford MK, Ares G, Emoji questionnaires can be used with a range of population segments: findings relating to age, gender and frequency of emoji/emoticon use. Food Qual Prefer. 2018;68:397–410. doi:10.1016/j.foodqual.2017.12.011.
  • Covington NV, Duff MC, Heterogeneity is a hallmark of traumatic brain injury, not a limitation: a new perspective on study design in rehabilitation research. Am J Speech Lang Pathol. 2021;30(2S):974–85. doi:10.1044/2020_AJSLP-20-00081
  • Leary MR. Affect, cognition, and the social emotions. In: Forgas JP (Eds.) Feeling and thinking: the role of affect in social cognition. New York, NY, US: Cambridge University Press; 2000. p. 331–56.
  • Leary MR, Digging deeper: the fundamental nature of “self-conscious” emotions. Psychol Inq. 2004;15(2):129–31.
  • Hareli S, Parkinson B, What’s social about social emotions? J Theory Soc Behav. 2008;38(2):131–56. doi:10.1111/j.1468-5914.2008.00363.x
  • Adolphs R, Baron-Cohen S, Tranel D, Impaired recognition of social emotions following amygdala damage. J Cogn Neurosci. 2002;14(8):1264–74. doi:10.1162/089892902760807258.
  • Turkstra LS, Kraning SG, Riedeman SK, Mutlu B, Duff M, VanDenHeuvel S, Labelling facial affect in context in adults with and without TBI. Brain Impair. 2017;18(1):49–61. doi:10.1017/BrImp.2016.29.
  • Steward KA, Kretzmer T, Anosognosia in moderate-to-severe traumatic brain injury: a review of prevalence, clinical correlates, and diversity considerations. Clin Neuropsychol. 2022;36(8):2021–40. doi:10.1080/13854046.2021.1967452
  • Spikman JM, Milders MV, Visser-Keizer AC, Westerhof-Evers HJ, Herben-Dekker M, van der Naalt J, Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury. PLoS One. 2013;8(6):e65581. doi:10.1371/journal.pone.0065581
  • Lamberts KF, Fasotti L, Boelen DHE, Spikman JM, Self-awareness after brain injury: relation with emotion recognition and effects of treatment. Brain Impair. 2017;18(1):130–37. doi:10.1017/BrImp.2016.28.
  • Malec JF, Brown AW, Leibson CL, Flaada JT, Mandrekar JN , et al. The Mayo Classification System for traumatic brain injury severity. J Neurotrauma. 2007;24(9):1417–24. doi:10.1089/neu.2006.0245.
  • Calvo MG, Lundqvist D, Facial expressions of emotion (KDEF): identification under different display-duration conditions. Behav Res Methods. 2008;40(1):109–15. doi:10.3758/BRM.40.1.109.
  • Anwyl-Irvine AL, Massonnié J, Flitton A, Kirkham N, Evershed JK, Gorilla in our midst: an online behavioral experiment builder. Behav Res Methods. 2020;52(1):388–407. doi:10.3758/s13428-019-01237-x
  • Bates D, Mächler M, Bolker BM, Walker SC, Fitting linear mixed-effects models using lme4. J Stat Softw. 2015;67(1). doi:10.18637/jss.v067.i01.
  • Barr DJ, Levy R, Scheepers C, Tily HJ, Random effects structure for confirmatory hypothesis testing: keep it maximal. J Mem Lang. 2013;68(3):255–78. doi:10.1016/j.jml.2012.11.001.
  • Kreibig SD, Autonomic nervous system activity in emotion: a review. Biol Psychol. 2010;84(3):394–421. doi:10.1016/j.biopsycho.2010.03.010.
  • Kralj Novak P, Smailović J, Sluban B, Mozetič I, Sentiment of emojis. PLoS One. 2015;10(12):e0144296. doi:10.1371/journal.pone.0144296.
  • Croker V, McDonald S, Recognition of emotion from facial expression following traumatic brain injury. Brain Inj. 2005;19(10):787–99. doi:10.1080/02699050500110033.
  • Rosenberg H, McDonald S, Dethier M, Kessels RPC, Westbrook RF, Facial Emotion recognition deficits following moderate–severe traumatic brain injury (TBI): re-examining the valence effect and the role of emotion intensity. J Int Neuropsychol Soc. 2014;20(10):994–1003. doi:10.1017/S1355617714000940.
  • Forslund MV, Perrin PB, Røe C, Sigurdardottir S, Hellstrøm T, Berntsen SA et al. Global outcome trajectories up to 10 years after moderate to severe traumatic brain injury. Front Neurol. 2019;10(March). doi:10.3389/fneur.2019.00219.
  • Jack RE, Garrod OGB, Schyns PG, Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Curr Biol. 2014;24(2):187–92. doi:10.1016/j.cub.2013.11.064.
  • Bai Q, Dan Q, Mu Z, Yang M, A systematic review of emoji: current research and future perspectives. Front Psychol. 2019;2221. doi:10.3389/fpsyg.2019.02221.
  • Kaye LK, Wall HJ, Malone SA, “Turn that frown upside-down”: a contextual account of emoticon usage on different virtual platforms. Comput Human Behav. 2016;60:463–67. doi:10.1016/j.chb.2016.02.088.
  • Derks D, Bos AER, Von Grumbkow J, Emoticons in computer-mediated communication: social motives and social context. Cyberpsychol Behav. 2008;11(1):99–101. doi:10.1089/cpb.2007.9926.
  • Pfeifer VA, Armstrong EL, Lai VT, Do all facial emojis communicate emotion? The impact of facial emojis on perceived sender emotion and text processing. Comput Human Behav. 2022;126(April2021):107016. doi:10.1016/j.chb.2021.107016.
  • Holtgraves T, Robinson C, Kaschak MP, Emoji can facilitate recognition of conveyed indirect meaning. PLoS One. 2020;15(4):e0232361. doi:10.1371/journal.pone.0232361.
  • Garcia C, Țurcan A, Howman H, Filik R, Emoji as a tool to aid the comprehension of written sarcasm: evidence from younger and older adults. 2021.
  • Evans K, Hux K, Comprehension of indirect requests by adults with severe traumatic brain injury: contributions of gestural and verbal information. Brain Inj. 2011;25(7–8):767–76. doi:10.3109/02699052.2011.576307.
  • Glikson E, Cheshin A, van Kleef GA, The dark side of a smiley: effects of smiling emoticons on virtual first impressions. Soc Psychol Personal Sci. 2018;9(5):614–25. doi:10.1177/1948550617720269.
  • Riordan MA, Glikson E, On the hazards of the technology age: how using emojis affects perceptions of leaders. Int J Bus Commun. 2020;232948842097169. doi:10.1177/2329488420971690.
  • Walther JB. Theories of computer-mediated communication and interpersonal relations. In: Knapp ML & Daly JA (Eds.) The SAGE handbook of interpersonal communication. 4th ed. USA: Sage Publications; 2011. p. 443–79.
  • Guntuku SC, Li M, Tay L, Ungar LH, Studying cultural differences in emoji usage across the East and the West. 2019.