461
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Emojis vs. facial expressions: An electrical neuroimaging study on perceptual recognition

& ORCID Icon
Pages 46-64 | Received 01 Nov 2022, Published online: 17 Apr 2023
 

ABSTRACT

The aim of this study was to investigate the neural underpinnings and the time course of emoji recognition through the recording of event-related potentials in 51 participants engaged in a categorization task involving an emotional word paradigm. Forty-eight happy, sad, surprised, disgusted, fearful, angry emojis, and as many facial expressions, were used as stimuli. Behavioral data showed that emojis were recognized faster and more accurately (92.7%) than facial expressions displaying the same emotions (87.35%). Participants were better at recognizing happy, disgusted, and sad emojis, and happy and angry faces. Fear was difficult to recognize in both faces and emojis. The N400 response was larger to incongruently primed emojis and faces, while the opposite was observed for the P300 component. However, both N400 and P300 were considerably later in response to faces than emojis. The emoji-related N170 component (150–190 ms) discriminated stimulus affective content, similar to face-related N170, but its neural generators did not include the face fusiform area but the occipital face area (OFA) for processing face details, and object-related areas. Both faces and emojis activated the limbic system and the orbitofrontal cortex supporting anthropomorphization. The schematic nature of emojis might determine an easier classification of their emotional content.

Acknowledgements

We are extremely grateful to Alice Cerri and to all participants in the study for their free participation and commitment.

Disclosure statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Author contributions

AMP conceived and planned the experiment and wrote the paper. LDN prepared the stimuli and carried out the data collection. AMP performed statistical analyses and data illustration. AMP and LDN interpreted the data. All authors provided critical feedback and helped shape the research, analysis and manuscript.

Data accessibility

Anonymized data and details about preprocessing/analyses are available to colleagues upon request.

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 169.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.