364
Views
0
CrossRef citations to date
0
Altmetric
Articles

Episodic memory and recognition are influenced by cues’ sensory modality: comparing odours, music and faces using virtual realityOpen Data

ORCID Icon, ORCID Icon, ORCID Icon, , ORCID Icon, & ORCID Icon show all
Pages 1113-1133 | Received 10 Nov 2022, Accepted 18 Apr 2023, Published online: 30 Aug 2023
 

ABSTRACT

Most everyday experiences are multisensory, and all senses can trigger the conscious re-experience of unique personal events embedded in their specific spatio-temporal context. Yet, little is known about how a cue’s sensory modality influences episodic memory, and which step of this process is impacted. This study investigated recognition and episodic memory across olfactory, auditory and visual sensory modalities in a laboratory-ecological task using a non-immersive virtual reality device. At encoding, participants freely and actively explored unique and rich episodes in a three-room house where boxes delivered odours, musical pieces and pictures of face. At retrieval, participants were presented with modality-specific memory cues and were told to 1) recognise encoded cues among distractors and, 2) go to the room and select the box in which they encountered them at encoding. Memory performance and response times revealed that music and faces outperformed odours in recognition memory, but that odours and faces outperformed music in evoking encoding context. Interestingly, correct recognition of music and faces was accompanied by more profound inspirations than correct rejection. By directly comparing memory performance across sensory modalities, our study demonstrated that despite limited recognition, odours are powerful cues to evoke specific episodic memory retrieval.

Open Scholarship

This article has earned the Center for Open Science badge for Open Data. The data are openly accessible at OSF URL : https://osf.io/d436m/

Acknowledgments

We thank Pr Christian Scheiber for agreeing to support our work as principal investigator and Eda Erdal for her help in data preliminary analyses. We are grateful to the technical platform NeuroImmersion for the development of the EpisOdor Virtual Reality software, and the company EmoSens® for supplying some of the odourants used in this study. This work was supported by the Centre de Recherche en Neurosciences de Lyon (CRNL), the Centre National de la Recherche Scientifique (CNRS), the LABEX Cortex (ANR-11-LABX-0042) of Université de Lyon within the program “Investissements d’Avenir” (ANR-11-IDEX-0007) operated by the French National Research Agency (ANR). The team Auditory Cognition and Psychoacoustics is part of the LabEx CeLyA (Centre Lyonnais d'Acoustique, ANR-10-LABX-60). Dr L. R. was funded by the Roudnitska Foundation and LabEx CeLyA. A CC-BY 4.0 public copyright license has been applied by the authors to the present document and will be applied to all subsequent versions up to the Author Accepted Manuscript arising from this submission, in accordance with the grant’s open access conditions.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data that support the findings of this study is openly available in the Open Science Framework (OSF) at https://osf.io/d436m/, doi:10.17605/OSF.IO/D436M.

Additional information

Funding

This work was supported by LabEx Celya: [Grant Number ANR-10-LABX-60]; LabEx Cortex: [Grant Number ANR-11-LABX-0042]; Roudnitska Association (2017-00000006771).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 354.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.