Publication Cover
Aging, Neuropsychology, and Cognition
A Journal on Normal and Dysfunctional Development
Volume 22, 2015 - Issue 4
2,340
Views
75
CrossRef citations to date
0
Altmetric
Articles

Self-referencing enhances recollection in both young and older adults

, &
Pages 388-412 | Received 26 Apr 2014, Accepted 18 Aug 2014, Published online: 29 Sep 2014
 

Abstract

Processing information in relation to the self enhances subsequent item recognition in both young and older adults and further enhances recollection at least in the young. Because older adults experience recollection memory deficits, it is unknown whether self-referencing improves recollection in older adults. We examined recollection benefits from self-referential encoding in older and younger adults and further examined the quality and quantity of episodic details facilitated by self-referencing. We further investigated the influence of valence on recollection, given prior findings of age group differences in emotional memory (i.e., “positivity effects”). Across the two experiments, young and older adults processed positive and negative adjectives either for self-relevance or for semantic meaning. We found that self-referencing, relative to semantic encoding, increased recollection memory in both age groups. In Experiment 1, both groups remembered proportionally more negative than positive items when adjectives were processed semantically; however, when adjectives were processed self-referentially, both groups exhibited evidence of better recollection for the positive items, inconsistent with a positivity effect in aging. In Experiment 2, both groups reported more episodic details associated with recollected items, as measured by a memory characteristic questionnaire, for the self-reference relative to the semantic condition. Overall, these data suggest that self-referencing leads to detail-rich memory representations reflected in higher rates of recollection across age.

Acknowledgments

The authors wish to thank Adarsh Shetty, Brent Sattelmeier, An Do, and Yashu Jiang for their invaluable assistance with this project. We would also like to thank Erika Fulton and Terry W. Moore for help with development of the stimuli.

Notes

1. We varied font type and color to enrich the amount of visual details that participants could later recollect.

2. After making a “familiar” response, participants reported retrieving details (e.g., providing a response of two or three for any of the MCQ questions) on fewer than 1% of trials, which we think limits the informativeness of those trials.

3. There were occasions where participants made “remember” judgments without reporting any MCQ details. These types of trials were also rare, but we included them in the analysis, since it is possible that participants were retrieving some episodic detail that did not neatly fit into any of the MCQ categories.

4. The temporal details assessed whether an item appeared in the beginning, middle, or end of the encoding session, and not the amount of temporal details they could retrieve, which is a qualitatively different type of measure than the other four MCQ ratings; hence these data will not be discussed further.

Additional information

Funding

This research was supported by a grant from the American Federation for Aging Research to A. Duarte and by NIA Grant T32 AG00175 to E. D. Leshikar and M. R. Dulas as well as T32 AG000204 to E. D. Leshikar.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 528.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.