458
Views
4
CrossRef citations to date
0
Altmetric
Original Articles

Evidence of a metacognitive benefit to memory?

&
Pages 317-325 | Received 22 Dec 2015, Accepted 21 Mar 2016, Published online: 06 Apr 2016
 

ABSTRACT

Studies of the memory-control framework have contrasted free-report and forced-report recall, with little regard to the order of these two tests. The present experiment sought to demonstrate that test order is crucial, and that this suggests a potential role for metacognitive monitoring on memory retrieval. Participants undertook tests of episodic and semantic memory in both free- and forced-report format, in one of the two potential response orders. This showed that free-report performance was more accurate if conducted prior to forced-report, rather than after it, with no cost to memory quantity. Additionally, there was a trend towards higher forced-report performance if it was preceded by an initial free-report test, a pattern revealed by a meta-analysis to be consistent with previous studies in the literature. These findings suggest a reciprocal relationship between metacognitive monitoring and early retrieval processes in memory that results in higher memory performance when monitoring is encouraged.

Acknowledgement

The author would like to thank Lowenna Wills for her help with data collection. Some of the data reported here were presented at the Metacog2014 workshop in Clermont-Ferrand in September 2014.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. We thank Morris Goldsmith for providing the original data file that enabled us to conduct this analysis. The figure shown reports only performance on standard items, and excludes deceptive items and difficult items for which recall performance was close to floor.

2. We keep our Experiments 1a and 1b separate for the meta-analysis and as, despite no significant difference between them, the former displays one of the smallest effect sizes of the eight studies and we do not want this to be masked by averaging with experiment 1b. Using only a single effect size for the combined data produces the same meta-analytic average effect size and only slightly changes the CI: d = 0.26 [0.05, 0.48].

3. The ds were calculated using the MBESS (Lai & Kelley, Citation2012) package in R, an open-source language and environment for statistical computing (R Core Team, Citation2013). Cumming’s (Citation2012) ESCI software was used to create the figure and for the meta-analytic calculations.

4. There was no evidence of significant heterogeneity, Q(7) = 0.91, p = .996.

5. We thank Maciej Hanczakowski for suggesting this potential account during the review process.

 

Additional information

Funding

The author would like to thank Flinders University Norman Munn travel fund for partial support of this project.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 354.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.