154
Views
1
CrossRef citations to date
0
Altmetric
Regular articles

The serial nature of the masked onset priming effect revisited

&
Pages 2239-2246 | Received 25 Feb 2014, Accepted 27 Feb 2014, Published online: 22 May 2014
 

Abstract

Reading aloud is faster when target words/nonwords are preceded by masked prime words/nonwords that share their first sound with the target (e.g., save-SINK) compared to when primes and targets are unrelated to each other (e.g., farm-SINK). This empirical phenomenon is the masked onset priming effect (MOPE) and is known to be due to serial left-to-right processing of the prime by a sublexical reading mechanism. However, the literature in this domain lacks a critical experiment. It is possible that when primes are real words their orthographic/phonological representations are activated in parallel and holistically during prime presentation, so any phoneme overlap between primes and targets (and not just initial-phoneme overlap) could facilitate target reading aloud. This is the prediction made by the only computational models of reading aloud that are able to simulate the MOPE, namely the DRC1.2.1, CDP+, and CDP++ models. We tested this prediction in the present study and found that initial-phoneme overlap (blip-BEST), but not end-phoneme overlap (flat-BEST), facilitated target reading aloud compared to no phoneme overlap (junk-BEST). These results provide support for a reading mechanism that operates serially and from left to right, yet are inconsistent with all existing computational models of single-word reading aloud.

This work was supported by a CCD Reading Program Research Support Grant to the first author.

Notes

1We verify this by simulations, which we report later in this paper.

2Although the error rate in this study seems to be very high, a close inspection of the data revealed that it is due to a small number of participants (six in total) who mispronounced over 15 words. When these participants were excluded from the analyses, the overall error rate decreased to 3.9%, yet the critical differences between the three conditions remained the same.

3Following a reviewer's suggestion, in an attempt to attenuate the influence of end-related primes on target reading aloud we parametrically decreased the prime duration in the CDP+ model from 20 cycles to 1. Only when the prime duration was as short as 1 and 2 cycles did the model successfully simulate the human data. However, at these short prime durations it is questionable whether the model would be able to simulate any other masked onset priming effects reported in the literature. In the present paper we only report the results at a prime duration of 25 cycles because this is the prime duration that Perry et al. used to simulate the human data from the Forster and Davis (Citation1991) seminal study on the MOPE.

4We also ran the simulations with the CDP++ model, which is a disyllabic version of the CDP+ model. The results were identical to those produced by the CDP+ model in terms of the differences between the three conditions, yet naming latencies were overall much faster in the disyllabic model (63.6, 64.5, and 66.2 for the onset-related, end-related, and unrelated conditions respectively). We only report the simulation results from the monosyllabic model because they are directly comparable to the DRC1.2.1 model, which is also limited to monosyllables.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.