7,222
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

What we have learned: implementing MiniLit as an intervention with young struggling readers

, ORCID Icon, ORCID Icon, , ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon show all
Pages 113-125 | Received 05 Apr 2021, Accepted 27 Sep 2021, Published online: 06 Nov 2021

ABSTRACT

Information about effective interventions for students with early reading difficulties is essential when deciding about how best to provide support. MiniLit, a small-group intervention for young struggling readers, was released in 2011 after several years of development and trials. This paper provides a rationale for MiniLit, a brief history of its development and implementation, and a summary of evidence collected from various efficacy studies. Pre- and post-test group data from program trials and experimental research indicate that MiniLit has produced large gains in students’ phonemic awareness, phonic decoding, word reading, and spelling. Experimental studies yielded lower effect sizes than program trials for phonic decoding and word reading skills, with effect sizes for these dimensions in the medium or large range. Program revisions in response to evidence from studies, feedback from practitioners, and findings from recent research about early reading have been undertaken, resulting in the development of MiniLit Sage.

Children enter school with varying experiences and knowledge about early literacy. It is a challenge for schools to cater for all students and move those with limited experiences and knowledge onto a trajectory of successful literacy learning. Increasingly schools are recognizing that they need to take action to prevent literacy failure as soon as possible and are seeking information about effective ways to achieve this.

The teaching of reading in the early years has been one of the most heavily researched areas of education, and there is strong evidence that it is critically important that young children get off to a good start in reading because a poor start to literacy learning has negative consequences for later learning and wellbeing (Stanovich, Citation2000). For example, literacy difficulties are associated with increased risk of anxiety (McArthur, Badcock, Castles, & Robidoux, Citation2021), poor self-esteem (Rose, Citation2006), reduced motivation to read (Chapman, Tunmer, & Prochnow, Citation2000; Morgan, Fuchs, Compton, Cordray, & Fuchs, Citation2008), reduced school retention (Bost & Riccomini, Citation2006; Ensminger & Slusarcick, Citation1992), and poorer academic performance (Juel, Citation1988; McArthur, Castles, Kohnen, & Banales, Citation2016). The critical importance of supporting students’ reading skills was recognized by a team of academic researchers and special educators associated with Macquarie University and an ongoing program of research and development led by Professor Kevin Wheldall was undertaken. In 2011, after five years of development, MiniLit, which stands for Meeting Initial Needs in Literacy, was launched as an intervention for young struggling readers (MultiLit, Citation2011).

This paper provides a brief introduction to MiniLit and traces its development, early stages of implementation, and subsequent evaluations. Information about the efficacy of the intervention will be provided through data and findings from associated trials and research studies in a variety of settings. It then describes MiniLit Sage – a new, revised version of the intervention program to be published ten years after the first MiniLit began to be used in schools.

What is MiniLit?

MiniLit is an intervention for young struggling readers that incorporates research findings about what needs to be in an early reading program and components of effective interventions. It consists of 80 structured lessons, each of which has three main components: Sounds and Words Activities, Text Reading and Story Book Reading. Lessons take approximately one hour and, ideally, are delivered daily. MiniLit is designed for use within a Response to Intervention (RtI) framework, whereby students are provided with exemplary instruction according to a tiered system of learning support. In practice, implementing RtI means that exemplary instruction at a whole-class (i.e. Tier 1) level is expected to be sufficient for approximately 80% of students, while 15% need extra support in the form of more intense small-group (i.e. Tier 2) instruction, and the remaining 5% of students need one-to-one (i.e. Tier 3) intervention (Fuchs & Fuchs, Citation2007). As an intervention designed for groups of up to four struggling readers, MiniLit is a Tier 2 intervention (Wheldall, Citation2011).

Rationale for the development of MiniLit

One of the reasons for the development of MiniLit was a concern about the efficacy and utility of the Reading Recovery program that had been widely adopted in Australian schools. This concern led to a review of research about the efficacy of Reading Recovery (Reynolds & Wheldall, Citation2007), which found that the program was relatively expensive and lacked some key components of early reading programs that recent research has shown to be crucial. Within the review, it was stated that, as a program based on ‘Whole Language’, Reading Recovery focuses on teaching students to predict upcoming words from context and initial letter cues, and uses analogy as a key strategy to teach phonics. Thus, it does not take a structured and systematic approach to the teaching of phonics. In addition, despite its widespread implementation, it had a limited research base at that time. We concluded that there was a need for an alternative intervention that was cheaper to implement and was based on evidence about the most effective way/s of teaching students to read and the components of effective interventions. Since then, good research evidence has further indicated that Reading Recovery’s effect is limited and short-term (Buckingham, Citation2019) confirming the need for a more effective alternative.

A search of the literature showed a lack of alternative interventions for young struggling readers that were evidence-based. Consequently, the directors of MultiLit made the decision to develop and trial a new intervention for young struggling readers that was based on key desiderata from the reviews of research into early reading (Department of Education, Science and Training (DEST), Citation2005; National Institute of Child Health and Human Development, Citation2000; Rose, Citation2006) and components of effective interventions (Reynolds, Wheldall, & Madelaine, Citation2010b).

These desiderata were, and remain:

  1. Intervention is timely and offered during the second year of formal schooling, as soon as it is identified that the student has ongoing difficulties that cannot be addressed by the regular classroom instruction.

  2. Instruction is delivered in small groups of up to four students.

  3. The program includes phonemic awareness, phonics, fluency, vocabulary and comprehension.

  4. The main activities in phonemic awareness relate to learning to blend and segment and should ideally involve using letters once students are familiar with some letter-sound relationships.

  5. Phonics is taught through a synthetic approach.

  6. There are planned procedures for students to build automaticity in word recognition.

  7. Instruction is explicit and systematic.

  8. A well-trained teacher or a paraprofessional with teacher support delivers instruction.

  9. Sessions are frequent, preferably daily, and involve at least 20–30 minutes of intensive instruction.

  10. Assessment procedures and tools are available to identify struggling students and to monitor their progress (Reynolds et al., Citation2010b).

The desiderata themselves were based on a review of existing research into effective literacy instruction practices (see Reynolds et al., Citation2010b, for full details) and are still supported by more recent research (e.g. Buckingham, Citation2020; Castles, Rastle, & Nation, Citation2018; Rastle, Lally, Davis, & Taylor, Citation2021). Following this review, an intervention incorporating these desiderata was planned, which ultimately resulted in the development of MiniLit.

Development and trialing

Initially, MiniLit comprised a revised version of the MULTILIT Reading Tutor Program (Macquarie University Special Education Centre, Citation1998; Wheldall & Beaman, Citation2000). There were 60 one-hour lessons, each including: phonemic awareness or sight words, word attack activities, text reading using the Fantastic Phonics program or Ginn Reading 360 readers, group listening and oral comprehension activities based on books read to students.

The initial version of MiniLit was implemented as part of the Exodus Schoolwise Program that was run by MultiLit at the time. In this setting, young struggling readers who had been identified and referred by nearby metropolitan schools with high proportions of disadvantaged students attended a literacy centre for a short-term withdrawal intervention. Two tutors delivered the intervention to small groups of students, on 4 days per week over 15-week periods. The results were considered by the MultiLit Research Unit to be very encouraging (Wheldall, Beaman, Madelaine, & McMurtry, Citation2012; Wheldall & Wheldall, Citation2014).

Following several trials of this version, a development team worked on an updated version of the intervention to bring it more in line with more recent research. Feedback from staff implementing the intervention also influenced the content of the subsequent version of the program that was published in 2011 (MultiLit, Citation2011).

MiniLit (MultiLit, Citation2011) had more lessons and integrated components of the program wherever practicable. For instance, the sequences for teaching phonemic awareness and phonics were merged. Lessons were scripted, making them suitable for implementation by trained paraprofessionals. The published materials were also supported by a two-day compulsory training course run by MultiLit trainers.

Implementation of MiniLit

Since its release, MiniLit has been implemented in a variety of contexts from one group in a school to a system-wide intervention in a Catholic diocesan school system. It is recommended that schools have at least two MiniLit groups so that students can move between groups, depending on their rates of progress.

From the release date late in 2011 until the end of June 2020, 2799 schools in Australia purchased the program. Around half of schools with the program are in our home state, New South Wales (NSW). More than 6,600 teachers, paraprofessionals and administrators across all states and territories have attended the two-day MiniLit training course.

Efficacy research

Early research took the form of pilot studies in tutorial settings run by MultiLit (Reynolds, Wheldall, & Madelaine, Citation2007a, Citation2007b, Citation2007c). The intervention sessions for these studies were 60 minutes long and based on components from the MULTILIT Reading Tutor Program. Two experimental studies were subsequently carried out in school settings with school staffing (Buckingham, Wheldall, & Beaman-Wheldall, Citation2012, Citation2013; Reynolds, Wheldall, & Madelaine, Citation2010a). A second unpublished pilot version of MiniLit comprising shorter (i.e. 45-minute) lessons was used in the first experimental study by Reynolds et al. (Citation2010a). All pilot versions were similar in content to the published MiniLit program (MultiLit, Citation2011), though the published version had more cohesion between components, which made for more streamlined lessons. The published MiniLit program was used in studies carried out between 2011 and 2019. In addition, aggregated data were collected over a number of years from the Exodus tutorial centres and from schools in Cape York.

This paper will summarise findings from all published research about the efficacy of MiniLit. The program has been implemented in the following research settings:

  • Tutorial Centres run by the Exodus Foundation in which programs were implemented by MultiLit and, later, by the Foundation itself (Wheldall et al., Citation2012; Wheldall & Wheldall, Citation2014; Wheldall, Wheldall, Madelaine, Reynolds, & Arakelian, Citation2017)

  • An independent school in a high SES area in which MiniLit was implemented by school staff (Reynolds et al., Citation2010a)

  • A state primary school in a disadvantaged area where MiniLit was implemented by school staff (Buckingham et al., Citation2012, Citation2013)

  • Remote schools with high numbers of Indigenous students in Cape York (Wheldall et al., Citation2019)

  • An independent evaluation conducted with 217 students in 20 NSW public schools (Quach et al., Citation2019)

provides a summary overview of each study, its setting, the number of students and groups and the duration of the intervention in the study.

Table 1. Overview of studies and evaluations investigating the efficacy of MiniLit

The effects of MiniLit across a number of dimensions of early literacy have been investigated in each of the studies. The dimensions used in most studies are phonemic awareness, phonic decoding, word reading, and spelling. Pre- and post-test group data are available from trials and experimental studies and have been used to provide information about gains on tests related to the dimensions of early literacy and associated effect sizes.

Results

The results of the field trials and the randomized control studies provide the key reason for our continuing support of MiniLit. Large effect sizes have been found consistently across the studies for all dimensions of early reading.

lists effect sizes that show the extent of gains in achievement from pre-test to post-test on the assessments used in each study. The effect sizes (ES) that are shown for all studies except one were calculated using Cohen’s d (Cohen, Citation1988), where effect sizes of 0.8 and above are generally considered to be large. Note, however that, according to Kraft’s (Citation2020) criteria based on experimental studies in education, Cohen’s d effect sizes above 0.2 are large. The study that does not report effect sizes as Cohen’s d is that reported in Buckingham et al. (Citation2012, Citation2013) where partial eta squared is used. It should be noted that, when this procedure is used, a large effect size is evident when partial eta squared is greater than or equal to .138.

Table 2. Dimensions measured in studies and associated Effect Sizes (ES)

Table 3. Components of MiniLit and MiniLit Sage lessons

In general, the data show large to very large effect sizes for all dimensions. For studies where Cohen’s d was used, the effect sizes were between 0.64 and 3.10 for phonemic awareness, between 0.99 and 1.76 for spelling, between 0.43 and 5.90 for phonemic decoding, between 1.03 and 1.83 for word reading fluency, between 0.08 and 1.80 for word reading (accuracy), 0.59 for regular word reading, −0.16 for irregular word reading, 1.44 for letter-sound knowledge, 0.67 for reading comprehension, and 0.91 for vocabulary.

Discussion of results

The studies that had the lower effect sizes for phonemic decoding and word reading were the experimental studies. One of these studies (Reynolds et al., Citation2010a) was of short duration, had reduced lesson time and was carried out with students in a school with high socioeconomic status. The participants had a synthetic phonics program prior to undertaking MiniLit and were placed at relatively high levels at the beginning of the intervention. Different assessments were also used and it was felt that these had too few items at the lower level to be highly sensitive to change. The other experimental study was a larger-scale independent evaluation (Quach et al., Citation2019). It should be noted that most of the effect sizes yielded in these studies are still considered to be in the medium (Cohen, Citation1988) or large (Kraft, Citation2020) range.

Instructional fidelity may have played a role in affecting student outcomes. Fidelity checks were carried out for all studies listed in . In the early studies, instructors were experienced MultiLit personnel and fidelity was naturally high. In the Reynolds et al. (Citation2010a) study in which the instructors were a trained special educator and an experienced teacher employed by the host primary school, the average treatment fidelity was 95%. The inclusion of scripts in later trials and in the 2011 published version provided support when school-based personnel implemented the program but this does not guarantee good fidelity. In Buckingham et al. (Citation2012), a target of 80% fidelity took one term to be achieved and student results improved more quickly once the target was reached. In the Quach et al. (Citation2019) study, fidelity was observed to be relatively high on average, and a strong association between instructional fidelity and student results was reported.

Lessons we have learned

  • We have learned that MiniLit is effective, especially for students who are at-risk. The effect sizes provide strong evidence that the intervention is effective in improving outcomes in reading for young struggling readers, especially in the dimensions of phonemic awareness and phonic decoding. MiniLit has been shown to be effective for Indigenous students and disadvantaged students.

  • MiniLit has potential as a Tier 2 intervention in an RtI approach as shown in the Buckingham et al. (Citation2012, Citation2013) studies. If it is to be used in this way, ideally schools will also implement research-based high-quality Tier 1 classroom literacy programs for all young students.

  • While MiniLit has a placement test to determine a starting point for instruction, there is a need for a way of monitoring progress that can be administered often and in a short period of time. The Wheldall Assessment of Reading Lists (WARL) (Reynolds, Wheldall, & Madelaine, Citation2009) is a set of parallel curriculum-based measures that was used as an assessment tool in several of the studies. Data collected across the studies indicate that it may be a useful means of measuring progress during the implementation of MiniLit, possibly for students on Level 2 of the program as it is a fluency measure. Benchmarks for students in NSW schools have also been established for progress on the WARL and could assist schools to identify students for intervention and determine when they have reached grade expectations (Reynolds, Wheldall, & Madelaine, Citation2011).

  • Non-word reading has been recognised as a sensitive measure of phonic decoding in the research literature (Castles, Polito, Pritchard, Anandakumar, & Coltheart, Citation2018). A non-word fluency assessment called the Wheldall Assessment for Reading Non-words (WARN) has been developed for this purpose (Wheldall, Reynolds, Madelaine, & Bell, Citation2021).

  • Training provided in MiniLit is a valuable component of the program. Training includes information about the evidence base that underpins the program as well as how to implement it effectively. The largest experimental study (Quach et al., Citation2019) found that implementation fidelity was strongly associated with student scores in letter-sound knowledge, phonemic awareness, and nonword reading (phonic decoding) in the first post-test, which in turn were related to reading accuracy, rate and comprehension in the second post-test. Our experiences have highlighted the need for higher quality pre-service training in teaching reading for new teachers and professional development for practising teachers.

  • The need to select assessment measures carefully according to how well they represent and capture targeted skills in the specific population of interest, and to adhere closely to administration protocols, has been reinforced (Wheldall, Wheldall, Bell, & Buckingham, Citation2020). For example, one of the assessments chosen for the independent evaluation (Quach et al., Citation2019) was not appropriate for the cohort and provided unreliable and misleading results.

  • A final lesson we have learned is that trial outcomes are influenced by factors related to the implementation of the program such as the amount of time allocated, the length of the intervention, the quality of the instruction, the support of the school executive, the allocation of resources and the compatibility of the classroom programs. Our studies show differences in outcomes that may relate to these. For example, the timing of the intervention in Semester 2 of Foundation Year or Semester 1 of Year 1 seems ideal and warrants further research to determine if this is the case.

The future for MiniLit: MiniLit Sage

Efficacy studies and program evaluations to date have provided strong evidence about the effectiveness of MiniLit for young struggling readers. Nonetheless, MultiLit is committed to reviewing and, if necessary, revising and refining its programs in a timely way. This is to (a) ensure programs reflect the developing scientific research and evidence base on effective reading instruction and intervention, and (b) incorporate the lessons learned from trials and evaluations and from practitioner feedback from years of implementation in schools.

A revised version of MiniLit will be published in 2021. “MiniLit Sage” more closely focuses on the highly effective elements of the original MiniLit program and aligns with InitiaLit, MultiLit’s Tier 1 program for Foundation to Year 2. InitiaLit has been published since the publication of the original MiniLit, in response to demand for a whole class program based on the evidence and instructional principles that has made MiniLit effective and successful. MiniLit Sage uses the same instructional scope and sequence for teaching letter-sound relationships and “tricky” words as InitiaLit, allowing a more seamless implementation of a RtI approach. Another benefit is that the specially developed InitiaLit decodable readers, which are aligned with the InitiaLit Scope and Sequence, can be used for the reading of connected text in MiniLit lessons.

As described in the final dot point of the “lessons we have learnt”, pragmatic constraints, such as time allocated to the lesson, can affect how students respond to an intervention. Hence, the 60-minute lessons in the original MiniLit have been reduced to 45 minutes each in MiniLit Sage. To ensure the content is still adequately covered, there are an additional 20 lessons in MiniLit Sage (i.e. 100 lessons).

The MiniLit Sage lessons predominantly focus on developing and building automatic word recognition processes. While very little has changed in the approach to teaching phonemic awareness and phonics and our understanding of the connection between the two for reading and spelling, additional phonemic awareness activities are now included in Part B of the program. MiniLit Sage also has a stronger focus on articulation of speech sounds and the correct formation of letters, two program components that have been shown to facilitate automatic recognition and recall of letter-sound correspondences. There is also an increased focus on the development of fluency in MiniLit Sage.

The approach to teaching irregular, or “tricky” words, has been updated in MiniLit Sage to reflect the most recent research evidence. It now corresponds with the sequence and approach used in the InitiaLit program with an added step that includes drawing children’s attention to the sounds in a word prior to writing it on the board. The link between the sounds and both regular and irregular spelling patterns are then highlighted.

A major change is that MiniLit Sage does not include a Storybook Reading component. MultiLit has recognised the increasing demand for more explicit and directed lessons for young students with language difficulties and, as a result, has begun developing and trialling a more comprehensive program for young children who require Tier 2 language support. Vocabulary and comprehension, however, continue to feature in MiniLit Sage and are integrated throughout the lesson activities and the connected text reading components: Putting it all Together, Text Reading Accuracy, and Text Reading Fluency.

Conclusion

Evaluations of the original MiniLit program in numerous trials in a variety of contexts over more than ten years have shown it to be an effective small group intervention for young struggling readers, and that its benefits were maximised under optimal instructional conditions. Evidence gathered from research and practice during that time have been incorporated into an updated version of the program – MiniLit Sage.

Disclosure statement

The authors hereby declare a financial interest in the outcomes of this paper. All authors work for MultiLit Pty Ltd, which publishes the MiniLit program. For any study referenced in this paper that was conducted by one or more of the authors, this was made clear in the Ethics application for the project and in the consent forms signed by school principals and parents.

References

  • Bost, L. W., & Riccomini, P. J. (2006). Effective instruction: An inconspicuous strategy for dropout prevention. Remedial and Special Education, 27(5), 301–311.
  • Buckingham, J. (2019, February). Reading recovery: A failed investment. MultiLit Pty Ltd.
  • Buckingham, J., Wheldall, K., & Beaman-Wheldall, R. (2012). A randomized control trial of a tier two small group intervention (‘MiniLit’) for young struggling readers. Australian Journal of Learning Difficulties, 17(2), 79–99.
  • Buckingham, J., Wheldall, K., & Beaman-Wheldall, R. (2013). Evaluation of a two-phase implementation of a tier 2 (small group) reading intervention for young low-progress readers. Australian Journal of Special Education, 38(2), 169–185.
  • Buckingham, J. (2020). Systematic phonics instruction belongs in evidence-based reading programs: A response to Bowers. The Educational and Developmental Psychologist, 37(2), 105–113.
  • Castles, A., Polito, V., Pritchard, S., Anandakumar, T., & Coltheart, M. (2018). Do nonword reading tests for children measure what we want them to? An analysis of year 2 error responses. Australian Journal of Learning Difficulties, 23(2), 153–165.
  • Castles, A., Rastle, K., & Nation, K. (2018). Ending the reading wars: Reading acquisition from novice to expert. Psychological Science in the Public Interest, 19(1), 5–51.
  • Chapman, J. W., Tunmer, W. E., & Prochnow, J. E. (2000). Early reading-related skills and performance, reading self-concept, and the development of academic self-concept: A longitudinal study. Journal of Educational Psychology, 92(4), 703–708.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. New York, NY: Routledge Academic.
  • Department of Education, Science and Training (DEST). (2005). National Inquiry into the Teaching of Literacy (NITL). Teaching reading: Report and recommendations. Accessed 21 December 2007. http://www.dest.gov.au/nitl/documents/report_recommendations.pdf
  • Dunn, L. M., & Dunn, D. M. (2007). Peabody picture vocabulary test (4th ed.). Minneapolis, MN: Pearson Education.
  • Ensminger, M. E., & Slusarcick, A. L. (1992). Paths to high school graduation or dropout: A longitudinal study of a first-grade cohort. Sociology of Education, 65(2), 95–113.
  • Fuchs, L. S., & Fuchs, D. (2007). A model for implementing responsiveness to Intervention. Teaching Exceptional Children, 39(5), 14–20.
  • Gilmore, A., Croft, C., & Reid, N. (1981). Burt word reading test – New Zealand revision. Wellington, NZ: New Zealand Council for Educational Research.
  • Hulme, C., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., & Snowling, M. J. (2010). York assessment of reading for comprehension: Early reading. London, UK: GL Assessment.
  • Juel, C. (1988). Learning to read and write: A longitudinal study of 54 children from first through fourth grades. Journal of Educational Psychology, 80(4), 437–447.
  • Kraft, M. A. (2020). Interpreting effect sizes of education interventions. Educational Researcher, 49(4), 241–253.
  • Macquarie University Special Education Centre. (1998). The MultiLit reading tutor program. Sydney, NSW: Author.
  • Martin, F., & Pratt, C. (2001). Martin and Pratt nonword reading test. Melbourne, VIC: ACER.
  • McArthur, G., Badcock, N., Castles, A., & Robidoux, S. (2021). Tracking the relations between children’s reading and emotional health across time: Evidence from four large longitudinal studies. Reading Research Quarterly, Advance online publication. doi: https://doi.org/10.1002/rrq.426
  • McArthur, G., Castles, A., Kohnen, S., & Banales, E. (2016). Low self-concept in poor readers: Prevalence, heterogeneity, and risk. PeerJ, 4(e2669. doi:https://doi.org/10.7717/peerj.2669
  • Morgan, P. L., Fuchs, D., Compton, D. L., Cordray, D. S., & Fuchs, L. S. (2008). Does early reading failure decrease children’s reading motivation? Journal of Learning Disabilities, 41(5), 387–404.
  • MultiLit. (2011). MiniLit early literacy intervention program. Sydney, NSW: Author.
  • National Institute of Child Health and Human Development (2000). Report of the national reading panel: Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH publication no. 00-4769). U.S. Government Printing Office.
  • Neilson, R. (2003a). Astronaut Invented Spelling Test (AIST). Jamberoo, NSW: Language, Speech and Literacy Services.
  • Neilson, R. (2003b). Sutherland phonological awareness test – revised (SPAT-R). Jamberoo, NSW: Language, Speech and Literacy Services.
  • Quach, J., Goldfeld, S., Clinton, J., Serry, T., Smith, L., & Grobler, A. (2019). MiniLit: Learning impact fund evaluation report. Evidence for Learning. Accessed 30 November 2020. https://evidenceforlearning.org.au/lif/our-projects/minilit
  • Rastle, K., Lally, C., Davis, M. H., & Taylor, J. S. H. (2021). The dramatic impact of explicit instruction on learning to read in a new writing system. Psychological Science, 32(4), 471–484.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2007a). Developing a ramp to reading for at-risk year one students: A preliminary pilot study. Special Education Perspectives, 16(1), 39–69.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2007b). ‘Meeting Initial Needs In Literacy’ (MINILIT): A ramp to MULTILIT for younger low-progress readers. Australian Journal of Learning Disabilities, 12(2), 67–72.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2007c). ‘Meeting Initial Needs In Literacy’ (MINILIT): Why we need it, how it works and the results of pilot studies. Australian Journal of Special Education, 31(2), 147–158.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2009). Building the WARL: The development of the wheldall assessment of reading lists, a curriculum-based measure designed to identify young struggling readers and monitor their progress. Australian Journal of Learning Difficulties, 14(1), 89–111.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2010a). An experimental evaluation of the efficacy of an intervention for young struggling readers in year one. Special Education Perspectives, 19(2), 35–57.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2010b). Components of effective early reading interventions for young struggling readers. Australian Journal of Learning Difficulties, 15(2), 171–192.
  • Reynolds, M., Wheldall, K., & Madelaine, A. (2011). Early identification of young struggling readers: Preliminary benchmarks for intervention for students in years one and two in schools in New South Wales. Australian Journal of Learning Difficulties, 16(2), 127–143.
  • Reynolds, M., & Wheldall, K. (2007). Reading recovery twenty years down the track: Looking forward, looking back. International Journal of Disability, Development and Education, 54(2), 199–223.
  • Rose, J. (2006). Independent review of the teaching of early reading: Final report. Accessed 20 September 2006. http://www.standards.dfes.gov.uk/rosereview
  • Stanovich, K. (2000). Progress in understanding reading: Scientific foundations and new frontiers. New York, NY: The Guilford Press.
  • Torgeson, J. K., Wagner, R. K., & Rashotte, C. A. (1999). Test of word reading efficiency (TOWRE). (Austin, TX: PRO-ED) .
  • Wechsler, D. (2007). Wechsler individual achievement test – 2nd ed, Australian standardised version (WIAT-II Australian). Marrickville, NSW: Pearson.
  • Westwood, P. (1999). Spelling: Approaches to teaching and assessment. Camberwell, VIC: ACER Press.
  • Wheldall, K., Beaman, R., Madelaine, A., & McMurtry, S. (2012). Evaluations of the efficacy of MultiLit and MiniLit programs provided by the Exodus Foundation, 2009-2011 [Unpublished research report].
  • Wheldall, K., & Beaman, R. (2000). An evaluation of MULTILIT: ‘Making Up Lost Time In Literacy’. Canberra, ACT: Department of Education, Training and Youth Affairs.
  • Wheldall, K., Reynolds, M., Madelaine, A., & Bell, N. (2021). The Wheldall assessment of reading nonwords. Sydney, NSW: MultiLit Pty Ltd.
  • Wheldall, K., Reynolds, M., & Madelaine, A. (2015). The Wheldall assessment of reading lists. Sydney, NSW: MultiLit Pty Ltd.
  • Wheldall, K., & Wheldall, R. (2014). Report on the efficacy of the tutorial centres provided by the Exodus Foundation, mid-2010-2013 [Unpublished research report].
  • Wheldall, K., Wheldall, R., Bell, N., & Buckingham, J. (2020). Researching the efficacy of a reading intervention: An object lesson. The Educational and Developmental Psychologist, 37(2), 147–151.
  • Wheldall, K., Wheldall, R., Madelaine, A., Reynolds, M., Arakelian, S., & Kohnen, S. (2019). ‘Just teach our kids to read’: Efficacy of intensive reading interventions for both younger and older low-progress readers in schools serving mainly remote Indigenous communities. In J. Rennie, and H. Harper (Eds.), Literacy education and Indigenous Australians. Singapore: Springer 221–246 . doi:https://doi.org/10.1007/978-981-13-8629-9_13
  • Wheldall, K., Wheldall, R., Madelaine, A., Reynolds, M., & Arakelian, S. (2017). Further evidence for the efficacy of an evidence based, small group, literacy intervention program for young struggling readers. Australian Journal of Learning Difficulties, 22(1), 3–13.
  • Wheldall, K. (2011). Ensuring that all children learn to read. Learning Difficulties Australia Bulletin, 43(1), 5–8.