Abstract
This study sought to examine the development of paragraph writing skills of 116 English as a second language university students over the course of 12 weeks and the relationship between the linguistic features of students’ written texts as measured by Coh-Metrix – a computational system for estimating textual features such as cohesion and coherence – and the scores assigned by human raters. The raters’ reliability was investigated using many-facet Rasch measurement (MFRM); the growth of students’ paragraph writing skills was explored using a factor-of-curves latent growth model (LGM); and the relationships between changes in linguistic features and writing scores across time were examined by path modelling. MFRM analysis indicates that despite several misfits, students’ and raters’ performances and scale’s functionality conformed to the expectations of MFRM, thus providing evidence of psychometric validity for the assessments. LGM shows that students’ paragraph writing skills develop steadily during the course. The Coh-Metrix indices have more predictive power before and after the course than during it, suggesting that Coh-Metrix may struggle to discriminate between some ability levels. Whether a Coh-Metrix index gains or loses predictive power over time is argued to be partly a function of whether raters maintain or lose sensitivity to the linguistic feature measured by that index in their own assessment as the course progresses.
Acknowledgements
This study was funded by the Centre for English Language Communication of the National University of Singapore (NUS). I wish to thank Professors Wu Siew Mei, Susan Tan and Richard Seow of NUS for their support of the project and their valuable comments. Advice given by three reviewers of Educational Psychology has been a great help. Assistance given by Professors Irene Tan and Maliga Jeganathan and four raters from NUS is greatly appreciated. All remaining errors are mine.
Notes
1. An online platform that allows students to submit assignments, and offers simplified grading and originality cheques for instructors. Turnitin also allows chats, messages and forum discussions.
2. In a recent study, I examined the effect of tasks on students’ scores (Aryadoust, Citationin press). The LGM analysis indicated no task effect and, accordingly, task was not parameterised in the current study.