306
Views
4
CrossRef citations to date
0
Altmetric
Articles

Differential rapid responding across language and cultural groups

 

ABSTRACT

Rapid response behaviour, a type of test disengagement, cannot be interpreted as a true indicator of the targeted constructs and may compromise score accuracy as well as score validity for interpretation. Rapid responding may be due to multiple factors for diverse populations. In this study, using Programme for International Student Assessment (PISA) 2018 Science data, we examined the comparability of rapid response behaviours for nine different language and cultural groups. Statistical methods were developed to flag rapid responses on different item types and to test differential rapid responding rates between groups. Results showed that rapid response rates and their association with performance were different among the studied groups. However, regardless of students’ group membership, a rapid response led to a chance correct rate, and it had a negative impact on performance. The results also indicated limited impact of different test engagement on performance ranking for the studied groups.

Acknowledgements

The authors are grateful to Yue H. Jia, John Mazzeo, Michael Walker, and Claudia Tamassia for their comments on an earlier version, and Frederic Robin for his help with data and related information. Any opinions expressed here are those of the authors and not necessarily those of the Educational Testing Service.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1 Low-stakes assessments are those bearing little consequences to test takers individually, even though these assessments may have significant impact on educational inferences and decisions at the group levels.

2 Process data are those captured in a logging system, such as how long test takers spent on the item, how many actions they executed on the keyboard and mouse, and how many times they visited the item, rather than the item responses for scoring.

3 PISA tests were administrated in two different languages in CAN, ARE, and QAT, and we only considered the selected language groups in the study.

4 B-S-J-Z (China) refers to the four PISA-participating provinces/municipalities of the People’s Republic of China (hereafter “China”): Beijing, Shanghai, Jiangsu, and Zhejiang.

5 The group means were different from operational PISA results because the student weights were not used here. To compute the grand mean PV, we followed the PISA technique procedure (OECD, Citation2019b).

6 In order to obtain a reliable threshold for each item, we used data from all 36 science forms that contained the studied item, regardless of whether the item blocks were in the pure or mixed science forms.

7 The percent correct is the average item score divided by the maximum item score.

8 It is worth noting that the rapid responses for the Chinese student group had a lightly higher AIS than the chance score.

9 Note again that in , the group mean (the third column) was computed using the ten plausible values for each group, and the standard error (SE, the fourth column) of the group mean also used the 10 plausible values (OECD, Citation2019b).

Additional information

Notes on contributors

Hongwen Guo

Hongwen Guo is a principal research scientist in the Research & Measurement Sciences (RMS), a part of the Research & Development division at ETS. Dr Guo had previously worked on large-scale testing programmes at ETS as a psychometrician, leading programme operation and research. Throughout her career, she also taught college-level courses in mathematics, statistics, and psychometrics at different times and places. Her current research focuses on statistical modelling of test and process data in educational measurement.

Kadriye Ercikan

Kadriye Ercikan is the Vice President of Research & Measurement Sciences (RMS) at ETS. Dr Ercikan’s research addresses validation of score meaning using examinee response processes, assessment of collaborative problem solving, designing and validating assessments of complex thinking, assessment of linguistic minorities, and fairness and validity issues in cross-cultural and international assessments. In her current role, Dr Ercikan is responsible for data analysis and psychometric support of ETS’s major testing products and contracts, ETS’s Research Centers and National Assessment Program Coordination.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.