2,298
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Assessment and learning in times of change and uncertainty

The year we left behind, 2021, was a year like no other, with the pandemic impacting students’ learning and assessment globally. While some countries have decided to open up, other countries have adopted a stricter approach and still use lockdown in areas of COVID outbreak. The impact of school lockdowns for students’ learning and the assessment of their achievements following pandemic will be the theme for a forthcoming Special Issue in this journal (O’Leary & Hayward to be published). What we already know, is that we are still in challenging times, where uncertainty is in the centre of our lives.

Even worse, the uncertainty now includes dreadful scenes from the war in Ukraine, where children, instead of going to school and preparing for their exams this Spring, are fleeing their country in fear of bombs. Wars, whether they occur in Afghanistan, Ethiopia, Syria, or Yemen, are taking away the future of a generation of children who should be living together peacefully, connecting through music and art, preparing for adulthood, and developing the skills to solve problems we have not yet discovered.

In challenging times, with a global pandemic, wars and personal losses, it is worth reminding ourselves of the reasons so many of us committed our lives to education. In Europe, the President of the assessment organisation AEA-Europe, Dr Christina Wikström, made the following statement to members in March 2022: ‘We place our trust in all our members, irrespective of nationality, to always stand up for peace, democracy, and equality for all. We must also work together in our scholarly profession since scholarship is necessary for the survival and prosperity of humanity’ (Wikström, Citation2022).

It is in this spirit we publish the first 2022 issue of Assessment in Education – knowing that education matters, and particularly in times of uncertainty and challenges. As an international journal, we hope that researchers will continue to connect, collaborate, and stand up for the values which enhance learning for all students, no matter which continent they live on. We owe it to them to offer aspirations for the future, and quality education for all can provide that hope.

In the first article in this regular issue, Steinmann et al. (Citationthis issue) presents findings from a study investigating the reliability of questionnaire scales. Student self-report scales have been known for their limitations, both with respect to validity and reliability (Samuelstuen & Bråten, Citation2007; Samuelstuen et al., Citation2007), but the current study specifically investigates the use of mixed wording in items used in international large-scale studies. More specifically, students were asked to report their agreement on items, such as ‘I usually do well in mathematics’ and ‘I am just not good at mathematics’, mixing positive and negative statements. Steinmann et al. used data from the international IEA studies PIRLS and TIMSS 2011, to further investigate how students responded to these scales, and demonstrated that 2–36% of students across 37 primary education systems had inconsistent responses. As was expected, these students showed lower average achievement scores, a findings previously also found in PISA scales (Hopfenbeck & Maul, Citation2011). The study is a healthy reminder of the importance of improving the scales used in international studies, as well as the need to be careful with the use of the data, and acknowledging its limitations, particularly when reporting results from student questionnaires.

The next articles in this issue, with data from US, Greece and UK, examine high-stakes tests from students’ perspectives in different ways. Ober et al., (Citation2022) present a study from the US where students have predicted their future performance in both low- and high-stakes testing contexts. Interestingly, the use of self-assessment has been questioned, but the current study demonstrated that students’ predictions were related to future performances in the subject area. The findings suggest that self-assessment may indeed be accurate in predicting test performance in specific contexts, and stable over time.

Brown and Woods (Citationthis issue) investigate students’ perspectives of high-stakes tests, by conducting a systematic literature review methodology and critical appraisal frameworks. The scope of the review was to examine students’ experiences and perceptions of sitting the General Certificate of Secondary Education (GCSE), the qualification by which students’ attainment at age 16 has been measured for the last 30 years in England, Wales and Northern Ireland. Issues such as test anxiety, well-being and student agency are discussed.

Papanastasiou & Stylianou-Georgiou (Citation2022) examine the interrelationships among test-taking strategy instruction (with a focus on metacognition), answer changing bias and performance on multiple-choice tests among college students through structural equation modelling. A total of 1512 students from Greece participated in the study which demonstrated important aspects of test-taking strategies and how it impacts upon students’ test-scores.

Hoffmann et al. (Citation2022) at the Yale Center for Emotional Intelligence describe how students have been actively participating in the development of an assessment tool measuring the school climate. They write that to ensure any assessment tool is engaging, and reports made by them are meaningful, students must be included as key decision-makers in the process. There are lessons to be learned from the group on how they build an educational assessment tool that is meaningful and inclusive for all students and the processes where students are engaging with the development in authentic discussions. By sharing data back to students in focus groups, the research group witnessed how students were not only discussing the tool itself, but also were motivated to improve their school.

All articles provide promising results for future work which can enrich students’ learning. As we embrace an uncertain future, it is important that our scholarly work continues to focus upon the global need for all students to learn, in open societies, as well as on how we can support those in lockdown and who are living in the midst of war.

Changes to the editorial board

After more than 20 years on our Editorial Board, Professor Sally Thomas is now stepping down. We are grateful for her expertise and commitment to this journal over the years, and for her invaluable dedication in making the journal such an impactful and well-established outlet for international research. Having been on the Board since 2001, Sally has not only made sure journals articles have been reviewed to the high standard needed, but she has also edited special issues which are still used globally in the field by new researchers, such as the Special Issue Educational Assessment in Latin America (Swaffield & Thomas, Citation2016). Further, we would like to thank Professor Gordon Stobart who has also stepped down from his role as Executive Editor. Gordon has not only served on this Editorial Board for more than two decades but was also the Lead Editor of this journal from 2003 to 2009. With his expertise in assessment internationally, he was able to make the journal relevant across the globe. As he stated in one of his Editorials, ‘assessment is never a neutral process – it always has consequences. The task is to make these as constructive as possible, particularly for those who are assessed’ (Stobart, Citation2003).

We are pleased to welcome Professor Mary Richardson, UCL, as new Executive Editor of this journal. Mary is Professor of Educational Assessment in the Department for Curriculum, Pedagogy and Assessment at the Institute of Education, London. She leads an MA in Assessment and supervises doctoral students interested in assessment, ethics, and children’s rights. Mary is interested in the intersection of technical and philosophical elements of educational assessment and is currently examining the role of artificial intelligence in the practice of high-stakes testing; as well as leading the reporting for England on the international Trends in Mathematics and Science Studies (2023) for the DfE. She sits on the councils for the Philosophy of Education GB and the Assessment and Evaluation (Europe), managing key aspects of online communications for both organisations. Her new book, Rebuilding Public Confidence in Educational Assessment (UCL Press) is due for publication in May 2022.

Finally, we are equally pleased to welcome Dr David Torres Irribarra as new Executive Editor. David is a psychologist from Pontificia Universidad Católica de Chile, where he is currently an assistant professor at the School of Psychology. He obtained his Ph.D. in Quantitative Methods and Evaluation from the University of California, Berkeley. Before his doctoral studies, David worked as a researcher and analyst in Chile’s national teacher assessment projects. His doctoral studies focused on using statistical models for measurement in the social sciences, particularly the use of item response theory in educational contexts. As part of his educational and social measurement work, David has served in Chile as a senior psychometrics consultant to multiple major national Chilean assessments and has participated in research projects and consultancies in Honduras, Peru, Mexico, and the United States. His main research areas are (i) the application of latent variable models to measurement contexts, (ii) the theoretical foundations of measurement in the social sciences, an area in which he recently published the book A Pragmatic Perspective of Measurement (2021) and (iii) the use of digital technologies to improve educational assessment.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.