971
Views
7
CrossRef citations to date
0
Altmetric
Articles

Which form of assessment provides the best information about student performance in chemistry examinations?

&
Pages 49-65 | Received 10 Oct 2012, Accepted 06 Jan 2013, Published online: 08 Feb 2013
 

Background

This study developed from observations of apparent achievement differences between male and female chemistry performances in a state university entrance examination. Male students performed more strongly than female students, especially in higher scores. Apart from the gender of the students, two other important factors that might influence student performance were format of questions (short-answer or multiple-choice) and type of questions (recall or application).

Purpose

The research question addressed in this study was: Is there a relationship between performance in state university entrance examinations in chemistry and school chemistry examinations and student gender, format of questions – multiple-choice or short-answer, and conceptual level – recall or application?

Sample

The two sources of data were: (1) secondary analyses of five consecutive years’ data published by the examining authority of chemistry examinations, and (2) tests conducted with 192 students which provided information about all aspects of the three variables (question format, question type and gender) under consideration.

Design and methods

Both sources of data were analysed using ANOVA to compare means for the variables under consideration and the statistical significance of any differences. The data from the tests were also analysed using Rasch analysis to determine differences in gender performance.

Results

When overall mean data are considered, both male and female students performed better on multiple-choice questions and recall questions than on short-answer questions and application questions, respectively.

When overall mean data are considered, male students outperformed female students in both the university entrance and school tests, particularly in the higher scores. When data were analysed with Rasch, there was no statistically significant difference in performance between males and females of equal ability.

Conclusions

Both male and female students generally perform better on multiple-choice questions than they do on short-answer questions. However, when the questions are matched in terms of difficulty (using Rasch analysis), the differences in performance between multiple-choice and short-answer are quite small. Rasch analysis showed that there was little difference in performance between males and females of equal ability. This study shows that a simple face-value score analysis of relative student performance – in this case, in chemistry – can be deceptive unless the actual abilities of the students concerned, as measured by a tool such as Rasch, are taken into consideration before reaching any conclusion.

Acknowledgements

The authors wish to acknowledge the assistance of Dr Julia Pallant (University of Melbourne) and Mr Nick Connolly (Australian Council for Educational Research) for providing guidance and assistance in the use and interpretation of the RUMM2030 software and outputs.

Notes

1. The Bonferroni adjustment is calculated against a base probability value of 0.05 as the minimum acceptable value for dependence within a set of test questions that are intended to be unidimensional. The calculation for this set of items is Pr (Bonferroni = 0.05 / 24 (number of items) = 0.00208. (Pallant Citation2007)

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

ISS Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,007.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.