171
Views
8
CrossRef citations to date
0
Altmetric
Original Articles

Two recent (2003) international surveys of schooling attainments in mathematics: England’s problems

Pages 33-46 | Published online: 29 Jan 2007
 

Abstract

The two recent (2003) international surveys of pupils’ attainments were uncoordinated, overlapped considerably, were costly and wasteful, especially from the point of view of England where inadequate response‐rates meant that no reliable comparisons at all could be made with other countries. The surveys were conducted by the OECD (Programme of International Student Assessment—PISA) and by the US‐based International Educational Assessment group (Trends in International Mathematics and Science Study—TIMSS). Sources of the problem are investigated.

Acknowledgements and apologies

This paper has benefited from comments on earlier drafts by Professor G. Howson (Southampton), Professor P. E. Hart (Reading), Professor J. Micklewright (Southampton), Dr Julia Whitburn and many others at the National Institute of Economic and Social Research, London; I am also indebted to the National Institute for the provision of research facilities. Needless to say, I remain solely responsible for all errors and misjudgements.

I take this opportunity also of offering apologies to the individuals who have innocently participated in carrying out the underlying inquiries here reviewed; but those who planned those inquiries must fully accept their share of blame for the inadequacies complained of here, and for too often uncritically following what was done in previous inquiries—instead of improving on those practices.

Notes

1. Only limited information on costs of these surveys has been released. For England, a total of £0.5m was paid to the international coordinating bodies, but information on locally incurred costs was withheld (in reply to a Parliamentary Question on 7 March 2005) as publication could ‘prejudice commercial interests’ in the government’s negotiating of repeat surveys in 2006–7. It is astonishing that expenditure on further surveys should have been put in hand before there has been adequate opportunity for scientific assessment of the value of the 2003 surveys and of the appropriate frequency of their repetition.

2. The PISA (Programme of International Student Assessment) inquiry of 2003 was organised by OECD and followed their first attempt in this activity in 2000. The report on their first survey was critically reviewed in my article in the Oxford Review of Education, 29(2) (2003); the present paper has benefited from discussion following that earlier paper. The acronym TIMSS was originally short for Third International Mathematics and Science Study; subsequently it became short for Trends in International … The previous occasion on which it had been carried out was 1999. More of the 2003 co‐ordinating costs (76%) were incurred by PISA, making TIMSS—which covered two age‐groups—the better buy for the British taxpayer.

3. TIMSS, Mathematics Report, p. 351.

4. See, for example, PISA, Annex B, Data Tables, pp. 340 et seq.

5. See my paper with K. Wagner, Schooling Standards in England and Germany: Some summary comparisons bearing on economic performance, in National Institute Economic Review, May 1985 and in Compare: A Journal of Comparative Education, 1986, no 1. More generally, see the series of reprints re‐issued by NIESR in two compendia entitled Productivity, Education and Training (1990 and 1995). Teachers and school inspectors, particularly from the London Borough of Barking and Dagenham, were invaluable in assessing school visits here and abroad.

6. See G. Ruddock et al., 2004, Where England Stands in the Trends in International Mathematics and Science Study (TIMSS) 2003, (NFER), pp. 8–10.

7. I. V. S. Mullis et al., 2004, TIMSS 2003 International Mathematics Report (IEA, Boston), for example, p. 35.

8. I. V. S. Mullis et al., 2004, TIMSS 2003 International Mathematics Report (IEA, Boston), for example, p. 355.

9. I. V. S. Mullis et al., 1997, Mathematics Achievement in the Primary School Years (TIMSS), p. A 13.

10. The reader will understand that the gradient of the response‐rate with respect to attainment‐level will be different according to whether it is amongst schools, at the school‐level, or amongst students within schools; but the point is not worth elaboration in view of what is said in the next paragraph.

11. Australia, Hong Kong, Netherlands, Scotland, United States (I.V.S. Mullis et al., 2004, TIMSS 2003 International Mathematics Report (IEA, Boston), p. 359). For the USA a response rate (before replacement) of only 66% was recorded for the primary survey and the same for the TIMSS secondary survey. For England’s secondary survey, the corresponding proportion was a mere 34%!

12. For example, starting from an initial list of schools organised by geographical area, size, etc., a random start is made; subsequent schools are chosen after counting down a given total number of pupils (so, in effect, sampling schools with probability proportional to their size). A reserve list is yielded by taking schools, each one place above the schools in that initial list; and a second reserve, by going one place down the initial list.

13. Quota sampling is used in commercial work, and places greater emphasis on achieving the agreed total of respondents, rather than on their representativeness; it is avoided in scientific work. On the ‘Sampling Referee’, see TIMSS 2003, p. 441. The issue of replacement sampling was questioned in my previous paper on PISA 2000 (Oxford Review of Education, 29(2), pp. 139–163); see also the response by R. J. Adams (Oxford Review of Education, 29(3), pp. 377–389), and my rejoinder to that response (Oxford Review of Education, 30(4), pp. 569–573). The need for representative sampling is so basic to scientific survey procedures that it is astonishing that those responsible for educational surveys, together with the government departments providing taxpayers’ money for such exercises, could accept such an easy‐going (slack) approach to non‐response. But, as it now turns out, this was not the last word—as discussed below in relation to re‐weighting with population weights.

14. G. Ruddock et al., Where England Stands (in TIMSS 2003), National Report for England (NFER, 2004), p. 25. The (previous) view expressed by PISA was very different. ‘A subsequent bias analysis provided no evidence for any significant bias of school‐level performance results but did suggest there was potential non‐response bias at student levels’ (PISA, p. 328, my ital.). To emphasise, this is different from the TIMSS conclusion that it was weaker schools that needed up‐weighting to improve representation (pp. 9, 25).

15. It is difficult to find more than a trace of a reference to this re‐weighting in the international TIMSS report, though it is quite explicit in the English national report; the same average scores for England are published in both reports. The TIMSS Technical Report (ch. 7, by M. Joncas, p. 202, n. 7) offers the following light: ‘The sampling plan for England included implicit stratification of schools by a measure of school academic performance. Because the school participation rate even after including replacement schools was relatively low (54%), it was decided to apply the school non‐participation adjustment separately for each implicit stratum. Since the measure of academic performance used for stratification was strongly related to average school mathematics and science achievement on TIMSS, this served to reduce the potential for bias introduced by low school participation’. The PISA report does not discuss any such possible improved estimation procedure.

16. The above discussion of response rates has been restricted, for the sake of brevity, to the primary school survey. More or less the same applied to both secondary school surveys, as follows. For the TIMSS secondary survey, the participation rate of the 160 sampled schools (before replacements were included) was a pathetic 34% (TIMSS, p. 358); for the PISA inquiry, directed to 450 schools, it was 64% (PISA, p. 327, col. 1). For the US, which deserves special attention because of its greater financial sponsorship, the corresponding secondary school response rates were 66 and 65% (but would their financial contribution have been as great if the true response rates had been published, i.e. after correctly allowing for replacement sampling as explained above?).

The English Department of Education issued Notes of guidance for media‐editors explaining that their ‘failure to persuade enough schools in England to participate occurred despite … various measures including an offer to reimburse schools for their time…’ (National Statistics First Release 47/2004, p. 4, 7 December 2004). Note the term ‘reimburse’; there is no suggestion of motivating a sub‐sample of schools by a substantial net financial incentive.

17. TIMSS, Mathematics Report, p. 266; the same applied also to the primary inquiry at Year 5, p. 267.

18. TIMSS, [International] Technical Report, p. 121 (see also Mathematics Report, p. 349, which is also not very helpful); the English National Report has an Appendix on Sampling (p. 287) but regrettably says nothing on this vital aspect of sampling.

19. Parents in countries with low between‐school variances, we are told, ‘can be confident of high and consistent performance standards across schools in the entire education system’ (PISA, p. 163). ‘Avoiding ability grouping in mathematics classes has an overall positive effect on student performance’ (though it is conceded ‘the effect tends not to be statistically significant at the country level’!) (p. 258). ‘Grade repetition can also be considered as a form of differentiation’ [and therefore to be avoided] (p. 264).

20. PISA (2004), p. 20.

21. See A. P. Carnevale and D. M. Desrochers, The democratization of mathematics, in Quantitative Literacy (Eds B. L. Maddison & L. A. Steen, National Council on Education and the Disciplines, Princeton NJ, 2003), esp. p. 24: ‘if the United States is so bad at mathematics and science, how can we be so successful in the new high‐tec global economy? If we are so dumb, why are we so rich?’

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 385.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.