18,581
Views
41
CrossRef citations to date
0
Altmetric
Editorial

Lost in translation? The challenges of educational policy borrowing

&

Introduction

Few would argue that the intermingling of ideas from different education systems does not have an important role to play in educational improvement. However, it is reasonable to expect that if we want to draw valid lessons from different countries, we need to ensure that we understand what happens at each step in the transference of a particular policy idea from one context to another. The special issue explores this theme and highlights the complexity surrounding the application of policy decisions in education in different contexts. It also reveals the difficulties that ensue when cultural variations are misunderstood, ignored or underestimated. Taken as a whole, the articles in this collection problematise the complicated relationship between educational research evidence and policy-making, illuminating the dangers that may be inherent in situations where policy formulation becomes more about politics and persuasion than research.

Why, then, do useful lessons from educational systems so often get lost in translation between the origin and destination? The reasons are manifold, and range from naivety of aspirations and a lack of proper theoretical frameworks, through to (some would say) wilful misrepresentation. To understand this, we need to comprehend the complex nature of educational policy-making, how it links to classroom practice and, not least, what is meant by the term policy borrowing.

Policy borrowing, although a phrase that can sometimes take on a pejorative tone, especially when referred to as ‘policy tourism’ (e.g. Oates Citation2015), is not per se a bad thing; it can be very constructive and effective in some circumstances. It would be a short-sighted policy-maker who did not indulge in some horizon scanning and look at other contexts to gain an informed, evaluative perspective on the relationship between policies and educational outcomes. The old adage ‘we should all learn from our mistakes’ may be true, but it is surely wiser to learn from others’ experiences, especially as the socio-economic stakes surrounding education are so high. Applying lessons learnt from other contexts can, and should, be a powerful tool in the field of comparative education and policy-making. Like all tools, though, it is misuse that leads to damage and the fault lies with the users rather than the tool.

There is no ideal solution or blueprint to policy borrowing as, like the policies themselves, the reasons for policy borrowing are highly complex, dynamic and very much embedded in the context within which they exist. This is further complicated by the impetus for educational policy change not always being linked solely to educational reasons and outcomes, but instead being heavily influenced by the strong currents of the surrounding sociopolitical milieu. The introduction of academy schools in England, for example (discussed later), had its roots in the ideological and political, in addition to the educational (Adonis Citation2012). As this issue demonstrates, to be used effectively in policy-making, the evidence on educational performance needs to be correctly and thoroughly interpreted in context. In order to examine the connections between a country’s performance in an international assessment and its curriculum, these elements themselves must be understood within a wider set of societal relationships and norms (e.g. Burdett and Weaving Citation2013).

The problem with complex systems

Education systems are complex. They involve a wide range of players and a diverse range of direct and indirect influencers on the outcomes. They are not only educational systems but sociopolitical systems in their own right. As Kemmis and Heikkinen (Citation2012) rightly characterise them, they are akin to complex ecosystems and, as in ecosystems, transplanting ‘alien’ entities can be very disruptive. This complexity means that establishing causality is difficult. For the policy-maker, it might be comforting to assume that Shanghai’s success in the Programme for International Student Assessment (PISA)Footnote1 is due to a particular aspect of its education policy or a specific teaching approach. However, understanding what is going on is more difficult and, while high assessment scores might be associated with various factors, it is hard to assign causality unambiguously to specific features of the system. This is due to a range of reasons. First and foremost among these is that education systems are not closed. We cannot say that all learning happens entirely within the classroom and is mediated by school and education policies. The initial problem that we face when trying to understand a policy context is establishing the limits of the system we are investigating. For policy-makers, this is often assumed to be the areas that they have control over: usually, this is the state education system. It should not be forgotten, though, that a large amount of learning happens outside of this, as can be attested to by the importance of a student’s background factors (including the quality of the home learning environment in the early years, mother’s education level, socio-economic status, etc.) in predicting their attainment (Sammons et al. Citation2007; Wheater et al. Citation2014).

This is further complicated by the presence of additional educational systems external to the official systems. Analysis of data collected as part of the Trends in International Mathematics and Science Study (TIMSS)Footnote2 in 1995 suggests, for example, that around 40% of students worldwide are involved in tutoring outside the classroom (Baker and LeTendre Citation2005). Clearly, various forms of ‘shadow education’ (Bray and Lykins Citation2012; Bray Citation2015) can be very significant. For example, in Korea, in 2006, it was estimated that for every one dollar spent by the state, a family spends a further 80 cents on their child’s education (Bray and Lykins Citation2012). In such a situation, how feasible is it to ascribe Korea’s success to policy or to private initiative? Can one realistically policy borrow without also importing the extramural education systems? These shadow education systems can also obscure the way in which the outcomes of a nation’s educational system are achieved. For example, in the case of Korea, the education system might appear very cost-effective and equitable from the state’s perspective (or from the viewpoint of an outside observer only looking at state expenditure) but, for the individual struggling to pay for their child’s tuition, it might seem very different.

There can also be huge variance within these complex and evolving educational systems. Therefore, interpreting and understanding data from these systems may require detailed background knowledge of an educational system’s history and provenance. In England, for example, school types and policies for schools are ‘mushrooming’ and there is nuanced variation within the plethora of terms denoting different types of educational institutions. The ‘academy school programme’ is a case in point. This initiative began in the early 2000s under the UK’s Labour Government, introducing schools in England that are directly funded by and accountable to central government rather than the local authority (House of Commons Library Citation2015). The original programme involved replacing poorly performing inner city secondary schools with an academy, with the aim that new management would increase school performance. The programme has broadened in recent years and all schools can now apply to become academies. The early academies are now known as ‘sponsored’ academies to distinguish them from the newer so-called ‘converter’ academies, but research shows that, even within academy schools, there are differences in attainment between ‘sponsored’ and ‘converter’ adopters (Worth Citation2014). At a macrolevel of data analysis, it is possible to average out these effects. But, for the policy-maker, the detail and understanding of this seemingly slight variance is important – or should be. If a given policy works only for a subset of the system or in specific circumstances, then this is surely vital information that should be taken into account in the policy-making process.

The other issue when dealing with complex systems is that the same inputs are not necessarily implemented in the same way: merely because a policy states a particular way of doing things, it does not automatically follow that this is what happens in the classroom. For example, in the field of early childhood education, researchers in Hong Kong found that when new teaching approaches were introduced as policy, teaching practices did not automatically change as a result (Pearson and Rao Citation2006): due, in part, to avoidance by teachers, and also to the entrenched expectations of parents for their children. Even where policy does lead to change in classroom practice and learning, there can be a considerable time lag between policy implementation and any observable outcomes. Consequently, what is documented policy today might not be the policy that has led to students achieving this year’s assessment results. The students have, after all, in many cases, spent some years in the system and might be influenced by the effects of previous policies. This is especially true of the large-scale international assessment surveys where the publication of results can lag significantly behind rapidly moving educational policy changes. In other words, the outcomes can result from a situation which no longer applies.

It has to be remembered that each nation will have different goals for its education system and, despite increased globalisation and the concerns of some commentators about the homogenisation of education systems (Lingard and Rizvi Citation1998; Gidney Citation2008; Rinne Citation2008; Rizvi and Lingard Citation2009), the aims of education systems differ significantly. They often include a broad set of aims with some specific and measurable outcomes (e.g. levels of literacy and numeracy), as well as more intangible aims such as promoting national identity and inculcating moral and ethical mores. For instance, the Malaysian National Education Philosophy states that education in Malaysia is ‘an on-going effort towards further developing the potential of individuals in a holistic and integrated manner so as to produce individuals who are intellectually, spiritually, emotionally and physically balanced and harmonious, based on a firm belief in and devotion to God’ (Ministry of Education Malaysia Citation2016; 1). The French education system, by contrast, has a long history of specifically stated secular educational aims starting with the Jules Ferry Education Act of 1881 (Brickman Citation1981). Such differences in the aims of education need to be taken into account by those making value judgements: if educational systems have varied aims, on what grounds should we evaluate whether a system or policy is good or bad? How do we know if a policy is successful?

International large-scale assessment data and policy borrowing

Given the complexities inherent in measuring the success of an education policy, it is desirable to take the debate beyond the panegyrics of current policies or the denigration of previous ones. To allow this to happen, there needs to be some means of interpreting the data to make it accessible to policy-makers so that they can make rationally informed decisions. There is, though, a real issue in creating a valid unidimensional construct from the multiscalar complex of policy issues; not least, how to weight the various dimensions. There is a very real tension between the desire to condense rich data down to a simple statement that encapsulates the complexity in a readily digestible format, and the validity of that interpretation. But for policy-makers, the phrase ‘well, that depends’ is not often one that will help them reach a decision and so is not particularly welcomed. One of the possible solutions is to delineate, purposely, what will be considered, to restrict the scope and focus solely to what is measurable, although this carries its own dangers. Converting educational outcomes to a limited set of constructs is a perfectly valid approach in certain situations and can lend useful clarity to the interpretation of the data, but it is one that may limit interpretation.

A theme running through the articles in this special issue is the dangers of a simplistic approach to categorising educations systems and policies. There have been many challenges to the validityFootnote3 and use of International Large-Scale Assessments (ILSAs), but their rise as an increasingly influential mediator and rationalisation for policy borrowing seems inexorable. In their paper in this issue, On the Supranational Spell of PISA in Policy, Baird et al. ascribe the evident rise in popularity in ILSAs in part as due to a ‘belief in the economic imperative’ that means jurisdictions are compelled to compete globally, with their citizens’ education causal to their economic wellbeing. The challenge for researchers is, therefore, in how to convey the nuances and richness of ILSA data in a format that allows policy-makers to make properly informed decisions for which the public can hold them accountable. This will not, however, prevent inappropriate policy borrowing; researchers will still need to characterise the context and confounding variables that mediate policy outcomes. This has been and will continue to be a challenge, due to the complex nature of the systems under study. Simple, easily communicated policies have an intrinsic power and logic that can be hard to counteract even if misplaced. The use and abuse of ILSA data is clearly an area of some controversy in education and several other articles in this special issue draw attention to these debates. In International Large-Scale Assessments: What Uses, What Consequences? Johansson focuses on the consequential aspects of validity, analysing claims in the literature that the use of large-scale assessments can lead to a range of unintended consequences. In evaluating the arguments, he observes that the valuable research resource afforded by the ILSAs can be easily overlooked.

Certainly, the ILSAs present a very valuable data source but the focus for policy-makers is often on only one dimension – an international ranking. However, this ranking by itself is almost a meaningless construct, as it depends upon an artificial collapsing of a complex set of constructs into a single dimension. For example, OECD defines the ‘single dimension’ of reading literacy as being made up of

a wide range of cognitive competencies, from basic decoding, to knowledge of words, grammar and larger linguistic and textual structures and features, to knowledge about the world. It also includes metacognitive competencies: the awareness of and ability to use a variety of appropriate strategies when processing texts. Metacognitive competencies are activated when readers think about, monitor and adjust their reading activity for a particular goal. (OECD Citation2013, 9)

In addition, measurement errors often mean that it is only possible to say that countries perform similarly, rather than they have a definitive ranking (even though it is often misleadingly reported as the latter). Unfortunately, this simplistic interpretation offers a very seductive path. As Kamens (Citation2013) characterises it: ‘world league tables are now a thriving enterprise as the frenzy of evaluation spreads’ (Kamens Citation2013, 118). This need to interpret the data poses a significant problem for research-based policy formulation; as Baird et al. phrase it in this special issue, ‘The supranational spell of PISA in policy draws a veil over the fact that with data you are just another person with an opinion. PISA data are not unambiguous fact; all data must be interpreted’. There needs to be proper interpretation of the evidence and valid use of the outcomes.

Different types of policy borrowing

Policy borrowing is a broad term and one that does not always involve the wholesale uprooting of a policy and its transport to an ‘alien soil’. In their paper in this special issue, Referencing and Borrowing from Other Systems: The Hong Kong Education Reforms, Forestier et al. highlight developments in Hong Kong, where a broader, collaborative and seemingly more research-driven learning approach is argued to inform policy. The authors distinguish between policy borrowing (Phillips and Ochs Citation2003), policy learning (Raffe and Spours Citation2007) and policy referencing (Steiner-Khamsi Citation2002). The difference between policy learning, policy referencing and policy borrowing is that policy learning and referencing imply that no actual policy or practice is transferred, but rather that the data inform local solutions. Such distinctions are also discussed in Harris et al. in this special issue. In their comparative analysis of preparation and development programmes across a range of different educational systems, Qualified to Lead? A Comparative, Contextual and Cultural View of Educational Policy Borrowing, the authors conclude that the adoption of ‘design principles’ that lie behind successful interventions is preferable to the borrowing of policies in their entirety.

Disconnects

One of the big challenges in connecting research and policy formation is that the worlds of education, research and policy-makers only intersect at specific loci and run to different motivations and timescales. Even within these groups, there is also a great deal of variation. Among policy-makers, for example, the civil servants in the finance ministry might be looking to the education system to deliver work-related skills improvements in the medium term, while the minister of education might be under pressure to deliver voter-friendly policies immediately. Researchers, rightly, would want to take a more longitudinal approach to evidence gathering and analysis, but this often fails to engage with policy-makers, who need to balance a wide range of competing factors and are often working to short-term deadlines. These more mundane factors and the ‘realpolitik’ of policy formation can often ‘bleed back’ into the research arena and lead to further disconnects between policy and the evidence base on which it should draw. In other words, political imperatives can end up directing or driving research rather than the other way around. To be effective, evidence should inform policy rather than policy justification selecting supporting evidence. This has been characterised as research becoming ‘a policy production on a number of different levels: a carefully scripted, directed and managed staging for the purpose of producing particular policy outcomes’ (Rappleye Citation2012, 122).

In addition to being able to direct the research agenda, policy-makers are also increasingly aware that they do not need to act on the outcomes of commissioned research and that they have a certain amount of control over how the findings are disseminated and used. As Kamens (Citation2013) observes: ‘Ministries and politicians can thus support such research where it is convenient to do so, and decouple the activity from actually altering the structures of schools or pedagogical practices where such efforts would cause political problems’ (128). In the sphere of educational comparisons, the large number of suitable policy donor countries can easily lead to a ‘cherry-picking’ of the evidence to align with existing policy in the country. As the articles in this special issue suggest, a deep disconnect between the original research data and newly developed policy may be the result.

Although policy-makers may have a genuine desire to use research evidence to inform policy, there are often temporal disconnects between policy formation and changes in practice that can make the evaluation of policy success difficult. In ‘Educarein Australia: Analysing Policy Mobility and Transformation in this special issue, McShane explores this relationship between policy and practice with an example of early childhood education and care. Using an illustrative case study, the paper demonstrates how policies outside the educational arena serve to shape and influence the implementation of educational policy.

We have seen earlier how there can be difficulties in understanding exactly what the policy was at a given time, and also how slow policy can be to have an impact on practice. In connection with this, the evidence suggests that the impacts of contextual factors on the implementation of new policies can easily be underestimated. These factors can also lead to policy-makers citing evidence and policy that have subsequently been challenged in their place of origin, or applying policy whose effectiveness is only partially understood at the time of acquisition. In this special issue, Tan highlights the case of child-centred education in Tensions and Challenges in China’s Education Policy Borrowing. The paper analyses the interplay between borrowed policy ideas and local ideologies and practices. Writing about a very different cultural context in The (Mis)use of the Finnish Teacher Education Model: ‘Policy-based Evidence-Making’?, Chung emphasises the importance of understanding the context in which the Finnish teacher education system is situated. Both studies represent significant examples of where the policy transfer situation has proven to be more complex and elusive than an initial interpretation would suggest.

These discontinuities might, individually, be small or subtle, but the end result can be a situation where the policy being borrowed is almost unrecognisable from the policy lender’s perspective. This is further compounded by the numerous contextual factors that influence the framing and implementation of borrowed policy. This can lead to well-travelled innovations, such as school-based assessment (SBA) (Gardner Citation2006; Fok et al. Citation2007), taking on myriad new forms when diffracted through the lens of existing expectations and culture. This deviation from the intended purposes of the original, borrowed policy can result in the failure of the ‘new’ policy (Kellaghan and Greaney Citation2004; Bajunid Citation2008).

School-based assessment (SBA) had a very clear and evidence-based rationale in its original incarnation (Wiliam et al. Citation2004), but the borrowed policy variants often only share with the original policy a common name and the fact that the assessment is classroom-based. Much of the careful research that influenced the original development of SBA, and the intrinsic lessons painstakingly learnt about SBA, have become lost in the exigencies of implementing SBA in classrooms that may not have the infrastructure, the accountability systems or the cultural surround necessary for its successful implementation (Kellaghan and Greaney Citation2004; Bajunid Citation2008). Rather than adapt the policy to match the local context in a manner that remains true to the original intentions, the purposes for introducing the policy often get lost in the day-to-day immediacy of ensuring that the policy is rolled out. Policy implementers, at all levels in the system, are often under a great deal of pressure, and as illustrated in the case of SBA, it is often the most expedient, rather than the best, solution that is hastily patched into place leading to the potential longer term collapse and discrediting of the policy in its new context. As many of the papers in this special issue illustrate, our understanding of the policy borrowing process needs further development to comprehend, fully, the interplays and ramifications within the borrower and lender systems.

These factors all serve to highlight that policy borrowing is a highly complex undertaking if it is to be pursued diligently and successfully. It is easy to see why policy-makers may prefer a simpler ‘short-cut’, even if this does not guarantee that they will reach the intended destination.

Conclusion

A reoccurring theme in this collection of papers is the difficulties that arise when policy borrowing takes place with no apparent thought to the lending or borrowing context. An apt analogy would perhaps be to the large-scale hydroelectric projects of the 1970s and 1980s, where the promise of free electricity and examples of successful large-scale hydroelectric generation led to a rash of dams being built in the developing world with little thought to local social or environmental factors. The lack of planning for local conditions often led to the failures of the dams to provide the projected returns due to silting and other unforeseen problems (Deudney Citation1981; World Commission on Dams Citation2000). There was also often significant local social and political outcry as the dams led to some major local problems but only minimal local improvements. In response, the World Bank introduced a moratorium on funding for dam projects. As with many examples of educational policy borrowing, international players or central government imposed a solution of often dubious benefit that the local systems had to accommodate to their needs and customs at their cost.

The papers in the special issue shed a fascinating light on the complexity of policy borrowing and indicate the potential benefits and successes that can result from taking proper account of contextual differences. The examples go beyond the simple narrative of Western nations looking to the ‘rich nations club’ (Kamens Citation2013, 124), or the so-called ‘tiger’ economies, to provide a blueprint for the developing nations. They show how policy borrowing is a bilateral process with the Pacific Rim high achievers studying the Western nations with equal scrutiny. This special issue highlights some of the successes and failures; the benefits and dangers; the nuances and challenges; the good, the bad and the downright ugly when it comes to policy borrowing. Undertaken with rigour, and when the data are appropriately and fully interpreted, the papers show how policy learning can be a powerful tool. When embarked upon without integrity, though, it seems that the borrowing of policies from elsewhere may constitute a thin veneer of legitimacy for a policy approach already decided, or an expedient means of making policy in the absence of a theoretical framework.

Newman Burdett
Independent Consultant
[email protected] O’Donnell
National Foundation for Educational Research
[email protected]

Notes

3. That is, that conclusions are drawn logically and are factually sound, based on what is being tested.

References

  • Adonis, A. 2012. Education, Education, Education: Reforming England’s Schools. London: Biteback.
  • Bajunid, I. A. 2008. Malaysia, from Traditional to Smart Schools: The Malaysian Educational Odyssey. Oxford: Fajar.
  • Baker, D. P., and G. K. LeTendre. 2005. National Differences, Global Similarities: World Culture and the Future of Schooling. Stanford, CA: Stanford University Press.
  • Bray, M. 2015. “Exacerbating or Reducing Disparities? The Global Expansion of Shadow Education and Implications for the Teaching Profession.” Keynote address at the International Council on Education for Teaching (ICET), 59th World Assembly, ‘Challenging Disparities in Education’, Japan, Naruto University of Education, June 29.
  • Bray, M., and C. Lykins. 2012. Shadow Education: Private Supplementary Tutoring and Its Implications for Policy Makers in Asia. Mandaluyong: Asian Development Bank [online]. Accessed March 7, 2016. http://adb.org/sites/default/files/pub/2012/shadow-education.pdf
  • Brickman, W. W. 1981. “The Ferry Law of 1881: The Fundamental Law of French Primary Education.” Western European Education 13 (3): 3–5.
  • Burdett, N., and H. Weaving. 2013. Science Education – Have we Overlooked What we are Good at? (NFER Thinks: What the Evidence Tells Us). Slough: NFER [online]. Accesssed March 7, 2016. www.nfer.ac.uk/publications/99935/99935.pdf
  • Deudney, D. 1981. Rivers of Energy: The Hydropower Potential (Worldwatch Paper No. 44). Washington, DC: Worldwatch Institute.
  • Fok, P., K. J. Kennedy, J. Chan, and F. W. Yu. 2007. Integrating Assessment of Learning and Assessment for Learning in Hong Kong Public Examinations: Rationales and Realities of Introducing School-based Assessment [online]. Accessed February 12, 2016. http://www.iaea.info/documents/paper_1162a1b7ea.pdf
  • Gardner, J., ed. 2006. Assessment and Learning. London: Sage.
  • Gidney, J. 2008. “Beyond Homogenisation of Global Education.” In Alternative Educational Futures: Pedagogies for an Emergent World, edited by M. Bussey, S. Inayatullah, and I. Milojevic, 253–268. Rotterdam: Sense.
  • House of Commons Library. 2015. Academies under the Labour Government. London: House of Commons Library [online]. Accessed March 7, 2016. http://dera.ioe.ac.uk/22717/1/SN05544.pdf
  • Kamens, D. H. 2013. “Globalization and the Emergence of an Audit Culture: PISA and the Search for ‘Best Practices’ and Magic Bullets.” In PISA, Power and Policy: The Emergence of Global Educational Governance, edited by H.-D. Meyer and A. Benavo, 117–139. Oxford: Symposium Books.
  • Kellaghan, T., and V. Greaney. 2004. Assessing Student Learning in Africa. New York: World Bank.
  • Kemmis, S., and H. L. T. Heikkinen. 2012. “Future Perspectives: Peer-group Mentoring and International Practices for Teacher Development.” In Peer-group Mentoring for Teacher Development, edited by H. L. T. Heikkinen, H. Jokinen, and P. Tynjälä, 144–170. London: Routledge.
  • Lingard, B., and F. Rizvi. 1998. “Globalisation and the Fear of Homogenisation in Education.” Change: Transformation in Education 1 (1): 62–71.
  • Ministry of Education Malaysia. 2016. National Education Philosophy [online]. Accessed February 12, 2016. http://www.moe.gov.my/en/falsafah-pendidikan-kebangsaan
  • Oates, T. 2015. Finnish Fairy Stories. Cambridge: Cambridge Assessment [online]. Accessed March 7, 2016. http://www.cambridgeassessment.org.uk/Images/207376-finnish-fairy-stories-tim-oates.pdf
  • OECD. 2013. PISA 2015: Draft Reading Literacy Framework [online]. Accessed March 7, 2016. http://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Reading%20Framework%20.pdf
  • Pearson, E., and N. Rao. 2006. “Early Childhood Education Policy Reform in Hong Kong: Challenges in Effecting Change in Practices.” Childhood Education 82 (6): 363–368.
  • Phillips, D., and K. Ochs. 2003. “Processes of Policy Borrowing in Education: Some Explanatory and Analytical Devices.” Comparative Education 39 (4): 451–461.10.1080/0305006032000162020
  • Raffe, D., and K. Spours. 2007. “Policy Learning in 14–19 Education: From Accusation to an Agenda for Improvement.” In Policy-making and Policy Learning in 14–19 Education (Bedford Way Papers), edited by D. Raffe and K. Spours, 209–330. London: Institute of Education.
  • Rappleye, J. 2012. “Reimagining Attraction and ‘Borrowing’ in Education. Introducing a Political Production Model.” In World Yearbook of Education 2012: Policy Borrowing and Lending in Education, edited by G. Steiner-Khamsi and F. Waldow, 121–147. London: Routledge.
  • Rinne, R. 2008. “The Growing Supranational Impacts of the OECD and the EU on National Educational Policies, and the Case of Finland.” Policy Futures in Education 6: 665–680.10.2304/pfie
  • Rizvi, F., and B. Lingard. 2009. Globalizing Education Policy. Oxford: Routledge.
  • Sammons, P., K. Sylva, E. Melhuish, I. Siraj-Blatchford, B. Taggart, Y. Grabbe, and S. Barreau. 2007. Effective Pre-school and Primary Education 3–11 Project (EPPE 3–11). Summary Report Influences on Children’s Attainment and Progress in Key Stage 2: Cognitive Outcomes in Year 5. London: Department for Education and Skills [online]. Accessed March 7, 2016. http://webarchive.nationalarchives.gov.uk/20130401151715/http://www.education.gov.uk/publications/eOrderingDownload/RR828.pdf
  • Steiner-Khamsi, G. 2002. “Re-framing Educational Borrowing as a Policy Strategy.” In Internationalisierung – Internationalisation, edited by M. Caruso and H.-E. Tenorth. Frankfurt: Lang [online]. Accessed February 17, 2016. http://www.tc.columbia.edu/faculty/steiner-khamsi/_publications/Gitas%20Professional%20Files/Chapters%20in%20edited%20volumes/Reframing2002.pdf
  • Wheater, R., R. Ager, B. Burge, and J. Sizmur. 2014. Achievement of 15-Year-Olds in England: PISA 2012 National Report (OECD Programme for International Student Assessment). December 2013 – Revised April 2014 [online]. Accessed February 11, 2016. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/299658/programme-for-international-student-assessment-pisa-2012-national-report-for-england.pdf
  • Wiliam, D., C. Lee, C. Harrison, and P. Black. 2004. “Teachers Developing Assessment for Learning: Impact on Student Achievement.” Assessment in Education: Principles, Policy and Practice 11 (1): 49–65.10.1080/0969594042000208994
  • World Commission on Dams. 2000. Dams and Development: A New Framework for Decision-making. London: Earthscan [online]. Accessed February 12, 2016. http://www.unep.org/dams/WCD/report/WCD_DAMS%20report.pdf
  • Worth, J. 2014. Analysis of Academy School Performance in GCSEs 2013. Slough: NFER.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.