1,228
Views
9
CrossRef citations to date
0
Altmetric
Web Papers

The clinical skills assessment for international medical graduates in The Netherlands

, , , &
Pages e533-e538 | Published online: 12 Nov 2009

Abstract

Aim: A need was felt to improve the quality of admission and licensing procedures for international medical graduates in The Netherlands.

Method: A clinical skills assessment was designed as part of a new procedure to realize a high-stakes, fair, transparent, and a time-limited path of admission for international medical graduates to the Dutch health care system. Additionally, it should provide a well-founded advice about length and content of additional medical training, should this be indicated by the outcome of the assessment.

Results: The clinical skills assessment procedure was developed as a Dutch variant of the “Step 2 Clinical Skills examination” of the Educational Commission for Foreign Medical Graduates (ECFMG) in collaboration with the United States National Board of Medical Examiners, which has a well-documented validity and reliability. The experience with the new procedure is yet limited, but enough to warrant a report.

Discussion: Worldwide, a number of countries have developed such high-stake assessment procedures, but they show little uniformity and transparency. By describing the design and development of our procedure, we do not pretend to set a standard, but we hope to contribute to more fair, accurate and uniform approaches for doctors moving from one country to another.

Introduction

Many western countries face considerable number of foreign doctors seeking certification as independent health care providers. Some countries partly rely on these international medical graduates (IMGs) to alleviate the shortage of locally trained doctors. The influx of foreign doctors into the national healthcare system, however, is often troubled. The level of competence and education does not always meet national standards. IMGs usually pass one or more tests before being granted certification and the ability to work unsupervised. However, only few countries have developed uniform and transparent tests. Assessment procedures are considered long and expensive and so is additional medical training. Many IMGs seek other employment, usually below their actual potential. In this way, much of the investments in their education and training, both economical and personal, are lost (Spike Citation2006; Srivastava Citation2008).

The situation in The Netherlands was, until the late nineties, also far from ideal. The Dutch assessment procedure consisted of verification of the diploma and, when this was considered unequal to the Dutch medical licensing diploma, additional training in a University Medical Centre (UMC) was prescribed. The length and content of this obligatory training period was determined by the UMC the candidate was allocated to, and usually no assessment of the applicant's actual medical competence was carried out. Between 1999 and 2002 the number of IMGs entering The Netherlands and applying for recognition of their diplomas increased significantly, from 180 to 400 per year, and medical schools as well as the government were increasingly unsatisfied with the procedure (CIBG-Brochure Citation2004).

The time had come to develop a new procedure. This new procedure was to provide equal treatment for all medical graduates from outside the European Economical Area (EEA, European Union (EU) member states plus Norway, Iceland, Switzerland and Liechtenstein), as within the EEA there is an open labour market and the assessment of professional competence cannot be enforced. The new procedure should lead to a shorter track before admission to employment in the Dutch health care system and also provide a justified and well-founded advice for additional medical training in case the assessment showed the need for it, taking the medical professional background and postgraduate training of the applicants into account (Splinter et al. Citation2003).

Such a high-stake assessment procedure would probably not only serve as a screening of incoming doctors’ medical competence, but would most likely also help to improve their ability, as they will probably seriously prepare to pass these examinations (Newble & Jaeger Citation1984; Wilkinson et al. 2007).

The USA and Canada have developed robust and validated procedures for the assessment of (foreign) doctors on a national level (Cohen et al. Citation1988; Conn & Cody Citation1989; Sutnnick et al. Citation1994; Friedman Ben-David et al. Citation1999). Specifically, the USA procedures, carried out by the Educational Commission for Foreign Medical Graduates (ECFMG), served as an exemplary focus in the design of procedures that could be used in The Netherlands. The ECFMG has a long standing and well-known experience with the assessment of foreign medical graduates. The validity of the ECFMG assessment procedure is reflected in the predictive properties of these tests for the future practice of the candidates (Chambers et al. Citation2000; Boulet et al. Citation2002; Austin et al. Citation2003; Papadakis et al. 2004).

The Dutch assessment procedure was designed after carefully studying the ECFMG procedures. The reason to match ECFMG procedures was to contribute to an international standard and to enhance the acceptance of procedures. This is of particular importance as the EU has an open labour market. A medical diploma, recognized by one member state is, in most cases, automatically recognized by all other member states, due to European regulations. By describing the development of the clinical skills assessment, we hope to be followed by other countries. With the formation of a multi-national community of interest around fairness in the testing of IMGs, hopefully there will be more transparency in this field.

The Dutch assessment of medical competence of foreign medical graduates

The new Dutch assessment of medical competence of foreign medical graduates (DAMCFG) procedure consists of (a) a portfolio, in which education and work experience are explained, (b) tests of general skills necessary to work in a Dutch health care environment, including Dutch medical language proficiency, knowledge of the organization of Dutch health care and English reading proficiency and (c) a series of assessments of medical competence, including a computer-based assessment of the knowledge of basic and clinical sciences and a hands-on assessment of clinical skills ().

Table 1.  Steps in the admission of foreign medical graduates

The general skills tests have been developed by the University Medical Centre of Groningen and are administered at the James Boswell Institute in Utrecht. The computer-based assessments have been developed and held in Maastricht. Depending on the number of applicants, the tests can be held every month. The clinical skills assessment procedure has been developed by the University Medical Centre of Nijmegen and is the focus of this article. It takes place at least once in every 3 months in the clinical skills centre in Nijmegen. Candidates must first pass the general skills tests (Step IIA) before they are allowed to proceed to the clinical tests (Step IIB).

Clinical skills assessment: Development of a blueprint

To assess clinical and communicative skills an Objective Structured Clinical Examination (OSCE) has been developed in analogy with the Step 2 Clinical Skills of the ECFMG examination, which has been proven to be valid and reliable. The OSCE consists of 10 stations, nine of which involve standardized patients. Candidates are asked to take a history and perform a physical examination as they would do in real practice. In the tenth station, mastery of specific procedures must be shown. The time per station is 20 min. After that the candidate has 10 min available to formulate a problem list and a differential diagnosis and select additional diagnostic procedures and therapy.

A structured procedure was carried out to establish the content validity of the OSCE. First, two established sources were consulted to generate a list of domains representing organ systems, disease categories and elements of professional behaviour. One is the so-called Inter-university Progress Test, originally designed at Maastricht University, now applied at five Dutch medical schools, and intended to cover the complete range of relevant objectives of undergraduate medical training (Verhoeven et al. Citation2002). The second source was the United States Medical Licensing Examination Step 2 Clinical Knowledge test, as used in ECFMG procedures (Swanson et al. Citation1987; Gary et al. Citation1997; Hallock & Kostis Citation2006; USMLE Clinical Knowledge Test Citation2008). This yielded a list of 20 domains ().

Table 2.  List of the 20 domains of the clinical skills assessment blueprint

The next step was to constitute an expert panel to rank these domains. Thirty-three practicing physicians from 20 different disciplines, involved in the training of medical students and residents in academic as well as affiliated centres were selected. Those physicians were well equipped to judge the importance of clinical problems for this new assessment, because of their experience with the level of competency of newly graduated doctors in daily practice. All physicians were invited to select from the list of 20 domains, the 10 most important ones. In addition, they were asked to identify 10 patient problems or conditions from their own discipline that could be used as material to build cases. This generated a list of problems and conditions that was cross-validated against the so-called Dutch Blueprint of Objectives of undergraduate medical training. This document describes all requirements that graduated doctors should meet in terms of knowledge, skills and attitude and includes a list of clinical problems. This blueprint has been established by the Royal Dutch Medical Association and organizations of clinical physicians and University Medical Centers and was put into effect as a law in The Netherlands in 1994 and revised in 2001 (Schade & Sminia Citation1995; Blueprint of objectives of undergraduate medical education in the Netherlands 2001; Metz et al. Citation2001).

From the 20 domains of the list, 18 were identified as important to test, although some domains were rated as more important than others. This information was used to give weight to the domains. As a consequence, 18 domains are to be tested in the clinical skills assessment, although some domains will be tested more often than others. The 18 domains were linked to medical disciplines and put into a multidimensional matrix. This matrix contains the general competencies to be tested, the disciplines on which these have to be demonstrated, and the specific problems or conditions that ought to be assessed within these disciplines. This clinical skills blueprint has been used to select appropriate case samples for subsequent assessments. Given the high stakes nature of the OSCE and the need to securely guard its content, we cannot provide more detailed information about the blueprint here.

Constructing stations and training standardized patients

After designing the OSCE blueprint, a series of actual cases was constructed. Each case consisted of (a) the objectives of the assessment, (b) the patient characteristics and (c) the setting (e.g. family practice, emergency department). Next, the patient's history was meticulously described. Not only relevant elements of the chief complaint, but all normal and abnormal findings within the various systems were included. The pieces of information that could be presented directly, as well as information that is only to be revealed as answers to specific questions were defined. Expert clinical faculty members created, reviewed and repeatedly revised the cases. Finally, checklists and rating scales were constructed by clinical and educational experts and based on checklists and rating scales from the literature and examples from the UMC Nijmegen (Jacobs et al. Citation2004). For each case, separate checklists were developed to score history taking, physical examination and professional behaviour. To improve the quality of the items, the checklists were revised several times by various experts. Particularly, the formulation of the items was the subject of extensive discussion. The standardized patients score the performance of the candidate by using the checklists. Every encounter is recorded on tape and at random these are scored by clinicians. The written information (problem list, differential diagnosis and additional diagnostic and therapeutic procedures) is scored by a clinician on the basis of a constructed rating scale.

After the case material and checklists had been described in detail, the next phase was the training of the standardized patients. The approximate training time per standardized patient was 25 h per case. This included – among other things – reading the case together, role playing and education and training in observing the physical examination. Various practice sessions were organized with medical students. These sessions were recorded on tape, which were reviewed by standardized patients as well as by their trainers. During this training, emphasis was put on the provision of the correct information at the right moment to assure the standardization of standardized patients’ performance. Additional training in observing and rating candidates followed, mostly using the video tapes and new practice sessions. After this training, high levels of inter-rater agreement were reached (>90%) (Pelgrim et al. Citation2008) and formal assessment sessions with a reference group were then organized. The reference groups consisted of recently graduated medical students of the UMCs of Nijmegen, Maastricht and Utrecht. They had a maximum of 6 months of clinical experience after graduation.

Establishing a found advice per IMG

The station scores of the clinical skills assessment are derived from the checklists with total test scores being an aggregate of the individual station scores. To evaluate the candidates’ test scores, no definite criterion has yet been set. The absence of performance criteria is due to a lack of insight in the actual levels of difficulty of the test at this moment. Until sufficient information is collected to develop performance criteria, the scores of the IMGs are compared with the scores of a reference group. Scores less than one standard deviation below the mean of this reference group are considered to indicate deficiencies in the skills of the candidates. The clinical knowledge tests scores are valued in the same way.

The results of the clinical assessment and the information provided by the IMGs through their portfolio are weighted by a committee established by the Ministry of Healthcare. All members, except the chairman, are senior faculty from the eight Dutch University Medical Centers. They have been selected by their respective institutions because of their clinical and educational expertise. The chairman is a representative of the Ministry of Health Care, a physician with a longstanding career in the legislation and regulations concerning the certification of IMGs. The committee advises the Minister of Health Care whether or not a candidate needs additional training and how long this training should be. After each decision the Minister of Health Care takes, the Committee communicates this to the IMG.

First results

Due to the political pressure and adjustment of regulations, the influx of foreigners in the Netherlands, in particular asylum seekers, has shown a marked decrease in the first decade of the twenty-first century. Consequently, the number of IMGs applying for the new assessment procedure also decreased rather unexpectedly, leaving only small numbers available for our first evaluation. We decided that nevertheless communication of our procedure is worth while at this moment and we will report more extensively in the future. After an extensive preparation, the new assessment procedure was launched in practice in December 2005. Since then, a total number of 200 IMGs entered the assessment procedure. The first step in the assessment procedure, namely the general skills tests, was failed by 161 candidates, mainly because of insufficient mastery of the Dutch language, in terms of speaking, reading and writing. As a consequence, only 39 candidates took the assessment of medical competence (Step IIB, see ) between April 2006 and October 2008.

One IMG was granted a license without further demands after a successful assessment. The results of nine were considered equal to those of the reference groups and those candidates got their license after a 12 weeks period of supervised working in health care. This period was used to assess their professional behaviour in daily practice. The remaining 29 candidates all received the decision that additional training at a University Medical Centre to acquire their license was necessary. Two candidates were to take one half a year of additional medical training, nine to take 1 year, 14 to take 2 years and one to take 3 years. One candidate had to take 6 years of additional training, which is a complete medical curriculum.

Discussion

A new high-stakes, transparent assessment procedure for the certification of foreign medical graduates in the Netherlands has been developed and introduced. With the description of the development of the clinical assessment procedures, an example is provided to show how such an assessment procedure can be conceived and put into practice.

The new Dutch assessment procedure was developed as a means to select those IMGs who do not meet the standards of medical practice in the Netherlands. The DAMCFG consists of a series of assessments on different elements of medical competencies. The basic and clinical knowledge domains are covered by a computer-based assessment. Clinical skills and professional behaviour are assessed through a 10-station OSCE. However, performance in real practice is not just the sum of knowledge, skills and attitude, but also depends on other factors, in particular time and efficiency (Rethans et al. Citation1991). In the 1970s, Senior and Lloyd already distinguished between the competence and performance of physicians. They defined competence as what a doctor is capable of doing and performance as what he or she actually does in day-to-day practice (Senior Citation1976; Lloyd Citation1979). Naturally, what we really want to know is the level of performance of the IMGs when working in the Netherlands. However, before actual patients are to be taking care of, those with a competency below a minimal threshold should be detected. With this goal in mind, a valid and reliable clinical skills test was to be developed. The most important step in developing such an assessment must be the construction of a clinically relevant blueprint, reflecting at best the actual clinical tasks and problems the candidates are required to be able to cope with according to national standards. Various studies have shown that the reliability of measurements of clinical competence is hindered by the fact that competence is content specific. Good performance in one area is a very poor predictor of performance in another area, even if these are closely related. Wide sampling across problems is required if an adequate level of content validity and reliability is to be achieved (Swanson et al. Citation1987; Van der Vleuten et al. Citation1991; Newble et al. Citation1994; Newble Citation2004; Hallock & Kostis Citation2006). This requires many cases and a very long testing time. The simplest solution to minimize the practical difficulties raised by case specificity is to combine the OSCE with other test formats that provide more efficient sampling of content, for example multiple choice questions. When all test components are based on the same blueprint, this seems to be a justifiable approach. Within the DAMFCG, we use this approach by combining the OSCE with a multiple choice test and an open-question clinical cases test.

There is a large body of evidence indicating that the use of standardized patients in high-stake assessments yields reliable results (Colliver et al. Citation1989; Van der Vleuten & Swanson Citation1990; Boulet et al. Citation1998; Williams et al. Citation1999; Chambers et al. Citation2000; Whelan et al. Citation2005).

Research on the USMLE clinical skills examinations, on which the DAMFCG test procedures is based, demonstrates that test scores of candidates correlate with their subsequent practice patterns (Chambers et al. Citation2000; Boulet et al. Citation2002). Quebec Licensing Examination scores, including a clinical skills examination, were shown to predict the future practice patterns of physicians. For example, in contrast to high performers, lower performers tended to prescribe more symptomatic and contra-indicated drugs (Norcini et al. Citation2002; Tamblyn et al. Citation2002). In another study, IMGs who had passed the clinical skills assessment of the ECFMG outperformed those United States medical graduates who had not passed such a test, showing higher levels of physical examination skills (Boulet et al. Citation2002). External validity is of the greatest importance for licensing examinations. It is not only a quality measure for the test, but also the only way to predict the future performance of candidates. By studying the future prospects of the IMGs in the Netherlands, we try to determine the predictive value of our assessment.

With this new assessment, justified, transparent and adequate advice for additional training can be given or the lack of need for additional training can be determined. This is of great importance because it gives the IMGs the opportunity to start training, which in the end will lead to entrance to the healthcare system as an independent provider. This way, all previous education, training and practice is put into use again. From this, both society and the IMGs will profit.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

Additional information

Notes on contributors

Marye J. Sonderen

MARYE SONDEREN is a resident in the Internal Medicine. Before her residency she coordinated the development of the clinical skills assessment.

Eddie Denessen

EDDIE DENESSEN is an assistant professor in the Department of Educational Sciences and member of the Behavioural Science Institute at Radboud University Nijmegen. His research interests include culture differences in education, teacher beliefs and teaching behaviour, assessment of teaching, research methods, and psychometrics.

Olle Th.J. Ten Cate

OLLE TH. TEN CATE is professor of Medical Education and director of the Center of Research & Development of Education, UMC Utrecht School of Medical Sciences.

Ted A.W. Splinter

T. SPLINTER is a medical oncologist, professor of medical education and chairman of the project group for the assessment of international medical graduates

Cornelis T. Postma

CORNELIS T. POSTMA is associate professor of medicine in the Department of Medicine of the University of Nijmegen Medical Centre. His main medical education interests are in the training of practical medical education and medical competence and in the field of clinical assessment.

References

  • Austin Z, O’Byrne C, Pugsley J, Munoz L. Development and validation processes for an Objective Structured clinical Examination (OSCE) for entry-to-practice certification in pharmacy: The Canadian experience. Am J Pharm Educ 2003; 67(3)76
  • Blueprint of objectives of undergraduate medical education in the Netherlands (2001) Available from: www.nvmo.nl/portals/0/files/BLUEPRINT-Training-of-doctors.pdf
  • Boulet JR, Friedman BM, Ziv A, Burdick WP, Curtis M, Peitzman S, Gary NE. Using standardized patients to assess the interpersonal skills of Physicians. Acad Med 1998; 73(10)S94–S96
  • Boulet JR, McKinley DW, Whelan GP, Van Zanten M, Hambleton RK. Clinical skills deficiencies among first-year residents: Utility of the ECFMG clinical skills assessment. Acad Med 2002a; 77(Suppl 10)S33–S35
  • Chambers KA, Boulet JR, Gary NE. The management of patient encounter time in a high-stakes assessment using standardized patients. Med Educ 2000; 34: 813–817
  • CIBG-Brochure. 2004. Assessment foreign medical graduates: Every provider an equal and fair opportunity to show what he is worth. (Assessment Buitenslands Gediplomeerden Gezondheidszorg: elke beroepsbeoefenaar een gelijke en eerlijke kans om te laten zien wat hij waard is) The Hague: Ministry of Health
  • Cohen R, Rothman AI, Ross J, Domovitch E, Jamieson C, Jewitt M, Keystone J, Kulesha D, Maclnnes M, Shier RM, et al. A comprehensive assessment of graduates of foreign medical schools. Annuals RCPSC 1988; 21: 505–509
  • Colliver JA, Verhulst SJ, Williams RG, Norcini JJ. Reliability of performance on standardised patient cases: A comparison of consistency measures based on generalizability theory. Teach Learn Med 1989; 1: 31–37
  • Conn HL, Jr, Cody RP. Results of the second clinical skills assessment Examination of the ECFMG. Acad Med 1989; 64: 448–453
  • Friedman Ben-David M, Klass DJ, Boulet J, De Champlain A, King AM, Pohl HS, Gary NE. The performance of foreign medical graduates on the National Board of Medical Examiners (NBME) standardized patient examination prototype: A collaborative study of the NBME and the Educational Commission for Foreign Medical Graduates (ECFMG). Med Educ 1999; 33: 439–46
  • Gary NE, Sabo MM, Shafron ML, Wald MK, Ben-David MF, Kelly WC. Graduates of foreign medical schools: Progression to certification by the Educational Commission for foreign Medical Graduates. Acad Med 1997; 72: 17–22
  • Hallock JA, Kostis JB. Celebrating 50 years of experience: An ECFMG perspective. Acad Mede 2006; 81(Suppl)S7–S16
  • Jacobs JCG, Denessen E, Postma CT. The structure of medical competence and results of an OSCE. Netherlands J Med 2004 2004; 62: 397–403
  • Lloyd JS. Definitions of competence in specialties of medicine. American Board of Medical Specialties, Chicago 1979
  • Metz JCM, Verbeek-Weel AMM, Huisjes HJ. Blueprint 2001: Training of doctors in The Netherlands. University Medical Center, University Press, NijmegenThe Netherlands 2001
  • Newble D. Techniques for measuring clinical competence: Objective structured clinical examinations. Med Educ 2004; 38: 199–203
  • Newble D, Dawson B, Dauphinee D, Page G, Macdonald M, Swanson D, Mulholland H, Thomson A, van der Vleuten C. Guidelines for assessing clinical competence. Teach Learn Med 1994; 6(3)213–220
  • Newble D, Jaeger K. The effect of assessment and examinations on the learning of medical students. Med Educ 1984; 17: 165–171
  • Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ 2002; 36: 853–859
  • Papadakis MA. The step 2 clinical-skills examination. N Engl J Med 2004; 350: 1703–1705
  • Pelgrim EAMP, Hettinga AM, Postma CT. Interrater reliability of standardized patients in the Dutch Clinical Skills Assessment (DCSA) for Foreign Medical Graduates. 212. 2008, Abstract Book AMEE 2008; 7I/SC6
  • Rethans JJ, Sturmans F, Drop R, van der Vleuten C, Hobus P. Does competence of general practitioners predict their performance? Comparison between examination setting and actual practice. BMJ 1991; 303: 1377–1380
  • Schade E, Sminia TD. Eindtermen voor de universitaire artsopleiding: 'Raamplan 1994 artsopleiding'. [Final terms for university medical education: 'General Plan 1994 medical education’.]. Ned Tijdschr Geneeskd 1995; 139: 30–35
  • Senior JR. Towards the measurement of competence of medicine. National Board of Medical Examinees, Philadelphia 1976
  • Spike NA. International medical graduates: The Australian perspective. Acad Med 2006; 81: 842–846
  • Splinter TAW, Herfs PGP, Ruijs AJEM, Van Luijk SJ, Wijkhuis NP. Naar een nieuwe stroomlijn voor buitenlandse artsen. Rapport Opleidingcommissie Geneeskunde van het Discipline Overlegorgaan Medische Wetenschappen OCG-DMW. 2003, [Towards new admission procedures for foreign medical graduates. Report of the Educational Council of the Gathered Dutch Medical Disciplines.] Ministry of Health Care
  • Srivastava R. A bridge to nowhere – The troubled trek of foreign medical graduates. N Engl J Med 2008; 385: 216–219
  • Sutnnick AI, Stillman PL, Norcini JJ, Friedman M, Williams RG, Trace DA, Schwartz MA, Wang Y, Wilson MP. Pilot study of the use of the ECFMG clinical competence assessment to provide profiles of clinical competencies of graduates of foreign medical schools for residency directors. Educational Commission for Foreign Medical Graduates. Acad Med 1994; 69: 65–7
  • Swanson DB, Norcini JJ, Grosso LJ. Assessment of clinical competence: Written and computer-based simulations. Assess Eval Higher Educ 1987; 12: 220–246
  • Tamblyn R, Abrahamowicz M, Dauphinee WD, Hanley JA, Norcini J, Girard N, Grand Maison P, Brailovsky C. Association between licensure examination scores and practice in primary care. JAMA 2002; 288: 3019–3026
  • USMLE Clinical Knowledge test. 2008. Content outline. Available from: URL: http://www.usmle.org, Accessed January 20, 2008
  • Van der Vleuten CPM, Norman G, De Graaff E. Pitfalls in the pursuit of objectivity: Issues of reliability. Med Educ 1991; 25: 110–118
  • Van der Vleuten CPM, Swanson DB. Assessment of clinical skills with standardized patients: State-of-the-art. Teach Learn Med 1990; 2: 58–76
  • Verhoeven BH, Verwijnen GM, Scherpbier AJ, Van der Vleuten CPM. Growth of medical knowledge. Med Educ 2002; 36: 711–717
  • Whelan GP, Boulet JR, McKinley DW, Norcini JJ, van Zanten M, Hambleton RK, Burdick WP, Peitzman SJ. Scoring standardized patient examinations: Lessons learned from the development and administration of the ECFMG clinical skills assessment (CSA). Med Teach 2005; 27: 200–206
  • Williams RG, McLaughlin MA, Eulenberg B, Hurm M, Nendaz MR. The patient findings questionaire: One solution to an important standardized patient examination problem. Acad Med 1999; 74: 1118–1124

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.