1,976
Views
14
CrossRef citations to date
0
Altmetric
Web Papers

Development of ACLEEM questionnaire, an instrument measuring residents’ educational environment in postgraduate ambulatory setting

, , , , , , , , & show all
Pages e861-e866 | Published online: 03 Sep 2012

Abstract

Background: Students’ perceptions of their educational environment (EE) have been studied in undergraduate and postgraduate curricula. Postgraduate EE has been measured in hospital settings. However, there are no instruments available to measure the EE in postgraduate ambulatory settings.

Aim: The aim of this study was to develop the “Ambulatory Care Learning Education Environment Measure” (ACLEEM).

Methods: A mixed methodology was used including three stages: (1) Grounded theory (focus groups); (2) Delphi technique to identify consensus; and (3) Pilot study.

Results: Three quota samples of approximately 60 stakeholders were formed, one as Focus Groups and two as Delphi panels. Eight focus groups were carried out including 58 residents (Latin-American Spanish speakers). The results were analysed and 173 items were offered to a National Delphi panel (61 residents and teachers). They reduced in two rounds the number of important items to 54. The 54-item questionnaire was then piloted with 63 residents and refined to the final version of the ACLEEM with 50 items and three domains.

Conclusions: The 50-item inventory is a valid instrument to measure the EE in postgraduate ambulatory setting in Chile. Large-scale administration of the ACLEEM questionnaire to evaluate its construct validity and reliability are the next steps to test the psychometric properties of the instrument.

Introduction

The research related to educational environment (EE) started in the 1930s and was boosted by the work of Pace and Stern (Citation1958). They studied aspects associated with the “atmosphere” in classrooms of primary, secondary schools and universities, utilizing qualitative research methods such as interviews or direct observation of lectures trying to capture the interactions between teachers and students. Gradually, research evolved to the use of quantitative methodologies and Hutchins (Citation1961) created one of the first instruments developed specifically to evaluate the EE in medical education: the Medical School Environment Index (MSEI). This instrument allowed researchers to identify the US medical schools that were perceived as more aggressive and competitive from the students’ point of view.

In more recent years, several studies have been performed to develop and validate new instruments intended to evaluate the EE in different healthcare professions. Soemantri et al. (Citation2010) conducted a systematic review that found 31 instruments published in the literature and established that the Dundee Ready Education Environment Measure (DREEM) (Roff et al. Citation1997), the Postgraduate Hospital Educational Environment Measure (PHEEM) (Roff et al. Citation2005), the Clinical Learning Environment (CLE) (Saarikosky & Leino-Kilpi 1999) and the Supervision and Dental Student Learning Environment Survey (DSLES) (Henzi et al. Citation2005) are likely to be the most suitable instruments for undergraduate medicine, postgraduate medicine, nursing and dental education, respectively. Specifically related with postgraduate medical education, other than the PHEEM, Soemantri et al. (Citation2010) found eight instruments that have been used in this level: Veteran Affairs Learners’ Perceptions Survey (Keitz et al. Citation2003), learning environment assessment (Roth et al. Citation2006), questionnaire from Rotem, Godwin and Du (Citation1995), operating room educational environment measure (Kanashiro et al. Citation2006), surgical theatre educational environment measure (Cassar Citation2004), anaesthetic theatre educational environment measure (Holt & Roff Citation2004), practice-based educational environment measure (Mulrooney Citation2005), and – despite the fact that it was designed for undergraduate medicine – the DREEM. Recently, a new instrument developed in the Netherlands, the Dutch Residency Educational Climate Test (D-RECT), emerged as a valid and reliable instrument measuring residents’ learning climate (Boor et al. Citation2011).

Having recognized this, and considering the wide spectrum of medical education in terms of settings where it is carried on, we noticed that all instruments, including the PHEEM, are questionnaires designed and intended to evaluate the EE in postgraduate hospital settings. Consequently, the ambulatory clinical setting of postgraduate medical education is not particularly addressed in any of the instruments mentioned before.

Ambulatory medical education is mostly conducted in primary care. Since the Alma-Ata Declaration, primary health care (PHC) has been increasingly recognized as the most important level in health systems, as we can see in documents such as the “World Health Report Citation2008” of the World Health Organization that was called “Primary Health Care: Now more than ever”; the “Bamako's call-to-action on research for health” of the Global Ministerial Forum on Research for Health 2008 that restated PHC as the central priority in health research (Editorial Citation2008a); and the series of articles in The Lancet, “Alma-Ata: Rebirth and Revision” (Editorial Citation2008b), and of the New England Journal of Medicine, “The Future of Primary Care” (Perspective Citation2008). Along with this, scientific evidence has demonstrated that health systems with stronger PHC has better health outcomes, reduced inequities and has less healthcare costs (Starfield Citation2002; Macinko Citation2003; Starfield Citation2005). Moreover, international agencies such as the PanAmerican Health Organization have taken similar steps recommending medical schools to orientate their curricula more strongly towards components of PHC (Borrell et al. Citation2008). In this scenario of relevance of PHC (therefore, of ambulatory care) and in the absence of a specific instrument to evaluate the EE in this setting, the aim of our study was to develop an inventory to measure the EE in postgraduate ambulatory medical education.

Methods

A mixed methodology was used including three stages: (1) Grounded theory; (2) Delphi technique to identify consensus; and (3) Pilot study.

In the first stage, the objective was to identify the aspects related with the EE in ambulatory postgraduate medical education. For this, we use a qualitative approach carrying out focus groups with residents of specialties with ambulatory activities. In each focus group, the positive and negative aspects of their ambulatory EE were explored in approximately 90 min and the conversations were recorded. The analyses of the interviews were made following the model of codification proposed by Grounded Theory (Strauss & Corbin 1998) by using the Atlas.ti® software. For the quality control of the analyses, data triangulation was performed. The results were revised considering methodological, ecological and explanatory validity.

In the second stage, we created statements addressing the most relevant aspects of the EE in postgraduate medical education in ambulatory setting that emerged from the results of the first stage. We performed a two-round Delphi technique (Hasson et al. Citation2000) with a panel of experts from different medical schools of Chile to prioritize the importance of each item (statement) expecting >11% of response rate in each round following a similar methodology used at this stage in the development of PHEEM (Roff et al. Citation2005). Respondents were asked to rate the importance of the items identified on the first stage with a Likert-type scale where 0 = without relevance, 1 = some relevance, 2 = indifferent, 3 = relevant and 4 = highly relevant. The survey was administered online. All the items with mean value ≥3 was considered relevant in both Delphi rounds and the items below this cut-off value were deleted (SPSS software).

The third stage (pilot study) was designed to test and refine the instrument created with the items selected by the Delphi panel. We piloted the questionnaire with a group of residents. They were asked to answer/report their EE perceptions for each item with a Likert-type scale: 0 = completely disagree, 1 = disagree, 2 = uncertain, 3 = agree and 4 = completely agree. We evaluated the quality of the items, identifying imprecise or ambiguous items within the questionnaire, considering the opinions of the residents.

The project was supported by the Postgraduate Director and it was approved by the Ethics Committee of the Pontificia Universidad Católica de Chile Medical School. The residents were randomly invited to participate in the focus groups and informed consent was required assuring confidentiality of their comments. Delphi panel opinions were confidential. Finally, a group of residents was randomly invited to participate in the pilot study and they anonymously answered the questionnaire.

Results

Stage 1: Grounded theory

In this first stage, 16 specialities with ambulatory setting activities were selected (family medicine, psychiatry, paediatrics, gynaecology & obstetrics, surgery, dermatology, otorhinolaryngology, ophthalmology, neurology and internal medicine with their sub-specialties) and residents were randomly invited to participate. Eight focus groups were conducted with a total of 58 participants (all from Latin American and Spanish speakers countries); 5 out of 58 (8.6%) were foreigners from Argentina, Colombia and Ecuador.

The information provided by the focus groups was open-coded generating 173 items (119 positive statements and 54 negative statements) grouped in three general domains related to relevant aspects of the EE in the ambulatory setting: Support, Clinical Teaching and Clinical Training. As a result, 173 items were identified regarding the aspects of the ambulatory EE.

Stage 2: Delphi technique

We performed a two-round Delphi process. The 173 items previously acknowledged were offered in the first round to a National Delphi panel drawn from nine Medical Schools of Chile. The survey was administered online and 61 out of 361 teachers and residents answered the survey (16.9% response rate). They considered 64 items as relevants. For the second round, 58 respondents in the Delphi panel (15.4 % response rate) reduced the number of items to 54.

The 54-item questionnaire was firstly translated from Spanish into English by a Chilean medical doctor proficient in English to be reviewed by two experts in EE (S.R. and A.R.) and then reverse translated into Spanish by a professional translator in order to ensure validity of content and meaning.

Stage 3: Pilot study

The 54-item questionnaire was piloted with a representative group 63 residents from seven specialties. summarizes the methodological process and findings of the three stages. As a result, four items were erased and seven were re-written because they had imprecise or ambiguous concepts. Fifty items were finally considered in the refined version of the instrument and it was translated into English to be reviewed by A.R. and S.R. following the same methodology described for the Stage 2 (see ). As items 24 and 27 contained negative statements, we reverse coded the scores for these questions. The 50-item questionnaire was called the “Ambulatory Care Learning Education Environment Measure” (ACLEEM) and the items were allocated in three domains according to the content of the items: Clinical Teaching (items 1 to 16), Clinical Training (items 17 to 38) and Support (items 39 to 50).

Figure 1. Flow chart of the methodology process.

Figure 1. Flow chart of the methodology process.

Table 1  Ambulatory Clinical Learning Educational Environment Measure (ACLEEM)

Discussion

The quality assurance process of postgraduate educational programmes and residency training is increasingly important (Afrin et al. Citation2006). EE is one of the aspects to evaluate the quality of training programmes providing information about several domains like atmosphere, feedback and supervision in hospital and ambulatory settings (Boor et al. Citation2011). Several questionnaires have been developed to measure the EE and ACLEEM is the first one, particularly developed to measure aspects related to the EE in ambulatory setting. Development and validation of the 50-item inventory was based on grounded theory and a modified Delphi procedure. It is a strength of this study that Focus Groups included residents from several Latin-American countries and 16 residency programmes. Even when residents from Argentina, Colombia and Ecuador participated in the whole process, providing feedback about the content and meaning of the statements, some words in Spanish could be interpreted in a different way in other Spanish-speaking countries. e.g. outpatient clinic (OPC) in Chile could be named as consultorio or policlínico, however, in other Latin-American countries clínica externa or dispensario are more commonly used. In the future, it is important to address this issue, revising the meaning of each statement with residents of the programme, before the administration of the ACLEEM.

We need to take into account the fact that the Delphi panel included over 300 residents and teachers from several universities with 16.9% and 15.4% response rates, in the first and second rounds, respectively. This could be a source of potential bias. However, we expected >11% of response rate in each round according to a similar methodology used at this stage in the development of PHEEM (Roff et al. Citation2005). The 50 items were allocated in three domains according to the qualitative analysis of the data: Clinical Teaching, Clinical Training and Support. These domains must be tested by using an exploratory factor analysis. However, for a sound factor analysis a number of five subjects per item (250 residents) must be included (Streiner Citation1994) and the pilot study (63 residents) was not enough to carry it out. Finally, ACLEEM is a valid instrument to measure the EE in postgraduate ambulatory settings in Chile and it can be administered in Spanish speaking countries. Large-scale administration of the ACLEEM questionnaire to evaluate its construct validity (Field Citation2005), internal consistency and reliability including Generalisability theory (Crossley et al. Citation2002) are the next steps to evaluate the psychometric properties of the instrument.

Acknowledgements

This work is partially supported by grants of the National Commission for Scientific and Technological Research (CONICYT), FONDECYT no. 1 100 436 to A.R.

Declaration of interest: The authors report no conflicts of interest. The authors are responsible for the content and writing of the article.

References

  • Afrin LB, Arana GW, Medio FJ, Ybarra AF, Clarke HS, Jr. Improving oversight of the graduate medical education enterprise: One institution's strategies and tools. Acad Med 2006; 81(5)419–425
  • Boor K, van der Vleuten CP, Teunissen P, Scherpbier A, Scheele F. Development and analysis of D-RECT, an instrument measuring residents’ learning climate. Med Teach 2011; 33: 820–827
  • Borrell RM, Godue C, García Dieguez M. 2008. Serie: La Renovación de la Atención Primaria de Salud en las Américas. N°2 La Formación en Medicina Orientada hacia la Atención Primaria de Salud. OPS
  • Cassar K. Development of an instrument to measure the surgical operating theatre learning environment as perceived by basic surgical trainees. Med Teach 2004; 26(3)260–264
  • Crossley J, Davies H, Humphris G, Jolly B. Generalisability: A key to unlock professional assessment. Med Educ 2002; 36(10)972–978
  • Global Ministerial Forum on Research for Health. The Bamako call to action: Research for health. Lancet 2008a; 372: 1855
  • Global Ministerial Forum on Research for Health. A renaissance in primary health care. Lancet 2008b; 372: 863
  • Field A. Exploratory factor analyisis. Discovering statistics using SPSS2nd. Sage, London 2005; 619–680
  • Hasson F, Keeney S, McKenna H. Research guidelines for Delphi survey technique. J Adv Nurs 2000; 32(4)1008–1015
  • Henzi D, Davis E, Jasinevicius R, Hendricson W, Cintron L, Isaacs M. Appraisal of the dental school learning environment: The students' view. J Dent Educ 2005; 69(10)1137–1143
  • Holt MC, Roff S. Development and validation of the Anaesthetic Theatre Educational Environment Measure (ATEEM). Medical teacher 2004; 26(6)553–558
  • Hutchins EB. The 1960 medical school graduate: Hisperception of his faculty, peers and environment. J Med Educ 1961; 36: 322–329
  • Kanashiro J, McAleer S, Roff S. Assessing the educational environment in the operating room - a measure of resident perception at one Canadian institution. Surgery 2006; 139: 150–158
  • Keitz SA, Holland GJ, Melander E, Bosworth H, Pincus S. The veteran affairs learners' perceptions survey: The foundation for educational quality improvement. Acad Med 2003; 78(9)910–917
  • Macinko J, Starfield B, Shi L. The Contribution of Primary Care Systems to Health Outcomes within Organization for Economic Cooperation and Development (OECD) Countries, 1970–1998. Health Serv Res 2003; 38(3)831–865
  • Mulrooney A. Development of an instrument to measure the Practice Vocational Training Environment in Ireland. Med Teach 2005; 27(4)338–342
  • Pace CR, Stern GG. “An approach to the measurement of psychological characteristics of college environments”. J Educ Psychol 1958; 49: 269–277
  • Lee T. The Future of Primary Care. N Engl J Med 2008; 359(20)2085
  • Roff S, McAleer S, Harden RM, Al-Qahtani M, Uddin AA, Deza H, Groenen G, Primparyon P. Development and Validation of the Dundee Ready Education Environment Measure (DREEM). Med Teach 1997; 19(4)295–299
  • Roff S, McAleer S, Skinner A. Development and Validation of an instrument to measure postgraduate clinical leraning and teaching educational environment for hospital-based junior doctors in the UK. Med Teach 2005; 27(4)326–331
  • Rotem A, Godwin P, Du J. Learning in hospital settings. Teach Learn Med 1995; 7(4)211–217
  • Roth LM, Severson RK, Probst JC, Monsur JC, Markova T, Kushner S, Schenk M. Exploring physician and staff perceptions of the learning environment in ambulatory residency clinics. Fam Med 2006; 38(3)177–184
  • Saarikoski M, Leino-Kilpi H. Association between quality of ward nursing care and students' assessment of the ward as a learning environment. NTresearch 1999; 4(6)467–474
  • Soemantri D, Herrera C, Riquelme A. Measuring the educational environment in health professions studies: A systematic review. Med Teach 2010; 32: 947–952
  • Starfield B, Shi L, Macinko J. Contribution of Primary Care to Health Systems and Health. Milbank Q 2005; 83(3)457–502
  • Starfield B, Shi L. Policy relevant determinants of health: An international perspective. Health Policy 2002; 60: 201–218
  • Strauss A, Corbin J, 1998. Basics of Qualitative Research: Grounded Theory Procedures and Technique. 2nd ed. Sage
  • Streiner DL. Figuring out factors: The use and misuse of factor analysis. Can J Psychiat 1994; 39(3)135–140
  • WHO Declaration of Alma-Ata, 1978. http://www.who.int/hpr/NPH/docs/declaration_almaata.pdf (accessed on October 3rd, 2011)
  • World Health Organization. 2008. The World Health Report 2008. “Primary Health Care: Now More Than Ever”.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.