3,262
Views
47
CrossRef citations to date
0
Altmetric
Web Paper

Educational environment in intensive care medicine—use of Postgraduate Hospital Educational Environment Measure (PHEEM)

, &
Pages e184-e191 | Published online: 03 Jul 2009

Abstract

Background: The educational climate is an important measure within medical education. This is because there are still accounts of poor teaching, humiliation, bullying and harassment of doctors in training. Deaneries and schools must be able to demonstrate to the Postgraduate Medical Education and Training Board that trainees are working and learning in a good environment.

Methods: This study used the Postgraduate Hospital Educational Environment Measure (PHEEM) to measure the educational climate in nine intensive care training schemes within hospitals in England and Scotland. 134 trainees replied out of 190 (71% response). Neither the identities of the nine units nor the trainees were known to the researchers.

Results: The results showed that there was a good overall educational climate in the intensive care units studied, with no racism or sexism, and trainees were happy with their teaching, their support and the work they did. The junior trainees (house officers and senior house officers) perceived a significantly better climate than did the senior trainees (specialist registrars). There were also significant differences in scores for the nine different intensive care units.

Conclusions: PHEEM has proved to be a reliable and consistent tool to assess educational climate with an overall Cronbach's alpha of 0.921.

Introduction

Within the United Kingdom, the Postgraduate Medical Education and Training Board (PMETB) is now established. Its role is to oversee the whole of postgraduate medical education within both primary and secondary care within the United Kingdom. There is now a need for a valid and reliable assessment of the environment in postgraduate medical education, used at a UK wide level. At the moment, postgraduate Deaneries and Royal Colleges carry out quality assurance visits, assessing the opinions of trainers and trainees of the educational environment. None of these use valid and reliable assessments.

There is a need to develop the system of using a practical, valid and reliable way of assessing educational environment, which is reproducible and transferable, and could be used both to measure the educational environment in a unit, and to compare results of educational environment measurement across different grades of junior doctors, across different Deaneries and different specialities within the United Kingdom.

What is the educational environment? It is variously referred to as climate, atmosphere or tone, is a set of factors that describe what it is like to be a learner within that organization. Chambers and Wall (Citation2000), consider the educational climate in three parts. These are the physical environment (safety, food, shelter, comfort and other facilities), the emotional climate (security, constructive feedback, being supported, and absence of bullying and harassment) and the intellectual climate (learning with patients, relevance to practice, evidence-based, active participation by learners, motivating, and planned education).

The good clinical teaching environment (SpencerCitation2003) ensures the teaching and learning is relevant to patients, has active participation by learners, and shows professional thinking and behaviours. There should be good preparation and planning, both of the structure and content, reflection on learning, and evaluation of what has happened in the teaching and learning. Spencer also goes on to describe some of the common problems with teaching and learning in the clinical environment. These include a lack of clear objectives, a focus on knowledge rather than problem-solving skills, teaching at the wrong level, passive observation, little time for reflection and discussion, and teaching by humiliation.

Teaching by humiliation, bullying and harassment is a big problem, both in the United Kingdom and other countries. Much of this relates to teachers' lack of awareness of educational skills and knowledge (LowryCitation1992, Citation1993) and inability to promote a good supportive educational climate for trainees in which to learn (SCOPMECitation1992; SCOPMECitation1994). Lowry (Citation1992) described disenchantment with medicine in the words of a young doctor as ‘It could have been such a wonderful thing to be a doctor – but it's not. It's just a disaster.’

There are, sadly, many examples in the literature of bullying and harassment of junior doctors and medical students. Such studies show the practice is widespread, and on the individual level, illustrate how destructive to confidence and well being bullying and harassment can really be. Wolf et al. (Citation1991) carried out a questionnaire study of medical students in the Louisiana State University School of Medicine. Of these, 98.9% reported mistreatment, with shouting and humiliation being most frequent. Over half the sample reported sexual harassment—reported mainly by women students. There was a high level of remarks degrading doctors and medicine as a profession. Increased mistreatment was associated positively with a perceived increase in cynicism.

More recently an editorial in Medical Education (Spencer and LennardCitation2005) discussed teaching by humiliation. They called for an end to a culture of bullying, setting in place a self perpetuating culture of abuse, in which the victims become the perpetrators in the next round. They cited the teaching the teachers movement, to encourage teaching based on sound educational principles, monitoring of examiners’ performances, and assessment by the whole team, not just the consultants, who appear to be the main perpetrators of abuse.

What about in intensive care medicine? This would appear to be a challenging environment in which to place trainees. It can appear overwhelming, with complicated, ill patients, and complex machinery, all unfamiliar to the trainee. There are many staff, most of whom are experienced and familiar with the workings of the intensive care unit. Life or death situations occur frequently. Patients’ conditions change rapidly, and there may be little time to ponder the correct clinical decision to take. Despite this the Foundation curriculum (Foundation Programme CommitteeCitation2005) includes a need to learn about the care of acutely ill patients and an increasing number of Foundation years 1 and 2 trainees are attached to ICUs.

Evaluations from trainees at senior house officer level within the West Midlands report good to excellent scores for induction, supervision and clinical experience using the post evaluation tool, developed and used every six months on all senior house officer posts since 1997 (Wall et al.Citation2000, Citation2001). Also, evidence from our Deanery Quality Assurance visits to intensive care units has repeatedly shown that senior house officers value their time in intensive care medicine, feel well supported, and rate the jobs highly.

Can the educational environment be measured, using a practical, valid and reliable tool? The Dundee Ready Education Environment Measure (DREEM) was developed in Dundee by Roff et al. (Citation1997). It is a valid and reliable measure of the perceived educational environment. It has been widely used in many countries throughout the world, including the Gulf States, Nepal, Nigeria, the West Indies, Thailand, China, and Canada and the United Kingdom. It has been used for medical students, nurses, dentists, chiropractors and other professions allied to medicine. A more detailed bibliography has been published by Roff (Citation2005).

From this original environmental measure, further postgraduate measurement scales have been developed. The Anaesthetic Theatre Education Environment Measure (ATEEM) (Holt & RoffCitation2004) showed that a better educational climate for first year senior house officers than for specialist registrars, and these results were statistically significant. The Surgical Theatre Environmental Education Measure (STEEM) (CassarCitation2004) has been developed and validated for training surgeons. The Dundee group has also developed a 40 item Inventory for the various aspects of junior doctor training in the UK and Ireland, the Postgraduate Hospital Educational Environment Measure (PHEEM) (Roff et al.Citation2005). However their description of the tool does not contain data to show its reliability or practicality.

This study looked at the practicality of using the Postgraduate Hospital Education Environment Measure (PHEEM) to look at the education environment within Intensive Care Medicine.

Our research questions were as follows:

  1. Is PHEEM a practical way to evaluate the educational climate within Intensive Care Units?

  2. Is there a good educational climate in intensive care units, as the post evaluation results seem to suggest?

  3. Does the educational climate vary with the different grades of junior doctor?

  4. Does the educational climate vary between intensive care units?

Methods

We set out to assess the reliability and practicality of a new tool to measure the quality of the educational environment in nine intensive care training programmes in England and Scotland. We used the PHEEM (Roff et al.Citation2005) questionnaire as a self-administered tool (Appendix 1). This has 40 statements with which the respondents are asked to indicate their agreement using a 5 point Likert scale. These range from strongly agree (4), agree (3), unsure (2), disagree (1) to strongly disagree (0). Agreement with the items indicates a ‘good’ environment giving high scores. The four negative statements (questions 7, 8, 11, and 13) were scored in reverse so that the higher the score the more positive the environment. Information on gender and seniority in terms of the grade of post were also requested as part of the questionnaire.

The intensive care programmes surveyed varied from individual hospitals to amalgamated units according to local delivery of education. These were selected on a voluntary basis by members of the Intensive Care Society Education Committee. Participation by individuals and units was entirely voluntary. Neither the individuals nor the units were identified to MC who conducted the analysis.

The data were analysed using SPSS 12.0. The reliability of the questionnaire was assessed using Cronbach's alpha both as a whole and for each item using the ‘alpha if item deleted’ to identify questions whose exclusion would improve the reliability (TuroffCitation1970; BowlingCitation1997; FieldCitation2000). Descriptive statistics were reported as median, mean and standard deviation. The comparative statistics used the non-parametric methods of Mann–Whitney U for two independent samples and Kruskal-Wallis for multiple independent samples (FieldCitation2000; JamiesonCitation2004). Global mean scores, for individual respondents, were calculated with missing values scored as 2 (the midpoint on this 0–4 scale). For factor analysis, the method used was a varimax rotation, setting to accept Eigen values above 1.0 and accepting correlations above 0.5.

Results

Demographics

134 trainees, out of 190 (71%) completed the questionnaire. These were drawn from nine intensive care training programmes within England and Scotland and the numbers in each programme varied from 4–41 trainees. There were 50 female trainees (37.3%) and 80 male trainees (59.7%), and 4 missing values (3.0%). There were 3 pre-registration house officers (PRHOs), 60 senior house officers (SHOs), 68 specialist registrars (SpRs) and 3 missing values. The number of years spent at these grades ranged from 0 to 5 years.

Practicality

The questionnaires took less than 5 minutes to complete. Furthermore out of a possible 5360 responses to the 40 questions there were only 23 missing values suggesting that the questionnaire was easy to understand. Coding the questionnaires and calculating the raw scores for each individual was also quick and easy. The scores suggested by Roff et al. (Citation2005) can be calculated by hand in less than five minutes or input into a spreadsheet in the same time. Designing the spreadsheet and calculations are more time consuming and dependent on the operator's skills.

Reliability

Cronbach's alpha scored at 0.921 for the 40 statements. When this was analysed to exclude each question in turn, using the ‘alpha if item deleted’ there was no significant improvement in the score thus confirming all questions were relevant and should be included.

Questionnaire responses

summarizes the responses to each question. We have reported the mean and standard deviation of the results because this gives a better overall view of the results than the median values alone. Only one of the questions answered met the criteria for a normal (parametric) distribution so all comparative statistics used are non parametric in type.

Table 1.  Summary results of PHEEM questionnaire (## – questions with reverse scoring)

shows the statements which were highly rated (more than 3) or poorly rated (less than 2). There were only three questions (13, 34 and 40) with statistically significant differences with a p-value of less than 0.05 (on Mann–Whitney test) between genders. On each occasion the male trainees ranked the environment higher than did the female trainees. There were no gender differences in the aggregated scores.

Table 2.  Questions with mean scores high (more than 3) or low (less than 2) [## reverse scores high = little racism or sexism]

There were significant differences between training programmes on 21 questions and all the aggregated scores. There are significant differences between trainees' grades on 17 questions and in general the Senior House Officers rated the environment more highly than did the Specialist Registrars. The order of ranking was Pre Registration House Officers, then Senior House Officers and then Specialist Registrars for all the aggregated scores except for role autonomy where the Senior House Officers ranked it highest and the Pre Registration House Officers ranked it the lowest.

summarizes the aggregate scores for individual trainees in the way suggested by Roff et al. (Citation2005) to identify measures of the environment globally, and in terms of teaching, role autonomy and social support.

Table 3.  Summary of aggregate scores from PHEEM questionnaire

The numbers of trainees that fall into each of the three categories are then shown in .

Table 4.  Numbers of trainees in PHEEM categories

For the factor analysis results, the questions were grouped according to how different individuals answered the questions using Principal Component Factor Analysis. The Kaiser–Meyer–Olkin Measure of Sampling Adequacy was 0.82 (a value above 0.5 means that the sample of data is a valid matrix (FieldCitation2000) demonstrating that there are significant factors to be derived from this data). The factor analysis produced 10 factors which account for 67% of the variance. The top three factors encompass 18 of the 40 questions. The first is to do with ‘the teacher’, the second ‘learning doctoring skills in a safe environment’ and the third a ‘happiness index’. shows these 3 main factors and the questions which are part of these three factors.

Table 5.  Factors shown in PHEEM data

Discussion

We believe that we have shown that PHEEM has a set of reliable questions to use for measuring the educational environment within intensive care medicine (Cronbach's alpha 0.92). Furthermore the high compliance rate of filling in the questionnaire implies that the trainees found it a simple and practical proposition. We have shown that PHEEM meets some of the requirements of assessment tools namely reliability and practicality (Chambers & WallCitation2000).

The intensive care areas sampled provided good overall educational environments. In particular we were heartened to find that there were low levels of perceived racism and sexism. While recent studies have reported bullying and harassment is common within the NHS (AnonymousCitation2001; QuineCitation2002; Musselman et al.Citation2005) it was reassuring to find that in intensive care medicine that this was not an issue. However this was a small self-selected sample of the trainees within Intensive Care Medicine, and so caution must be taken in generalising too widely from our results here. In order to confirm these findings the next stage in the process will be to undertake a ‘snapshot’ of all intensive care trainees within the United Kingdom as a whole to set benchmarks.

There were interesting differences between genders, grades of the trainees and between intensive care areas. In particular, we found that Senior House Officers experienced a better educational climate in many respects than did Specialist Registrars both when comparing grades of doctor and through hierarchical cluster analysis. Our data does not explain why this is the case, so this is an area for further study in the future. It would be valuable to understand what factors contributed to the success of the units achieving the better results.

We believe it is useful to measure the educational climate for four reasons.

  1. So an individual trainee can identify what to expect.

  2. So each unit can inform the teachers and the trainees of the quality of their educational climate.

  3. So the educational climate can be tracked over time.

  4. So that standards can be set at local, regional and national level.

Information is also useful to Deaneries, Royal Colleges and PMETB in their roles relating to quality assurance of training programmes and posts. In the future, PHEEM could be used as a quick, easy and cost effective way of measuring the educational environment. On the one hand it could be used to identify examples of excellence from which good practice could be shared with others. On the other hand it could act as a screening tool to identify areas of concern. Such areas could then be visited by Deanery teams, by Royal Colleges and by PMETB.

In conclusion PHEEM is a simple, practical and reliable way to measure the educational environment in intensive care medicine. Furthermore these results show that an intensive care area can offer a supportive and good environment for trainees, especially senior house officers and that such attachments for Foundation trainees to ICUs should be encouraged.

Additional information

Notes on contributors

Michael Clapham

DR MICHAEL CLAPHAM is a consultant anaesthetist and intensivist at the Queen Elizabeth Hospital in Birmingham, and associate dean for education in the West Midlands Deanery, United Kingdom.

David Wall

PROFESSOR DAVID WALL is deputy regional postgraduate dean in the West Midlands Deanery and professor of medical education at Staffordshire University.

Anna Batchelor

DR ANNA BATCHELOR is President of the Intensive Care Society and is a consultant anaesthetist and intensivist in the Department of Anaesthesia and Intensive Care Medicine, Newcastle-upon-Tyne National Health Service Trust, and the University of Newcastle-upon-Tyne, Newcastle-upon-Tyne, United Kingdom.

References

  • Anonymous. Bullying in medicine. Br Med J 2001; 323: 1314
  • Bowling A. Research Methods in Health. OUP, Buckingham 1997
  • Cassar K. Development of an instrument to measure the surgical operating theatre learning environment as perceived by basic surgical trainees. Med Teach 2004; 26: 260–264
  • Chambers R, Wall DW. Teaching Made Easy: A Manual for Health Professionals. Radcliffe Medical Press Ltd, Abingdon 2000
  • Field A. Discovering Statistics Using SPSS for Windows. Sage, London 2000
  • Foundation Programme Committee. Curriculum for the Foundation Years in Postgraduate Education and Training. HMSO, London 2005
  • Holt MC, Roff S. Development and validation of the Anaesthetic Theatre Education Environment Measure (ATEEM). Med Teach 2004; 26: 553–558
  • Jamieson S. Likert scales: how to (ab)use them. Med Edu 2004; 38: 1217–1218
  • Lowry S. What's wrong with medical education in Britain?. Br Med J 1992; 305: 1277–1280
  • Lowry S. Teaching the Teachers. Br Med J 1993; 306: 127–130
  • Musselman LJ, Macrae HM, Reznick RK, Lingard LA. You better learn under the gun: intimidation and harassment in surgical education. Med Edu 2005; 39: 353–357
  • Quine L. Workplace bullying in junior doctors: questionnaire survey. Br Med J 2002; 324: 878–879
  • Roff S. Education environment: a bibliography. Med Teach 2005; 27: 353–357
  • Roff S, McAleer S, Harden M, Al-Qahtani M, Ahmed A, Deza H, Groenen G, Primparyon P. Development and validation of the Dundee Ready Education Environment Measure (DREEM). Med Teach 1997; 19: 295–299
  • Roff S, McAleer S, Skinner A. Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Med Teach 2005; 27: 326–331
  • SCOPME. Teaching Hospital Doctors and Dentists to Teach: Its Role in Creating a Better Learning Environment Proposals for Consultation—Full Report. Standing Committee on Postgraduate Medical and Dental Education, London 1992
  • SCOPME. Creating a Better Learning Environment in Hospitals 1: Teaching Hospital Doctors and Dentists to Teach. Standing Committee on Postgraduate Medical and Dental Education, London 1994
  • Spencer J. Learning and teaching in the clinical environment. Br Med J 2003; 326: 591–594
  • Spencer J, Lennard T. Time for gun control?. Med Edu 2005; 39: 868–869
  • Turoff M. The design of a policy Delphi. Technol Forecas Soc 1970; 2: 149–171
  • Wall D, Whitehouse A, Campbell I, Kelly S, Cook S. Computerized evaluations of their education and training by senior house officers in the West Midlands. Hospital Med 2000; 61: 54–56
  • Wall D, Woodward D, Whitehouse A, Kelly S, O’Regan C, Dykes P, Cook S. The development and uses of a computerized evaluation tool for senior house office posts in the West Midlands Region, UK. Med Teach 2001; 23: 24–28
  • Wolf TM, Randall HM, Von Almen K, Tynes LL. Perceived mistreatment and attitude change by graduating medical students: a retrospective study. Med Edu 1991; 25: 182–190

Appendix 1

PHEEM

(Postgraduate Hospital Educational Environment Measure)

Sex:□ Male □ Female

Training Grade:□ PRHO □ F2 □ SHO □ SpR

Years in present grade:□ 1 □ 2 □ 3 □ 4 □ 5

Specialty:□ Surgical   □ Medical  □ Paediatric

□ Obs & Gynae □ Anaesthetic □ Critical Care

□ Foundation  □ Other

The following items relate to your current experience. Please read each statement and rate it as it applies to your own feelings about your present position in this hospital. It is about your personal perceptions of the current post.

Please tick the appropriate box:

Comments

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.