4,426
Views
10
CrossRef citations to date
0
Altmetric
Web Paper

Basic life support skills training in a first year medical curriculum: six years’ experience with two cognitive–constructivist designs

, , &
Pages e49-e58 | Published online: 03 Jul 2009

Abstract

Rationale: Although the Basic Life Support (BLS) ability of a medical student is a crucial competence, poor BLS training programs have been documented worldwide. Better training designs are needed. This study aims to share detailed descriptions and the test results of two cognitive–constructivist training models for the BLS skills in the first year of medical curriculum.

Method: A BLS skills training module was implemented in the first year curriculum in the course of 6 years (1997–2003). The content was derived from the European Resuscitation Council Guidelines. Initially, a competence-based model was used and was upgraded to a cognitive apprenticeship model in 2000. The main performance-content type that was expected at the end of the course was: competent application of BLS procedures on manikins and peers at an OSCE as well as 60% achievement in a test consisting of 25 MCQ items. A retrospective cohort survey design using exam results and a self-completed anonymous student ratings’ questionnaire were used in order to test models.

Results: Training time for individual students varied from 21 to 29 hours. One thousand seven hundred and sixty students were trained. Fail rates were very low (1.0–2.2%). The students were highly satisfied with the module during the 6 years.

Conclusion: In the first year of the medical curriculum, a competence-based or cognitive apprenticeship model using cognitive-constructivist designs of skills training with 9 hours theoretical and 12–20 hours long practical sessions took place in groups of 12–17 students; medical students reached a degree of competence to sufficiently perform BLS skills on the manikins and their peers. The cognitive–constructivist designs for skills training are associated with high student satisfaction. However, the lack of controls limits the extrapolation of this conclusion.

Introduction

The term Basic Life Support (BLS) refers to maintaining an airway and supporting breathing and the circulation. It comprises the following elements: initial assessment of the person, airway maintenance, expired air ventilation (rescue breathing; mouth-to-mouth ventilation) and chest compression. When all combined, the term cardiopulmonary resuscitation (CPR) is used (Handley et al., Citation2001). According to European Resuscitation Council Guidelines almost 60 different skills are required to perform BLS (Handley et al., Citation2001; Philips et al., Citation2001). Although effective BLS decreases morbidity and mortality, and is a core skill for all healthcare professionals, it has been documented by several authors worldwide that training programs in this area are poor (Kaye & Mancini, Citation1998; Garcia-Barbero & Caturla-Such, Citation1999; Perkins et al., Citation1999; Jordan & Bradley, Citation2000; Phillips & Nolan, Citation2001). Different schools vary greatly in training time and programming of theory and practice. All programs taught one and two-rescuer adult BLS. Most included the management of an obstructed adult airway. The majority also covered adult BLS with spinal injury. Courses concentrate training on practical demonstration and practice of BLS. Assessment of student competence in BLS was carried out in most courses, usually after initial training. Although BLS guidelines are disseminated widely, there is no standard for the program and there is still great confusion what the exact content and educational approach should be (Kaye & Mancini, Citation1998; Garcia-Barbero & Caturla-Such, Citation1999; Jordan & Bradley, Citation2000; Phillips & Nolan, Citation2001). More standardization seems to be called for. Taking current educational principles into account, carefully designed BLS skills training programs are needed for health professionals.

Educational perspectives of skills training

In order to design and test an educational program, while placing ‘how do people learn’ in the center, it is essential to use sound educational principles and consider their evidential implementation results in different conditions. In this process, the content and objectives of teaching, prior knowledge of learners, teaching and learning environment and methods, assessment of learners’ achievements and the program evaluation are the main issues. In terms of skills training, there is a growing trend to introduce clinical skills earlier in the curriculum (Lam et al., Citation2002). Numerous reports cite examples of cognitive–constructivist curricular reform that include clinical skills training and the creation of clinical skills centers (Dent, Citation2001), use of simulation (Issenberg et al., Citation1999; Issenberg, Citation2002) and development of the outcome measures such as OSCE that more accurately and reliably assess clinical competence (Whelan, Citation2000). Moreover, recent experiences in design of skills training focus on cognitive–constructivist views of learning and emphasize the adult learning principles, competence-based instruction, expert modeling and situated learning (Hamo, Citation1994; Stillman et al., Citation1997; Wilson & Jennet, Citation1997; Boulay & Medway, Citation1999; Issenberg, Citation2002; Patrick, 2002; Rolfe & Sanson-Fisher, Citation2002).

According to the cognitive–constructivist view, skills development consists of three definable stages; (1) initial skill acquisition; (2) skill compilation (proceduralization); and (3) skill automaticity (Ford & Kraiger, Citation1995; Patrick, Citation2002a). Through practice, feedback, reflection and experience, declarative knowledge is compiled and proceduralized. Sequencing the content, practice, whole and part drill, carefully monitoring students’ performance, preventing or correcting misconceptions, constructive feedback, formative performance assessment are emphasized in the training process (Ford & Kraiger, Citation1995; Patrick, Citation2002b). The use of skills guidelines and/or checklists has been shown to be beneficial in exposing medical students to practical procedures (Hunskaar & Seim, Citation1985; Bruce, Citation1989; Hamo, Citation1994; Boulay & Medway, Citation1999).

Competency-based education has become dominant at most stages of medical education (Leung, Citation2002) which consists of a functional analysis of professions’ occupational roles, translation of these roles into outcomes, and assessment of learners’ progress in the demonstrated whole task performance of these outcomes. Its potential advantages include individualized flexible training, transparent standards, and increased public accountability (Leung, Citation2002).

The Cognitive Apprenticeship Model is based on a practical educational approach. It reflects a situated perspective by seeking to contextualize learning (Brown et al., Citation1989). It has been proposed as an instructional method for imparting expert models to novices and includes six sub-processes to be undertaken by the trainer and students, namely: modeling, coaching, scaffolding, articulation, reflection and exploration (Ford & Kraiger, Citation1995). The first three sub-processes are under the control of the trainer and last three are performed by the students.

Assessment validates intended learning outcomes and gives unique opportunity for program evaluation and should be congruent with sound educational principles. If the educational principles of the curriculum are not reflected in and reinforced by the assessment program, the ‘hidden curriculum’ of assessment objectives will prevail (Harden & Gleeson, Citation1979; Hafferty, Citation1998). The Objective Structured Clinical Examination (OSCE) format emphasizes the assessment of competence in a structured fashion and widely impacted the assessment program of many medical schools during the last 30 years (Harden & Gleeson, Citation1979; Mennin & Kalishman, Citation1998; Fowel & Bligh, Citation2001).

Student ratings are one of the data sources with the potential to address the evaluation of educational interventions, namely: materials, staff, effectiveness and outcomes of the educational process (Marsh, Citation1984; McKeachie, Citation1996; Greenwald, Citation1997). In our BLS training design we tried to use almost all the educational principles given above.

Study aims and research questions

Although it is a well-known procedure and one of the major skills for the health care professionals, an evidential scarcity of adequately designed BLS training programs is a reality. This fostered us to retrospectively analyse our experience and we aimed to share our 6 years cognitive–constructivist model BLS Skills Training Program while focusing on two main aspects; (1) a description of the training program's context, educational approach and the instructional methods; and (2) an evaluation of students’ achievements and satisfaction.

In order to test our design we analyzed three research questions:

  1. Did students develop sufficient knowledge and skills to competently perform BLS skills within provided instruction in the 6 years?

  2. To what level were the students satisfied with learning BLS skills within this context and educational approach?

  3. Have the students’ competencies and the level of satisfaction changed in the years?

Method

Setting

In 1997, the Medical Education Committee (UMEC) of Ege University, İzmir, Turkey, recognized the need for competence-based Basic Life Support (BLS) skills training in early years of the medical curriculum and delegated the training design and implementation tasks to a multi-professional group from the Department of Anesthesiology and the Medical Education Unit (since 1999, the Department of Medical Education; DME). Before this initiative the BLS and Advanced Trauma Life Support (ATLS) courses were taught only at the 5th year Anesthesiology clerkship. The BLS skills training module has been in action for 6 years (1997–2003) in the first year of curriculum while BLS reinforcement and ATLS course still occurs at 5th year.

Organization and structure of the program

At the beginning, the planning group adopted a modular, competence-based design including lectures, video demonstration, and skills tutorials (stage 1). The module's pedagogical background was upgraded to another cognitive–constructivist design (cognitive apprenticeship) model in 2000 (stage 2). MCQ tests and checklist-based performance assessment on the manikins were used for assessment of students’ achievements. Students’ evaluation of training questionnaires (BLS-SEQ) were developed and used in each stage.

While we were training the students at a temporary skills laboratory, in 2000 a permanent skills laboratory was completed and we moved. This laboratory has far better conditions than the previous one. The study guide (which includes a yearly schedule, a translated version of ERC-BLS guidelines and skills training checklists) was developed and upgraded yearly. More than 16 skills tutors were involved in the process. summarizes the main properties of the modules in the related years.

Table 1.  Basic properties of implemented BLS skills training modules

Content and objectives

Content was determined from ERC guidelines. Initial assessment of the case, airway management, recovery position, adult and pediatric CPR are the main competence domains.

Our objectives at the end of training were that students:

  1. must have in-depth knowledge and understanding on CPR Algorithms as a basis of procedurals skills of BLS;

  2. must be able to assess the situation appropriately (initial assessment and field diagnosis) and apply correct initial procedures with sufficient speed, considering the emergency case (safety, conscious check, call for help);

  3. must be able to correctly manage airway (including mouth control, neck position, foreign body removal, from upper respiratory tract) or give recovery position;

  4. must be able to timely and sufficiently apply rescue breathing, ventilation in a procedural manner;

  5. must be able to correctly assess vital signs and appropriately apply pulse checking procedure;

  6. must be able to apply a well dosed cardiac massage in terms of correct place, appropriate rhythm and pressure in time; and

  7. accurately use automated external defibrillator (added, year 2000) on infant, child and adults manikins and peers (recovery position).

Instructional methods and their implementation

All the skills tutorials took place at the skills laboratory with Ressus Anne™ and Laerdal's CPR trainer manikins and students themselves for the recovery position.

illustrates the module's educational approaches employed by the years.

  • Stage 1 (1997–1999)

Figure 1. The BLS Skills Training module's educational approaches.

Figure 1. The BLS Skills Training module's educational approaches.

Students were trained in six skills tutorials of 3 hours each. Before the skills tutorials, two introductory lectures on general BLS concepts and principles and one video session were provided. Skills tutorials were: (1) initial assessment and airway management; (2) foreign body removal and recovery position; (3) one rescuer adult CPR; (4) two rescuer adult CPR; (5) pediatric CPR; and (6) self study. In the first five tutorials, after reviewing the skills check-lists we clearly demonstrated the skills and allowed students to practice. We provided feedback and encouraged their peers to give feedback. Students practiced the particular procedure at least three times in the tutorials and received extensive feedback.

In the next year, the number of tutors could not afford the 20 hours practical with 372 students and it was decreased to 14 hours (six 2-hour skill tutorials and 2 hours self study). The module was repeated similarly in 1999.

  • Stage 2 (2000–2003)

The time had come to add the Automated External Defibrilator and revise the content by the guidance of ERC 2000 guidelines. The ERC guidelines sequenced the BLS task into eight meaningful procedures. Based on these sequences and particular meaningful procedures, while we were upgrading our design to another (cognitive apprenticeship) we prepared four main skills tutorials for the adult cases as a core of the module and assumed that students could transfer their procedural knowledge and skills to pediatric cases. Therefore, we set up only one skills tutorial for pediatric cases later than three adult CPR tutorials. The BLS algorithm-based skill clusters are presented in .

Figure 2. Basic Life Support Skills and their clustered skills tutorial at the module.

Figure 2. Basic Life Support Skills and their clustered skills tutorial at the module.

Because many students complained about the quality of the given information via a documentary film, we removed it and instead, we added one scenario-based reinforcement-repetition session. Students were encouraged to reflect their procedural knowledge on various case scenarios, articulate and discuss with their peers. The activities are presented in .

Table 2.  The types and sessions of instructional activities of module by the stages

The components of the cognitive apprenticeship model used in stage 2 and their counterparts are presented in .

Table 3.  Components of the Cognitive Apprenticeship Model and their counterparts in the BLS module

Tutors and tutor training

The tutors were anesthesiologists and trained general practitioners from DME. All tutors were regular users of BLS and Advanced Trauma Life Support procedures in their work. The Department of Anesthesiology (DA) had content expertise and DME had general pedagogical knowledge. Neither department had a sufficient number of qualified staff members to train large student groups with the new instructional methods. Through the collaboration with Departments, starting from the first year, we systematically trained each other. In this tutor training process, we developed content specific pedagogical knowledge as well as BLS tutoring skills.

Subjects

One thousand seven hundred and sixty first year students of EUFM who trained in BLS skills between 1997–2003 participated in the study. The comparative analysis of students’ ratings and exam performances in terms of years and types of instructional design was used to answer the research questions.

Procedure

A case study format was used for description of the module. The number of students, learning environment, contents, main properties of the educational approaches employed, instructional formats, durations and student pass–fail rates were described for each year. Modifications of the module's elements were elaborated. For addressing the research questions, a retrospective cohort survey design by means of self-completed students’ evaluation questionnaires and students’ assessment scores were used.

  • Students’ assessment scores

The main performance-content type that was expected at the end of the training was the competent application of BLS procedures in various conditions on the manikins and peers at the OSCE station and 60% achievement in a test of 25 MCQs.

In order to reach precision on our test results we developed an evaluation checklist. Pilot testing was done by observing at least two tutors. After corrections, we trained the observers. We assessed both the Basic Life Support procedural knowledge (MCQ) and skills (OSCE). In each OSCE, at least two stations were developed; one for pediatric and one adult life support skills. While one station used a conscious case story, the other used an unconscious case scenario. Within this approach, we tried to overcome the content specificity problem of testing medical competence and thus increase content validity. The number of participating students was sufficient to allow conclusions at group-level. Each station consisted of one BLS task which required more than fifteen skills. In order to increase fidelity, we used peers as manikins in the conscious adult case scenario (for recovery positioning skills).

We set our standards by an Angoff procedure as a test-centered method (Wilkinson et al., Citation2001). At least four tutors took part in each Angoff procedure for the OSCE stations and in a modified Angoff procedure for MCQ tests. We defined the minimum pass level (MPL). The students who achieved the just pass level received 60 points. Because of the strict University Exam Legislation, this recoding procedure was applied. A sample OSCE station and assessment MCQ questions are presented in appendix 6.1. and 6.2.

  • Students’ Evaluation Questionnaire (SEQ)

For the first 3 years we used SEQa. Students were asked to rate their agreement with seven statements on a five-point Likert scale, ranging from 1 (completely disagree) to 5 (completely agree). We asked students to answer anonymously ‘how they found the module’ addressing physical setting, module duration, resource and material, skills tutors and opportunity to involve actively sessions.

While we were upgrading the module to a cognitive apprenticeship model in 2000, we added few questions about the new model, while maintaining the original questions of the SEQa. SEQb consisted of three distinctive subsets. It consisted of six ‘input’, four ‘process’ and two ‘output’ items, in total 12 statements. The first seven items were exactly the same as the SEQa.

Congruent with the new educational approach, we added five items which aimed to evaluate students’ perceptions of the degree of their learning motivation, relevancy of the content, information load, tutors’ guidance in the learning process and their enjoyment while learning. Both instruments included the students’ written comments that were asked in an open-ended form for collecting the other individual perceptions and elaboration of their ratings.

In developing both SEQa and SEQb, we used an item pooling approach. All the tutors put some items into the pool where they believed the item belonged. Later we reviewed the items and selected the most important concepts to be evaluated by students. The second step was a meeting with the students. We asked their opinion and remarks on clarity and understandability of the questionnaire and revised the wording. The last step was piloting. We asked twenty students who were in the first year but had not attended the module. According to the pilot results, the final corrections were made by a tutors meeting.

Questionnaires were administered at the end of exams. Response rates were 81.1% for SEQa and 88.4% for SEQb. The total response rate was 85.1%. Cronbach Alfa coefficients were found to be 0.78 for SEQa and 0.83 for SEQb. Although we did not reach 0.80 standard point of reliability coefficient for SEQa (Bryman & Cramer, Citation2002), the written comments provided us with deeper elaboration. The students’ written comments were analyzed by a simple content analysis method (positive/negative/module's elements).

Standard was absolute for the Likert type items. The percentage of low (<3) and high (>3) rating score and basic statistics of items (means, percentiles, standard deviations) were used to define strong and weak aspects.

The clients of the evaluations were the students (as information providers and customers), module planning group and the tutors (as information receivers for valuing, reflecting toward improvement of the module) and dean's office (as curriculum administrator). We interpreted the assessment and SEQ results in the module planning group for an improvement plan through the years. Instructor and tutors met several times in the year, shared their own views and collaboratively interpreted the students’ ratings. Instructional changes were made based on these meetings and on the number of students in that year.

SPSS for Windows Statistic Package (version 11.5) was used for data analysis.

Results

Students’ performances on assessments

Fail rates were very low (1.0–2.2%). In the MCQs exams, almost all the students achieved high marks (80.5 ± 12.1) and performed well on the manikins and their peers at the OSCE stations (81.7 ± 12.5). The views of the tutors support these results.

In terms of students’ scores (MCQ and OSCE), there was no difference between the years. Significant correlation was found between the mean scores of MCQ and OSCE (Pearson = 0.281, p < 0.001) in all the years.

Table 4.  Means and standard deviations (between brackets) of the students’ scores in the years

Students’ evaluation results

shows the yearly survey results.

Table 5.  Educational years and number of responses and coverage rates and average means and standard deviations (in bracket) of the students’ ratings

Average ratings of input variables varied between 3.5 to 4.5 indicating a positive impression which we also see in process variables (4.1–4.5). As a control item, we asked about information overload in a negative formulation (item 9). Students indicated they were not overloaded with information (2.4–3.0).

Adults learn if they are motivated and if the training gives them perspective on relevance of the skills to be learned with regards to their profession/future job (Peyton, Citation1998). Reflecting these two main adult learning principles, at stage 2, output items asked the students’ motivation to learn and perceptions of the module's relevancy to their perceived future profession. A similar positive impression was found in these variables (3.9–4.2) and the module was regarded as highly relevant for their future profession as well as for following years in the curriculum.

Mean scores of all positive items of students’ satisfactions were quite stable at around 4. Item scores got closer by the years. Although we decreased the practical hours, this stable high satisfaction was observed in the last years.

One way ANOVA shows significant differences in students’ satisfaction among nine items. According to the Bonferroni post hoc analysis results, indicates the causes of differences. Also, further elaboration of these changes has been discussed below. Because there is no significant differences for the item 3, 11 and 12 by years, we did not include them following analysis.

  • Item 1. Physical setting

Table 6.  Bonferroni post hoc analysis results for significant differences

As seen in the table there is no difference in stage 1 and stage 2.

  • Item 2. Duration of the module

There is a moderate significant difference caused by 3rd year.

  • Item 4. Module notes and study guide

At the first years of each stage, module notes and study guides get relatively lower scores from students.

  • Item 5. Tutors’ knowledge and skills to teach BLS skills.

  • Item 6. Tutors’ coaching.

  • Item 7. Active involvement in the sessions.

Item 5 did not exist in SEQa. We only compared the SEQb means. However, in items 5, 6 and 7, the 4th year mean scores were slightly better than its followers and this caused moderate significance.

  • Item 8. Level of enjoyment while learning.

Mean differences between 1st and 4th year modules is the cause of significance. Students’ evaluation was relatively lower in 1st year than 4th.

  • Item 9. Load of given information.

  • Item 10. Guidance information.

  • Item 3, 11 and 12 do not show significant differences.

These items did not exist in SEQa. In the 5th and 6th year students’ perceptions were moderately more negative than the 4th year.

Discussion

The assessment scores were generally high and fail rates were low. The MCQ and OSCE scores showed a significant correlation.

According to students’ ratings, they were highly satisfied with the module. The response rate of the questionnaire was high and reliable.

According to students’ free comments, we attributed the students’ high level of satisfaction to two main perspectives:

  1. Pedagogical principles of the module allowed us to align the instruction constructively, and the cognitive–constructivist approach fitted well with the students’ expectations and needs. Supporting the tutorials with scenarios increased the reality of the training, permitting complete practice, exploration and reflection. One important comment is that, in order to use scenarios a careful task analysis is required. BLS skills should be clustered by sequential procedures that are applicable to individual scenarios. This makes instruction context-bound and outcome-focused thus coherent with the adult learning principles (Peyton, Citation1998).

  2. The timing and time devoted to practice of the BLS skills training in the curriculum is another explanation of high level of students’ satisfaction. BLS skills training in the first year fostered students’ motivation toward learning. Students saw themselves as doctors who save lives. Actually, they clearly wrote these last words in the questionnaires. Our results support the usefulness of introducing clinical skills in the early years of curriculum (Lam et al., Citation2002).

In terms of significant differences in student ratings between the years and/or stages, for physical settings (item 1), we attribute this difference to the change in skills lab conditions. We do not have any objective findings to explain the moderate difference in judgment of the duration of the module (item 2) caused by 3rd year. We attribute the moderate significance in 5, 6, and 7 seen by 4th year (first year of stage 2) to high motivation of the tutors. In terms of level of enjoyment while learning (item 8), we do not have any attributable findings to explain this moderate difference between the 1st and the 4th year. Perhaps being a novice in small group teaching caused low satisfaction in the 1st year. Item 9 (asks about information overload) and item 10 (asks about satisfaction on guidance information) reveal that without changing the amount or type of information in the last three years, it is possible to interpret that we found the best satisfaction scores in the first year of the cognitive apprenticeship adopted module (4th year). This general tendency to rate higher in the first year of stage 2 might explain the differences in these items.

According to WHO's European survey in 1997, BLS training in medical schools vary in content, format (theory and/or practice) and time spent (13.3 ± 9.7 hours) (Garcia-Barbero & Caturla-Such, Citation1999) and there is still a long way to go towards standardized training (Perkins et al., Citation1999; Jordan & Bradley, Citation2000). The European survey points out that more time is devoted to theoretical teaching than practice. Turkish medical schools provide 6.9 ± 6.6 hours for theoretical and 6.9 ± 5.7 hours practical for BLS training. In our designs, we devoted similar time for theory (∼9 hours) but two to three times more for practice (12–20 hours).

As one limitation of this study, although we upgraded the design to another one in stage 2, our findings do not give enough specific information between the stages. We do not know whether upgrading the educational approach to a cognitive apprenticeship model did affect the students’ performance and evaluations. While we were focusing on smaller group instruction by the years, we emphasized the students’ own reflections. Although the number of practical hours decreased in quantity, our results show that the quality was stable, or even better. In the second stage, students are likely to learn more meaningfully.

The module's design and quality improvement perspectives triggered and fostered several local innovations. These include the development and implementation of a ‘structured skills training format and the introduction of a skills laboratory, interdisciplinary cooperation, skills tutor and their training, skills test in Objective Structured Clinical Exam format’.

In terms of a ‘positive side effect’ of this experience, besides the module, we trained nearly 2000 government employees who work as Primary Health Care Workers in 2001 in 28 courses, for one day each. We used a compact program with the same pedagogical approach.

Conclusion

The results of this study suggest that: in the first year of the medical curriculum, at the skills laboratory, in the frame of the structured, outcome-focused, cognitive apprenticeship model adopted, 9 hours theoretical (lecture, BLS algorithm discussion) and 12–20 hours long practical training with a 12–17 students tutorial group, medical students can develop sufficient knowledge and skills to competently perform BLS skills on the manikins and their peers at the end of training. This cognitive–constructivist design for skills training is sufficient for performing BLS competently at OSCE and is associated with stable, high student satisfaction. However, as the last limitation of the study, the lack of controls limits the extrapolation of this educational design in the acquisition of BLS skills in the first year medical students.

Acknowledgement

We are thankful to our colleagues, Dr. S. Elif Törün, Dr. Kevser Vatansever, Dr. Sürel Karabilgin, and Dr. A. Hilal Batı for their tutorship, and to all our students involved in the BLS skills training 1997–2003.

Additional information

Notes on contributors

Halıi İbrahım Durak

HALIL İ. DURAK and AYHAN Çalışkan, of Ege University Faculty of Medicine, Department of Medical Education, ızmir, Turkey, are the module's skills tutors, and HıD provided the original idea for this study. SAÇ contributed to data collection analysis, and HıD wrote the manuscript.

Agah Çertuğ

AGAH ÇERTUĞ, Ege University Faculty of Medicine, Department of Anesthesiology, ızmir, Turkey, is the main instructor of the BLS skills training program.

Jan Van Dalen

JAN VAN DALEN, University of Maastricht, Skillslab, Maastricht, The Netherlands provided support, guidance and amendments to the manuscript.

References

  • Boulay C, Medway C. The clinical skills resource: a review of current practice. Medical Education 1999; 33: 185–191
  • Brown JS, Collins A, Duguid P. Cognition and the culture of learning. Educational Researcher 1989; 18: 32–42
  • Bruce NC. Evaluation of procedural skills of internal medicine residents. Academic Medicine 1989; 64: 213–216
  • Bryman A, Cramer D. Concepts and their measurement. Quantitative Data Analysis with SPSS Release 10 for Windows, A Bryman, D Cramer. Routledge, New York 2002; 54–68
  • Dent JA. Current trends and future implications in the developing role of clinical skills center. Medical Teacher 2001; 23: 483–489
  • Ford KJ, Kraiger K. The application of cognitive constructs and principles to the instructional systems model of training: implications for needs assessment, design and transfer. International Review of Industrial and Organizational Psychology 1995; 10: 1–47
  • Fowel S, Bligh J. Assessment of undergraduate medical education in the UK: time to ditch motherhood and apple pie. Medical Education 2001; 35: 1006–1007
  • Garcia-Barbero M, Caturla-Such J. What are we doing in cardiopulmonary resuscitation training in Europe? An analysis of a survey. Resuscitation 1999; 41: 225–236
  • Greenwald AG. Validity concerns and usefulness of students' ratings. American Psychologist 1997; 52: 1182–1186
  • Hafferty FW. Beyond curriculum reform: confronting medicine's hidden curriculum. Academic Medicine 1998; 73: 403–407
  • Hamo IM. The role of the Skills Laboratory in the integrated curriculum of the Faculty of Medicine and health Sciences, UAE University. Medical Teacher 1994; 16: 167–178
  • Handley AJ, Monsieeur KG, Bossaert LL. European Resuscitation Council Guidelines 2000 for Adult Basic Life Support. A statement from the Basic Life Support and Automated External Defibrilator Working Group (1) and approved by the Executive Committee of the European Ressusciation Conucil. Resuscitation 2001; 48: 199–205
  • Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination. Medical Education 1979; 13: 19–22
  • Hunskaar S, Seim SH. Medical students’ experiences in medical emergency procedures upon qualifications. Medical Education 1985; 19: 294–298
  • Issenberg SB. Clinical skills—training makes perfect. Medical Education 2002; 36: 210–211
  • Issenberg SB, Mcgaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. Journal of the American Medical Association 1999; 282: 861–866
  • Jordan T, Bradley P. A survey of basic life support training in various undergraduate health care professions. Resuscitation 2000; 47: 321–323
  • Kaye W, Mancini ME. Teaching adult resuscitation in the United States—time for a rethink. Resuscitation 1998; 37: 177–187
  • Lam TP, Irwin M, Chow LW, Chan P. Early introduction of clinical skills teaching in a medical curriculum—factors affecting stuıdents’ learning. Medical Education 2002; 36: 233–240
  • Leung WC. Competency based medical training. British Medical Journal 2002; 325: 693–696
  • Marsh HW. Students’ evaluations of university teaching: dimensionality, reliability, validity, potential biases and utility. Journal of Educational Psychology 1984; 76: 707–754
  • McKeachie WJ. Student ratings of teaching. American Council of Learned Society Occasional Paper 1996; 33: 12–17
  • Mennin SP, Kalishman S. Student assessment. Academic Medicine 1998; 73: s46–s54
  • Patrick J. Learning and skill acqusition. Training Research and Practice. Academic Press, North Yorkshire 2002a; 19–74
  • Patrick J. Task oriented analysis. Training Research and Practice. Academic Press, North Yorkshire 2002b; 131–168
  • Perkins GD, Hulme J, Shore HR, Bion JF. Basic life support training for health care students. Resuscitation 1999; 41: 19–23
  • Peyton JWR. Teaching & Learning in Medical Practice. Manticore Europe Ltd., Guilford 1998, Ed.
  • Philips B, Zideman D, Wyllie J, et al. European Resuscitation Council Guidelines 2000 for Basic Paediatric Life Support. Resuscitation 2001; 48: 223–229
  • Phillips PS, Nolan JP. Training in basic and advanced life support in UK medical schools: questionnaire survey. British Medical Journal 2001; 323: 22–23
  • Rolfe IE, Sanson-Fisher RW. Translating learning principles into practice: a new strategy for learning clinical skills. Medical Education 2002; 36: 345–352
  • Stillman PL, Wang Y, Ouyang O, et al. Teaching and assessing clinical skills: a competency-based programme in China. Medical Education 1997; 31: 33–40
  • Whelan GP. Educational Commission of Foreign Medical Graduates: lessons learned in a high stakes, high-volume medical performance examination. Medical Teacher 2000; 22: 293–296
  • Wilkinson T, Newble DI, Frampton CM. Standard setting in an objective structured clinical examination. Use of global ratings of borderline performance to determine the passing score. Medical Education 2001; 35: 1043–1049
  • Wilson DB, Jennet PA. The Medical Skills centre at the University of Calgary Medical School. Medical Education 1997; 31: 45–48

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.