2,078
Views
37
CrossRef citations to date
0
Altmetric
Web Paper

Web-based collaborative training of clinical reasoning: A randomized trial

, MD, , , , , & show all
Pages e431-e437 | Published online: 09 Sep 2009

Abstract

Background: Clinical reasoning skills are essential for medical practice. Problem-based collaborative learning via the internet might prove useful in imparting these skills.

Aim: This randomized study assessed whether web-based learning (WBL) is superior to face-to-face problem-based learning (PBL) in the setting of a 6-week cardio-respiratory course.

Methods: During winter term 2007/08, all 148 fourth-year medical students enrolled in the 6-week course consented to be randomized in small groups to diagnose a patient complaining of dyspnoea either using a virtual collaborative online module or a traditional PBL session. Clinical reasoning skills were assessed by means of a key feature examination at the end of the course.

Results: No significant difference between the mean scores of both study groups was detected (p = 0.843). In virtual learning groups, costs for diagnostic tests were significantly correlated to the number of contributions to online group discussions (r = 0.881; p = 0.002). Evaluation data favored traditional PBL sessions over virtual collaborative learning.

Conclusion: While virtual collaborative learning was as effective as traditional PBL regarding the acquisition of clinical reasoning skills, it was less well accepted than traditional PBL. Future research needs to determine the ideal format and time-point for computer-assisted learning in medical education.

Introduction

Clinical reasoning skills are essential for medical practice (Lee Citation2006). Problem-based learning (PBL) (Norman & Schmidt Citation1992) has been identified as one possible way of fostering these skills since the group setting is particularly effective in imparting problem-solving skills (Margetson Citation1996; Medelez Ortega et al. Citation2003). In recent years, the use of computers has become widely accepted in medical education. It is not yet clear which learning objectives might be best achieved using computers as opposed to attending ‘real’ teaching sessions (Cook Citation2006). Regarding the acquisition of factual knowledge, computer-assisted instruction (CAI) has not been found to be superior to traditional teaching methods (Schwartz & Griffin Citation1993). However, there is evidence of an effect of CAI on the development of problem-solving skills (Weverling et al. Citation1996). Modern learning management systems (Sparacia et al. Citation2007) seek to merge computer technology with interactive features as provided by the world-wide-web. Web-based learning (WBL) is thought to be advantageous due to its flexible scheduling, economies of scale, ease in updating, and the potential for the delivery of individualized teaching (Cook Citation2007).

By combining the PBL approach with modern electronic resources including the internet, thus creating ‘virtual collaborative learning’, a new educational tool was developed (Stromso et al. Citation2004; Taradi et al. Citation2005). While it has been hypothesized that this teaching format might be capable of modeling clinical reasoning (Kamin et al. Citation2002; Hammoud et al. Citation2006), substantial support for this hypothesis is lacking. A number of confounders need to be controlled for in media-comparative studies, and care must be taken to match learning objectives with the tools used to assess learning outcome (Hudson Citation2004). In order to examine the effectiveness of virtual PBL in comparison with traditional PBL regarding the acquisition of clinical reasoning skills, a randomized study design using an appropriate assessment is required.

This study assessed whether students completing a web-based collaborative teaching module on the differential diagnosis of dyspnoea show higher performance in a test aimed at clinical reasoning skills than students discussing the same clinical case in a traditional teaching session.

Methods

The 6-year medical curriculum at Göttingen University comprises two pre-clinical and three clinical years, followed by a practice year after which final exams are held. The 3-year clinical part of the curriculum has a modular structure with the sequence of modules being identical for all students. Thus, all medical students take part in a 6-week, interdisciplinary cardio-respiratory module at the beginning of the 4th year. This course comprises lectures, seminars, practical skills training using simulators, clinical teaching sessions, and case-based learning in small groups (eight students). For the latter, each student is assigned to eight face-to-face meetings, all of which are tutored by the same clinical teacher. During these meetings, case histories are discussed starting with a presenting complaint and gradually unfolding the patient's past medical history, symptoms and findings as well as results of diagnostic tests as students request them. Thus, face-to-face sessions do not represent true ‘PBL’ as defined by Barrows (Barrows & Mitchell Citation1975). However, they cannot be classified as pure case-based learning either since clinical teachers motivate students to work out their own diagnostic strategies. In addition, students are asked to consult text books and current literature between sessions in order to clarify issues arising during face-to-face meetings and to prepare short talks to be presented during the following session.

Description of the online module

During winter term 2007/08, a web-based collaborative teaching module on the differential diagnosis of dyspnoea was added to the 6-week curriculum. A learning management system (Clix®, Information Multimedia Communication AG, Saarbrücken, Germany) facilitating live chats, asynchronous group discussions and the exchange of documents was used to create the online module. It was designed to be completed by groups of eight students each. These groups were tutored by a postgraduate clinical teacher who logged on to the system every day.

During the online module, students were asked to diagnose a 64-year old patient who presented to his general practitioner with chronic shortness of breath. In order to find the correct diagnosis (chronic thromboembolic pulmonary hypertension, CTPH), students had diagnostic procedures carried out on their virtual patient, the results of which were provided by their online tutor. The initial version of the online module was piloted in summer 2007; course content was adjusted according to feedback obtained from students and tutors. The revised module comprised five distinct steps (). Access to each following step (e.g. from history and examination to diagnostic procedures) was only allowed after completing the preceding step: As a first step, groups were asked to nominate one student who would supervise group discussions and contact the online tutor if necessary (De Wever et al. Citation2006). Students were then informed about the patient's presenting complaint and past medical history. A list of diagnostic procedures and their related costs was provided online, and each student was asked to upload a text file containing information on one specific diagnostic procedure to be viewed by all group members (step 2). Student groups were then asked to choose three tests they thought most helpful for finding the correct diagnosis (step 3). After receiving the test results from their online tutor, students were given the opportunity to order as many further test results (just one at a time) as they needed to reach a final diagnosis. Students were then asked to submit the corresponding ICD-code of their tentative diagnosis (step 4). After obtaining the correct diagnosis from their tutor, students were asked to search a contemporary guideline for the diagnosis and treatment of this disease (Galie et al. Citation2004). A PDF file of this article freely available on the internet was to be downloaded from the internet and uploaded to the learning management system (step 5).

Figure 1. Timeline of the study. The five distinct steps are described in the text.

Figure 1. Timeline of the study. The five distinct steps are described in the text.

According to suggestions from the literature (Soller Citation2001), milestones for the completion of each step were defined. Both course content and the didactic concept underlying the online module were discussed with respiratory specialists and faculty staff (Hudson Citation2004).

Study design

Prior to entering the 6-week cardio-respiratory module, 4th-year students were informed about the aims of this study and given the opportunity to sign up for participation in groups of up to eight students, irrespective of randomization. Preference to enroll as a group was accounted for during randomization in that groups of students who had signed up together were randomized as a group to either the control or the intervention setting. The randomization procedure also controlled for gender and students’ achievements in end-of-course exams during their 3rd year. All 148 students agreed to be randomized, and 84 signed up for participation in small groups prior to randomization. A total of 18 groups (eight students each) was formed, nine of which were assigned to either study arm (). Five students unsubscribed from the 4th year shortly before the course. Regarding the final study population, the proportion of female students was 43.1% in the intervention and 42.3% in the control group (p = 0.868); sum scores of 3rd-year exams were 325.8 ± 2.0 and 325.8 ± 1.2 points, respectively (p = 0.858).

Figure 2. Outline of the study design.

Figure 2. Outline of the study design.

Students in the intervention group were asked to complete the online module while attending the 6-week course (blended learning). In order to compensate for the time spent on the online module, these students were only assigned to seven on-site small group discussions. Control subjects attended the cardio-respiratory module in its traditional format, thus taking part in eight small group sessions; the last 2-hour session was devoted to discussing the same case that was offered to the online group.

Evaluation and assessment tools

Students were asked to complete two online questionnaires (EvaSys®, Electric Paper, Lüneburg, Germany) during the course: The first questionnaire was to be completed on the 1st day of the cardio-respiratory course. It contained statements pertaining to the use of computers (e.g. ‘I am experienced in using computers and the internet’, ‘I like computer-assisted learning’) to be rated on a six-point Likert scale anchored by 1 (‘fully agree’) and 6 (‘completely disagree’). The second questionnaire which was completed by the end of the 6-week course covered student experience and satisfaction with the online module. Responses to scaled questions in the online questionnaires were coded as follows: Options 1 and 2 were assumed to indicate an affirmatory response, and options 5 and 6 were counted as negative ratings while options 3 and 4 were assumed to indicate indifferent ratings.

Learning outcome regarding clinical reasoning was assessed using the key feature approach (Page & Bordage Citation1995). A key feature is defined as a critical step in the resolution of a problem. Thus, the term ‘key feature’ might either refer to a step in which examinees are most likely to make errors in exams or to an important aspect of the identification and management of the problem in practice. Using this definition, 11 key feature problems were designed, containing five questions each. An online learning system (CASUS®, Instruct AG, Munich, Germany) was used to facilitate two key feature examinations: On the 1st day of the course, all students were presented with five problems. The final exam conducted during the last week of the course contained all 11 problems. Both assessments were done at Göttingen University Hospital. Key feature problems were reviewed by clinical teachers familiar with the concept (SA and RS) and piloted with 10 students who had taken the cardio-respiratory course 6 months prior to the study. Problems encountered during the pilot phase were addressed before using the key feature problems in this study.

On the last day of the 6-week cardio-respiratory course, all students took a summative examination made up of 68 multiple choice questions mainly assessing factual knowledge. Test results were compared to students’ achievements in the key feature exams.

The local ethics committee approved the study protocol.

Data acquisition and analysis

The primary endpoint of this study was the overall key feature score difference between the intervention and the control group at the end of the course. The study was adequately powered to detect a 3.5 point difference at the 5% level (1 − β = 82%; effect size 0.48). Secondary analyses focused on the costs generated by diagnostic procedures in relation to the amount of discussion on the online platform for each group. Statistical analysis was carried out using SPSS 12.0.1 (SPSS Inc., Chicago, Illinois, USA); p values (significance level 5%) were derived from T-tests. Data are presented as mean ± standard deviation.

Results

Baseline questionnaire and key feature assessment

In the online survey conducted on the 1st day of the course, students stated that they were familiar with using computers and the internet (proportion of affirmatory responses: 68.8% (control) versus 62.5% (intervention); p = 0.691), used computers daily (80.0% vs. 80.6%; p = 0.392) and liked learning with computers (40.6% vs. 40.3%; p = 0.709).

Mean scores in the initial key feature examination were comparable in both groups (intervention: 10.1 ± 3.4 out of 25 points; control: 10.2 ± 3.7; p = 0.827).

Student activity on the platform

No major technical problems occurred during the online module. A majority of students (94.4%) assigned to the intervention group logged on to the system at least once. Almost 90% completed the second step. Of all students who used the online platform, 82.3% posted threads and comments to the chat and forum discussions. The number of these individual contributions varied greatly between groups (from 24 to 93 entries in total). The number of online contributions was significantly correlated with the costs for diagnostic procedures within each group (r = 0.881; p = 0.002; ). After requesting a median of eight diagnostic test results, every group submitted a definite diagnosis which was correct in all but one cases. Seven groups downloaded the clinical guideline from the internet and uploaded the document to the learning management system, thus completing the online course.

Figure 3. Correlation of online contributions and costs for diagnostic procedures. Each dot represents one online group.

Figure 3. Correlation of online contributions and costs for diagnostic procedures. Each dot represents one online group.

Student evaluation of the online module

At the end of the course, students were asked about their perceptions of the online module. While technical aspects received favorable ratings, 83.8% of students indicated that, in addition to online communication, a substantial amount of group discussions relevant to the case presented in the online module took place during personal meetings at university (e.g. between courses) rather than on the online platform. Almost every third student (29.0%) did not read the documents provided by their fellow students, and 40.6% stated that they had lost their interest in the online module while it was running. Mean working time for the online module amounted to 2.0 ± 0.9 hours per week for individual students (i.e. approximately 10 hours over the 5-week period during which the module was online). On a six-point scale (1 = ‘excellent’, 6 = ‘poor’), the online module achieved an overall evaluation score of 2.9 ± 1.3, and 38% of participants would not recommend the online module to their fellow students while traditional PBL sessions received a mean rating of 1.8 ± 0.5 (n = 143; no significant difference between control and intervention groups).

Achievements in final course exams

Internal consistency of both the final key feature exam and the summative multiple choice test was acceptable (Cronbach alpha = 0.83 and 0.79, respectively). Students who had been randomized into the intervention arm of the study achieved a mean score of 31.9 ± 7.2 out of 55 points in the final key feature examination. Students who had discussed the same clinical problem during small group meetings achieved 31.7 ± 7.5 points (p = 0.843). There was a significant difference in performance regarding a key feature problem closely related to the clinical case of CTPH in that students in the intervention group achieved higher scores than control subjects (2.5 ± 1.1 versus 2.0 ± 1.2; p = 0.003; ).

Table 1.  Sub-scores in the final key feature examination

In the summative multiple choice test, no significant difference between mean results of the two groups was detected (intervention 50.3 ± 6.8; control 50.3 ± 6.5; p = 0.973). In fact, there was a positive and significant correlation between scores in the multiple choice test and the key feature examination ().

Figure 4. Scores achieved in final examinations.

Figure 4. Scores achieved in final examinations.

Discussion

Learning outcome

In this randomized study comparing traditional PBL to virtual collaborative PBL, the latter was not associated with better performance in a key feature examination assessing clinical reasoning skills for the presenting complaint ‘dyspnoea’. Assuming that clinical reasoning can be adequately assessed using key feature problems our results indicate that virtual PBL is at least not less effective than traditional PBL. However, students indicated to have spent a total of 10 hours on the clinical case presented in the online module while students in the control group were allocated no more than 2 hours for the discussion of the same case. This might explain the significant difference in test results between the control and the intervention group regarding one key feature problem which was very similar to the'experimental case’ (). This is in line with earlier findings from a non-randomized study (Taradi et al. Citation2005).

A recent meta-analysis reviewing a total of 201 published articles on the effectiveness of internet-based learning for students, postgraduate trainees and practitioners in a profession directly related to human or animal health concluded that, when compared to no intervention, internet-based interventions have large and significant effects on the acquisition of knowledge, skills, and behaviors (Cook et al. Citation2008). However, when comparing WBL with non-Internet formats, effects are of marginal size and predominantly non-significant. However, the format of WBL intervention as well as effect sizes reported in the included studies varied widely, and many studies used assessment tools not matched to the desired learning outcome. These limitations set aside, media-comparative research (i.e. studies comparing traditional teaching methods to approaches featuring modern technologies) is currently being heavily criticised due to its methodological limitations (Cook Citation2005). Being media-comparative in nature, the present study sought to eliminate common sources of bias: According to Letterie's suggestion (Letterie Citation2003), a randomized, prospective design comparing student achievements in objective examinations was chosen in order to adequately address the research question. The number of drop-outs was small, a pre-test effect regarding key feature questions should have affected both study groups, and the online tutor was identical for all intervention groups. Contamination between ‘online’ and ‘offline’ students might have occurred but this is unlikely to have impacted on test performance due to the learning objective being quite complex.

Despite better control for potential confounders, the results of the present study are comparable to those obtained in previous trials in medical ethics (Fleetwood et al. Citation2000), physiology (Bowdish et al. Citation2003), paediatrics (Kamin et al. Citation2003) as well as in residency training (Bello et al. Citation2005) and nursing education (Jacobsen Citation2006). In addition, our findings are in line with the results of the meta-analysis cited above (Cook et al. Citation2008): The authors concluded that CAI is neither superior nor inferior to traditional methods.

Secondary endpoints

Analysis of student activity on the online platform and the costs created by diagnostic tests revealed that vivid online discussions entailed more expensive procedures to be carried out. Although this result has not been controlled for the amount of ‘offline’ discussions, it is noteworthy since it could have been expected that thorough discussion leads to a more careful use of expensive methods. However, students were unfamiliar with most diagnostic procedures and not used to choosing cost-effective ways of reaching a diagnosis. It would be interesting to examine whether students learn to employ diagnostics in a more cost-effective way as they advance through the years of medical education.

Student satisfaction

Although online discussions were not impeded by technical difficulties, students tended to prefer real meetings to discuss the case. In addition, many participants did not read the documents prepared by their peers, and despite successful completion of the curriculum by most groups, individual satisfaction with the online module was rather low. Consistent with findings from comparable projects (Baumlin et al. Citation2000; Hahne et al. Citation2005; Haag et al. Citation2007) and from a recent meta-analysis (Cook et al. Citation2008), a significant proportion of students lost their interest in the online module while using it, and evaluation data for the online module were much less positive than for traditional PBL. While this finding might be explained by the fact that most students did not have previous experience with CAI (it is not a mandatory component of medical education at Göttingen University) which might have resulted in unrealistic expectations towards the online module, it is nevertheless disappointing.

In this regard, our data confirm McLean's hypothesis of the small group in PBL being ‘more than a cognitive experience’ (McLean et al. Citation2006). As opposed to virtual discussions, communication skills are trained during real group meetings, and by influencing group dynamics PBL tutors are more influential in the traditional PBL format than when supervising online groups.

Study limitations

One major limitation of this study is the use of blended learning in the intervention group (Hudson Citation2004). In order to accurately distinguish the impact of virtual PBL on clinical reasoning skills from the effect of traditional PBL sessions, a mutually exclusive use of the two teaching formats would have been necessary. This was not feasible in the present study since the local ethics committee would not have approved of a study design exposing half of the student population to a novel teaching format, thus replacing an established and successful format. Consequently, generalization of results is impeded by the fact that any significant effect cannot solely be ascribed to one specific teaching format (Greenhalgh Citation2001).

Results in key feature examinations were assumed to adequately reflect students’ clinical reasoning skills. While the higher taxonomic level (Krathwohl Citation2002) of this assessment tool is beyond doubt, the observed significant positive correlation between test results obtained in the final key feature examination and the summative multiple choice examination might have several implications: Firstly, ‘good’ students might achieve high scores in any examination, regardless of the learning objective being assessed. However, this notion would question the justification for developing assessment tools tailored to specific learning objectives. Secondly, quality of the key feature problems used in this study might have been low. However, questions were carefully reviewed, and measures of internal consistency imply that the key feature examination complied with international standards. Taking all this into account, it cannot be excluded that – if present – by using different assessment tools (e.g. structured interviews or objective structured clinical examinations), a significant difference between subjects in both study groups might have been detected.

Conclusion

Using a randomized controlled design, this study failed to detect a significant effect of a virtual PBL course (as compared to a traditional PBL course) on performance in a key feature examination assessing clinical reasoning skills of 4th-year medical students. However, results might have been confounded by the use of a blended learning approach. Future research will have to identify learning objectives lending themselves to computer-assisted teaching formats as well as the ideal point for these methods to be deployed in medical education.

Acknowledgments

The authors would like to thank Martin Fischer for advice on study design. Furthermore, we would like to thank all tutors and students who devoted their time and effort to this study.

Declaration of conflict of interest: None of the authors has any conflict of interest to declare.

Additional information

Notes on contributors

T. Raupach

TOBIAS RAUPACH works as a Senior House Officer in the Department of Cardiology and Pneumology at Göttingen University and co-ordinates the department's teaching activities. His current research focuses on web-based learning, peer teaching, clinical teaching and evaluation tools.

C. Muenscher

CHRISTIAN MUENSCHER is a scientific co-worker at the computer center of Göttingen University Hospital, mainly engaged in supporting research and teaching by means of applied medical computer science.

S. Anders

SVEN ANDERS works as a consultant in the Department of Legal Medicine at Hamburg University, co-ordinating the department's teaching activities. He is involved in curricular development and has just completed a 2-year study course of Medical Education. Main research areas are forensic pathology, clinical forensic medicine, and medical education.

R. Steinbach

REIKO STEINBACH works as a consultant in the Department of Nephrology at Kiel University Hospital. He is the co-ordinator of the department's teaching activities and his current research focuses on problem-solving skills in clinical scenarios.

T. Pukrop

TOBIAS PUKROP is a fellow in the Department of Hematology and Oncology at Göttingen University Hospital. During his academic studies in Ulm he became interested in PBL and, in conjunction with fellow students, organized a PBL course in pharmacology. He was then involved in establishing computer-based PBL in Göttingen.

I. Hege

INGA HEGE is a scientific co-worker at the Medical Education Unit of the Ludwig-Maximilians-University Munich. She was involved in implementing the key-feature examination with CASUS®.

M. Tullius

MONJA TULLIUS works as psychiatrist and psychotherapist in the Asklepios Fachklinikum Göttingen. Her commitment to medical education stems from her former work as project manager in curricular development and her long-standing responsibility for the modular teaching system at Göttingen University.

References

  • Barrows HS, Mitchell DL. An innovative course in undergraduate neuroscience. Experiment in problem-based learning with ‘problem boxes’. Br J Med Educ 1975; 9: 223–230
  • Baumlin KM, Bessette MJ, Lewis C, Richardson LD. EMCyberSchool: An evaluation of computer-assisted instruction on the Internet. Acad Emerg Med 2000; 7: 959–962
  • Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, Antonelli M. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med 2005; 31: 547–552
  • Bowdish BE, Chauvin SW, Kreisman N, Britt M. Travels towards problem based learning in medical education (VPBL). Instruct Sci 2003; 31: 231–253
  • Cook DA. The research we still are not doing: An agenda for the study of computer-based learning. Acad Med 2005; 80: 541–548
  • Cook DA. Where are we with web-based learning in medical education?. Med Teach 2006; 28: 594–598
  • Cook DA. Web-based learning: Pros, cons and controversies. Clin Med 2007; 7: 37–42
  • Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: A meta-analysis. JAMA 2008; 300: 1181–1196
  • De Wever B, Van Winckel M, Valcke M. Discussing patient management online: The impact of roles on knowledge construction for students interning at the paediatric ward. Adv Health Sci Educ Theory Pract 2006; 13: 25–42
  • Fleetwood J, Vaught W, Feldman D, Gracely E, Kassutto Z, Novack D. MedEthEx online: A computer-based learning program in medical ethics and communication skills. Teach Learn Med 2000; 12: 96–104
  • Galie N, Torbicki A, Barst R, Dartevelle P, Haworth S, Higenbottam T, Olschewski H, Peacock A, Pietra G, Rubin LJ, et al. Guidelines on diagnosis and treatment of pulmonary arterial hypertension. The task force on diagnosis and treatment of pulmonary arterial hypertension of the European society of cardiology. Eur Heart J 2004; 25: 2243–2278
  • Greenhalgh T. Computer assisted learning in undergraduate medical education. BMJ 2001; 322: 40–44
  • Haag M, Singer R, Bauch M, Heid J, Hess F, Leven FJ. Challenges and perspectives of computer-assisted instruction in medical education: Lessons learned from seven years of experience with the CAMPUS system. Methods Inf Med 2007; 46: 67–69
  • Hahne AK, Benndorf R, Frey P, Herzig S. Attitude towards computer-based learning: Determinants as revealed by a controlled interventional study. Med Educ 2005; 39: 935–943
  • Hammoud M, Gruppen L, Erickson SS, Cox SM, Espey E, Goepfert A, Katz NT. To the point: Reviews in medical education online computer assisted instruction materials. Am J Obstet Gynecol 2006; 194: 1064–1069
  • Hudson JN. Computer-aided learning in the real world of medical education: Does the quality of interaction with the computer affect student learning?. Med Educ 2004; 38: 887–895
  • Jacobsen HE. A comparison of on-campus first year undergraduate nursing students’ experiences with face-to-face and on-line discussions. Nurse Educ Today 2006; 26: 494–500
  • Kamin C, Deterding R, Lowry M. Student's perceptions of a virtual PBL experience. Acad Med 2002; 77: 1161–1162
  • Kamin C, O'Sullivan P, Deterding R, Younger M. A comparison of critical thinking in groups of third-year medical students in text, video, and virtual PBL case modalities. Acad Med 2003; 78: 204–211
  • Krathwohl DR. A revision of bloom's taxonomy: An overview. Theor Pract 2002; 41: 212–218
  • Lee WR. Computer-based learning in medical education: A critical view. J Am Coll Radiol 2006; 3: 793–798
  • Letterie GS. Medical education as a science: The quality of evidence for computer-assisted instruction. Am J Obstet Gynecol 2003; 188: 849–853
  • Margetson D. Beginning with the essentials: Why problem-based learning begins with problems. Educ Health 1996; 9: 61–69
  • McLean M, Van Wyk JM, Peters-Futre EM, Higgins-Opitz SB. The small group in problem-based learning: More than a cognitive ‘learning’ experience for first-year medical students in a diverse population. Med Teach 2006; 28: e94–e103
  • Medelez Ortega E, Burgun A, Le Duff F, Le Beux P. Collaborative environment for clinical reasoning and distance learning sessions. Int J Med Inform 2003; 70: 345–351
  • Norman GR, Schmidt HG. The psychological basis of problem-based learning: A review of the evidence. Acad Med 1992; 67: 557–565
  • Page G, Bordage G. The medical council of Canada's key features project: A more valid written examination of clinical decision-making skills. Acad Med 1995; 70: 104–110
  • Schwartz S, Griffin T. Comparing different types of performance feedback and computer-based instruction in teaching medical students how to diagnose acute abdominal pain. Acad Med 1993; 68: 862–864
  • Soller AL. Supporting social interaction in an intelligent collaborative learning system. J Artif Intell Educ 2001; 12: 40–62
  • Sparacia G, Cannizzaro F, D’Alessandro DM, D’Alessandro MP, Caruso G, Lagalla R. Initial experiences in radiology e-learning. Radiographics 2007; 27: 573–581
  • Stromso HI, Grottum P, Hofgaard Lycke K. Changes in student approaches to learning with the introduction of computer-supported problem-based learning. Med Educ 2004; 38: 390–398
  • Taradi SK, Taradi M, Radic K, Pokrajac N. Blending problem-based learning with web technology positively impacts student learning outcomes in acid-base physiology. Adv Physiol Educ 2005; 29: 35–39
  • Weverling GJ, Stam J, ten Cate TJ, van Crevel H. Computer-assisted education in problem-solving in neurology: A randomized educational study. Ned Tijdschr Genees 1996; 140: 440–443

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.