2,505
Views
17
CrossRef citations to date
0
Altmetric
Research Articles

Assessment of cognitive biases and biostatistics knowledge of medical residents: a multicenter, cross-sectional questionnaire study

, , , , , & show all
Article: 23646 | Received 22 Dec 2013, Accepted 17 Feb 2014, Published online: 12 Mar 2014

Abstract

Purpose

The aim of this study is to determine the perceived familiarity of medical residents with statistical concepts, assess their ability to integrate these concepts in clinical scenarios, and investigate their susceptibility to the gambler's fallacy and the conjunction fallacy.

Methods

A multi-institutional, cross-sectional survey of Greek medical residents was performed. Participants were asked to indicate their familiarity with basic statistical concepts and answer clinically oriented questions designed to assess their biostatistics knowledge and cognitive biases. Univariate, bivariate, and multivariate statistical models were used for the evaluation of data.

Results

Out of 153 respondents (76.5% response rate), only two participants (1.3%) were able to answer all seven biostatistics knowledge questions correctly while 29 residents (19%) gave incorrect answers to all questions. The proportion of correct answers to each biostatistics knowledge question ranged from 15 to 51.6%. Residents with greater self-reported familiarity were more likely to perform better on the respective knowledge question (all p<0.01). Multivariate analysis of the effect of individual resident characteristics on questionnaire performance showed that previous education outside Greece, primarily during medical school, was associated with lower biostatistics knowledge scores (p<0.001). A little more than half of the respondents (54.2%) answered the gambler's fallacy quiz correctly. Residents with higher performance on the biostatistics knowledge questions were less prone to the gambler's fallacy (odds ratio 1.38, 95% confidence intervals 1.12–1.70, p=0.003). Only 48 residents (31.4%) did not violate the conjunction rule.

Conclusions

A large number of medical residents are unable to correctly interpret crucial statistical concepts that are commonly found in the medical literature. They are also especially prone to the gambler's fallacy bias, which may undermine clinical judgment and medical decision making. Formalized systematic teaching of biostatistics during residency will be required to de-bias residents and ensure that they are proficient in understanding and communicating statistical information.

Introduction

Biostatistical literacy is imperative for physicians in training who must keep abreast of current medical knowledge and be able to communicate statistical and epidemiological information to patients and colleagues. Furthermore, medical residents may be particularly vulnerable to the suggestions of the published literature and other more questionable sources such as promotional brochures and websites (Citation1, Citation2). The vast majority of published medical research contains at least some elementary statistical analysis, and it has consistently been shown that errors in such data are frequently found, even in well-respected textbooks (Citation3–(Citation7). It is therefore crucial for residents to possess the necessary basic biostatistical knowledge to critically evaluate and apply original research data.

Previous surveys conducted in the 1980s and early 1990s demonstrated that physicians have poor understanding of simple biostatistical concepts (Citation8–(Citation11). Physicians today may have even more pronounced difficulties in comprehending and integrating data derived from the increasingly sophisticated statistical methodologies used in contemporary medical literature (Citation7, Citation12Citation14). More recent studies have shown that internal medicine, emergency medicine, and obstetrics–gynecology residents have considerable difficulties in interpreting statistical results found in published clinical research (Citation15–(Citation17). Therefore, despite the fact that many medical schools include statistical courses in their curricula, it should not be taken for granted that residents can effectively assess biostatistical information and use it to the best advantage of their patients.

A number of predictable biases in human information processing may considerably hinder sound clinical decision making (Citation18–(Citation21). Therefore, documenting and elucidating these processes may improve patient outcomes, prevent the inappropriate utilization of medical resources, and reduce health care expenditures. A previous instrument designed to evaluate such cognitive biases among physicians (Citation22) was recently reported to have very limited measurement properties (Citation23). The representativeness heuristic is a known factor that may lead to bias in probability estimation by physicians (Citation24, Citation25). The gambler's fallacy is a classic cognitive bias produced by the representativeness heuristic and arises when a person assumes that a deviation from what occurs on average or in the long term will be corrected in the short term (Citation26). A common example is when tails are repeatedly obtained while tossing a fair coin leading the gambler to incorrectly expect that heads are ‘due’ for the next toss. In reality, the previous tosses have no bearing on the outcome of future tosses. The gambler's fallacy has been shown to be by far the most frequent bias arising in this probability judgment situation, particularly among highly educated individuals (Citation27, Citation28). The conjunction fallacy is another cognitive illusion of probability judgment which was first documented using the well-known ‘Linda’ example whereby the stereotypical description of Linda as a social activist compels respondents to rank the probability that she is a feminist bank teller as greater than the probability that she is a bank teller. However, this response is considered erroneous as feminist bank tellers cannot be more frequent or more probable than the more inclusive class of bank tellers (Citation29).

The aim of the present multicenter, cross-sectional study was to determine the perceived familiarity of medical residents with statistical concepts and assess their ability to integrate these concepts in actual clinical scenarios. Furthermore, the incidence of the gambler's fallacy and the conjunction fallacy among medical residents were investigated using clinical vignettes that were designed to tap these two domains with specific reference to medicine. The effects of individual resident characteristics such as age, gender, residency year, and past biostatistics training on the cognitive biases and biostatistics knowledge of medical residents were also evaluated.

Methods

Participants and data collection

The reporting of this cross-sectional study conforms to the STROBE statement (Citation30). The data presented here are part of a large multi-institutional project evaluating the biostatistical competency and cognitive biases of medical residents. Respondents were originally recruited to participate in a cross-sectional randomized experimental study of medical resident's Bayesian reasoning performance and the sample size was determined according to that study's requirements. The survey was conducted from January to March 2010, and the participants were chosen from eight major Greek hospitals in the city of Athens (). The Greek system does not allow official residency training in private hospitals and therefore all institutions surveyed in this study were public. Every resident was given a number according to the official enrolment lists that had been provided by each of the hospitals. Out of 1,272 eligible residents in all hospitals, 200 (25 from each hospital) were randomly selected by a computerized method to participate in the sample, a number that represents approximately 2% of the total number of residents training in Greece (Citation31). Participation was elective and all participants were approached during breaks from work or training, were informed that responses would be anonymous and were blinded to the scope and purpose of the study. The residents were asked to return the completed questionnaires to a sealed box provided in each hospital. The study received permission from the authorized personnel at each institution (i.e., the Department or Scientific Meeting Chair). The study complied with Greek requirements for survey studies, and ethical approval was not required as responses were fully anonymous, participation was elective, all participants were approached during breaks from work or training, and the study did not contain questions on sensitive topics. Participants’ informed consent was indicated by each individual's willingness to complete and return the questionnaire.

Table 1 Resident demographic, educational, and residency profile

Survey measures

The full questionnaire was developed in the Greek language and is available from the corresponding author. The first seven questions queried the socio-demographic profile, specialty choices, previous education outside Greece, current training level, residency year, and past training in biostatistics of residents. We combined residencies according to their conceptual and occupational relations and formed three different medical ‘fields’: 1) internal medicine (n=71; dermatology, pediatrics, neurology, and general practice were also included in this group); 2) surgical specialties (n=67); and 3) diagnostic and laboratory specialties (n=15). The participants were then asked to indicate their understanding of the statistical concepts of standard deviation (SD), standard error of the mean (SEM), p-values, confidence intervals (CI), correlation coefficients, relative risk (RR), sensitivity, and positive predictive value (PPV) by rating their familiarity with each concept on a 7-point Likert scale ranging from ‘none’ (score of 1) to ‘excellent’ (score of 7). The next set of questions was developed by one of the authors and assessed the biostatistics knowledge () and cognitive biases () of medical residents.

Table 2 Biostatistics knowledge questions

Table 3 Cognitive biases questions

A mathematician from the Section of Statistics and Operations Research of the Department of Mathematics of the University of Athens reviewed the questions for clarity and mathematical appropriateness. In order to assess the intelligibility, interpretation, content validity, adequacy of response options, and clinical relevance of the vignettes, the questionnaire was pretested in a separate group of seven medical residents, including two residents with advanced training in epidemiology and biostatistics. The residents completed the questionnaire and provided oral feedback which resulted in the removal of one question to avoid duplication of similar concepts. Both residents with advanced biostatistics backgrounds answered all questions correctly. Of the remaining five residents, four answered four of seven biostatistics knowledge questions correctly while one resident correctly answered two out of seven questions. Four of these five residents selected the ‘approximately 10%’ choice in the gambler's fallacy scenario and answered that the probability of suffering from coronary artery disease (CAD) and Huntington's disease is greater than the probability of suffering from Huntington's disease. Despite extensive discussion of the conjunction fallacy vignette's probabilistic nature, these residents were very reluctant to accept the strictly mathematical reading of the problem.

A ‘total biostatistics knowledge’ score, which equally weighted each of the seven questions listed in , was calculated. Each correct response counted one point while no points were given for incorrect answers. Furthermore, a ‘total biostatistics familiarity’ score was calculated by adding all familiarity ratings.

Statistical analysis

Data analysis was performed using R (Foundation for Statistical Computing, Vienna, Austria) (Citation32). Variables were maintained as continuous or categorical according to their original form in the questionnaire. A non-normal distribution was assumed for all continuous variables as indicated by the Kolmogorov–Smirnov test. Missing values were counted as incorrect responses and added in the ‘I do not know and do not wish to guess’ response selection (). Median differences between more than two groups were evaluated using the Kruskal–Wallis one-way analysis of variance by ranks. The Spearman rank correlation coefficient was used to determine the correlation between two continuous variables. Median differences between two groups were analyzed using the Mann–Whitney U test for two non-paired data. Categorical variables were compared using Pearson's Chi-square or Fisher's exact tests where appropriate. The internal consistency of the ‘total biostatistics familiarity’ and the ‘total biostatistics knowledge’ scores was assessed by examining the Cronbach's α coefficient. Cronbach's α>0.6 were considered to be acceptable (Citation33). Bivariate analyses were performed to identify individual resident characteristics that may be associated with total biostatistics knowledge score, total biostatistics familiarity score, or performance on the cognitive biases questions. Candidate factors included age, gender, hospital, residency field, residency year, residency program type, possession of other advanced degrees, education abroad, past biostatistics training, and total biostatistics familiarity score. Variables with p<0.05 on bivariate comparisons were further included in multivariate regression analyses. Multivariate analysis was performed with linear regression analyses as well as binary logistic regression tests using the dichotomous coding of responses (correct/incorrect) as the dependent variable. To adjust for multiple pairwise comparisons, p<0.01 was considered statistically significant.

Results

The survey response rate was 76.5% (153/200). details participant demographic profile, education, and specialty choice. The majority of residents (81.7%) reported prior attendance of biostatistics courses, which mainly occurred during medical school.

Responses to each of the individual biostatistics knowledge and cognitive bias questions are presented in Table and 3 , respectively. The proportion of correct answers to each biostatistic knowledge question ranged from 15% in the scenario concerning correlation coefficients to 51.6% in the diagnostic sensitivity question. The most popular incorrect answer (selected by 63.4% of respondents) in the correlation coefficient question stated that the correlation between the two variables was strong. However, the correlation coefficient provided in the vignette (r=0.29) was actually low. Only two participants (1.3%) answered all biostatistics knowledge questions correctly while 29 residents (19%) gave incorrect answers to all seven questions. Reliability analysis of the ‘total biostatistics familiarity’ and the ‘total biostatistics knowledge’ scores showed acceptable α values of 0.904 and 0.648, respectively. The mean±SD of the ‘total biostatistics knowledge’ score was 2.1±1.8 (range 0–7). No significant association was found between residency field and total biostatistics knowledge score (p=0.066). On the other hand, bivariate analyses indicated a significant association of total biostatistics knowledge score with residents’ age (r=−0.166, p=0.041), training year (r=−0.205, p=0.011), total biostatistics familiarity score (r=0.333, p<0.001), and education abroad (p=0.004). Multivariate analysis was further performed in order to estimate the independent effects of these factors on biostatistics knowledge (). Residents’ perceived familiarity with biostatistics was associated with higher total knowledge scores (p<0.001) while previous education outside Greece was associated with lower total knowledge scores (p=0.005).

Table 4 Multivariate association (multiple linear regression) between resident variables and total biostatistics knowledge score

Perceived familiarity ratings of biostatistical concepts by medical residents are listed in . Thirty-one residents (20.3%) gave scores ≤4 on all familiarity ratings. Notably, <25% of residents reported having above average knowledge (familiarity rating>4) of correlation coefficients. The mean familiarity score for this statistical concept was 2.9±1.9 (mean±SD) which was significantly lower (p<0.01) compared to all other familiarity ratings. Bivariate analyses showed that residents with higher self-reported familiarity with SD (p<0.001), SEM (p=0.001), p-values (p=0.009), CI (P<0.001), and correlation coefficients (p<0.001) performed better on the respective knowledge question. Also, there was no significant association between self-reported understanding of the concepts of RR (p=0.134) and sensitivity (p=0.571) with the respective knowledge question. The mean±SD of the ‘total biostatistics familiarity’ score was 31.6±12.4 (range 8–55). In bivariate analyses, total biostatistics familiarity was significantly associated with past training in biostatistics (p=0.006) and possession of other advanced degrees (p<0.001). Conversely, residency field was not significantly associated with total biostatistics familiarity (p=0.236). A linear regression model was fitted with ‘total biostatistics familiarity’ score as the dependent variable and past biostatistics training and possession of other advanced degrees as the independent variables (). Residents who have previously attended biostatistics courses tended to report higher familiarity scores although this effect did not reach statistical significance (p>0.01). Residents with other advanced degrees were significantly more likely to report higher familiarity with statistical concepts (p<0.001).

Table 5 Perceived familiarity rating by medical residents for the biostatistical concepts of SD, SEM, p-values, CI, correlation coefficients, RR, sensitivity, and PPV

Table 6 Multivariate association (multiple linear regression) between resident variables and total biostatistics familiarity score

As shown in , approximately half of the respondents (54.2%) answered the gambler's fallacy quiz correctly. Furthermore, less than one third of residents surveyed (31.4%) judged the probability that John has Huntington's disease to be greater than the probability that John has both CAD and Huntington's disease. Preliminary bivariate analysis of factors that may affect resident performance on the gambler's fallacy problem was followed by multivariate logistic regression which showed that residents with higher total biostatistics scores were more likely to answer that quiz correctly (odds ratio=1.38, 95% CI 1.12–1.70; p=0.003). Also, none of the investigated individual resident factors were found to be significantly associated with performance on the conjunction fallacy quiz.

Discussion

The present multicenter survey of biostatistics performance and biases demonstrated that a considerable proportion of medical residents are prone to errors and biases that undermine judgement and decision making. The fact that, on average, only two out of seven biostatics knowledge questions were answered correctly raises serious concerns. This significant lack of biostatistics aptitude was observed despite the fact that the majority of participants have attended undergraduate biostatistical training during medical school. It should be noted, however, that most residents did not reinforce these courses during residency training. Indeed, the biostatistics knowledge score tended to decline with progression through the years of residency training although this observation did not reach statistical significance (). Furthermore, we did not observe a significant performance difference between residents who have previously received biostatistics education with those that have never received such training at any point in their career. Taken together, these data suggest that biostatistics skills may atrophy with progression through medical training and further effort should be expended to maintain knowledge in that sphere during residency.

The majority of residents were unable to properly combine significance level information with correlation coefficient data (). This mistake was observed even in respondents who have selected the correct definition of the p-value in the relevant question. Educators may therefore need to reevaluate how these concepts are taught and represented in a way that medical residents can understand and use in rational, evidence-based decisions.

Further analysis of the effects of individual resident variables () revealed that respondents (all Greek citizens) who have received medical education outside Greece, predominantly taking place during undergraduate medical education in international medical schools, had lower biostatistics knowledge scores compared to residents who have never trained abroad. This performance difference may be partly due to variability in medical school experiences obtained by Greek citizens abroad but it may also reflect differences in ability and effort. Further research will be required to elucidate the potential performance differences between graduates of Greek and international medical schools particularly with regards to educational quality, clinical outcomes and quality of care. It should also be noted that other resident characteristics, including age, gender, advanced Graduate training, specialty choice, hospital and residency program type did not affect the biostatistics knowledge score of residents indicating that the insufficient biostatistical literacy reported in the present study is similar across a broad range of medical residents in Greece. Furthermore, biostatistical performance was homogenous among residents training in different specialty programs. To our knowledge, the present study is the first to investigate and directly compare the biostatistics knowledge between different resident specialty fields. Recent studies have assessed the biostatistical competence of internal medicine, emergency medicine, and obstetrics–gynecology residents (Citation15–(Citation17). Contrary to the present study, Windish et al. reported that prior biostatistics training, gender and advanced degrees were predictive of internal medicine residents’ statistics knowledge (Citation15). Also, a Danish study mainly focusing on junior hospital doctors did not find significantly higher statistics performance in physicians with previous biostatistics education or interest in research (Citation10). These reports used different sets of questions compared to this study, thus hindering direct comparisons.

Approximately one out of five medical residents acknowledged below average familiarity with eight biostatistical concepts that are routinely encountered in the medical literature (Citation12, Citation14). This lack of comfort was substantiated by the similar percentage of residents who answered all biostatistics questions erroneously. Residents considered themselves the least familiar with the concept of correlation coefficient. Indeed, their actual performance in the correlation coefficient quiz was clearly lower compared to all other tests. The independent association between self-reported biostatistics familiarity and knowledge score was further verified in a multivariate model (). It is of note that residents with advanced education through a master's or PhD degree were significantly more likely to report higher familiarity with biostatistical concepts although this effect did not translate into substantial performance improvement on the knowledge questions. Previous education in biostatistics also tended to independently increase familiarity rankings although this association did not reach statistical significance. These observations indicate that although residents with advanced training and prior statistical education are more likely to believe that they are comfortable with statistics compared to other medical residents, they experience similar difficulties in interpreting research statistics. Consequently, formalized systematic teaching of biostatistics during residency will be required to ensure that residents are proficient in understanding and communicating statistical information.

To the best of our knowledge, the present study is the first to investigate the susceptibility of physicians to the gambler's fallacy and the conjunction fallacy in medical settings. A recent survey in Germany reported that approximately 60% of the general population selected the correct answer in a coin toss gambler's fallacy test (Citation27). The pervasiveness of this bias among our study population of medical residents is alarming, especially given the fact that the bias was investigated using a problem within their domain of expertise. Residents who felt that the patient has increased chances of having tuberculosis are likely to overlook other important causes of hemoptysis. Notably, residents with higher biostatistics knowledge scores were less prone to the gambler's fallacy. Therefore, continuing medical education programs focusing in statistical concepts and reasoning may mitigate the probability judgment biases of residents (Citation34, Citation35). Furthermore, specific instruction aimed at increasing awareness of the gambler's fallacy may reduce residents’ susceptibility to this bias (Citation35–(Citation37).

Approximately 80–90% of respondents generally violate the conjunction rule according to previous reports using various presentations of the ‘Linda’ problem (Citation29, Citation38) (Citation39). In the present study, we found a slightly lower percentage (68.6%) of violations of class-inclusion in medical residents using a clinically based problem. This percentage was found to be independent of residents’ individual characteristics including age, gender, residency year, past biostatistics training, performance on the biostatistics tests and possession of other advanced degrees. A number of authors have argued that linguistic and pragmatic features in conjunction fallacy scenarios may lead respondents to alternative evaluations (Citation38, Citation40). For example, if residents interpret the statement ‘John has CAD’ to additionally imply that ‘John does not have Huntington's disease’ then ranking this statement as more probable than ‘John has CAD and Huntington's disease’ would not be considered a conjunction fallacy. In addition, similarly to the English and German languages, the word ‘probable’ is polysemous in Greek and residents faced with multiple possible interpretations may infer a non-mathematical meaning of probability.

A further interpretation of the high number of conjunction violations found in the present study may be that residents apply different inferential rules and heuristic procedures resulting in answers that appear mathematically incorrect (Citation41). Physicians use a repertoire of clinically relevant assumptions to arrive at a differential diagnosis based on the patient's history, symptoms and physical examination. These diagnostic norms tend to focus on the most relevant information and assign that information its appropriate weight taking into account various clinical goals and expectations. This strategy may indeed disregard syntactical systems such as the conjunction rule. However, medical decision making is not dependent solely on the probability of an event (e.g., a disease). More often, clinicians focus on calculating the probability times payoff or cost when constructing a differential diagnosis list. Although this approach can lead physicians to rank the constituent as more probable than the conjunction it may also reflect an effective and intelligent way to tackle clinical uncertainty. During pretesting of our conjunction fallacy vignette, residents were specifically made aware of the mathematical nature of the test and the class-inclusion violation of ranking the probability of CAD and Huntington's disease as more probably than Huntington's disease. However, they still insisted that paying attention to the gist that the patient had multiple risk factors for CAD was far more reasonable than precise mathematical rankings.

Limitations

The cross-sectional design of the present study prevented determination of causality. Furthermore, in order to fully protect residents’ anonymity, we were unable to collect any further data on non-respondents. Although the response rate was significantly higher than what is typical in surveys of physicians (Citation42), we cannot exclude the possibility that residents who felt less comfortable with statistical concepts may have been less willing to complete and return the questionnaire. The present survey was purposely kept brief in order to achieve maximum participation. We thus limited the assessment of residents’ biostatistics knowledge to seven biostatistical concepts () which however represent some of the most commonly used statistical methods found in the medical literature today.

Conclusions

The measured biostatistics performance in our survey indicates that an alarmingly large number of medical residents are unable to correctly interpret crucial statistical concepts that are commonly found in the medical literature. These difficulties were pervasive, extending even to residents with prior biostatistics education and advanced graduate training. In addition, residents were found to be especially prone to the gambler's fallacy bias, which may undermine clinical judgment and medical decision making. Frequent violations of the conjunction rule in a clinical scenario were also observed and residents were uncomfortable with the mathematical explanation of the problem. The low performance of medical residents in the biostatistics knowledge tests and the gambler's fallacy question was independent of any prior biostatistics education, which mainly occurred during medical school. Therefore, in order to adequately develop and maintain the biostatistical reasoning of medical residents, educators may need to re-evaluate how such information is taught as well as emphasize and systematize the teaching of statistical concepts during residency.

Ethical approval

The study complied with Greek requirements for survey studies. Ethical approval was not required as responses were fully anonymous, participation was elective, all participants were approached during breaks from work or training and the study did not contain questions on sensitive topics.

Conflict of interest and funding

This research received no financial or other support from any funding agency in the public, commercial, or not-for-profit sectors. The authors have no competing interests to report.

Acknowledgements

The authors thank Dr Antonis Economou, Assistant Professor in the Department of Mathematics of the University of Athens, for reviewing the mathematical appropriateness of questions and response options used in the present study.

References

  • Ramos KD , Schafer S , Tracz SM . Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003; 326: 319–21.
  • Agrawal S , Saluja I , Kaczorowski J . A prospective before-and-after trial of an educational intervention about pharmaceutical marketing. Acad Med. 2004; 79: 1046–50.
  • Bland JM , Altman DG . Misleading statistics: errors in textbooks, software and manuals. Int J Epidemiol. 1988; 17: 245–7.
  • Bland M , Altman DG . Autumn Books: Caveat doctor: a grim tale of medical statistics textbooks. Br Med J (Clin Res Ed). 1987; 295: 979.
  • Gore SM , Jones IG , Rytter EC . Misuse of statistical methods: critical assessment of articles in BMJ from January to March 1976. Br Med J. 1977; 1: 85–7.
  • Ioannidis JP . Why most published research findings are false. PLoS Med. 2005; 2: e124.
  • von Roten FC , de Roten Y . Statistics in science and in society: from a state-of-the-art to a new research agenda. Public Underst Sci. 2013; 22: 768–84.
  • Weiss ST , Samet JM . An assessment of physician knowledge of epidemiology and biostatistics. J Med Educ. 1980; 55: 692–7.
  • Berwick DM , Fineberg HV , Weinstein MC . When doctors meet numbers. Am J Med. 1981; 71: 991–8.
  • Wulff HR , Andersen B , Brandenhoff P , Guttler F . What do doctors know about statistics?. Stat Med. 1987; 6: 3–10.
  • Laopaiboon M , Lumbiganon P , Walter SD . Doctors’ statistical literacy: a survey at Srinagarind Hospital, Khon Kaen University. J Med Assoc Thai. 1997; 80: 130–7.
  • Levy PS , Stolte K . Statistical methods in public health and epidemiology: a look at the recent past and projections for the next decade. Stat Methods Med Res. 2000; 9: 41–55.
  • Hellems MA , Gurka MJ , Hayden GF . Statistical literacy for readers of Pediatrics: a moving target. Pediatrics. 2007; 119: 1083–8.
  • Horton NJ , Switzer SS . Statistical methods in the journal. N Engl J Med. 2005; 353: 1977–9.
  • Windish DM , Huot SJ , Green ML . Medicine residents’ understanding of the biostatistics and results in the medical literature. JAMA. 2007; 298: 1010–22.
  • Hack JB , Bakhtiari P , O'Brien K . Emergency medicine residents and statistics: what is the confidence?. J Emerg Med. 2009; 37: 313–18.
  • Anderson BL , Williams S , Schulkin J . Statistical literacy of obstetrics–gynecology residents. J Grad Med Educ. 2013; 5: 272–5.
  • Detmer DE , Fryback DG , Gassner K . Heuristics and biases in medical decision-making. J Med Educ. 1978; 53: 682–3.
  • Croskerry P . The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003; 78: 775–80.
  • Brehaut JC , Poses R , Shojania KG , Lott A , Man-Son-Hing M , Bassin E , etal. Do physician outcome judgments and judgment biases contribute to inappropriate use of treatments?. Study protocol. Implement Sci. 2007; 2: 18.
  • Phua DH , Tan NC . Cognitive aspect of diagnostic errors. Ann Acad Med Singapore. 2013; 42: 33–41.
  • Hershberger PJ , Part HM , Markert RJ , Cohen SM , Finger WW . Development of a test of cognitive bias in medical decision making. Acad Med. 1994; 69: 839–42.
  • Sladek RM , Phillips PA , Bond MJ . Measurement properties of the Inventory of Cognitive Bias in Medicine (ICBM). BMC Med Inform Decis Mak. 2008; 8: 20.
  • Bakwin H . Pseudodoxia pediatrica. N Engl J Med. 1945; 232: 691–7.
  • Ayanian JZ , Berwick DM . Do physicians have a bias toward action? A classic study revisited. Med Decis Making. 1991; 11: 154–8.
  • Tversky A , Kahneman D . Judgment under uncertainty: heuristics and biases. Science. 1974; 185: 1124–31.
  • Dohmen T , Falk A , Huffman D , Marklein F , Sunde U . Biased probability judgment: evidence of incidence and relationship to economic outcomes from a representative sample. J Econ Behav Organ. 2009; 72: 903–15.
  • Xue G , He Q , Lei X , Chen C , Liu Y , Lu ZL , etal. The gambler's fallacy is associated with weak affective decision making but strong cognitive ability. PLoS One. 2012; 7: e47019.
  • Tversky A , Kahneman D . Extensional versus intuitive reasoning: the conjunction fallacy in probability judgment. Psychol Rev. 1983; 90: 293–315.
  • von Elm E , Altman DG , Egger M , Pocock SJ , Gotzsche PC , Vandenbroucke JP . Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007; 335: 806–8.
  • AUTH. Aristotle University of Thessaloniki: Career Services Office: position availability for medical specialties in Greece database. 2009. Available from: http://www.cso.auth.gr/Greek/Baseis/Eid/Eidikotites.gr.htm [cited 14 December 2008]. [In Greek].
  • R Development Core Team. R: a language and environment for statistical computing. 2012; Vienna, Austria: R Foundation for Statistical Computing.
  • Cronbach LJ . Coefficient alpha and the internal structure of tests. Psychometrika. 1951; 16: 297–334.
  • Davis DA , Thomson MA , Oxman AD , Haynes RB . Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995; 274: 700–5.
  • Bornstein BH , Emler AC . Rationality in medical decision making: a review of the literature on doctors’ decision-making biases. J Eval Clin Pract. 2001; 7: 97–107.
  • Arnoult LH , Anderson CA . Identifying and reducing causal reasoning biases in clinical practice. 1988, pp. 209–32; Reasoning, inference, and judgment in clinical psychology: New York: Free Press.
  • Gruppen LD , Margolin J , Wisdom K , Grum CM . Outcome bias and cognitive dissonance in evaluating treatment decisions. Acad Med. 1994; 69: S57–9.
  • Hertwig R , Gigerenzer G . The “conjunction fallacy” revisited: how intelligent inferences look like reasoning errors. J Behav Decis Making. 1999; 12: 275–305.
  • Hertwig R , Chase VM . Many reasons or just one: how response mode affects reasoning in the conjunction problem. Think Reas. 1998; 4: 319–52.
  • Stanovich KE , West RF . Individual differences in reasoning: implications for the rationality debate?. Behav Brain Sci. 2000; 23: 645–65; discussion 65–726.
  • Reyna V , Brainerd C . Numeracy, ratio bias, and denominator neglect in judgments of risk and probability. Learn Indiv Differ. 2008; 18: 89–107.
  • Kellerman SE , Herold J . Physician response to surveys. A review of the literature. Am J Prev Med. 2001; 20: 61–7.