3,477
Views
3
CrossRef citations to date
0
Altmetric
Research Article

A KSA system for competency-based assessment of clinicians’ professional development in China and quality gap analysis

, , , , , , , , , , , , & show all
Article: 2037401 | Received 26 Oct 2020, Accepted 31 Jan 2022, Published online: 09 Feb 2022

ABSTRACT

Background

We aim to create a holistic competency-based assessment system to measure competency evolution over time – one of the first such systems in China.

Method

Two rounds of self-reported surveys were fielded among the graduates from the Shantou University Medical College: June through December 2017, and May through August 2018. Responses from three cohorts of graduates specializing in clinical medicine – new graduates, resident physicians, and senior physicians – were analyzed. Gaps between respondents’ expected and existing levels of competencies were examined using a modified service quality model, SERVQUAL

Results

A total of 605 questionnaires were collected in 2017 for the construction of competency indicators and a 5-level proficiency rating scale, and 407 in 2018, for confirmatory factor and competency gap analysis. Reliability coefficients of all competency indicators (36) were greater than 0.9. Three competency domains were identified through exploratory factor analysis: knowledge (K), skills (S), and attitude (A). The confirmatory factor analysis confirmed the fit of the scale (CMIN/DF < 4; CFI > 0.9; IFI > 0.9; RMSEA ≤ 0.08). Within the cohorts of resident and senior physicians, the largest competency gap was seen in the domain of knowledge (K): −1.84 and −1.41, respectively. Among new graduates, the largest gap was found in the domain of skills (S) (−1.92), with the gap in knowledge (−1.91) trailing closely behind.

Conclusions

A competency-based assessment system is proposed to evaluate clinician’s competency development in three domains: knowledge (K), skills (S), and attitude (A). The system consists of 36 competency indicators, a rating scale of 5 proficiency levels, and a gap analysis to measure competency evolution through 3 key milestones in clinician’s professional career: new graduate, resident physician, and senior physician. The competency gaps identified can provide evidence-based guide to clinicians’ own continuous development as well as future medical curriculum improvements.

Introduction

Epstein and Hundert [Citation1,Citation2] defined systems-based competencies for health professionals as follows: ‘the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community being served.’ It was further advocated in the report entitled ‘Health Professionals for a New Century: Transforming Education to Strengthen Health Systems in an Interdependent World’ and published by the Lancet Commissions in 2010 that ‘a 3rd generation [of medical education] is now needed that should be systems based to improve the performance of health systems by adapting core professional competencies to specific contexts while drawing on global knowledge.’[Citation3]

The earliest literature on physician core competencies can be traced as far back as to the 1970s. Government agencies and organizations worldwide have since been continuously updating these competencies. In 1998, the Accreditation Council for Graduate Medical Education (ACGME) in the U.S. defined core competencies in 6 areas for health practitioners[Citation4]. In 2005, the Royal College of Physicians and Surgeons of Canada published the CanMEDS 2005 Physician Competency Framework as an update to the previous version (published in 1996 and entitled ‘Skills for the New Millennium’), outlining 7 physician roles[Citation5]. In 2013, the General Medical Council in UK issued a guidance document entitled ‘Good Medical Practice’ to delineate the duties of doctors[Citation6].

In order to transform medical education for the twenty-first century, it is essential that educational institutes (medical colleges and schools, teaching hospitals, etc.) strengthen their faculty teams and promote curriculum reforms to elevate a broad range of capabilities of the medical personnel [Citation3,Citation7]. Graduate surveys that collect feedback from the recipients of medical education have been relied upon as one of the effective tools to gauge the teaching quality at medical institutes, and can provide valuable input to help direct plans to improve medical curricula [Citation8,Citation9].

Since 1998, China has implemented the largest reform of medical education in the world by incorporating professional training into college education. This has significantly boosted the enrollment of health-care professionals at medical institutes[Citation7]. In 2015, standardized resident training was also introduced in China[Citation10]. There are three main tracks of formal medical education in China, which aspiring high-school graduates can pursue: the 5-year, the 5 + 3, and the 8-year programs. For the 5-year track, high-school graduates enroll themselves to the undergraduate medical program and will receive a bachelor’s degree at the end of their 5 years of study (‘new graduate’). These new graduates are eligible for standardized resident training which will last another 3 years. For the 5 + 3 track, after students complete their initial 5-year training (equivalent to that of the 5-year track), they attend a 3-year standardized resident training and will be awarded a master’s degree together with a standardized resident training certificate (‘resident physicians’) when completing the program. For the 8-year track, high-school graduates attend a broader training program that spans basic and clinical medicine as well as liberal arts, and will receive a degree of MD (Doctor of Medicine/Medical Doctor) at the end of their 8 years of study. Like ‘new graduates’, MDs can take up additional standardized resident training which will last 2 to 3 years. The bachelor’s degree prepares new graduates for a career in clinical medicine, if they so choose, or related professions. The goal of the 5 + 3 training program is to cultivate a pipeline of clinical physicians, while the 8-year program aims to incubate medical talents with more versatility.

From 2012 to 2014, Dr. Baozhi Sun, former Vice President of the China Medical University, joined efforts with a team of scholars to conduct a large-scale and cross-sectional survey among clinicians in 31 provinces and cities across the country. They constructed the ‘Chinese Doctors’ Common Competency Model’ which consists of 76 indicators and covers 3 key dimensions of competency knowledge, skills, and attitude (KSA). The model encompasses the following aspects of medicine: clinical skills and patient care, disease prevention and health promotion, information and management, medical knowledge and life-long learning, interpersonal communication, teamwork and scientific research, core value, and professionalism. However, the model developed by Sun et al. mainly targets senior physicians with more extensive clinical experience as practitioners.

Nevertheless, holistic systems to assess medical graduates’ professional development as they progress through different phases of their career remain few and far between in China. What is also lacking is a keen appreciation of professional development as a continuous and dynamic process, as well as investigations to assess this process that are reproducible. Therefore, the objective of the current study is to create a holistic competency-based assessment system comprising 3 components: competency indicators suitable for clinicians in different phases of their career, a rating scale aligned with the progression of skill acquisition, and an analytical tool to measure competency evolution over time – one of the first such systems in China.

Method

A competency-based KSA assessment system was designed by drawing from the conceptual framework of Norcini who espoused that an effective assessment system should include three segments: competency (defined by indicators), level of assessment (degree of mastery), and assessment of progression (skill acquisition through stages) [Citation2,Citation11,Citation12]. The Dreyfus model that any skill acquisition spans 5 stages – novice, advanced beginner, competent, proficient, and expert [Citation13] – was also consulted to create a more nuanced scheme for assessing the mastery level of competency. This study was approved by the Ethics Committee of the Shantou University of Medical College (SUMC).

KSA-based competency indicators and a rating scale

Thirty-six indicators () were derived by combining and simplifying closely related indicators from the model created by Sun et al [Citation2]. so the scale can be applicable for surveying a more diverse group of clinicians – that is, new graduates, resident physicians, and senior physicians – who were selected to represent 3 key milestones in a clinician’s professional career. A more succinct scale also rendered the survey less cumbersome to administer, more enticing for respondents to complete the survey, thereby allowing the collection of more meaningful data.

Figure 1. KSA Model of Core Competencies (36 indicators).

Figure 1. KSA Model of Core Competencies (36 indicators).

The questionnaire developed based on this scale includes two sections: basic information, and self-assessment of competencies. The self-assessment is based on the 5-point Likert scale defined as follows: 0 represents ‘do not know’; ‘1’ represents ‘beginner’ (having acquired cognitive understanding of the relevant basics); ‘2’ represents ‘application’ (being able to practice or simulate under the guidance of others; ‘3’ represents ‘competent’ (being able to practice independently in the real world according to standards; ‘4’ represents ‘proficient’ (being able to practice independently and deliver top-quality outcome); and ‘5’ represents ‘expert’ (being able to serve as an example for peers and in an advisory capacity as well as participate in developments of standards). Respondents were asked to rate both their existing and expected levels of competencies.

The anonymous questionnaire was made available on the graduate survey platform (http://bysczdc.med.stu.edu.cn/) at the SUMC from June through December 2017. All participants had earned a degree in clinical medicine from the SUMC or partnering hospitals. Responses from those enrolled in 2012 (‘new graduates’), 2008/2009 (‘resident physicians’), or 2005/2006 (‘senior physicians’) were analyzed for the construction of competency indicators. Respondents were informed that their answers would be kept strictly confidential and that they could withdraw from the survey at will. All participants completed and submitted the questionnaires electronically or on paper.

Questionnaires collected were excluded from analysis if they met any of the following criteria: from graduates who earned their degrees outside the 3 time points specified; from respondents who no longer worked in the field of clinical medicine; from those who populated the answers mechanically (e.g., filled each question with identical answers); from respondents who submitted multiple questionnaires using the same Internet Protocol (IP) address (in this case, the last questionnaire submitted would be treated as valid input, with the rest, discarded).

SPSS Statistics 21.0 for Windows (IBM Corp., Armonk, New York) was used to analyze reliability and validity. The competency level of ‘0’ was equated as ‘missing data’ and substituted with the mean score (‘mean imputation’)[Citation14]. The Cronbach’s alpha value was used to evaluate the internal consistency. The Kaiser–Meyer–Olkin (KMO) measure greater than 0.9 and the significance level of Bartlett’s test of sphericity less than 0.05 would indicate that the data were suitable for exploratory factor analysis (EFA)[Citation15]. Factors with eigenvalues greater than 1 and factor loading greater than 0.45 would be extracted after orthogonal rotation with Kaiser normalization. If there were multiple-factor loadings greater than 0.45, the factor with the highest loading would be selected[Citation16].

For the confirmatory factor analysis, a separate random survey (using the same questionnaire) was fielded from May through August in 2018 among graduates who enrolled in the clinical medicine department at the SUMC in 2013 (‘new graduates’), 2010 (‘resident physicians’), or 2007 (‘senior physicians’). Confirmatory factor analysis using the software Amos 21.0 for Windows (IBM Corp., Armonk, New York) was carried out to test the fit of the scale. The reasonable fit of the scale would be determined based on the following: chi-square to the degree of freedom ratio (CMIN/DF) < 4; comparison fit index (CFI) > 0.9; incremental fit index (IFI) > 0.9; and root mean square error of approximation (RMSEA) ≤ 0.08[Citation17].

Gap analysis of competencies and perceived quality of medical education by graduates

A revised service quality model, SERVQUAL – which was originally designed for commercial applications to business services [Citation17] – was employed for the competency gap analysis based on the same survey responses collected in 2018. The quality of medical education (as measured by the gap between the existing competency level and the expected level) for the ith indicator is represented by Qi=PˉiEˉi, where Pi indicates the perceived existing level of competency for the ith indicator, and Ei, the expected level of competency[Citation18]. The quality of medical education for each of the KSA domain is Q=1mi=1m(PˉiEˉi), where m represents the number of indicators in each domain. When m = 36, Q indicates the overall quality of medical education. The Kruskal–Wallis test was used to analyze the differences in perceived quality among the three groups of respondents.

Results

The KSA-based competency indicators

Reliability and validity. There were 226, 193, and 186 questionnaires collected from new graduates, resident physicians, and senior physicians, respectively, which were included according the established criteria (). The Cronbach’s alpha values (reliability coefficients) for each item in the questionnaire and the questionnaire as a whole were both greater than 0.9. Therefore, all 36 core competency indicators were retained. The KMO values associated with the 3 groups of respondents were 0.967, 0.964, and 0.943, respectively. The p values of the Bartlett’s sphericity test were less than 0.001. The indicators were thus suitable for factor analysis.

Table 1. A Summary of Questionnaire Responses

Based on the exploratory factor analysis, 3, 3, and 5 factors were extracted from the groups of new graduates, resident physicians, and senior physicians, respectively (). Three out of the five factors extracted from senior physicians shared the same constructs and were combined into one single factor (i.e., ‘knowledge’). The Cronbach’s alpha values for all factors associated with each group were greater than 0.9, indicating high internal consistency. The factors extracted were analyzed further, and three domains emerged with which the competency indicators measured were aligned: knowledge (K), skills (S), and attitude (A).

Table 2. Rotated Component Matrix of Exploratory Factor Analysisb,c

Confirmatory factor analysis. There were 159, 126, and 122 questionnaires collected from the 3 cohorts of respondents, respectively, that were included according the established criteria (). The reasonable fit of the scale was confirmed based on the following: CMIN/DF = 3.596; CFI = 0.905; IFI = 0.905; RMSEA = 0.080.

Gap analysis of competencies and perceived quality of medical education by graduates

As shown in , the Q values represent the gaps between the existing and expected levels of competency as perceived by the 3 groups of participating graduates. Q values are negative for all 36 core competency indicators. Based on the total Q values, the largest overall competency gap is found among new graduates (−1.81), followed by resident physicians (−1.70) and senior physicians (−1.29) in that order. Within individual cohorts, the largest gap among resident physicians (−1.84) and senior physicians (−1.41) is seen in the domain of knowledge (K). Among new graduates, the largest gap (−1.92) is associated with the domain of skills (S), with the gap (−1.91) in knowledge trailing closely behind. For the domain of skills (S), both resident and senior physicians perceive their existing competency levels to be merely ‘applicable’ (2.46 & 2.77, respectively), which contrasts starkly with their expected levels of ‘proficient’ (4.13 & 4.02, respectively). New graduates, on the other hand, view their existing level as ‘applicable’ (2.00), while expecting their competency to reach the level of ‘competent’ (3.92).

Table 3. Competency Gap Analysis Based on the Modified SERVQUAL Model

Discussion

Unlike previous research that focused on such parameters as tangibility, reliability, responsiveness, assurance of services as well as empathy of the faculty and staff [Citation19,Citation20], our study aimed to evaluate the evolution of medical graduates’ core competencies in 3 domains: knowledge (K), skills (S), and attitude (A). We designed a competency-based assessment system that is holistic and implementable to examine how clinicians’ competencies have evolved from when they were new medical graduates, through residency, to becoming seasoned practicing physicians. The gap analysis, another component of our system, yielded uniquely valuable insights about the quality of medical education as perceived by the participating graduates.

In , the negative Q values for all 36 competency indicators among the 3 cohorts of graduates suggest a higher expected level of competency than participants’ perceived existing level. Based on the total Q values, the largest overall competency gap is seen among new graduates, followed by residents and senior physicians, in that order. In terms of domains, distinct gaps are found in domains of skills (S) and knowledge (K) in all 3 cohorts. Hence, there appear cohort-specific and domain-specific contributors to these gaps, and targeted remedial measures will be needed to bridge the gaps. For example, at the indicator level, the biggest gap among new graduates is associated with ‘conducting emergency rescue’ followed by ‘formulating the treatment plan’ – both indicators fall within the domain of skills (S). To bridge the gap, additional class hours – as part of the clinical skill training series at the SUMC – can be devoted to scenario-based simulation training. At the domain level, the biggest gap is found in the domain of skills (S) among new graduates. As required by laws and clinical practice standards in China, all medical activities shall be conducted under the supervision of senior physicians to ensure the safety of patients and the learning environment of medical students. New graduates can thus only reach the level where they can ‘apply’ the knowledge learned, but cannot reach the ‘competent’ level where they follow standard guidelines and practice independently. The ‘competent’ level of competency is now a requirement for standardized resident trainings in China. Therefore, there is a more urgent need to ramp up new graduates’ clinical skills, so they can be better prepared as they transition to the residency phase where more emphasis is placed on clinical practices. Methods such as simulation technique, standardized patient, and enhanced clinical exposure can all help elevate new graduates’ clinical skills [Citation21,Citation22].

Different levels of expectation were also found between new graduates and residents/senior physicians. While new graduates hoped to reach the level of ‘competent’ (competency level = 3) for the great majority of indicators when they graduated, resident & senior physicians aspired to become ‘proficient’ (competency level = 4) for more indicators. This difference is not a total surprise, given the different professional development phases that these graduates find themselves in. However, upon a closer examination, the expectation of ‘being proficient’ appeared predominantly associated with the domain of skills (‘S’) among residents (9 out of 10 skill-related indicators) and senior physicians (8 out of 10 skill-related indicators), and, to a lesser degree, among new graduates (2 out of 10 skill-related indicators). Interestingly, this strong correlation was not seen with the domains of attitude (‘A’) or knowledge (‘K’). In other words, the study participants did not demonstrate a similar degree of expectation for attitude- and knowledge-related competencies. This gravitation toward skill-defined competencies may reflect a paradigmatic orientation among medical graduates from the SUMC as a whole – which places higher emphasis on ‘skill acquisition’ than development of competencies in softer areas such as attitude and knowledge. This finding highlights the need to drive home not only the ultimate goal of nurturing well-rounded health-care professionals but also the importance of operationalizing this aim, so professional expectations can be raised accordingly and training courses/programs fit to deliver on this goal will be created and propagated. Fulfilling this objective also underscores the construction of a multi-component assessment system as proposed by this study to measure the multiple dimensions of medical competency.

Implications

The competency-based assessment system that we propose can be completed not only by ‘receivers’ of medical education/training (e.g., medical graduates, as in the current study), but also ‘administers’ (e.g., instructors, supervisors) and ‘beneficiaries’ (e.g., patients) of this education/training (although some modification of the indicators may be needed for the survey to be more meaningful to the latter group of stakeholders). This broad set of potential applications can facilitate the creation of a 360-degree survey of clinician’s core competencies, which echoes the systems-oriented characterization of the competencies that physicians need to demonstrate in order to serve the health-care needs of a society – a view elucidated by Epstein and Hundert [Citation1,Citation2] and referenced in the Introduction of this report.

Contrary to the assessment system that simply rates clinicians’ competency level at particular time points, the gap analysis incorporated into our system compares existing with expected levels of competencies empowers the receivers of medical education by acknowledging the value of their feedback. Insights culled from this group of stakeholders can inform policy-makers and administers of medical education as these decision-makers endeavor to instigate on-target improvements to bridge the pedagogical gaps. On the other hand, gap analysis facilitates the establishment of personal benchmark for clinicians, which allows them to take stock of the progress which they’ve made and titrate their goals and expectations as they continue evolving professionally. [Citation23–25]

Additionally, the competency-based KSA scale proposed in our study can serve as a reference to guide the reform of medical licensing examinations. In the past, these examinations focused on knowledge. Today, more emphasis is placed on physicians’ professionalism and clinical skills. Pursuant to additional investigations of feasibility, the 36 indicators contained in the scale can be developed into an expanded set of criteria to assist the redesign of medical licensing examinations.

Limitations of the study

As constrained by time and resources, the assessment rated by the 3 cohorts of graduates from the SUMC – new graduates, resident physicians, and senior physicians – was used as a proxy to gauge clinician’s professional development over time. Hence, the development trends found in this research may diverge from those in a longitudinal study that monitors the competency evolution of one single group of graduates. Secondly, the analysis of competency gaps in the study was based on participants’ self-assessment that might not corroborate fully with the assessment based on more objective measures or furnished by other key stakeholders such as patients, supervising physicians, peers, and nurses. Interpretations and extrapolations of the study findings thus need to be pursued with caution. Last but not least, the KSA-based assessment system proposed in our study was tested only among the graduates from one medical university, and needs to be further validated at additional medical institutes and in different parts of the country.

Conclusion

A competency-based assessment system is proposed to evaluate clinician’s competency development in 3 domains: knowledge (K), skills (S), and attitude (A). The system consists of 36 competency indicators, a rating scale of 5 proficiency levels, and a gap analysis to measure competency evolution through 3 key milestones in clinician’s professional career: new graduate, resident physician, and senior physician. The competency gaps identified can provide evidence-based guide to clinicians’ own continuous development as well as future medical curriculum improvements.

Data availability

Data are available from the corresponding author upon reasonable request.

Acknowledgments

The authors wish to thank the support from all the respondents, the National Medical Examination Center in China, and the late Dr. Ting Long (Shantou University Medical College) who was involved in the study design as well as the creation of the questionnaire and 36 core competency indicators. The authors also appreciate the guidance and suggestions furnished by Prof. Junhui Bian (Former Dean of the Shantou University Medical College) and Mianhua Yang (Shantou University Medical College).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was sponsored, guided, and assisted (for its implementation) by the National Medical Examination Center, the Ministry of Education Humanities and Social Science Project in the People’s Republic of China [17YJA880107], and the 2017 College Students’ Innovative Entrepreneurial Training Plan Program in Guangdong Province [No. 201710560108 & No. 201710560114] in the People’s Republic of China.

References

  • Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287(2):226–8.
  • Sun B, Li J, Wang Q. Zhong guo lin chuang yi sheng gang wei sheng ren li mo xing gou jian yu ying yong [Construction and application of Chinese doctors’ common competency model]. Beijing: Ren min wei sheng chu ban she [People’s Medical Publishing House Co., Ltd]; 2015.
  • Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010;376(9756):1923–1958.
  • Ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542–547.
  • Frank JR, ed. The CanMEDS 2005 Physician Competency Framework: better Standards, Better Physicians, Better Care. 2nd ed. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2005.
  • General Medical Council (GMC). Good medical practice. https://www.gmc-uk.org/-/media/documents/good-medical-practice—english-1215_pdf-51527435.pdf. Published 2013 Mar 25. Updated 2014 Apr 29. Accessed 2019 Sep 17
  • Hou J, Michaud C, Li Z, et al. Transformation of the education of health professionals in China: progress and challenges. Lancet. 2014;384(9945):819–827.
  • Distlehorst LH. What graduate follow-up studies can tell us. Med Educ. 2000;34(12):976–977.
  • Lockwood JH, Sabharwal RK, Danoff D, et al. Quality improvement in medical students’ education: the AAMC medical school graduation questionnaire. Med Educ. 2004 Mar;38(3):234–236.
  • National Health Commission of the People’s Republic of China, Department of Science,Technology and Education. Guidance of 7 departments including the National Health and Family Planning Commission on establishing standardized training system for resident physicians. http://www.nhc.gov.cn/qjjys/s3593/201401/032c8cdf2eb64a369cca4f9b76e8b059.shtml. Published Jan 17, 2014. Accessed 2020 May 17
  • Norcini JJ, Holmboe ES, and Hawkins RE. Evaluation challenges in the era of outcomes-based education. In: ES H, and RE H, editors. Pratical Guide to the Evaluation of Clinical Competence. Americia: Mosby; 2008. p. 1–9.
  • Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ. 2015;49(11):1086–1102.
  • Adolfo P. The Dreyfus model of clinical problem-solving skills acquisition: a critical perspective. Med Educ Online. 2010;15(1):109–117.
  • Zhou XH, Eckert GJ, Tierney WM. Multiple imputation in public health research. Stat Med. 2010;20(9–10):1541–1549.
  • Hair JF, Black B, Babin B, et al. Multivariate data analysis: international edition. J Rheumatol Suppl. 2007;35(2):357–358.
  • Hancock GR, Mueller RO. The Reviewer’s Guide to Quantitative Methods in the Social Sciences. New York NY: Routledge; 2010.
  • Shi C, Xu J, Chen K, et al. Chuang xin lin chuang ji ben ji neng jiao xue, ti gao yi xue sheng lin chuang zong he neng li [Innovation of Basic Clinical Skills Teaching and Improvement of Medical Students’ Clinical Comprehensive Ability]. Zhong hua yi xue jiao yu za zhi [Chinese Journal of Medical Education]. 2010;30(5):742–743.
  • Parasuraman A, Zeithaml VA, Berry LL. A Conceptual model of service quality and its implications for future research. J Mark. 1985;49(4):41–50
  • Aghamolaei T, Zare S. Quality gap of educational services in viewpoints of students in Hormozgan University of medical sciences. BMC Med Educ. 2008;8(1):34.
  • Kebriaei A, Akbari F. Quality gap of educational services at Zahedan University of Medical Sciences, Iran. Bangladesh Med Res Counc Bull. 2008;34(3):76.
  • Marel GM, Lyon PM, Barnsley L, et al. Clinical skills in early postgraduate medical trainees: patterns of acquisition of confidence and experience among junior doctors in a university teaching hospital. Med Educ. 2010;34(12):1013–1015.
  • Swamy M, Sawdon M, Chaytor A, et al. A study to investigate the effectiveness of SimMan® as an adjunct in teaching preclinical skills to medical students. BMC Med Educ. 2014;14(1):231.
  • DA C, DA A, Sj D, et al. Longitudinal research databases in medical education: facilitating the study of educational outcomes over time and across institutions. Acad Med. 2010;85(8):1340–1346.
  • Santonja-Medina F, García-Sanz MP, Martínez F, et al. Portfolio as a tool to evaluate clinical competences of traumatology in medical students. Adv Med Educ Pract. 2016;2016(7):57–61.
  • O’Brien CL, Sanguino SM, Thomas JX, et al. Feasibility and Outcomes of Implementing a Portfolio Assessment System Alongside a Traditional Grading System. Acad Med. 2016;91(11):1554–1560.