4,560
Views
0
CrossRef citations to date
0
Altmetric
Review Article

Self-directed learning assessment practices in undergraduate health professions education: a systematic review

ORCID Icon, ORCID Icon, & ORCID Icon
Article: 2189553 | Received 17 Nov 2022, Accepted 07 Mar 2023, Published online: 15 Mar 2023

ABSTRACT

Purpose

The goal of this systematic review was to examine self-directed learning (SDL) assessment practices in undergraduate health professions education.

Methods

Seven electronic databases were searched (PubMed, Embase, PsycINFO, ERIC, CINAHL, Scopus, and Web of Science) to retrieve English-language articles published between 2015 and July of 2022, investigating assessment of SDL learning outcomes. Extracted data included the sample size, field of study, study design, SDL activity type, SDL assessment method, number of SDL assessments used, study quality, number of SDL components present utilising the framework the authors developed, and SDL activity outcomes. We also assessed relationships between SDL assessment method and number of SDL components, study quality, field of study, and study outcomes.

Results

Of the 141 studies included, the majority of study participants were medical (51.8%) or nursing (34.8%) students. The most common SDL assessment method used was internally-developed perception surveys (49.6%). When evaluating outcomes for SDL activities, most studies reported a positive or mixed/neutral outcome (58.2% and 34.8%, respectively). There was a statistically significant relationship between both number and type of assessments used, and study quality, with knowledge assessments (median-IQR 11.5) being associated with higher study quality (p < 0.001). Less than half (48.9%) of the studies used more than one assessment method to evaluate the effectiveness of SDL activities. Having more than one assessment (mean 9.49) was associated with higher quality study (p < 0.001).

Conclusions

The results of our systematic review suggest that SDL assessment practices within undergraduate health professions education vary greatly, as different aspects of SDL were leveraged and implemented by diverse groups of learners to meet different learning needs and professional accreditation requirements. Evidence-based best practices for the assessment of SDL across undergraduate healthcare professions education should include the use of multiple assessments, with direct and indirect measures, to more accurately assess student performance.

Introduction

Health professionals within a wide variety of fields are charged with using evidence-based medicine to evaluate, diagnose, and treat patients. In addition, health professions fields are constantly evolving. Indeed, Densen wrote that the estimated doubling time of medical knowledge in 1950 was 50 years and thus was projected to be 73 days in 2020 [Citation1]. Adding to the difficulties regarding the professionals’ knowledge, is that it declines over time [Citation2–4]. This loss of knowledge can have detrimental effects on the patient, therefore it is critical that education promotes skills associated with life-long learning. Because self-directed learning (SDL) is crucial for cultivating life-long learners [Citation5], it is a key component of health professional education [Citation6]. Self-directed learning allows health professionals to continue to grow and refine their knowledge in order to improve patient care and well-being [Citation7].

The definition of SDL, coined by Knowles in 1975, is ‘a process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes.’ [Citation8] In SDL, the learning responsibility shifts to students, and they take on an active role in developing and initiating actions necessary to achieve their learning goals. Another concept known as ‘self-regulated learning’ shares some similarities with SDL and is sometimes used interchangeably with SDL. Both concepts require active engagement of the learner, choice and decisions regarding the learning strategies, and evaluation [Citation9]. However, self-regulated learning commonly takes place in the classroom, originates from cognitive psychology, and focuses on the learning processes involved with a task [Citation9,Citation10]. Moreover, it relies on metacognitive and cognitive operations such as self-efficacy and self-awareness [Citation9]. Interestingly, SDL requires self-regulated learning [Citation9]. Studies have shown SDL is associated with increased confidence, self-efficacy, critical thinking, self-awareness, and autonomy [Citation11]. Learners who practise SDL become more aware of their deficiencies and how to address them; health professionals who use SDL continue to grow and refine their knowledge in order to improve patient care and well-being [Citation8,Citation12].

Indeed, accreditation bodies of many health professions education programs require a form of self-directed or life-long learning in the curriculum [Citation8,Citation13–22]. For example, the Liaison Committee on Medical Education (LCME, 2021) and the Commission on Dental Accreditation (CODA, 2020) accreditation standards directly require SDL curricular components [Citation20,Citation23]. However, what can be defined as SDL varies across different health professions [Citation8,Citation24,Citation25], and while some programs do not specify criteria or definitions for SDL, they still require that it be included in the curriculum. Moreover, the way in which SDL is assessed varies amongst and even within disciplines. As a result, a general consensus as to what activities constitute SDL and how SDL should be uniformly assessed amongst health professions is lacking [Citation26]. SDL is frequently assessed in a variety of manners, including, but not limited to, changes in behaviour, knowledge-based assessments, and qualitative thematic analysis of student reflections [Citation27–30]. All of these methods have the potential to be valid means to assess SDL. However, to our knowledge there is no comprehensive review of the assessment practices related to SDL, including the relationship with learning outcomes [Citation26]. This has led to a call for research into SDL outcome measures and methodology in order to advance SDL in medical education [Citation31].

This systematic review was initiated to further the understanding of the assessment of SDL in undergraduate health professions education. The primary goal of this systematic review was to answer the research question: ‘how is SDL assessed in undergraduate health professions education?’ The primary objectives of this review were to:

  1. determine the type of assessment methods used to evaluate reported SDL activities.

  2. report the outcomes of SDL activities on student learning based on the assessments identified as well as the components of SDL identified in a framework.

  3. determine if there is a relationship between the identified SDL assessments and the number of SDL components, the study quality, the field of study, and the study outcomes.

Materials and methods

Data sources and searches

This systematic review examines the types of assessment used to evaluate learning outcomes from SDL activities implemented in undergraduate health profession education programs. The review was prepared in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines [Citation32]. Due to lack of a standard definition of SDL and a diversity of nomenclature related to assessment and health profession students, we worked with a health information specialist (MM) to develop search strategies that incorporated multiple keywords and index terms related to key concepts (e.g., self-directed learning, health profession students, assessment). We conducted comprehensive searches for literature published between January 2015 and July 2022 in seven online databases: PubMed, Embase, PsycINFO, ERIC, CINAHL, Scopus, and Web of Science. A preliminary literature search showed a more consistent upward trend in publications related to SDL around 2015. To ensure the most current literature related to SDL was incorporated, the search was limited to articles published in or after 2015. Unique index terms specific to each database were identified, if available, so search strategies tailored to each database were developed to optimise search retrieval (see Appendix 1 for a comprehensive search strategy with all search terms). Citation review (or hand searching by bibliography review) was performed by two investigators (SL, TT) to identify any additional studies. Literature search strategies were constructed based on PRISMA-S and the PRESS (Peer Review of Electronic Search Strategies) checklists [Citation32–34].

Development of framework

While various health professions agree that SDL is of great importance, there is not an agreed upon standard of what defines SDL, how SDL should be taught, or how SDL should be assessed. We created an SDL framework drawing on a literature review of available accreditation standards or guidelines representing different health profession education programs related to SDL requirements () [Citation8,Citation13–22]. We used the framework to help us develop research objectives and to aid in identifying different SDL components and the number of steps in reported SDL interventions in included studies. This framework comprises seven steps, each step of which involves various components or activities () [Citation8,Citation13–22]. The number of steps used during the described SDL activity was compared to the assessment(s) used as well as the reported outcomes of the assessment(s). The criteria to define SDL were not developed in this review; studies were included if the article authors stated that the SDL was incorporated in a teaching or learning method.

Figure 1. Expanded self-directed learning (SDL) framework. These steps may serve as a framework for identifying SDL activities and/or SDL interventions discussed in research articles and assessment methods employed for studying outcomes of SDL interventions [Citation8,Citation13–22]. .

Figure 1. Expanded self-directed learning (SDL) framework. These steps may serve as a framework for identifying SDL activities and/or SDL interventions discussed in research articles and assessment methods employed for studying outcomes of SDL interventions [Citation8,Citation13–22]. .

Study selection

Search results retrieved from all databases were downloaded and imported to Covidence systematic review software (www.covidence.org), a web-based software program for managing and streamlining a systematic review (MM). Since the review primarily focused on determining the types of assessment used to evaluate the core elements of SDL in health professions education and the outcomes of those SDL activities, studies were included if the authors reported use of SDL activities and assessed student learning outcomes related to those activities. Additionally, the students evaluated had to be enrolled in undergraduate health professions educational programs. The authors elected to include health profession programs that commonly include curricula that cover evidence-based patient care, in which students are involved in learning about the diagnosis, evaluation, and treatment of different disease states. For this reason, undergraduate students were from health professions educational programs including medicine, nursing, physical therapy, occupational therapy, pharmacy, dentistry, physician assistants, optometry, chiropractor, and podiatry. The studies were only included if they were published original research in English, with full content. Articles were excluded if they were in languages other than English, only investigated SDL assessment in graduate medical education or other non-health professions education, and were other types of publications rather than original research with empirical data.

Due to the wide variations in the definition and required components of SDL, all studies that reported the use of an SDL activity were included. Studies selected for the review contained at least one measure of assessing effectiveness associated with SDL, defined based on author declaration that SDL was utilised, and guided by the SDL framework created by the authors (see ) [Citation8,Citation13–22]. Due to the differences in SDL definition, activities, and assessments used, the findings and outcomes of the SDL intervention described were solely based on the authors’ reported data.

Two investigators screened titles and abstracts, and then full text articles for candidate studies, against the inclusion and exclusion criteria, in duplicate and independently (SL, TT). A third investigator served as a tie-breaker to resolve any discrepancies by consensus (MM). A tie-breaker vote was only required a handful of times.

Data extraction

We developed a standard data extraction form that was pilot-tested on five articles and extensively refined for the extraction process. Two investigators (SL, TT) worked independently to extract data from included studies with any disagreements resolved through consensus. A third investigator (MM) was required to resolve disagreements a handful of times. Extracted data included the country of a study conducted, field of study, sample size, study design, description of SDL, SDL activities, assessment methods used to evaluate effectiveness of SDL (student perceptions including SDL readiness, knowledge examinations, etc.), outcomes or findings of SDL activities, and number of SDL components. Psychometric information of reliability and validity on any SDL assessment used in selected studies was also extracted if reported in selected studies. Data on outcomes or findings of SDL activities were coded based on the outcomes reported by the article authors. For example, if article authors measured the effect of an SDL activity using an internal knowledge exam instrument, we reported outcomes as positive if there was a knowledge gain due to SDL in the article, and negative if there was a reported knowledge decrease. If an article reported that student reaction or perceptions of an SDL activity were low, we reported a negative outcome for SDL. The designation of a ‘mixed/neutral’ outcome was used if one assessment measure reported positive SDL outcomes while a second measure reported negative outcomes, or if the outcomes were not positive or negative. Studies that measure students’ reactions and perceptions, including readiness scales, internally and externally developed surveys, are categorised as ‘perception’; studies that use internal or external/standardised assessments to measure knowledge are categorised as ‘knowledge’; and studies that utilised other approaches (for example, a qualitative approach or peer review) are considered as ‘other.’

Two investigators (KK, SL) examined and evaluated SDL activities described in selected studies against the SDL framework () and extracted data on SDL components in these studies. A third reviewer (TT) reviewed the extracted data and resolved any disagreement by consensus a handful of times. Given that different components of SDL steps are described in selected studies, we extracted information on the number of steps included in reported SDL interventions.

Quality assessment

The authors used the Medical Education Research Study Quality Instrument (MERSQI) to assess the methodological quality of included studies [Citation35]. This assessment tool has been widely used to assess the quality of studies and the resulting score has been shown to be predictive of the likelihood for publication [Citation35–39]. The MERSQI is a 10-item tool that evaluates quality in the following six domains: study design, sampling, type of data, data analysis, validity of evaluation instrument, and outcome measures. Each item can have a maximum score of 3, and scores range from 5 to 18, with higher scores signifying higher quality. We used this instrument to assess validity (content, relationship to other variables, and internal structure), sample response rate, number of institutions, complexity of the data analysis, appropriateness of statistical analysis, the study outcomes assessed (perceptions, knowledge, behaviour, and patient outcomes), and study design (single group, non-randomized, randomised, etc.). If an assessment measure was validated by means of experts, guidelines, theory, or existing validated surveys, the measure was considered to have content validity. A measure was considered to have evidence of reliability if factor analysis or a measure of reliability (internal consistency, interrater, test-retest, etc.) was reported. The dual process of two reviewers (KK, TT) involving the third investigator resolving any disagreement a handful of times (SL) was used to ensure consistency and rigour in the quality assessment.

Data analysis

Descriptive statistics mean, median, mode, minimum, maximum, Kurtosis, and skewness were calculated for all quantitative variables (sample size, number of assessments, number of SDL components, and MERSQI scores) using https://www.socscistatistics.com/. Publication dates were examined for trends, if any.

The number of assessments used in each study was analysed to determine if there was evidence of differences in study outcomes or characteristics based on whether each study had 1, 2 or 3 assessments of SDL. In order to determine whether studies had different distributions of SDL components based on the number of assessments, a Kruskal-Wallis was used (to account for the ordinal nature of the number of SDL components). To test whether mean MERSQI scores differed between number of assessments, a One-Way ANOVA-independent measure was employed. Lastly, Chi-Square analyses were performed separately to test for an association between number of assessments and field of study (nursing/medical fields) and for an association between the number of assessments and the study outcomes (positive, neutral/mixed, and negative), respectively.

As a second variable of interest, the type of assessment (perception, knowledge, or other) was investigated. This aspect of the analysis only contained studies which had one specific assessment type, not two or more. In a similar approach to above, a Kruskal-Wallis was used to determine whether the distribution of SDL components differed between the study assessment types (perception, knowledge, or other). Again, a One-Way ANOVA-independent measure was used to test whether mean MERSQI scores differed between the assessment type. Lastly, a Chi-Square test was run in order to test for an association between type of assessment and field of study.

All analyses were run using https://www.socscistatistics.com/. Kolmogorov-Smirnov test for normality was run prior to the analysis. Two-sided tests were performed and an alpha of 0.05 was used. Pairwise comparisons were performed when applicable, and necessary steps were taken to account for multiple testing (Bonferroni/Tukey’s HSD). Statistical analysis was performed by KK and reviewed by a biostatistician. All statistical tests performed are summarised in .

Table 1. Statistical tests performed in this systematic review.

Results

Of the 1,809 papers that met the initial inclusion criteria, 141 were included for review (). summarises the extracted variables: SDL assessment methods, a brief description of each SDL activity, the number of SDL components, learning outcomes, and the MERSQI score for all included papers. For a full listing of participant fields of study, sample sizes, country, and study design types, see Appendix 2. Of note, only five (3.5%) studies contained all seven SDL components (see ). Most authors did not mention the validity of the described instruments (see MERSQI, ), but of those that did, 47 (33.3%) were validated and five (3.5%) were not. Similar results were seen in terms of instrument reliability with 64 studies (45.4%) reporting use of reliable instruments, seven (5.0%) reporting that reliability was not determined, and 70 (49.6%) failing to report reliability. No standardised assessment method was used. The most common type of assessment of SDL was a student perception survey developed by the researcher(s) (31.4% Perception-internal; see ). The second most common assessment method used was an internally-developed knowledge exam (23.3%). Most studies (58.2%) reported positive SDL learning outcomes, with 49 studies reporting mixed or neutral SDL outcomes (34.8%), and only 10 studies (7.1%) reported negative outcomes in terms of SDL activities ().

Figure 2. PRISMA flow diagram of SDL assessment systematic review [Citation32].

Figure 2. PRISMA flow diagram of SDL assessment systematic review [Citation32].

Table 2. Findings of systematic review for assessment of self-directed learning in healthcare education.

Table 3. Descriptive Statistical Analyses for Quantitative Variables in the Systematic Review (N = 141).

Basic characteristics of the studies analysed

The studies that met the inclusion criteria were conducted across the spectrum of health professions education (Appendix 2). The field of study of the majority of study participants was medical (N = 73, 51.8%) and nursing (N = 49, 34.8%). The number of studies that focused on pharmacy (N = 11, 7.8%), dental (N = 4, 2.8%), physical/physio therapy (N = 3, 2.1%) and optometry students (N = 1, 0.7%) was low. When we examined the publication dates of the studies that met the inclusion criteria, we found that SDL assessment studies did not appear to increase in popularity between 2015 and 2020. However, between 2020 and 2021, the number of studies rapidly spiked (presumably due to the COVID-19 pandemic and the resulting online learning) and remained high for the first seven months of 2022. The number of studies published during the years 2015 to 2020 remained stable at 11–13 per year, but then rose to 43 studies in 2021 and 22 studies from January until July of 2022.

Included studies were from a variety of countries from around the world (Appendix 2). Out of 141 included studies, 28 studies were from the United States with South Korea being the second-most prominent with 27 studies. Studies from India were the third-most prominent with 21 studies. The two most common study designs were short case study designs and non-equivalent control group designs, with 40 and 25 studies respectively. Descriptive data for all quantitative data collected (sample size, number of assessments, number of SDL components, and MERSQI scores) are summarised in .

Relationships between key variables

Number of assessments

There were 69 studies with one assessment, 59 with two assessments and 13 with three assessments. Of the studies with one assessment, there was a median number of SDL components of 4 with an interquartile range (IQR) of 5 to 3. Of the studies with two assessments, there was a median (IQR) of 4 (5 to 3). Studies with three assessments had a median (IQR) of 3 (4 to 3). There was no evidence of a significant difference between the distribution of the numbers of components based on the number of assessments [H (2) = 0.9402; p = 0.625]. Similarly, there was no evidence of a statistically significant association between either the number of assessments and field of study being medicine or nursing (x2 = 1.57; df = 2; p = 0.457), or the number of assessments (single vs. multiple) and study outcome (x2 = 4.20; df = 2; p = 0.122). This is evidenced by the relatively similar percentages of studies originating from medicine and nursing, respectively across assessment numbers: 1 (64.1% medicine vs 35.9% nursing), 2 (57.7% medicine vs 42.2% nursing) and 3 (46.2% medicine vs 53.8% nursing) and by similar percentages of outcomes between multiple (54.1%positive, 4.2% negative, 41.7% mixed) vs. single (62.3%positive, 10.1% negative, 27.5% mixed) assessments.

There was evidence of a significant difference with regards to study quality (MERSQI), based on the number of assessments [F(2,138) = 13.71, p < 0.001]. The mean scores for studies with one, two, and three assessments were 9.49 (2.07),11.25 (1.86), and 11.27 (2.22) respectively. Post hoc testing using Tukey’s HSD showed differences between studies with one and two assessment groups, and similarly between studies with one and three assessment groups (p = 0.003 and 0.004, respectively). Testing between studies with two and three assessment groups did not show evidence of a significant difference (p = 1).

Type of assessment

There were 42 (60.9%) studies with the assessment type classified as ‘perception,’ 13 (18.8%) with the type classified as ‘knowledge,’ and 14 (20.3%) classified as ‘other’ that utilised one assessment type. When looking at SDL components, there was a statistically significant difference in distribution found between study types [H(2) = 10.74; p = 0.005]. Studies classified as perception had a median (IQR) of 3.5 (4 to 2), whereas studies classified as knowledge had a median (IQR) of 4 (4 to 3). Those classified as other had a median (IQR) of 5 (5 to 4.25). Pairwise comparisons using a Bonferroni correction determined that there is evidence of a difference between perception and other (U = 122; p = 0.001), but not perception and knowledge (U = 225; p = 0.347) or knowledge and other (U = 52.5; p = 0.066).

There was a statistically significant difference of mean MERSQI quality scores between study types (p < 0.001). Studies classified as perception had a median (IQR) of 9.5 (10.5 to 7.6), whereas studies classified as knowledge had a median (IQR) of 11.5 (12.5 to 10.5) and other had a median (IQR) of 8 (10 to 7). Post hoc testing using Tukey’s HSD showed evidence of statistically significant differences between knowledge and perception, as well as knowledge and other (p = 0.004 and<0.001, respectively). However, there was no statistically significant difference between perception and other (p = 0.467) Moreover, there was no statistically significant evidence of an association between type of assessment and field of study (medicine, nursing; p = 0.209).

Discussion

This systematic review was initiated to begin to address the necessity of a uniform understanding about assessment practices related to SDL in health professions education. Included studies revealed that different aspects of SDL were implemented to meet different learning needs. While the authors found there were wide variations in the reported SDL activities, the assessment methods used to evaluate the outcomes of these activities were more consistent, and were primarily related to student perception surveys or knowledge exams (see ). We also found a statistically significant correlation between the number of assessments used and the study quality using the MERSQI methodology [Citation35], with the use of more than one assessment being significantly associated with a higher study quality. However, we did not see a significant difference in study quality scores between those that used two versus three assessments, likely due to the smaller number of studies with three assessments. Additionally, while there was not a statistically significant relationship between the number of assessments used and study outcomes, or the number of SDL components identified in the study, the type of assessment used to measure the impact of SDL activities had a statistically significant relationship with the study quality. The highest quality studies were statistically associated with the use of knowledge assessments (mean 11.3), while lower quality scores were associated with the use of student perception assessments (mean 9.25) and those assessments that we categorised as ‘other’ (mean 8.5). Student perception assessments in our categorization included SDL readiness, student surveys developed by the researchers (perception-internal), as well as surveys that were previously published (perception-external). When looking further at the types of assessments that were categorized as ‘other,’ many involved student opinions or perceptions. Examples included focus groups, interviews, reflective essays, peer or self-evaluation, skill confidence, and course evaluations (). The relationships between these variables were then examined further.

This statistically significant relationship between the number of assessments used and study quality is an important finding, as a blend of assessment measures, including both direct and indirect assessments, is recommended to more accurately evaluate student achievement of learning outcomes [Citation177]. Direct measures, such as objective exams, presentations, or papers, provide direct evidence of student learning in regards to knowledge and skills [Citation177,Citation178]. According to Suskie [Citation177], these measures offer tangible, self-explanatory, and compelling evidence of what students have or have not learned. Indirect measures, such as reflections, course evaluations, student surveys or focus groups, provide information about students’ perceptions of their abilities [Citation177,Citation178]. While indirect measures may seem less credible since they do not provide a direct measurement of student performance, these assessments can provide important information, beyond the achievement of learning outcomes [Citation177]. For example, focus groups or reflections can provide knowledge about the learning process and experience itself, potentially providing actionable information for improvement, beyond what knowledge-based outcomes may offer [Citation179].

Less than half of the studies in this review used more than one assessment method to evaluate the effectiveness of SDL activities. It may be important to highly recommend the use of multiple modes of assessments, including direct and indirect measures, when developing an effective assessment approach to evaluate SDL outcomes. The use of multiple assessments is often considered essential for an accurate evaluation of health professions student learning due to various strengths and shortcomings for each type of assessment. Using multiple assessment methods at different time points can help provide a more accurate picture of student performance [Citation180].

Many of the studies included in the review used student perception surveys either alone or in combination with other assessment methods to assess SDL effectiveness (109 studies, 77.3%; ). According to Gabbard [Citation181], this type of research often focuses on self-perceived improvements in knowledge or confidence in a certain area. There may be numerous reasons why self-perceptions are used in educational research, such as the relative ease with which the data can be collected and the importance of student satisfaction during their educational experiences [Citation181]. But unfortunately, self-assessment is not always accurate as students may not have a true awareness of their learning capabilities and understanding of the material [Citation182,Citation183]. Moreover, a recent study published by Kemp and colleagues [Citation24] found that while medical students claimed to have used and are familiar with SDL, they had not utilised aspects of SDL as defined by that study. This led the authors to speculate that students may not be able to identify components of SDL and explicit education regarding SDL may be required [Citation29]. Other studies in the analysis commented on the potential limitations of student perceptions. For example, Kastenmeier and colleagues [Citation122] noted that a major limitation of their study was that many conclusions were based on survey data which only measured student perceptions and not their actual acquisition of medical knowledge and SDL skills. Kershaw and colleagues [Citation102] commented that although they evaluated student perceptions, it would be helpful to also use formative feedback from the teaching faculty to improve the learning process, assessment design, and criteria used for evaluations. According to Ehrlinger and colleagues [Citation183], students’ perceptions of their abilities and characteristics can be unreliable; there is no clearly established relationship between a student’s self-confidence and their actual knowledge or skills. Dunning and Kruger [Citation184] noted that of those with less understanding of the material, students are actually more likely to misjudge and overestimate their own abilities [Citation181]. Oftentimes, stronger students have more accurate perceptions of their learning, whereas struggling students can be falsely overconfident in their abilities [Citation183–185]. For these reasons, solely using self-perception data to assess SDL effectiveness might not provide accurate information about a student’s true ability to use SDL and the effectiveness of the SDL instructional strategy.

Several of the studies used knowledge-based exams that were created internally (developed by the researchers) to assess the effectiveness of SDL (53 studies, 37.6%; ). Many were administered shortly after the activity and did not test for long-term retention of knowledge. This is not a surprising finding. Many accrediting bodies identify required components of instruction and assessment throughout a curriculum, in which institutions develop their own assessment methods and standards to meet these requirements. While it is common for academic institutions to develop their own assessment material, this often makes it more challenging to compare students across different programs [Citation180].

One of the most common SDL readiness assessments used when evaluating readiness was Guglielmino’s SDL readiness scale validated by Dr. Lucy M. Guglielmino in 1977 [Citation186]. This instrument measures students’ self-reported attitudes, abilities, and characteristics that involve their readiness to engage in self-directed learning and is a type of student perception assessment method [Citation101,Citation186]. Despite its popularity, there have been many potential problems associated with the use of this instrument, including concerns about cost and validity [Citation187–190]. There were also concerns about the reliability of the instrument when utilised with more diverse populations [Citation189,Citation191,Citation192]. Considering that Guglielmino’s scale [Citation186] is over 40 years old and may have issues related to construct validity and reliability when evaluating diverse populations, it may be time to develop a new instrument. Additionally, using older versions of scales may mean that the researchers are not assessing elements related to recent innovations in education or healthcare, which may weaken the results and make them less accurate and/or reproducible. For example, with the advances in technology and changes in health professions education including required competencies, a more updated evaluation tool may be more effective to measure SDL readiness and effectiveness.

Overall, the components of SDL assessments are often different, as various health professions programs have diverse contexts of practice. While core competencies like professionalism may be similar among health professions, the expectations for skills in knowledge, application, and critical thinking are unique based on each program. It is likely that the instruments used are different because of the unique requirements of each profession.

Strengths and limitations

There were several strengths associated with this systematic review. The comprehensive literature searches were done by an experienced librarian searcher with multiple sources of data. In addition, an SDL framework () [Citation8,Citation13–22] was developed to help screen studies investigating SDL activities and outcomes measures. Finally, this research reviewed SDL in multiple health professions programs to allow for a more comprehensive review of the literature related to assessments of SDL.

In terms of limitations, this systematic review was limited to English articles with full content, published since 2015, therefore potential publication selection bias may have been introduced to the review. Articles published in other languages or grey literature of research studies (unpublished) may have different definitions of SDL and have used different assessment tools to evaluate learning outcomes from SDL activities or interventions. Additionally, it is important to keep in mind that these results may not be generalizable to health-care professionals, since compared to students, professionals most likely have an increased readiness for SDL; they are more aware of their learning needs and have increased skills in decision making and critical thinking [Citation193].

To our knowledge, this is the first systematic review that focuses on assessment of learning outcomes from SDL interventions implemented in health professions education programs. The review shows that different components in the SDL steps were adopted in different health professions programs to meet different learning needs and professional accreditation requirements. We anticipate that an updated systematic review would be conducted in 3–5 years as an increasing number of research studies with strong research design are being published and the definition of SDL incorporates agreed-upon, salient components to achieve the optimum learning outcomes through implementing SDL activities in health profession education. The review also uncovers a large number of studies with single-group design that limits the generalizability of results and conclusions in these studies. While the review identifies a wide range of assessment tools to measure learning outcomes targeted on different SDL components of the SDL framework, it indicates the need for further research with a control group design and random assignment, using validated assessments administered to large sample sizes, and in an interprofessional education context. The study quality mean score placed the studies in a range similarly found for previously published medical education studies as indicated by studies that previously used this measure [Citation38,Citation194,Citation195].

Conclusions and recommendations

Since effective SDL is important for lifelong learning and considered a necessary skill by many accrediting bodies for health professions education programs, it must be assessed appropriately. To the authors’ knowledge, this systematic review is the first to thoroughly describe the types of assessment methods used to determine the effectiveness of SDL activities in health professions education programs. While the majority of included studies evaluated SDL using student perception surveys and internally-created knowledge-based exams, this review found that less than half of the studies used more than one assessment method to evaluate the effectiveness of SDL activities. Since SDL is considered such an important factor in healthcare professions education, we recommend evidence-based best practices for the assessment of SDL. It is important to recommend the use of multiple assessments, with a blend of direct and indirect measures, to ensure the most accurate evaluation of the learning process and student performance [Citation177,Citation178]. This is especially important as many board certifying exams measure a student’s knowledge and clinical skills, but do not evaluate SDL skills to ensure the student is prepared for the lifelong learning required for effective patient care throughout a career. Future research should determine the most accurate method to assess SDL activities. This may also include the development of new instruments that primarily focus on the SDL process, including SDL readiness.

Supplemental material

Supplemental Material

Download Zip (59.4 KB)

Acknowledgments

The authors would like to acknowledge Jacob Keeley for his help with the development and application of statistical methods. We are also grateful to Audrey Bell, CMI for her professional assistance with illustrations.

Disclosure statement

No potential conflict of interest was reported by the authors.

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/10872981.2023.2189553

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Densen P. Challenges and opportunities facing medical education. Trans Am Clin Climatol Assoc. 2011;122:48–26.
  • Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142(4):260–273.
  • Custers EJFM. Long-term retention of basic science knowledge: a review study. Adv Health Sci Educ Theory Pract. 2010;15(1):109–128.
  • Hołda MK, Stefura T, Koziej M, et al. Alarming decline in recognition of anatomical structures amongst medical students and physicians. Ann Anat. 2019;221:48–56.
  • Sb M. Andragogy and self-directed learning: pillars of adult learning theory. In: Merriam SB, editor. The new update on adult learning. new directions for adult and continuing education, No. Vol. 89: Jossey-Bass; 2001. pp. 73–81.
  • Foundation ABIM. American board of internal medicine; ACP-ASIM foundation. American college of physicians-American society of internal medicine; European federation of internal medicine. medical professionalism in the new millennium: a physician charter. Ann Intern Med. 2002;136(3):243–246.
  • Murad MH, Coto-Yglesias F, Varkey P, et al. The effectiveness of self-directed learning in health professions education: a systematic review. Med Educ. 2010;44(11):1057–1068.
  • Knowles M. Self-directed learning: a guide for learners and teachers: Associated Press; 1975.
  • Gandomkar R, Sandars J. Clearing the confusion about self-directed learning and self-regulated learning. Med Teach. 2018;40(8):862–863.
  • Saks K, Leijen L. Distinguishing self-directed and self-regulated learning and measuring them in the E-learning context. Procedia Soc Behav Sci. 2014;112:190–198.
  • Hwang Y, Oh J. The relationship between self-directed learning and problem-solving ability: the mediating role of academic self-efficacy and self-regulated learning among nursing students. Int J Environ Res Public Health. 2021;18(4):1–9.
  • Tekkol İA, Demirel M. An Investigation of self-directed learning skills of undergraduate students. Front Psychol. 2018;9:2324.
  • MedBiquitous Curriculum Inventory Working Group Standardized Vocabulary Subcommittee. Curriculum Inventory standardized instructional and assessment methods and resource types: Association of American Medical Colleges. Published; 2016 March. Accessed December 9, 2021: https://medbiq.org/curriculum/vocabularies.pdf
  • Holmboe ER, Edgar L, Hamstra S The Milestone Guidebook. Accreditation Council for Graduate Medical Education. Published 2016. Accessed October 14, 2022. https://www.utrgv.edu/som/gme/_files/documents/milestones_guidebook_2016.pdf
  • Molloy E, Boud D. Changing conceptions of feedback. In: Boud D Molly E, editors. Feedback in higher and professional education: Routledge; 2013. pp. 11–33.
  • Obied HK, Abo Gad RA. Applying self-directed learning strategy to enhance nursing students’ critical thinking skill. IOSR J Nurs Health Sci. 2017;06(02):67–77.
  • Montin L, Koivisto JM. Effectiveness of self-directed learning methods compared with other learning methods in nursing education related to nursing students’ or registered nurses’ learning outcomes: a systematic review protocol. JBI Database System Rev Implement Rep. 2014 February;12(2):1–8. DOI:10.11124/jbisrir-2014-532.
  • Neal JH, Neal LDM. Self-directed learning in physician assistant education: learning portfolios in physician assistant programs. The J Physician Assist Educ. 2016 December;27(4):162–169. DOI:10.1097/JPA.0000000000000091.
  • Accreditation council for graduate medical education and the American board of surgery. The General Surgery Milestone Project. Published July 2015. Accessed October 14, 2022. https://www.acgme.org/globalassets/PDFs/Milestones/SurgeryMilestones.pdf.
  • Commission on Dental Accreditation. Accreditation standards for dental education programs. Published 2016. Accessed October 14, 2022. https://coda.ada.org/~/media/CODA/Files/predoc_standards.pdf?la=en.
  • Haden NK, Andrieu SC, Chadwick DG, et al. The dental education environment. J Dent Educ. 2006;70(12):1265–1270. DOI:10.1002/j.0022-0337.2006.70.12.tb04228.x
  • Guidance on Continuing Professional Development (CPD) for the profession of pharmacy. accreditation council for pharmacy education (ACPE) website. Published 2015. Accessed April 24, 2020. www.acpe-accredit.org/pdf/CPDGuidance%20ProfessionPharmacyJan2015.pdf.
  • Liaison Committee on Medical Education. Functions and structure of a medical school. Published 2020. Accessed February 12. http://www.lcme.org/publications.htm.
  • Tough A. Focusing on highly deliberate efforts to learn. Tough A, editor: Learning Concepts; 1971. pp. 1–6.
  • O’shea E. Self-directed learning in nurse education: a review of the literature. J Adv Nurs. 2003;43(1):62–70.
  • Ricotta DN, Richards JB, Atkins KM, et al. Self-directed learning in medical education: training for a lifetime of discovery. Teach Learn Med. 2021:1–11. doi:10.1080/10401334.2021.1938074.
  • Cadorin L, Rei A, Dante A, et al. Enhancing self-directed learning among Italian nursing students: a pre- and post-intervention study. Nurse Educ Today. 2015;35(6):746–753.
  • Jeon J, Park S. Self-directed learning versus problem-based learning in Korean nurse education: a quasi-experimental study. Healthcare (Basel). 2021;9(12):1763.
  • Kemp K, Baxa D, Cortes C. Exploration of a collaborative self-directed learning model in medical education. Med Sci Educ. 2022;32(1):195–207.
  • Pai KM, Rao KR, Punja D, et al. The effectiveness of self-directed learning (SDL) for teaching physiology to first-year medical students. Australas Med J. 2014;7(11):448–453.
  • Ginzburg SB, Santen SA, Schwartzstein RM. Self-directed learning: a new look at an old concept. Med Sci Educ. 2020;31(1):229–230.
  • Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339(jul21 1):b2535. DOI:10.1136/bmj.b2535.
  • Rethlefsen ML, Kirtley S, Waffenschmidt S, et al. PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews. Syst Rev. 2021;10(1):39.
  • McGowan J, Sampson M, Salzwedel DM, et al. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–46.
  • Cook DA, Reed DA. Appraising the quality of medical education research methods: the medical education research study quality instrument and the Newcastle-Ottawa scale-education. Acad Med. 2015;90(8):1067–1076.
  • Smith RP, Learman LA. A plea for MERSQI: the medical education research study quality instrument. Obstet Gynecol. 2017;130(4):686–690.
  • Warrier V, Pradhan A. A narrative review of interventions to teach medical students how to break bad news. Med Sci Educ. 2020;30(3):1299–1312.
  • Reed DA, Beckman TJ, Wright SM, et al. Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM’s medical education special issue. J Gen Intern Med. 2008;23(7):903–907.
  • Sawatsky AP, Beckman TJ, Varayil JE, et al. Association between study quality and publication rates of medical education abstracts presented at the society of general internal medicine annual meeting. J Gen Intern Med. 2015;30(8):1172–1177.
  • Ramamurthy S, Er HM, Devi Nadarajah V, et al. Medical students’ orientation toward lifelong learning in an outcome-based curriculum and the lessons learnt. Med Teach. 2021;43(sup1):S6–11.
  • Chae SJ. Medical students’ satisfaction on online flipped learning by learning styles. Korean J Med Educ. 2021;33(4):405–409.
  • Choi AS, Chun YE. The effect of education using scratch on problem solving ability and self-directed learning ability. RIGEO. 2020;11(2):274–279.
  • Eun K, Hye Young K. Effects of simulation-based education combined team-based learning on self-directed learning, communication skills, nursing performance confidence and team efficacy in nursing students. J Korean Acad Nurs. 2017;24(1):39–50.
  • Roh YS, Kim SS. Integrating problem-based learning and simulation: effects on student motivation and life skills. Comput Inform Nurs. 2015;33(7):278–284.
  • Ho CJ, Chiu WH, Li MZ, et al. The effectiveness of the iLearning application on chest tube care education in nursing students. Nurse Educ Today. 2021;101:104870.
  • Kim HJ, Park D. Effects of applying flipped learning to simulation training in nursing students. Medico Legal Update. 2020 Jan;20(1):1679–1684. DOI:10.37506/v20/il/2020/mlu/194543.
  • Lee JY, Lee DY. The effects of team-based learning program on the self-directed learning capability, learning transfer and learning satisfaction of nursing students. Indian J Public Health Res Dev. 2018;9(11):908–914.
  • Min S, Yun S. Effect of prior learning method on nursing students’ practical capacity. Int J Recent Technol Eng. 2019;8(2S6):322–325.
  • Arizo-Luque V, Ramirez-Baena L, Pujalte-Jesús MJ, et al. Does self-directed learning with simulation improve critical thinking and motivation of nursing students? A pre-post intervention study with the MAES© methodology. Healthcare (Basel). 2022;10(5):927. DOI:10.3390/healthcare10050927
  • Arora K, Hashilkar NK. Effectiveness of student-led objective tutorials in pharmacology teaching to medical students. Indian J Pharmacol. 2016;48(Suppl 1):S78.
  • Ji E, Lee EK. Effects of integrating PBL and simulation in senior nursing students in Korea. Indian J Public Health Res Dev. 2019;10(12):1687–1691.
  • Kang SJ, Hong CM, Lee H. The impact of virtual simulation on critical thinking and self-directed learning ability of nursing students. Clini Simul Nursing. 2020 Dec 1;49:66–72. DOI:10.1016/j.ecns.2020.05.008.
  • Si J. Medical students’ self-directed learning skills during online learning amid the COVID-19 pandemic in a Korean medical school. Korean J Med Educ. 2022;34(2):145–154.
  • Burm E, Choi AS, Gu JA, et al. The effects of applying havruta learning method in nursing classes. Indian J Public Health Res Dev. 2019;10(11):4475–4480.
  • Yang Z, Zhou Y, Chung JWY, et al. Challenge based learning nurtures creative thinking: an evaluative study. Nurse Educ Today. 2018;71:40–47.
  • Oh JW, Huh B, Kim MR. Effect of learning contracts in clinical pediatric nursing education on students’ outcomes: a research article. Nurse Educ Today. 2019;83:104191.
  • Behar-Horenstein LS, Beck DE, Su Y. An initial validation study of the self-rating scale of self-directed learning for pharmacy education. Am J Pharm Educ. 2018;82(3). DOI:10.5688/ajpe6251
  • Kim M, Kim HJ. The effects of simulation-based learning on nursing students (focusing on self-directed learning and problem-solving competency). Int J Pharm Biol Sci. 2017;6(1):431–436.
  • Qamata-Mtshali N, Bruce JC. Self-directed learning readiness is independent of teaching and learning approach in undergraduate nursing education. Nurse Educ. 2018;43(5):277–281.
  • Millanzi WC, Herman PZ, Hussein MR. The impact of facilitation in a problem-based pedagogy on self-directed learning readiness among nursing students: a quasi-experimental study in Tanzania. BMC Nurs. 2021;20(1):242.
  • Cheng X, Ma XY, Luo C, et al. Examining the relationships between medical students’ preferred online instructional strategies, course difficulty level, learning performance, and effectiveness. Adv Physiol Educ. 2021;45(4):661–669.
  • Mahsood N, Mehboob A, Alam AN, et al. Medical college student’s perception regarding currently adopted teaching methodologies and their effectiveness; a cross sectional study from Rawalpindi. J Med Sci. 2022 January;30(1):57–61. DOI:10.52764/jms.22.30.1.12.
  • Khalid AM, Sohail M, Naiyar I, et al. Perceptions of medical students in Pakistan, KSA, and the US regarding the significance of case-based learning. J Taibah Univ Med Sci. 2021;16(3):344–349.
  • Muraleedharan A, Ragavan S, Nalini Bage N, et al. Perceptions of medical undergraduate students on curricular changes in anatomy: an embedded design mixed method study. J Adv Med Educ Prof. 2022;10(1):22–29.
  • Ireson M, Warring S, Medina-Inojosa JR, et al. First year medical students, personal handheld ultrasound devices, and introduction of insonation in medical education. Ann Glob Health. 2019;85(1):123.
  • Jose J, Ali I, Palappallil DS. Project-based learning in pharmacology during COVID-19 lockdown for second phase medical undergraduates. J Clin Diagn Res. 2021 May 1;15(5). DOI:10.7860/JCDR/2021/48919.14819
  • Lehl SS, Gupta M, D’cruz S. Enhanced learning strategies of undergraduate medical students with a structured case presentation format. J Educ Health Promot. 2021;10:424.
  • Mills G, Kelly S, Crittendon DR, et al. Evaluation of a quality improvement experience for family medicine clerkship students. Fam Med. 2021;53(10):882–885.
  • Prabhath S, DSouza A, Pandey AK, et al. Changing paradigms in anatomy teaching-learning during a pandemic: modification of curricular delivery based on student perspectives. J Taibah Univ Med Sci. 2021;17(3):488–497.
  • Al-Drees AA, Khalil MS, Irshad M, et al. Students’ perception towards the problem based learning tutorial session in a system-based hybrid curriculum. Saudi Med J. 2015;36(3):341–348.
  • Dudrey EF, Manglik N, Velasco VY, et al. Innate and adaptive immune response to a virus: an integrated learning module. MedEdPORTAL. 2018;14:10757.
  • Ransom AL, Nieto LE, Korndorffer ML. Impact of redesigned anatomy problem-based learning curriculum on student satisfaction. Med Sci Educ. 2017;27(4):895–901.
  • Rowland KC, Joy A. The gross anatomy laboratory: a novel venue for critical thinking and interdisciplinary teaching in dental education. J Dent Educ. 2015;79(3):295–300.
  • Sarkar S, Sharma S, Raheja S. Implementation of blended learning approach for improving anatomy lectures of phase I MBBS students - learner satisfaction survey. Adv Med Educ Pract. 2021;12:413–420.
  • Smith E, Boscak A. A virtual emergency: learning lessons from remote medical student education during the COVID-19 pandemic. Emerg Radiol. 2021;28(3):445–452.
  • Bhandari B, Agarwal P, Chopra D, et al. Implementation of self-directed learning in physiology for phase 1 undergraduate medical students. Med Sci Educ. 2022;32(4):899–906.
  • Asad M, Iqbal K, Sabir M. Effectiveness of problem based learning as a strategy to foster problem solving and critical reasoning skills among medical students. J Ayub Med Coll Abbottabad. 2015;27(3):604–607.
  • Maradi R, Bhargavi C, Prabhuvi K, et al. Comparison of perceptions towards modified PBL and didactic lecture among first year medical students in India. Indian J Public Health Res Dev. 2019;10(9):165–169.
  • Shah KP, Goyal S, Ramachandran V, et al. Efficacy of quality improvement and patient safety workshops for students: a pilot study. BMC Med Educ. 2020;20(1):126.
  • Hill M, Peters M, Salvaggio M, et al. Implementation and evaluation of a self-directed learning activity for first-year medical students. Med Educ Online. 2020;25(1):1717780.
  • Kim S, Jeong SH, Kim HS, et al. Academic success of online learning in undergraduate nursing education programs in the COVID-19 pandemic era. J Prof Nurs. 2022;38:6–16.
  • Li S, Gong H, Pan J, et al. Relationship between undergraduate nursing students’ self-directed learning and training demands for nursing information systems: a cross-sectional study. Comput Inform Nurs. 2021;39(12):908–915.
  • Noh GO, Kim DH. Effectiveness of a self-directed learning program using blended coaching among nursing students in clinical practice: a quasi-experimental research design. BMC Med Educ. 2019;19(1):225.
  • Cho MK, Kim MY. Outcomes and influential factors applying flipped learning methods in a clinical adult nursing practicum. Int J Nurs Pract. 2019;25(2):e12724.
  • Badiyepeymaie Jahromi Z, Mosalanejad L. Integrated method of teaching in web quest activity and its impact on undergraduate students’ cognition and learning behaviors: a future trend in medical education. Glob J Health Sci. 2015;7(4):249–259.
  • Wong FMF, Kan CWY. Online problem-based learning intervention on self-directed learning and problem-solving through group work: a waitlist controlled trial. Int J Environ Res Public Health. 2022;19(2):720.
  • Rezaee R, Mosalanejad L. The effects of case-based team learning on students’ learning, self regulation and self direction. Glob J Health Sci. 2015;7(4):295–306.
  • Sahoo S. Finding self-directed learning readiness and fostering self-directed learning through weekly assessment of self-directed learning topics during undergraduate clinical training in ophthalmology. Int J Appl Basic Med Res. 2016;6(3):166–169.
  • Fan JY, Tseng YJ, Chao LF, et al. Learning outcomes of a flipped classroom teaching approach in an adult-health nursing course: a quasi-experimental study. BMC Med Educ. 2020;20(1):317.
  • Díaz Agea JL, Megías Nicolás A, García Méndez JA, et al. Improving simulation performance through self-learning methodology in simulated environments (MAES©). Nurse Educ Today. 2019;76:62–67.
  • Mackay FD, Zhou F, Lewis D, et al. Can you teach yourself point-of-care ultrasound to a level of clinical competency? Evaluation of a self-directed simulation-based training program. Cureus. 2018;10(9):e3320.
  • Lim KH, Loo ZY, Goldie SJ, et al. Use of 3D printed models in medical education: a randomized control trial comparing 3D prints versus cadaveric materials for learning external cardiac anatomy. Anat Sci Educ. 2016;9(3):213–221.
  • MacArthur-Beadle I, Nair DVK, Cook NJ, et al. Master and apprentice or a slave to technology? A randomized controlled trial of minimal access surgery simulation-based training techniques. J Laparoendosc Adv Surg Tech A. 2020;30(12):1263–1271.
  • Palve S, Palve S. Comparative study of self-directed learning and traditional teaching method in understanding cardio-respiratory physiology among medical undergraduates. Biomedicine (Taipei). 2022 Mar;42(1):138–142.
  • Raupach T, Harendza S, Anders S, et al. How can we improve teaching of ECG interpretation skills? Findings from a prospective randomised trial. J Electrocardiol. 2016;49(1):7–12.
  • Zia S, Jabeen F, Atta K, et al. Self-directed learning (SDL), an effective method for teaching physiology to medical students. PJMHS. 2016;10(3):699–702.
  • Lian A, Rippey JCR, Carr PJ. Teaching medical students ultrasound-guided vascular access - which learning method is best? J Vasc Access. 2017;18(3):255–258.
  • Hilmes MA, Hyatt E, Penrod CH, et al. Radiology in medical education: a pediatric radiology elective as a template for other radiology courses. J Am Coll Radiol. 2016;13(3):320–325.
  • Atta IS, Alghamdi AH. The efficacy of self-directed learning versus problem-based learning for teaching and learning ophthalmology: a comparative study. Adv Med Educ Pract. 2018;9:623–630.
  • Palve S, Palve S. Analytical study of self directed and problem oriented learning sessions in cardio respiratory physiology. JPRI. 2021;33(46A):30–37.
  • Ahmed WAM, Alostaz ZMU, AL-Lateef Sammouri GA. Effect of self-directed learning on knowledge acquisition of undergraduate nursing students in Albaha university, Saudi Arabia. AIMS Medical Science. 2016;3(3):237–247.
  • Kershaw G, Grivna M, Elbarazi I, et al. Integrating public health and health promotion practice in the medical curriculum: a self-directed team-based project approach. Front Public Health. 2017;5:193.
  • Park HR, Park E. Nursing students’ perception of class immersion facilitators in psychiatric nursing: team-based learning combined with flipped learning. Nurse Educ Today. 2021;98:104653.
  • Chen L, Lin T, Tang S. A qualitative exploration of nursing undergraduates’ perceptions towards scaffolding in the flipped classroom of the fundamental nursing practice course: a qualitative study. BMC Fam Pract. 2021;22(1):245.
  • Imran M, Kalantan SA, Alkorbi MS, et al. Perceptions of Saudi medical students regarding self-directed learning: a qualitative study. J Pak Med Assoc. 2021;71(5):1403–1408.
  • Rosenberg E, Truong HA, Hsu SY, et al. Implementation and lessons learned from a mock trial as a teaching-learning and assessment activity. Curr Pharm Teach Learn. 2018;10(8):1076–1086.
  • Al-Moteri M. Implementing active clinical training approach (ACTA) in clinical practice. Nurse Educ Pract. 2020;49:102893.
  • Briceland LL, Caimano CR, Rosa SW, et al. Exploring the impact of engaging student pharmacists in developing individualized experiential success plans. J Am Coll Clin Pharm. 2021 Feb;4(2):154–161. DOI:10.1002/jac5.1342.
  • Chang YT, Lai CS. Nontechnical skills for the surgery clerkship in the operating room based on adult learning principles in Taiwan. Kaohsiung J Med Sci. 2022;38(9):907–913.
  • Darst EC, Makhlouf TK, Brannick EC, et al. A student-led elective provides quality improvement feedback for a required compounding course. Am J Pharm Educ. 2020;84(8):ajpe7394.
  • Howard V. Undergraduate mental health nursing students’ reflections in gaining understanding and skills in the critical appraisal of research papers - an exploration of barriers and enablers. Nurse Educ Pract. 2021;55:103143.
  • Röcker N, Lottspeich C, Braun LT, et al. Implementation of self-directed learning within clinical clerkships. GMS J Med Educ. 2021;38(2):Doc43.
  • Si J. An analysis of medical students’ reflective essays in problem-based learning. Korean J Med Educ. 2018;30(1):57–64.
  • van Woezik TE, Koksma JJ, Reuzel RPB, Jaarsma, DC, van der Wilt, GJ, et al. There is more than ‘I’ in self-directed learning: an exploration of self-directed learning in teams of undergraduate students. Med Teach. 2021;43(5):590–598.
  • Liu TH, Sullivan AM. A story half told: a qualitative study of medical students’ self-directed learning in the clinical setting. BMC Med Educ. 2021;21(1):494.
  • Patra S, Khan AM, Upadhyay MK, et al. Module to facilitate self-directed learning among medical undergraduates: development and implementation. J Educ Health Promot. 2020;9(1):231.
  • Beasley DR. An online educational intervention to influence medical and nurse practitioner students’ knowledge, self-efficacy, and motivation for antepartum depression screening and education. Nurs Womens Health. 2021;25(1):43–53.
  • Alharbi F, Alwadei SH, Alwadei A, et al. Comparison between two asynchronous teaching methods in an undergraduate dental course: a pilot study. BMC Med Educ. 2022;22(1):488.
  • Jaffar MS, Marei H, Rathan R, et al. Influence of students’ self-directed learning on situational interest: a prospective randomised study [published online ahead of print, 2022 May 11]. Eur J Dent Educ. 2022. DOI:10.1111/eje.12818
  • Lee S, Park HJ. The effects of team-based learning on nursing students’ learning performance with a focus on high-risk pregnancy in Korea: a quasi-experimental study. Korean J Women Health Nurs. 2021;27(4):388–404.
  • Zhang J, Zhou Y, Li Y. Effects of an interaction and cognitive engagement-based blended teaching on obstetric and gynecology nursing course. Int J Environ Res Public Health. 2022;19(12):7472.
  • Kastenmeier AS, Redlich PN, Fihn C, et al. Individual learning plans foster self-directed learning skills and contribute to improved educational outcomes in the surgery clerkship. Am J Surg. 2018;216(1):160–166.
  • Brown CC, Arrington SD, Olson JF, et al. Musculoskeletal ultrasound training encourages self-directed learning and increases confidence for clinical and anatomical appreciation of first-year medical students. Anat Sci Educ. 2022;15(3):508–521.
  • Devi S, Bhat KS, Ramya SR, et al. Self-directed learning to enhance active learning among the 2nd-year undergraduate medical students in Microbiology: an experimental study. J Curr Res Sci Med. 2016;2(2):80–83.
  • El-Ashkar A, Aboregela A, Abdelazim A, et al. Flipped classroom as a novel teaching tool for practical Parasitology. Parasitol United J. 2022 Apr 1;15(1):110–116.
  • Kwok J, Liao W, Baxter S. Evaluation of an online peer fundus photograph matching program in teaching direct ophthalmoscopy to medical students. Canadian Journal of Ophthalmology. 2017;52(5):441–446.
  • Mookerji N, El-Haddad J, Vo TX, et al. Evaluating the efficacy of self-study videos for the surgery clerkship rotation: an innovative project in undergraduate surgical education. Can J Surg. 2021;64(4):E428–434.
  • Padugupati S, Joshi KP, Chacko TV, et al. Designing flipped classroom using Kemp’s instructional model to enhance deep learning and self-directed collaborative learning of basic science concepts. J Educ Health Promot. 2021;10(1):187.
  • Saiboon IM, Musni N, Daud N, et al. Effectiveness of self-directed small-group-learning against self-directed individual-learning using self-instructional-video in performing critical emergency procedures among medical students in Malaysia: a single-blinded randomized controlled study. Clini Simul Nursing. 2021 Jul;56:46–56.
  • Zhang Y, Zerafa Simler MA, Stabile I. Supported self-directed learning of clinical anatomy: a pilot study of doughnut rounds. Eur J Anat. 2017;21(4):319–324.
  • Annadani RR, Undi M. A study to assess the effectiveness and perception of students regarding case based learning over traditional teaching method in community medicine. Indian J Community Health. 2021 Jan 1;33(1):41–46.
  • Ariana A, Amin M, Pakneshan S, et al. Integration of traditional and e-learning methods to improve learning outcomes for dental students in histopathology. J Dent Educ. 2016;80(9):1140–1148.
  • Chaudhuri A, Paul S, Mondal T, et al. Online teaching-learning experience among medical students in a developing country during the coronavirus disease-19 pandemic: a pilot study. National J Physiol Pharm Pharmacol. 2021 Jan 1;11(1):62–67. DOI:10.5455/ijmsph.2020.09244202017092020.
  • Gárate IB, Berrotarán GB. Self-directed pharmacotherapy learning to fifth-year pharmacy students in Spain. Indian J Pharm Educ Res. 2015;49(1):10–17.
  • Kim KJ, Lee YJ, Lee MJ, et al. E-Learning for enhancement of medical student performance at the objective structured clinical examination (OSCE). PLoS ONE. 2021;16(7):e0253860.
  • Lisenby KM, Garza KB, Andrus MR. Development of self-directed activities and a validated exam for primary care advanced pharmacy practice experiences. Curr Pharm Teach Learn. 2021;13(3):261–265.
  • Lull ME, Slevinski CU, Traina AN. Self-directed learning through journal use in an elective pharmacy course. Pharm Educ. 2015;15(1):27–30.
  • Peine A, Kabino K, Spreckelsen C. Self-directed learning can outperform direct instruction in the course of a modern German medical curriculum - results of a mixed methods trial. BMC Med Educ. 2016;16(1):158.
  • Shenoy PJ, Rao R. Crossword puzzles versus student-led objective tutorials (SLOT) as innovative pedagogies in undergraduate medical education. Scientia Medica. 2021;31(1):24.
  • Clay AS, Ming DY, Knudsen NW, et al. CaPOW! Using problem sets in a capstone course to improve fourth-year medical students’ confidence in self-directed learning. Acad Med. 2017;92(3):380–384.
  • Diwan JS, Sanghavi SJ, Shah CJ, et al. Comparison of case-based learning and traditional lectures in physiology among first year undergraduate medical students. National J Physiol Pharm Pharmacol. 2017;7(7):744–748.
  • Kumar A, Vandana, Aslami A, et al. Introduction of ?case-based learning? for teaching pharmacology in a rural medical college in Bihar. Natl J Physiol Pharm Pharmacol. 2016;6(5):427–430.
  • Ma X, Luo Y, Zhang L, et al. A trial and perceptions assessment of app-based flipped classroom teaching model for medical students in learning immunology in China. Education Sciences. 2018;8(2):45. DOI:10.3390/educsci8020045
  • van Lankveld W, Maas M, van Wijchen J, et al. Self-regulated learning in physical therapy education: a non-randomized experimental study comparing self-directed and instruction-based learning. BMC Med Educ. 2019;19(1):50.
  • Sajadi M, Fayazi N, Fournier A, et al. The impact of the learning contract on self-directed learning and satisfaction in nursing students in a clinical setting. Med J Islam Repub Iran. 2017;31(1):72.
  • Silitonga LL, Harahap RF. Comparison of effectiveness between PBL and LBL in improving student learning outcomes. Int J Nurs Educ. 2020 Oct 10;12(4):296–302. DOI:10.37506/ijone.v12i4.11598.
  • Kim SH, Lee BG. The effects of a maternal nursing competency reinforcement program on nursing students’ problem-solving ability, emotional intelligence, self-directed learning ability, and maternal nursing performance in Korea: a randomized controlled trial. Korean J Women Health Nurs. 2021 Sep 30;27(3):230–242. DOI:10.4069/kjwhn.2021.09.13.
  • Tao Y, Li L, Xu Q, et al. Development of a nursing education program for improving Chinese undergraduates’ self-directed learning: a mixed-method study. Nurse Educ Today. 2015;35(11):1119–1124.
  • Anantharaman LT, Shankar N, Rao M, et al. Effect of two-month problem-based learning course on self-directed and conceptual learning among second year students in an Indian medical college. J Clin Diagn Res. 2019;13(5):JC05–10.
  • Khodaei S, Hasanvand S, Gholami M, et al. The effect of the online flipped classroom on self-directed learning readiness and metacognitive awareness in nursing students during the COVID-19 pandemic. BMC Nurs. 2022;21(1):22.
  • Cho MK, Kim MY. Factors influencing SDL readiness and self-esteem in a clinical adult nursing practicum after flipped learning education: comparison of the contact and untact models. Int J Environ Res Public Health. 2021;18(4):1521.
  • Bouw JW, Gupta V, Hincapie AL. Assessment of students’ satisfaction with a student-led team-based learning course. J Educ Eval Health Prof. 2015;12:23.
  • Canniford LJ, Fox-Young S. Learning and assessing competence in reflective practice: student evaluation of the relative value of aspects of an integrated, interactive reflective practice syllabus. Collegian. 2015;22(3):291–297.
  • Teli SS, Senthilvelou M, Soundariya K, et al. Design, implementation and evaluation of student-centric learning in physiology. Res Dev Med Educ. 2021 Jul 6;10(1):12. DOI:10.34172/rdme.2021.012.
  • Park H, Cho H. Effects of a self-directed clinical practicum on self-confidence and satisfaction with clinical practicum among South Korean nursing students: a mixed-methods study. Int J Environ Res Public Health. 2022;19(9):5231.
  • Yeh YC. Student satisfaction with audio-visual flipped classroom learning: a mixed-methods study. Int J Environ Res Public Health. 2022;19(3):1053.
  • Crilly P, Kayyali R. The use of social media as a tool to educate United Kingdom undergraduate pharmacy students about public health. Curr Pharm Teach Learn. 2020;12(2):181–188.
  • Kawaguchi-Suzuki M, Fuentes DG, Gibbard RS, et al. Integration of mentored self-directed learning (MSDL) through both group and individual presentations in an accelerated modified block program. Curr Pharm Teach Learn. 2018;10(7):946–954.
  • Wolff M, Stojan J, Buckler S, et al. Coaching to improve self-directed learning. Clin Teach. 2020;17(4):408–412.
  • Wondie A, Yigzaw T, Worku S. Effectiveness and key success factors for implementation of problem-based learning in Debre Tabor University: a mixed methods study. Ethiop J Health Sci. 2020;30(5):803–816.
  • Khan HK, Jiskani AR, Kirmani F, et al. Comparison of different teaching styles in student of optometry related to ocular anatomy on the basis of grading. Rawal Medical Journal. 2020;45(1):206–210.
  • Powell BD, Oxley MS, Chen K, et al. A concept mapping activity to enhance pharmacy students’ metacognition and comprehension of fundamental disease state knowledge. Am J Pharm Educ. 2021;85(5):8266.
  • Rogan S, Taeymans JJ, Zuber S, et al. Planning and implementation of guided self-study in an undergraduate physiotherapy curriculum in Switzerland-A feasibility study. J Med Educ Curric Dev. 2020;7:2382120520944921.
  • Rogan S, Taeymans J, Zuber S, et al. Impact of guided self-study on learning success in undergraduate physiotherapy students in Switzerland - a feasibility study of a higher education intervention. BMC Med Educ. 2021;21(1):362.
  • Serdà BC, Alsina Á. Knowledge-transfer and self-directed methodologies in university students’ learning. Reflective Pract. 2018;19(5):573–585.
  • Gu M, Sok S. Factors Affecting the academic achievement of nursing college students in a flipped learning simulation practice. Int J Environ Res Public Health. 2021;18(11):5970.
  • Ko Y, Issenberg SB, Roh YS. Effects of peer learning on nursing students’ learning outcomes in electrocardiogram education. Nurse Educ Today. 2022;108:105182.
  • Wang Y, Ma J, Gu Y, et al. How does group cooperation help improve self-directed learning ability in nursing students? A trial of one semester intervention. Nurse Educ Today. 2021;98:104750.
  • Zeng J, Liu L, Tong X, et al. Application of blended teaching model based on SPOC and TBL in dermatology and venereology. BMC Med Educ. 2021;21(1):606.
  • Yh S, Choi J, Storey JM, et al. Effectiveness of self-directed learning on competency in physical assessment, academic self-confidence and learning satisfaction of nursing students. J Korean Acad Nurs. 2017;24(3):181–188.
  • Zhang JF, Zilundu PLM, Fu R, et al. Medical students’ perceptions and performance in an online regional anatomy course during the Covid-19 pandemic. Anat Sci Educ. 2022;15(5):928–942.
  • Chitkara MB, Satnick D, Lu WH, et al. Can individualized learning plans in an advanced clinical experience course for fourth year medical students foster self-directed learning? BMC Med Educ. 2016;16(1):232.
  • McGrath D, Crowley L, Rao S, et al. Outcomes of Irish graduate entry medical student engagement with self-directed learning of clinical skills. BMC Med Educ. 2015;15(1):21. DOI:10.1186/s12909-015-0301-x
  • Sachdeva K, Mahajan A. Introduction of SDL in department of anatomy: evaluation of learning and SDL readiness. Int J Anat Res. 2022;10(1):8301–8311.
  • Chen SL, Liu CC. Development and evaluation of a physical examination and health assessment course. Nurse Educ Today. 2021 Dec 1;107:105116.10.1016/j.nedt.2021.105116.
  • Doane KJ, Boyd P. A symposium-based self-directed learning approach to teaching medical cell biology to medical students. Med Sci Educ. 2016;26(2):229–237.
  • Suskie L. How can student learning be assessed. In Suskie L (editor). Assessing Student Learning: A Common Sense Guide 2nd. Anker Publishing; 2004. p. 19–35.
  • Indiana University–Purdue University Indianapolis Center for Teaching and Learning. Assessing Student Outcomes. Published March 2009. Accessed October 18, 2022. https://ctl.iupui.edu/Resources/Assessing-Student-Learning/Assessing-Student-Learning-Outcomes
  • Hundley SP, Kahn S. Trends in assessment: ideas, opportunities, and issues for higher education: Stylus; 2019.
  • Epstein RM, Cox M, Irby DM. Assessment in medical education. N Engl J Med. 2007;356(4):387–396.
  • Gabbard T, Romanelli F. The accuracy of health professions students’ self-assessments compared to objective measures of competence. Am J Pharm Educ. 2021;85(4):8405.
  • Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 Suppl):S46–54.
  • Ehrlinger J, Shain EA. How accuracy in students’ self perceptions relates to success in learning. Applying science of learning in education. In: Benassi V, Overson C Hakala C, editors. Applying science of learning in education: infusing psychological science into the curriculum: Society for the Teaching of Psychology; 2014. pp. 142–151.
  • Dunning D, Johnson K, Ehrlinger J, et al. Why people fail to recognize their own incompetence. Curr Dir Psychol Sci. 2003;12(3):83–87.
  • Ehrlinger J, Johnson K, Banner M, et al. Why the unskilled are unaware: further explorations of (absent) self-insight among the incompetent. Organizational Beha Human Decis Processes. 2008;105(1):98–121.
  • Guglielmino LM Development of the Self-Directed Learning Readiness Scale. [ Dissertation]: University of Georgia; 1977.
  • Field L. An investigation into the structure, validity, and reliability of Guglielmino’s self-directed learning readiness scale. Adult Educ Q. 1989;39(3):125–139.
  • Field L. Guglielmino’s self-directed learning readiness scale: should it continue to be used? Adult Educ Q. 1991;41(2):100–103.
  • Fisher M, King J, Tague G. Development of a self-directed learning readiness scale for nursing education. Nurse Educ Today. 2001;21(7):516–525.
  • Bonham LA. Guglielmino’s self-directed learning readiness scale: what does it measure? Adult Educ Q. 1991;41(2):92–99.
  • Long HB, Agyekum SK. Teacher ratings in the validation of Guglielmino’s self-directed learning readiness scale. Higher Educ. 1984;13(6):709–715.
  • Long HB, Agyekum SK. Guglielmino’s self-directed learning readiness scale: a validation study. Higher Educ. 1983;12(1):77–87.
  • Chakkaravarthy K, Ibrahim N, Mahmud M, et al. Determinants of readiness towards self-directed learning among nurses and midwives: results from national survey. Nurse Educ Pract. 2020;47:102824.
  • Stephenson CR, Vaa BE, Wang AT, et al. Conference presentation to publication: a retrospective study evaluating quality of abstracts and journal articles in medical education research. BMC Med Educ. 2017;17(1).
  • Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002–1009.

Appendix 1:

Comprehensive Search Strategies for a Systematic Review on Self-Directed Learning Concepts in Health Professions Students

Filters: English, human, past 5 years

PubMed (self-assessment [mh] OR program evaluation [mh] OR educational measurement [mh] OR evaluation studies as topic [mh:noexp] OR evaluation study [pt] OR psychometrics [mh] OR formative feedback[mh] OR feedback, psychological [mh] OR self-assessment OR “self assessment” OR self-evaluation OR “self evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND (“self-directed learning as Topic”[Mesh:NoExp] OR “self-directed learning” OR “self directed learning” OR “self-directed learners” OR “self-directed learner”) AND (students, medical [mh] OR education, medical, undergraduate [mh] OR clinical clerkship [mh] OR schools, medical [mh] OR students, pharmacy [mh] OR schools, pharmacy [mh] OR students, dental [mh] OR schools, dental [mh] OR students, nursing [mh] OR education, nursing, associate [mh] OR education, nursing, baccalaureate [mh] OR education, nursing, graduate [mh] OR schools, nursing [mh] OR (occupational therapy [mh] AND students[mh]) OR (optometry [mh] AND students[mh]) OR (chiropractic[mh] AND students [mh]) OR (podiatry [mh] AND students [mh]) OR ((health occupations [mh] OR allied health occupations [mh]) AND students [mh]) OR “medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes))

CINHAL (mh “evaluation+”) OR (mh “outcome assessment”) OR (mh “educational measurement+”) OR (mh “research measurement+”) OR (mh “feedback”) OR self-assessment OR “self assessment” OR self-evaluation OR “self evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement+” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND ((MH “Self Directed Learning”) OR “self directed learning” OR “self-directed learning” OR “self-directed learners” OR “self-directed learner”) AND ((MH “Students, Medical”) OR (MH “Schools, Medical”) OR (MH “Students, Pharmacy”) OR (MH “Education, Pharmacy Technicians”) OR (MH “Students, Dental”) OR (MH “Dental Health Education”) OR (MH “Schools, Dental”) OR (MH “Education, Dental Hygiene”) OR (MH “Students, Health Occupations”) OR (MH “Schools, Nursing”) OR (MH “Students, Nursing+”) OR (MH “Education, Health Sciences+”) OR (MH “Schools, Allied Health+”) OR (MH “Schools, Health Occupations+”) OR (MH “Students, Allied Health+”) OR “medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes))

PsycINFO (MAINSUBJECT.EXACT.EXPLODE(“Evaluation”) OR MAINSUBJECT.EXACT.EXPLODE(“Measurement”) OR self-assessment OR “self assessment” OR self-evaluation OR “self evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND

(“Self-directed learning” OR “self directed learning” OR “self-directed learners” OR “self-directed learner”) AND (SU.EXACT(“Medical Students”) OR SU.EXACT(“Medical Internship”) OR SU.EXACT.EXPLODE(“Nursing Students”) OR (SU.EXACT.EXPLODE(“Nursing Education”) OR SU.EXACT.EXPLODE(“Dental Education”) OR SU.EXACT.EXPLODE(“Dental Students”) OR “medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes))

Embase ((‘evaluation study’/exp) OR (‘self-evaluation’/exp) OR (‘constructive feedback’/exp) OR (‘measurement’/exp) OR (‘psychometry/exp’) OR (‘assessment of humans’/exp) OR self-assessment OR ‘self assessment’ OR self-evaluation OR “self evaluation” OR ‘educational assessments’ OR ‘educational assessment’ OR self-criticism OR ‘self criticism’ OR ‘program evaluation’ OR ‘program evaluations’ OR ‘programme evaluations’ OR ‘programme evaluation’ OR ‘program sustainability’ OR ‘program effectiveness’ OR ‘program appropriateness’ OR ‘educational measurements’ OR ‘educational measurement’ OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND (‘Self-directed learning’/exp OR ‘self-directed learning’ OR ‘self directed learning’ OR ‘Self-directed learners’ OR ‘self-directed learner’) AND (‘medical student’/exp OR ‘medical school’/exp OR ‘pharmacy student’/exp OR ‘dental student’/exp OR ‘dental education’/exp OR ‘pharmacy school’/exp OR ‘nursing student’/exp OR ‘paramedical education’/exp OR ‘medical students’ OR ‘medical student’ OR ‘medical undergraduates’ OR ‘medical undergraduate’ OR ‘undergraduate medical education’ OR ‘medical schools’ OR ‘medical school’ OR ‘medical internship’ OR clerkship* OR ‘clinical clerkship’ OR ‘clinical clerkships’ OR ‘pharmacy students’ OR ‘pharmacy student’ OR ‘pharmacy schools’ OR ‘pharmacy school’ OR ‘dentistry students’ OR ‘dentistry student’ OR ‘dental students’ OR ‘dental student’ OR ‘dental schools’ OR ‘dental school’ OR ‘dentistry schools’ OR ‘dentistry school’ OR ‘dental hygiene students’ OR ‘dental hygiene student’ OR ‘dental hygiene schools’ OR ‘dental hygiene school’ OR ‘nursing students’ OR ‘nursing student’ OR ‘nursing schools’ OR ‘nursing school’ OR ‘nursing education programs’ OR ‘nursing education program’ OR ‘physical therapy students’ OR ‘physical therapy student’ OR ‘physiotherapy students’ OR ‘physiotherapy student’ OR (‘occupation therapy’ AND student*) OR ‘physician assistant students’ OR ‘physician assistant student’ OR ‘optometry students’ OR ‘optometry student’ OR ‘optometry schools’ OR ‘optometry school’ OR ‘chiropractic students’ OR ‘chiropractic student’ OR ‘chiropractic schools’ OR ‘chiropractic school’ OR ‘podiatry students’ OR ‘podiatry student’ OR ‘health profession education’ OR ‘health professions education’ OR ‘health professions programs’ OR ‘health professions program’ OR ‘health profession programs’ OR ‘health profession program’ OR ‘health professions students’ OR ‘health professions student’ OR ‘allied health students’ OR ‘allied health student’ OR ‘allied health education’ OR ‘allied health education programs’ OR ‘allied health education program’ OR ‘allied health schools’ OR ‘allied health school’ OR ((‘allied health’ OR ‘health professions’ OR ‘health profession’) AND programmes))

ERIC MAINSUBJECT.EXACT(“Self Evaluation”) OR MAINSUBJECT.EXACT.EXPLODE(“Psychometrics”) OR MAINSUBJECT.EXACT.EXPLODE(“measurement”) OR MAINSUBJECT.EXACT.EXPLODE(“evaluation methods”) OR MAINSUBJECT.EXACT.EXPLODE(“tests”) OR self-assessment OR “self assessment” OR self-evaluation OR “Self Evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND (“self-directed learning” OR “self directed learning” OR “Self-directed learners” OR “self-directed learner”) AND (MAINSUBJECT.EXACT.EXPLODE(“Medical Students”) OR MAINSUBJECT.EXACT.EXPLODE(“Medical Schools”) OR MAINSUBJECT.EXACT.EXPLODE(“Dental Schools”) OR MAINSUBJECT.EXACT.EXPLODE(“Allied Health Occupations Education”) OR “medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes)

Web of Science (self-assessment OR “self assessment” OR self-evaluation OR “self evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND (“self-directed learning” OR “self directed learning” OR “Self-directed learners” OR “self-directed learner”) AND (“medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes))

SCOPUS TITLE-ABS-KEY ((self-assessment OR “self assessment” OR self-evaluation OR “self evaluation” OR “educational assessments” OR “educational assessment” OR self-criticism OR “self criticism” OR “program evaluation” OR “program evaluations” OR “programme evaluations” OR “programme evaluation” OR “program sustainability” OR “program effectiveness” OR “program appropriateness” OR “educational measurements” OR “educational measurement” OR assess* OR evaluat* OR measur* OR effect* OR test* OR feedback OR psychometr* OR scoring OR scale OR scales OR instrument* OR feedback) AND (“self-directed learning” OR “self directed learning” OR “Self-directed learners” OR “self-directed learner”) AND (“medical students” OR “medical student” OR “medical undergraduates” OR “medical undergraduate” OR “undergraduate medical education” OR “medical schools” OR “medical school” OR “medical internship” OR clerkship* OR “clinical clerkship” OR “clinical clerkships” OR “pharmacy students” OR “pharmacy student” OR “pharmacy schools” OR “pharmacy school” OR “dentistry students” OR “dentistry student” OR “dental students” OR “dental student” OR “dental schools” OR “dental school” OR “dentistry schools” OR “dentistry school” OR “dental hygiene students” OR “dental hygiene student” OR “dental hygiene schools” OR “dental hygiene school” OR “nursing students” OR “nursing student” OR “nursing schools” OR “nursing school” OR “nursing education programs” OR “nursing education program” OR “physical therapy students” OR “physical therapy student” OR “physiotherapy students” OR “physiotherapy student” OR (“occupation therapy” AND student*) OR “physician assistant students” OR “physician assistant student” OR “optometry students” OR “optometry student” OR “optometry schools” OR “optometry school” OR “chiropractic students” OR “chiropractic student” OR “chiropractic schools” OR “chiropractic school” OR “podiatry students” OR “podiatry student” OR “health profession education” OR “health professions education” OR “health professions programs” OR “health professions program” OR “health profession programs” OR “health profession program” OR “health professions students” OR “health professions student” OR “allied health students” OR “allied health student” OR “allied health education” OR “allied health education programs” OR “allied health education program” OR “allied health schools” OR “allied health school” OR ((“allied health” OR “health professions” OR “health profession”) AND programmes))

Appendix 2:

Basic characteristics of systematic review findings for assessment of self-directed learning in healthcare education