1,330
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Exploring medical students’ metacognitive and regulatory dimensions of diagnostic problem solving

ORCID Icon, ORCID Icon & ORCID Icon
Article: 2210804 | Received 01 Nov 2022, Accepted 02 May 2023, Published online: 17 May 2023

ABSTRACT

Solving clinical problems requires an individual to apply not only domain-specific medical knowledge and cognitive skills for reasoning, but also to be consciously aware of, monitor, and evaluate their thinking processes (i.e., metacognition). The purpose of this study was to map critical metacognitive dimensions of clinical problem solving and to explore the structural relationships among them, which may help frame a conceptual framework and better pedagogy for effective intervention. A context-specific inventory was adapted and modified from a domain-general instrument to capture essential metacognitive skills for learning and solving clinical problems. This inventory was administered to 72 undergraduate medical students to survey their capabilities in five dimensions: knowledge of cognition, objectives, problem representation, monitoring, and evaluation. The interplay among these dimensions was further examined using partial least squares structural equation modeling.

Our findings revealed that the medical students fell short of some expert-like, metacognitive, and regulatory competence, even after receiving years of medical education and on-site training. In particular, they did not know when a holistic understanding of a problem had been reached. Many of them often do not have a set of clear diagnostic procedures in mind, nor do they concurrently monitor their thinking during diagnostic reasoning. Moreover, their lack of self-improving approaches seemed to worsen their learning. Finally, the structural equation model indicated that knowledge of cognition and objectives significantly predicted problem representation, suggesting that medical learners’ knowledge of and goals for learning are influential in framing the clinical problems at hand. A significant linear prediction path was observed from problem representation, monitoring, to evaluation, signifying a possible sequenced process of clinical problem solving. Metacognitive-based instruction can help improve clinical problem-solving skills and awareness of potential biases or errors.

Introduction

This study proposed emphasizing metacognitive aspects of self-regulated learning (SRL) to enhance medical students’ diagnostic reasoning in face of the challenges of new diseases due to the rapidly changing society and globalization. Performing an expert diagnosis involves generating hypotheses to guide clinical data gathering, articulating the inner representation of the patient’s problem, synthesizing a prioritized differential diagnosis, and evaluating outcomes [Citation1], all of which highly rely on metacognition. In medical education research, a consensus has gradually formed which considers metacognition as essential in diagnostic instruction [Citation2–4]. Reviews of empirical studies on experts’ coaching and effective interventions for increasing diagnostic competence [Citation1,Citation5,Citation6,Citation7] have revealed the crucial role of metacognition.

A few pioneer studies have shown that learning and solving diagnostic problems is a multi-faceted, weakly sequenced cyclic self-regulatory process [Citation8]. During this SRL process, various metacognitive strategies are activated [Citation9] accompanied by conceptual knowledge and other higher level cognitive actions (e.g., knowledge about tasks and self-knowledge) [Citation10]. These metacognitive aspects of SRL include knowledge of cognition [Citation8,Citation11], task analysis (including goal setting and strategic planning) [Citation8,Citation9,Citation11], monitoring [Citation8,Citation9], evaluation [Citation8], and adaptation [Citation8,Citation11].

In this study, a self-reported inventory was developed and implemented to assess these metacognitive dimensions of SRL during diagnostic problem solving. A recent review study [Citation12] suggested that self-reported surveys have the potential to provide relatively accurate insights into students’ global levels of self-regulation. Their review of related studies found that students’ self-reported use of strategies was generally consistent with their behavioral data. Unlike prior studies which used interview and thinking-aloud methods (e.g., [Citation8,Citation9]), which demand training raters to appropriately interpret the verbal data [Citation13] and much more intensive time and labor when applied to a larger sample [Citation14], we attempted to develop an easy-to-use assessment tool for clinical teachers, students, and researchers to understand the multifaceted characteristics of diagnostic problem solving. To design effective metacognitive interventions, it is important to consider students’ perceptions of their own abilities [Citation5,Citation15]. Therefore, developing a self-reported instrument can be beneficial in this regard. Our aim was not to replace the behavioral measures such as think-aloud protocols with self-report surveys. Rather, we sought to supplement these measures with a contextualized tool for medical researchers who want to combine both approaches to gain a more comprehensive understanding of medical students’ SRL, as recommended by Gandomkar et al. [Citation16].

To achieve this goal, Winne and Hadwin’s [Citation11] model of self-regulated learning was adapted to build and validate a structured model to understand the interplay among metacognitive dimensions of diagnostic problem solving. Finally, novice medical learners’ related dispositions were used as examples to illustrate their needs. The following research questions were addressed:

  1. How do novice medical students perceive their competence in different metacognitive dimensions of SRL during diagnostic problem solving?

  2. What patterns of interplay between these dimensions can be found, situated in diagnostic problem solving?

Literature review

Diagnostic problem solving and metacognition

Diagnostic problem solving depicts the thinking process of systematically solving clinical problems when doctors encounter a patient or a clinical case [Citation17]. It requires the activation of a set of conceptual and strategic knowledge while purposefully attending to and overseeing the problem-solving process (in other words, metacognition) [Citation10]. Diagnostic problems are ill-structured, meaning that multiple approaches and paths to solutions are possible [Citation18]. There have been extensive efforts to improve diagnostic competence (e.g., [Citation19]) and reduce errors (e.g., [Citation20,Citation21]) that have been studied extensively. Previous research suggests that general medical knowledge is not necessarily linked to a medical student’s diagnostic performance [Citation10,Citation22], and cognitive strategies such as analyzing and information gathering do not directly contribute to correct diagnoses. However, superior metacognitive actions, such as forming inner representations and evaluation, are significantly associated with correct diagnoses [Citation22]. Novice medical students’ metacognitive competence may explain why some learners solve clinical problems more effectively than others, even when they possess similar conceptual knowledge [Citation8].

Metacognition is the awareness of one’s thinking, enabling self-regulation in learning and self-assessment in decision-making [Citation1,Citation2,Citation10,Citation21]. Metacognitive theory explains how much a learner knows about learning and how they can regulate their own thinking and behaviors to achieve specific goals [Citation23,Citation24]. Metacognitive theories assess and emphasize the importance of metacognitive dimensions in learning, while SRL models capture learners’ dynamic use of metacognitive and cognitive activities to achieve learning goals. Both metacognition and SRL models (e.g., [Citation11,Citation25]) suggest that individuals make efforts to monitor their thoughts and actions to gain control over the learning process. In our model of mastering diagnostic problem solving, we emphasize learners’ metacognitive aspects of self-regulatory processes.

SRL processes, as described by Winne and Hadwin [Citation11] and Zimmerman [Citation25], involve learners in (1) forming an internal representation of the task, (2) setting learning goals and proposing a strategic plan, and (3) following the plan while (4) monitoring and evaluating outcomes. According to, [Citation11] learners compare their outcomes to their internal standards of desirable learning outcomes (i.e., monitoring), and discrepancies trigger adjustments to the strategies and processes used in stages (1) to (3). The accuracy of monitoring and adjustment is also influenced by the learner’s knowledge of cognition, which affects their internal standards [Citation11]. In the SRL process of diagnostic problem solving, it begins with a physician’s identification of early clues from a patient’s symptoms. At the forethought phase, the doctor activates medical knowledge [Citation8,Citation9,Citation26], forms an inner representation of the problem [Citation11], sets goals, and plans for further inspections to generate differential diagnoses [Citation22]. During differential diagnoses, the doctor constantly monitors and examines their cognitive state and controls their use of strategies [Citation9,Citation26]. He/she evaluates various solutions and communicates to the patient [Citation22]. With iterative clinical practice, doctors can improve their beliefs, dispositions, and knowledge of the current task and studying tactics (knowledge of cognition), leading to better goal setting and learning outcomes [Citation11].

Metacognitive aspects of SRL are crucial in clinical practice, yet medical students have been found to lack these skills. Artino et al. [Citation9] found that while all second-year medical students demonstrated some metacognitive skills, only about half used them effectively. One-third of the students did not report any goals, and only one-third set goals for diagnostic tactics or made a strategic plan. Although monitoring was commonly observed, the monitoring targets for monitoring varied. About half of them monitored their identification or integration/synthesis of symptoms, while fewer than 20% focused on prioritizing relevant symptoms or comparing/contrasting diagnoses. Similarly, Kiesewetter et al. [Citation22] found that only 16% of 4th- and 5th-year medical students formed inner representations and evaluated their thinking. These studies showed that diagnostic problem solving involves efficient use of both metacognitive strategies and other regulatory actions [Citation10,Citation22].

A model of metacognitive and regulatory diagnostic problem solving

This study proposes a model (see ) that links metacognitive and regulatory processes to diagnostic problem solving based primarily on Winne and Hadwin’s [Citation11], model. Two key dimensions, knowledge of cognition and objectives, predict other dimensions. Knowledge of cognition, including learners’ self-perceptions of learning and task-related knowledge, influences goal setting and learning approaches (objectives). Hypothesis H1 predicts that Knowledge of Cognition predicts objectives.

Figure 1. The hypothesized model that illustrates the interplay among the five dimensions of diagnostic problem solving.

Note: Dotted line: no significant relation is hypothesized.
Figure 1. The hypothesized model that illustrates the interplay among the five dimensions of diagnostic problem solving.

When solving a clinical problem, problem representation is crucial and is influenced by knowledge and tactics for solving the task [Citation27] (H2a) as well as personal learning goals for diagnostic mastery (H2b). Together, hypothesis H2 predicts that problem representation is significantly predicted by both knowledge of cognition and objectives.

We also hypothesize that a direct relation exists among the dimensions in the order of problem representation, monitoring, and evaluation. According to Zimmerman [Citation25] and Winne and Hadwin [Citation11], understanding of the problem influences task planning and execution, which involves assembling the necessary steps, approaches, and strategies for diagnosis (hypothesis H3b). During task execution, the individual retrieves prior experiences and knowledge (hypothesis H3a) while monitoring progress against personal goals (H3c). Therefore, we suggest that monitoring could be predicted by knowledge of cognition (H3a), problem representation (H3b), and objectives (H3c).

Hypothesis H4 predicts that evaluation, including checking the quality of the diagnostic process and the reasonableness of conclusions, can be predicted by monitoring (H4b) and objectives (H4c), but is less influenced by knowledge of cognition (H4a). Knowledge of cognition may indirectly influence appraisal by shaping personal goals and internal standards [Citation11].

In brief, a structured model of diagnostic competence with five metacognitive and regulatory dimensions was developed. By examining the relationships among the key dimensions, one can gain insight into the essential elements necessary for the development of mature diagnostic decision-making. While Cleary et al. [Citation8] proposed a framework for analyzing the essential dimensions involved in self-regulated diagnostic problem solving based on Zimmerman’s [Citation25] model, the relationships among the dimensions were not established [Citation28] constructed and validated a model of computational thinking based on theories of information processing and models of computer programming. Although their model is composed of different dimensions than ours due to different contexts, it is in line with our structural model of problem solving. Despite these efforts, there is currently no validated, contextualized inventory, or structural model of multifaceted diagnostic competence in the medical education literature.

Research methods

A revised questionnaire was administered to understand medical students’ level of competence of metacognition and self-regulation regarding diagnostic problem solving. The interrelations among the metacognitive and regulatory dimensions of diagnostic competence were then explored using partial least squares structural equation modeling (PLS-SEM). PLS-SEM was chosen because of the explorative nature and the small sample size of this study [Citation29].

Participants

All 6th-year medical students at MacKay Medical College who were on their Emergency Medicine (EM) rotation were invited to participate in the study. The EM rotation lasted for 2 weeks, and prior to this, the students were required to complete a year-long rotation in internal medicine, surgery, obstetrics/gynecology, and pediatrics. Students were required to complete eight educational modules during their EM rotation to develop their competency in managing emergencies and to enhance their subsequent clinical skills. One of the modules was specifically designed to improve students’ diagnostic skills. For this study, all 72 medical students completing their EM rotation between 2021 and 2022 were invited to complete a 20-min survey session, with a 100% participation rate.

Research instrument

Instrument development and validation

To develop an instrument that captures metacognitive and regulatory competence in diagnostic problem solving, we began with an existing inventory, the Inventory of Metacognitive Self-Regulation (IMSR) [Citation30]. The original IMSR was domain-general and designed to capture metacognitive regulatory skills in science and mathematics problem solving. The original instrument consisted of 35 items covering five factors, namely: knowledge of cognition, objectives, problem representation, monitoring, and evaluation. The reason for adapting the IMSR was that the context of the original study by Howard et al. [Citation30] was close to that of the present study. Also, the metacognitive skills assessed by IMSR have shown good validity, that is, significantly positive moderate correlations with the concurrent measurement of metacognitive skills and with problem-solving performance in a previous study [Citation31].

For item modification, we concentrated on capturing what it means to be a metacognitively active clinician in the process of solving diagnostic problems. We specifically focused on measuring the extent of the metacognitive knowledge and regulatory skills undergraduate students are aware of and perform related to diagnostic reasoning and their learning. A medical professor and two science education professors formed a panel to revise the items. All of them had experience of conducting research projects investigating metacognition in problem solving or designing and delivering a metacognitive-infused workshop in medical education. The panel members worked on a list of definitions and question items for the five dimensions of metacognitive competence.

The panel revised the original 35 items to reflect specific metacognitive competence situated in diagnostic problem solving on the one hand, while maintaining the items to align the definitions of the dimensions on the other hand. Two items from the original IMSR were excluded due to the described strategies (i.e., ‘I can make myself memorize something’ (Knowledge of Cognition) and ‘I check to see if my calculations are correct’ (Evaluation)) not fitting well into clinical diagnosis. In addition, two reverse items (‘I ask myself what is the easiest way to conduct a clinical interview’ (Objectives) and ‘I use learning strategies for diagnostic reasoning without thinking’ (Knowledge of Cognition)) were further eliminated due to their low consistency with other items in the same dimension after running a Cronbach’s test for reliability. One item was created and added by the panel members (‘I try to get the information and filter it down to what is important for diagnosing.’ (Monitoring)). Through several rounds of discussed revisions, members of the panel reached a consensus on the final set of 32 items.

The items of IMSR in diagnostic problem-solving (IMSR-D) asked the medical participants to self-report on how often they perform a particular metacognitive skill while trying to learn and solve a diagnostic problem in a real setting. The wording of the IMSR-D items was drastically revised to place them in the context of diagnostic problem solving. As a result, they differed greatly from the original items. After model validation, which will be reported later in the Results section, 22 out of 32 items were retained, and all of these items of the IMSR-D can be found in . Participants were instructed to read and respond to each item on a 5-point scale (1 = never, 2 = seldom/rarely, 3 = sometimes, 4 = often/frequently, 5 = always) by circling the answer that best described them. Higher scores indicated the possession of superior metacognitive competence related to diagnostic problem solving.

Table 1. Results of the confirmatory factor analyses and reliabilities of the IMSR-D (N = 72).

Implementation procedures

IMSR-D was administered at the beginning of the emergency medicine rotation by rotation groups (4–5 medical students) via a Google form in a quiet meeting room in the emergency department. Participants were guided to recall recent clinical interview experiences and rate each item based on their actual experience. They were encouraged to use the survey items to identify areas for self-improvement, and the instructor could provide support based on their responses. The survey took approximately 10 min to complete. Participants were assured that this was not an evaluation and that their survey responses would not be linked to their onsite training scores. Ethical approval (20MMHIS484e) for this study was obtained from the Institutional Review Board of MacKay Memorial Hospital, Taipei, Taiwan.

Data analysis

To answer the first research question, composite scores of the dimensions were examined. Weaknesses of the medical students’ metacognitive and regulatory competence corresponding to different metacognitive dimensions were highlighted. To address the second research question, PLS-SEM was performed to examine the validity of the hypothesized model on the relationships between the latent variables of metacognitive competence in diagnostic problem solving, including knowledge of cognition, objectives, problem representation, monitoring, and evaluation. The commercial software, Smart-PLS version 3.3.7, was used to perform PLS-SEM.

In our study, the evaluation process of the outer model and the inner model followed the guidelines suggested by Lin et al. [Citation29]. Item scores were used for structural modeling. To examine reliability, factor loadings were computed, with factor loadings greater than 0.70 recommended, and those items with factor loadings less than 0.4 eliminated [Citation32]. Factor loadings indicate the degree to which an item represents the construct of interest [Citation33]. We used composite reliability (CR) to estimate internal consistency, with a value greater than 0.70 indicating that items assess the same latent variable [Citation33]. Convergent validity reflects the extent to which items of a dimension converge to represent that dimension. It was estimated using the average variance extracted (AVE) value by calculating the mean of the squared loadings of each item within a dimension [Citation33]. An AVE value greater than 0.50 is recommended [Citation33]. The Fornell-Larcker criterion was used as a decision rule to establish discriminant validity, where the square root of each dimension’s AVE should be greater than its correlations with all other dimensions [Citation34]. The results of the validity and reliability of the structural mode are reported in the following sections.

Results

Below, we first report the results of validation and hypothesis testing for the structural model of diagnostic problem-solving competence. Some dispositions of the medical students were delineated to identify their specific needs.

Validity and reliability of the structural model

During model validation, one, three, four, and two items were removed from the objectives, problem representation, monitoring, and evaluation dimensions, respectively, due to their low factor loadings. For the participants of this study, 22 out of the 32 items of the IMSR-D were validated and retained in the proposed model. The means and standard deviations of the 22 items are summarized in .

Most items had factor loadings higher than 0.7, except for a few which were between 0.56 and 0.70. Chin and Marcoulides [Citation35] suggested that if there are other indicators in the same construct, factor loadings of 0.5 or 0.6 are considered acceptable. For internal consistency reliability, the CR values of each dimension exceeded the minimum required value of 0.7 (0.79 and 0.85). For convergent reliability, the AVE value of each dimension was greater than the recommended value of 0.5 (0.50–0.56). The internal consistency for the overall inventory was high (Cronbach’s α = .92), and for the dimensions it was acceptable (Knowledge of Cognition =.69, Objectives =.79, Problem Representation =.60, Monitoring =.76, Evaluation =.75) [Citation32].

To establish discriminant validity, the square root of the AVE value of each dimension and the correlations among the dimensions are summarized in . According to the Fornell-Larcker criterion, the square root of the AVE value for each dimension (0.71–0.75, as the diagonal number of ) was greater than the Pearson’s correlation coefficients between that dimension and the others (in ). All dimensions showed acceptable discriminant validity, and all dimensions were significantly correlated at moderate levels (r = 0.55–0.70). In summary, the internal consistency reliability, convergent validity, and discriminant validity all met Hair et al.’s [Citation33], recommendations.

Table 2. The correlations and discriminant validity among the dimensions of diagnostic problem solving.

Modeling the relationships among metacognitive and regulatory dimensions of diagnostic problem solving

The structural relationships among the five dimensions of diagnostic problem solving were examined through a bootstrapping procedure with 5,000 subsamples in order to determine the significance level for each of the theoretical paths. The statistically significant paths are drawn in , along with the R2 values, and the outer loadings for each item.

Figure 2. The results of the structural relationships examination.

Note: ***p < 0.001.
Figure 2. The results of the structural relationships examination.

As shown in , the medical students’ ‘Knowledge of cognition’ significantly and positively predicted their ‘Objectives’ (path coefficient = 0.60, p < 0.001; adjusted R2 = 0.35). They together predicted ‘Problem representation’ dispositions (path coefficient = 0.44, p < 0.001 and 0.31, p < 0.05; adjusted R2 = 0.43). Next, students’ ‘Monitoring’ was significantly and positively predicted by both their ‘Knowledge of cognition’ and ‘Problem representation’ (path coefficient = 0.44, p < 0.001 and 0.28, p < 0.05; adjusted R2 = 0.52); yet the influence of ‘Objectives’ on ‘Monitoring’ was not observed (path coefficient = 0.12, p = 0.31). Lastly, the medical learners’ ‘Evaluation’ was significantly and positively predicted by both their ‘Objectives’ and ‘Monitoring’ (path coefficient = 0.41 and 0.40, p < 0.001; adjusted R2 = 0.62), whereas the relation between ‘Knowledge of cognition’ and ‘Evaluation’ was not significant (path coefficient = 0.12, p = 0.31), which supports our hypothesis. All of our hypotheses were confirmed, except for the interplay between ‘Objectives’ and ‘Monitoring’ (hypothesis H3c).

Revealing medical students’ diagnostic problem-solving weaknesses

As reported in , on the 5-point Likert scale, averaged scores of most items were below 4.0. Knowledge of cognition and monitoring received a lower rating on average, in particular. Some dispositions rated close to or below 3.0 are highlighted to delineate medical learners’ specific needs.

Although immersed in years of medical courses and training, the participants’ lower ratings on the Knowledge of Cognition dimension indicated that they seemed to fall short on effective knowledge and awareness of their own learning when mastering diagnostic competence. Few participants reported activating different learning strategies (Kn1) or recognizing their best approaches (Kn3) and weaknesses to learning to master diagnostic skills (Kn4). As for the Objectives dimension, our participants were more mindful of objectives related to diagnostic performance and, yet, less mindful of themselves setting personal goals for diverging diagnostic approaches (Ob5) or learning for future improvement (Ob3). Problem Representation items received relatively high ratings among the five dimensions. While the majority of them reported that they always/often put extra effort into what was important in differential diagnosis (Pr3), fewer observed or collected the patient’s information more than once to fully understand the patient’s problem (Pr1).

Some challenges appeared regarding consistently activating monitoring on various subtasks and also evaluating the quality of the entire problem-solving process and outcomes. While a fair (but not satisfactory) proportion of students reported consistently activating monitoring in various diagnostic subtasks, only one-fifth reported always/often forming clear steps (Mo3) and constantly thinking about all steps of differential diagnosis (Mo5). Other important monitoring skills, such as overseeing if all important aspects of the problem have been clarified (Mo2), were rarely reported. Only about one-third reported that they frequently and carefully evaluated the quality of their diagnostic process (Ev4), their history taking procedure (Ev1) or the correctness of their diagnostic reasoning (Ev3) to ensure that it was complete and thorough (Ev2).

Despite the fact that these medical students may be equipped with some medical knowledge and diagnostic strategies, they had weaknesses particularly centering on the different ways of learning and solving medical problems, knowing when a holistic understanding of the problem is reached, systematic monitoring, and thorough evaluation of diagnoses.

Discussion

While other studies have suggested conceptual frameworks of diagnostic decision-making by exploring crucial components of cognitive reasoning [Citation5], little is known about what the fundamental metacognitive elements and the relationships among them are. Moreover, previous studies that incorporated metacognitive components into training of clinical reasoning mostly focused on monitoring or evaluation such as prompting reflective practice [Citation36] (Ely et al., 2011) at the end of diagnoses or self-checking cognitive errors using a metacognitive checklist [Citation37]. The present study involved all major dimensions and elicited the structure.

We proposed the model based on the conceptual models of self-regulated learning and of problem solving. All hypotheses have been examined based on data from the IMSR-D survey situated in clinical diagnosis. Our findings have revealed and supported five critical dimensions of diagnostic problem solving as well as the interplay among them. Knowledge of cognition and objectives appear to be the two fundamental dimensions concerning thinking about learning which can predict the other three dimensions of actually solving a diagnostic problem. The undergraduate medical students’ self-conceptions of themselves as learners and their best approaches to learning (knowledge of cognition) significantly relate to the goals they set for learning to solve diagnostic problems (objectives). These two superior dimensions also affect how an individual frames the diagnostic problems he or she encounters. Medical experts and novices have different knowledge of cognition, set different objectives, and therefore form different inner representations when tackling a clinical case [Citation38,Citation39,Citation40].

Moreover, problem representation predicts the monitoring of the diagnostic process, which predicts the evaluation of diagnostic decisions. The interplay within the structural model implies that the growth of these dimensions is not parallel but rather developmental. This finding aligns with the process of diagnostic decision-making [Citation8,Citation9,Citation26] and steps of teaching ill-structured problem solving [Citation27]. Lacking skills to analyze key points of problems affects the subsequent stages of diagnostic problem solving [Citation28] and other problem-solving contexts such as computational thinking [Citation28]. In combination with the low self-ratings of the students’ diagnostic problem solving observed in this study, these young medical learners may need further metacognitive and regulatory support to enhance their diagnostic competence. Merely prompting them to reflect on the diagnostic process and results may not supply sufficient information for competence improvement. We suggest that effective implementation of reflective practice (evaluation) needs to build upon leveraging learners’ capability in former dimensions. These include helping learners become better at analyzing and framing appropriate problem representations as well as becoming better at establishing a set of clear and reasonable steps in mind while consciously monitoring throughout the diagnosis.

All in all, the structural model of diagnostic problem solving situated in a medical context echoes features and mechanisms of the contemporary SRL models [Citation11,Citation25], including the superior influence of the knowledge of cognition and objectives as well as the direct relations among problem decomposition, monitoring, and evaluation. All of our hypotheses were supported, except for the absence of a link pointing from objectives to monitoring whereby learners may monitor the status and progress of their diagnosis and learning against their goals (hypothesis H3c). We attribute this vanished link to processing large amounts of information during on-site diagnostic problem solving that may have exhausted these novices’ cognitive resources. Thus, monitoring status for the execution of learning goals was not observed in this overloaded situation. Instead, the effect of objectives feeds into the problem representation and subsequently influences monitoring.

A limitation of this study is that we verified this structural model utilizing PLS-SEM to compensate for the relatively small sample size due to the constraint of limited medical student enrollment. Future studies should implement this survey with a larger group of participants for a confirmatory factor analysis to confirm the constructs of the measurements. In this model, data were collected from medical novices who had some experience in clinical problem solving at the undergraduate level. In the future, it may be worth conducting a novice-expert comparison to examine this conceptual model with medical students who have just begun to immerse themselves in scenarios of solving clinical cases and with medical experts (e.g., doctors in practices) who have gained mastery of diagnostic problem solving. The main purpose of the study was to investigate the relationships within the model of diagnostic problem solving. We have not yet explored the relationships of diagnostic problem solving with other variables such as actual performance in clinical practice. The relationship of diagnostic problem solving with their prior existing experiences or training (e.g., the number of courses taken involving problem-based learning) is also worth exploring, since the influence of prior existing experiences was not taken into consideration in this study. We assessed dispositions using a self-reported survey. Its correspondence with learners’ actual performance of metacognition (e.g., through think-aloud tasks) has not yet been examined. There are some studies that have investigated the dynamic process of knowledge activation (conceptual, procedural, and metacognitive knowledge) (e.g., [Citation9; Citation10]). Future studies may explore the correlations between dispositions elicited from a self-reported survey and deployment of actual metacognitive strategies in real time.

Implications for clinical teaching

Metacognition and cognitive regulation processes are key to improving clinical diagnosis teaching. This study will enhance clinical educators’ understanding and ability to implement effective teaching strategies through the following two key approaches. First, by examining the interrelationships among the five metacognitive dimensions of diagnostic problem solving, clinical educators can design more effective and engaging lessons for their students in clinical diagnostic education. Second, by engaging in discussions about students’ thinking processes and learning strategies for clinical problems using the concepts and terms outlined in the IMSR-D inventory of this study, learners can improve their ability to identify and assess the type of thinking they are engaged in and make more informed decisions regarding the risk of cognitive errors. Based on the two key points above, the following suggestions are offered for clinical educators in teaching diagnostic reasoning.

Our model revealed a direct relationship between the learners’ objectives and their self-evaluation during reflection. Their learning objectives, including their goals, plans, and strategies, served as the basis for their self-standards and provided reference points for their future self-reflection. We suggest that educators provide metacognitive support prior to a clinical task. This can be accomplished by asking the students questions such as ‘What do you want to achieve in this interview?’ to promote goal setting and ‘How will you approach this patient? Are there different ways to approach the interview?’ to encourage strategic planning. In addition to learning objectives, it is equally crucial for teachers to address and support students’ self-regulation during the diagnostic process. Educators can also use questions based on the IMSR-D as prompts to guide students’ awareness of problem representation, monitoring, and evaluation during diagnostic tasks. These metacognitive strategies will enhance learners’ abilities by directing their focus, raising their awareness, and promoting mastery.

Finally, knowledge of cognition, or the understanding and strategies for controlling one’s own thinking, is crucial. Despite its importance, this aspect of education is often overlooked in the clinical context. A critical part of knowledge of cognition is understanding the distinction between intuitive and analytical modes of thinking. By differentiating these modes, students can better examine their own cognitive processes during diagnostic tasks to overcome cognitive biases and improve their clinical reasoning skills. It is important for educators to emphasize these metacognitive aspects of diagnostic problem solving in their teaching, as it provides the students with a deeper understanding of their own thinking processes and the ability to regulate and improve them.

Conclusion

In this study, we developed a survey with five metacognitive and regulatory factors of clinical problem solving in medical contexts. The results showed that knowledge of cognition and objectives are crucial factors, and there is a direct relationship between problem representation, monitoring, and evaluation. Metacognitive and regulatory clinical diagnosis is multidimensional with an interconnected structure and so should not be taught in isolation. The incorporation of metacognitive aspects of SRL into the teaching of clinical diagnosis is a comprehensive framework for education. This will equip novice students with the skills to become lifelong learners in response to the highly variable clinical problems of their future workplace.

Acknowledgments

This work was supported by the Ministry of Science and Technology, Taiwan under Grant [MOST108-2511-H-011-008-MY4] and [MOST 110-2511-H-715-002-MY2].

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

The work was supported by the Ministry of Science and Technology, Taiwan [MOST108-2511-H-011-008-MY4]; Ministry of Science and Technology [MOST 110-2511-H-715-002-MY2].

References

  • Thammasitboon S, Rencic JJ, Trowbridge RL, et al. The assessment of reasoning tool (ART): structuring the conversation between teachers and learners. Diagnosis. 2018;5(4):197–10.
  • Colbert CY, Graham L, West C, et al. Teaching metacognitive skills: helping your physician trainees in the quest to ‘know what they don’t know’. Am j med. 2015;128(3):318–324.
  • Jusue AV, Alonso AT, Gonzalez AB, et al. Learning how to order imaging tests and make subsequent clinical decisions: a randomized study of the effectiveness of a virtual learning environment for medical students. Med Sci Educator. 2021;31(2):469–477.
  • Leeds FS, Atwa KM, Cook AM, et al. Teaching heuristics and mnemonics to improve generation of differential diagnoses. Med Educ Online. 2020;25(1):1742967.
  • Huang GC, Lindell D, Jaffe LE, et al. A multi-site study of strategies to teach critical thinking: ‘Why do you think that? Med Educ. 2016;50(2):236–249.
  • Trowbridge RL, Dhaliwal G, Cosby KS. Educational agenda for diagnostic error reduction. BMJ Qual Saf. 2013;22:ii28–32.
  • Boon M, van Baalen S, Groenier M, et al. Interdisciplinary expertise in medical practice: Challenges of using and producing knowledge in complex problem-solving. Med Teach. 2019;41(6):668–677. DOI:10.1080/0142159X.2018.1544417
  • Cleary TJ, Durning SJ, Artino AR. Microanalytic assessment of self-regulated learning during clinical reasoning tasks: recent developments and next steps. Acad Med. 2016;91(11):1516–1521.
  • Artino AR, Cleary TJ, Dong T, et al. Exploring clinical reasoning in novices: a self-regulated learning microanalytic assessment approach. Med Educ. 2014;48(3):280–291.
  • Kiesewetter J, Ebersbach R, Tsalas N, et al. Knowledge is not enough to solve the problems - the role of diagnostic knowledge in clinical reasoning activities. BMC Med Educ. 2016;16:303.
  • Winne PH . Students’ calibration of knowledge and learning processes: Implications for designing powerful software learning environments, International Journal of Educational Research. 1998;41(6):66–488. DOI:10.1016/j.ijer.2005.08.012 . Lawrence Erlbaum Associates Publishers.
  • Rovers SFE, Clarebout G, Savelberg HHCM, et al. Granularity matters: comparing different ways of measuring self-regulated learning. Metacogn Learn. 2019;14:1–19.
  • Vandevelde S, Van Keer H, Schellings G, et al. Using think-aloud protocol analysis to gain in-depth insights into upper primary school children’s self-regulated learning. Learn Individual Differences. 2015;43:11–30.
  • Veenman MVJ, van Hout-Wolters BHAM, Afflerbach P. Metacognition and learning: conceptual and methodological considerations. Metacogn Learn. 2006;1(1):3–14.
  • Perry NE, Rahim A. Studying self-regulated learning in classrooms. In: Zimmerman B Schunk D, editors. Handbook of self-regulation of learning and performance. New York: Routledge; 2011. pp. 122–136.
  • Gandomkar R, Yazdani K, Fata L, et al. Using multiple self-regulated learning measures to understand medical students’ biomedical science learning. Med Educ. 2020;54(8):727–737.
  • Yazdani S, Abardeh MH. Five decades of research and theorization on clinical reasoning: a critical review. Adv Med Educ Pract. 2019;10:703–716.
  • Poitras EG, Doleck T, Lajoie SP. Towards detection of learner misconceptions in a medical learning environment: a subgroup discovery approach. Educ Technol Res Dev. 2018;66(1):129–145.
  • Croskerry P. Narrowing the mindware gap in medicine. Diagnosis. 2021;9(2):176–183.
  • Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012;21(7):535–557.
  • Royce CS, Hayes MM, Schwartzstein RM. Teaching critical thinking: a case for instruction in cognitive biases to reduce diagnostic errors and improve patient safety. Acad Med. 2019;94(2):187–194.
  • Kiesewetter J, Ebersbach R, Gorlitz A, et al. Cognitive problem solving patterns of medical students correlate with success in diagnostic case solutions. PLoS ONE. 2013;8(8):e71486.
  • Flavell JH. Metacognition and cognitive monitoring: a new area of cognitive–developmental inquiry. American Psychologist. 1979;34(10):906–911.
  • Schraw G, Dennison RS. Assessing metacognitive awareness. Contemp Educ Psychol. 1994;19(4):460–475.
  • Zimmerman BJ. Chapter 2 - Attaining self-regulation: a social cognitive perspective. In: Boekaerts M, Pintrich P Zeidner M, editors. Handbook of self-regulation. Academic Press; 2000. pp. 13–39. DOI:10.1016/B978-012109890-2/50031-7.
  • Lajoie SP, Li S, Zheng J. The functional roles of metacognitive judgement and emotion in predicting clinical reasoning performance with a computer simulated environment. Interact Learn Environ. 2021;12. DOI:10.1080/10494820.2021.1931347
  • Ge X, Land SM. A conceptual framework for scaffolding III-structured problem-solving processes using question prompts and peer interactions. Educ Technol Res Dev. 2004;52(2):5–22.
  • Tsai MJ, Liang JC, Lee SWY, et al. Structural validation for the developmental model of computational thinking. J Educ Comput Res. 2022;60(1):56–73. DOI:10.1177/07356331211017794
  • Lin HM, Lee MH, Liang JC, et al. A review of using partial least square structural equation modeling in e-learning research. Br J Educ Technol. 2020;51(4):1354–1372.
  • Howard BC, McGee S, Shia R, et al. (2000, April 24-28). Metacognitive self-regulation and problem-solving: expanding the theory base through factor analysis. Annual Meeting of the American Educational Research Association, New Orleans, LA, United States.
  • Wang CY. Exploring general versus task-specific assessments of metacognition in university chemistry students: a multitrait-multimethod analysis. Res Sci Educ. 2015;45(4):555–579.
  • Hair JF, Ringle CM, Sarstedt M. PLS-SEM: indeed a silver bullet. J Marketing Theory Pract. 2011;19:139–151.
  • Hair JF, Black WC, Babin BJ, et al. Multivariate data analysis: a global perspective. 7th ed. Upper Saddle River, NJ: Pearson Prentice Hall; 2010.
  • Hair JF, Hult GTM, Ringle CM, et al. A primer on partial least squares structural equation modelling (PLS-SEM). Los Angeles: SAGE Publications; 2014.
  • Chin W, Marcoulides G. The partial least squares approach to structural equation modeling. In: Marcoulides G, editor. Modern methods for business research. Lawrence Erlbaum Associates Publishers: Psychology Press; 1998. pp. 295–336.
  • Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775–780.
  • Chew KS, van Merrienboer JJG, Durning SJ. Perception of the usability and implementation of a metacognitive mnemonic to check cognitive errors in clinical setting. BMC Med Educ. 2019;19:18.
  • Salkowski LR, Russ R. Cognitive processing differences of experts and novices when correlating anatomy and cross-sectional imaging. J Med Imaging. 2018;5(3):031411.
  • St Pierre M, Nyce JM. How novice and expert anaesthetists understand expertise in anaesthesia: a qualitative study. BMC Med Educ. 2020;20(262). DOI:10.1186/s12909-020-02180-8
  • Yuruk N, Beeth ME, Andersen C. Analyzing the effect of metaconceptual teaching practices on students’ understanding of force and motion concepts. Res Sci Educ. 2009;39(4):449–475.