1,251
Views
8
CrossRef citations to date
0
Altmetric
Review Articles

Where do we go from here? Moving from systems-based practice process measures to true competency via developmental milestones

, &
Article: 24441 | Received 25 Mar 2014, Accepted 28 May 2014, Published online: 27 Jun 2014

Abstract

For many educators it has been challenging to meet the Accreditation Council for Graduate Medical Education's requirements for teaching systems-based practice (SBP). An additional layer of complexity for educators is evaluating competency in SBP, despite milestones and entrustable professional activities (EPAs). In order to address this challenge, the authors present the results of a literature review for how SBP is currently being taught and a series of recommendations on how to achieve competency in SBP for graduate medical trainees with the use of milestones. The literature review included 29 articles and demonstrated that only 28% of the articles taught more than one of the six core principles of SBP in a meaningful way. Only 7% of the articles received the highest grade of A. The authors summarize four guiding principles for creating a competency-based curriculum that is in alignment with the Next Accreditation System (NAS): 1) the curriculum needs to include all of the core principles in that competency, 2) the objectives of the curriculum should be driven by clinical outcomes, 3) the teaching modalities need to be interactive and clinically relevant, and 4) the evaluation process should be able to measure competency and be directly reflective of pertinent milestones and/or EPAs. This literature review and the provided guiding principles can guide other residency educators in their development of competency-based curricula that meets the standards of the NAS.

For many educators, it has been challenging to meet the Accreditation Council for Graduate Medical Education's (ACGME) requirements for teaching the competency on systems-based practice (SBP) (Citation1). Unveiled in 1999, the goal of the ACGME's Outcome Project was to move from basic knowledge and process measures to an accreditation system that focused on competencies. The multi-phase project began by inviting programs to define specific behaviors that would reflect the competency, as they pertained to that specific specialty. Phase II (2002–2006) of the project concentrated on refining the definitions of the competencies and their respective assessment tools. The goal of Phase III (2006–2011) was full integration of the competencies and their assessments into learning and clinical care. The Outcomes Project's goal was clear: ‘Residency programs are expected to phase in assessment tools that provide useful and increasingly valid, reliable, evidence that residents achieve competency-based educational objectives’ (Citation2). Most recently, to assist educators in reaching the Outcomes Project's goal, the concepts of developmental milestones (Citation3) and entrustable professional activities (EPAs) (Citation4) have been established.

The educational outcomes are grounded in the six core competencies: 1) patient care; 2) medical knowledge; 3) practice-based learning and improvement (PBLI); 4) SBP; 5) professionalism; and 6) interpersonal skills and communication. Of these competencies, PBLI and SBP are the newest additions. SBP has six core principles and is defined as an ‘awareness of and responsiveness to larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value’ (Citation1). Specifically, all residents need to demonstrate the ability to:

1) work effectively in various health care delivery settings and systems relevant to their clinical specialty; 2) coordinate care within the health care system relevant to their clinical specialty; 3) incorporate considerations of cost awareness and risk benefit analysis in patient and/or population-based care, as appropriate; 4) advocate for quality patient care and optimal patient care systems; 5) work in interprofessional teams to enhance patient safety and improve patient care quality; and 6) participate in identifying system errors and implementing potential solutions.

The goals of the ACGME, as evidenced by Phase II of the Outcomes Project, are for residency programs to use dependable measures to assess residents’ competencies and, subsequently, to use those individual assessments to evaluate the educational effectiveness of the residency as a whole. As is clearly defined, these abilities are expected to be specific to each specialty (Citation2) – and each competency should be sufficiently unique to merit its own category. Yet, data have shown that residencies are having difficulties measuring the individual competencies as independent constructs, and assessment tools often measure ‘behaviors that consistently map onto three or more of the general competencies’ (Citation5). As a result, educators continue to struggle with the creation, definition, and categorization of six separate competencies. Furthermore, uncertainty continues to persist as to which assessment tools are most valid. Because true competency extends far beyond simple knowledge, educators are looking for measures that can assess ‘authentic’ behaviors of real-life clinical context. To date, only a ‘few models offer clear and direct routes’ for competency assessments without having measurement pitfalls (Citation1).

Despite often being educators’ lowest priorities, this challenge especially exists for SBP and PBLI. Indeed, a survey of family medicine program directors rated SBP as the lowest importance of all the competencies, with 70% reporting no method to evaluate SBP (Citation6). A major challenge is that SBP is truly a skill-based activity that cannot be validly captured with simple knowledge-based assessments. In response to these and other similar educational challenges, the ACGME developed 142 discrete curricular milestones that map to the six specific core competencies (Citation3).

Even with the development of various educational constructs (e.g., competencies, milestones, EPAs), residency programs continue to struggle with the use of milestones and/or EPAs in their curricular goals, objectives, and observational evaluative tools (Citation4) – and little related information has been reported specific to SBP. Hence, we explored how SBP is being integrated into residency education and whether or SBP measures were in alignment with the Next Accreditation System (NAS). We began by reviewing our own internal medicine residency program, and found that our SBP curriculum and its assessments measured associated knowledge and attitudes – but probably not true competency (Citation7). It was obvious, then, that we needed to evaluate in far more greater detail the six components of SBP as they apply to internal medicine residency programs. To move from process measures to hard outcomes like SBP-related developmental milestones or EPAs, we conducted an extensive review of the literature on SBP in internal medicine residency programs. The results of that literature review are presented here, along with recommendations on how to achieve competency and best use NAS evaluative measures in SBP for graduate medical trainees.

Methods

Search strategy

We reviewed literature published from 1999 to April, 2012 present using PubMed, Google Scholar, and ERIC () – using the following key search terms: 1) core competencies; 2) system-based practice; and 3) graduate medical education. For this search 1,158 titles were generated. Of the original 1,158 titles reviewed, 179 titles remained after duplicates, books and titles that named a specialty other than internal medicine were eliminated. The 179 respective abstracts were reviewed and 37 full articles were chosen after applying our inclusion criteria. To these 37 articles, 6 more articles, gleaned from a bibliographic review, were added. We also supplemented our pool with seven articles from the ACGME's Outcomes Project web-based reference guide. After carefully reviewing these 50 articles, we excluded 21 which, despite their titles, did not align with our predetermined inclusion criteria.

Fig. 1.  Search strategy for articles on systems-based practice.

Fig. 1.  Search strategy for articles on systems-based practice.

Inclusion and exclusion criteria

All abstracts and articles were included if they contained either a description of the curriculum and/or the associated evaluation methods. The described curriculum had to be focused on the core competency of SBP within US internal medicine residencies. Articles were excluded if they: 1) were in foreign medical programs; 2) were primarily intended for medical students/fellows; 3) focused on more than two core competencies; or 4) involved learners in other (non-internal medicine) specialties. Articles discussing SBP and one other core competency (patient care, medical knowledge, PBLI, professionalism, or interpersonal skills and communication) were included. However, opinion pieces, review articles, and consensus conferences were excluded – as were papers from non-peer-reviewed journals.

Articles

As stated, the initial search resulted in 1,158 references (). After the above search strategies, removal of duplicates, and a thorough screening using inclusion and exclusion criteria, 29 articles were included in the systematic review (Citation8Citation36). Given the exploratory nature of this study, we had little basis for organizing these studies (Citation5). A subsequent review of the literature helped define SBP and its principles and implied abilities – from which six categories were developed based on Graham's taxonomy of SBP (Citation1) (). The primary aim of the reviewers, then, was to categorize each article into one of the six groups; however, several articles contained no single primary theme, but instead meaningfully addressed more than one of the six SBP principles and were subsequently grouped under more than one category. When this occurred, all three authors agreed to do so.

Table 1 Categories from SBP definition

A quality index, based on definitions from the ACGME's requirements for achieving competency, was established to ‘grade’ articles. This index ranged from A to D (with A being the highest) and was applied to descriptions of the curriculum and evaluation methods. Because our literature review went back to 1999, the grading system did not include the criteria for milestones or EPAs – which emerged much later (Citation3, Citation4). The definition of each grade is listed below – and centers primarily on the measured focus of the curriculum and evaluation:

  1. Patient outcome(s);

  2. Behavioral change by the provider (mostly closely linked to milestones or EPAs);

  3. Perceptions, knowledge, or attitudes of providers and/or patients; and

  4. Description of content only – no evaluation was completed.

Initial classification of category and quality was done independently by two of the authors (JM and EP). A third reviewer (CH) was used to reconcile disagreements. All reviewers are faculty members involved in residency education. One (JM) is a core faculty member who co-directs two residency educational units, whereas the other (EP) leads the primary care research block. The third reviewer (CH) is the primary care program director.

Results

After systematically culling the larger sample, the systematic review included 29 articles (). From these, the two reviewers exhibited 66 (19/29) and 90% (26/29) agreement on their assignment of category and grade, respectively.

Only one internal medicine graduate curriculum, represented in 2 of the 29 articles (7%), incorporated all the six principles of SBP into their described curriculum; eight (28%) articles included curricula covering more than one aspect of SBP in a meaningful way. The majority of articles (45%, or 13/29) were classified as ‘system evaluator’ – concentrating on the principle of ‘identifying system errors and implementing potential solutions’, or what is most readily labeled as quality improvement (). Few articles formally addressed coordination of care or patient advocacy.

Table 2 SBP article categorization

Only 7% (2/29) were graded A – the highest quality. These articles represented the ACGME's ‘gold standard’ of assessing mastery of a competency via patient outcomes. These two articles reflect examples of what most educators set out to achieve when developing curricula – and go beyond the typical assessment of learners’ knowledge of content. Interestingly, none of the articles, even those published after 2009, stated specifically how milestones and/or EPAs were addressed in their curricular design and/or evaluation process, if they occurred. However, both of the two ‘A’ articles were among those more recently written – suggesting that educators may be refining curricula to mirror more closely the goals of the Outcomes Project and NAS ().

Table 3 Quality index grade of reviewed literature

Study and search limitations

Our search included only published, peer-reviewed articles – each of which contained its own set of limitations and possible biases. Also, the articles did not use uniform definitions or terms, and exclusively focused on internal medicine residency programs. Lastly, because existing rating criteria had not been previously applied to medical education literature, our development and use of the present coding scheme is somewhat novel and, as such, of unknown validity.

Discussion

This comprehensive literature review suggests that efforts are underway to incorporate SBP into internal medicine residency education – with isolated examples geared toward evaluating competency in this arena. Yet, the data also indicate that residency programs are lagging behind the Outcomes Project's goal of full integration by 2011 – the largest barrier seemingly to be the design of curricula and operational measures that can independently assess each of the six competencies, even with the inclusion of milestones. This review shows that despite much curricular development, there appears to be little agreement and data regarding best practices in assessing learner competence – especially with regards to SBP and its six core principles. Furthermore, areas that require behavior or performance-based assessment (e.g., milestones, EPAs) are, in contrast to knowledge and written assessments (Citation37), still among the least developed areas.

Before moving ahead to Phase IV (2011 and beyond) of the Outcomes Project, educators must verify that the first three phases have been successfully completed. This review of the literature has shown that in an attempt by the ACGME to make definitions widely applicable to all programs and institutions, there exists ‘lack of enough specificity to move our understanding towards what exactly it is residents need to demonstrate in order to indicate proficiency in SBP’ (Citation38). Each article described different methods and approaches, and the data show that most curricula concentrate efforts on only one of the six principle areas of SBP. As a result, there is much progress to make before SBP can be comprehensively taught and assessed as a core competency.

Lessons learned and guiding principles

Despite the abovementioned challenges, educators should focus on the practical value of proposed educational changes. In the next steps, as an educational community, we must agree upon a basic model of teaching and assessing each competency that also provides room for divergence and innovation. Keeping in mind the ‘gold standard’ of improved patient care, we must move towards hard outcome measures of competency – for example, documenting milestones and completion of EPAs (Citation39, Citation40).

We offer four guiding principles that might inform educators when creating or restructuring SBP curricula. The principles were applied to our own SBP program efforts (Citation5), but can just as easily be used in any curriculum that will address any of the six core competencies of patient care, medical knowledge, PBLI, SBP, professionalism, and interpersonal skills and communication.

  1. The curriculum must include all core principles of the specified competency.

    In a hypothetical example of reformatting a SBP curriculum, the knowledge content should first be reviewed and grouped into its six core, defining principles (i.e., system consultant, care coordinator, resource manager, patient advocate, team collaborator, and system evaluator). If any core principle is not represented, additional material needs to be created to fill the gap.

  2. The curricular objectives should be driven by clinical outcomes.

    After categorizing the content, examples of clinical behaviors or patient outcomes can be linked accordingly. The categorization should align with not just the six basic principles of SBP but also the 21 milestones (Citation3) that map to the core competency of SBP (). For example, being a competent system consultant means being able to navigate your patient through a complete Medicaid application or to assist an elderly patient choose a prescription drug plan under Medicare Part D. Assistance with this real clinical experience could map to the milestone of ‘Reflect awareness of common socioeconomic barriers that affect patient care’ or, involving the social worker to complete this task, to the milestone of ‘appreciate role of various health care providers … including social workers’. Under team collaborator, a resident may consult with a diabetic nutritionist to create a culturally tailored diet that helps improve a diabetic's glucose control. The examples would then be listed for each category of SBP and its corresponding milestone ().

  3. The teaching modalities should be interactive and clinically relevant.

    The concrete examples outlined above should be incorporated into a curriculum via a teaching strategy that is interactive and clinically based. Some didactic sessions may be warranted, but other pedagogies should be considered. Of course, the degree of innovation will depend on institutional resources, but educational technology such as podcasts (if available) can often overcome scheduling or time barriers. Other examples include case-series, peer-assisted learning, simulations, and direct observations. The teaching and learning modality should also inform the evaluation process.

  4. The assessment of competency should utilize milestones and/or EPAs.

    Attention to the course evaluation should be turned from content-oriented to assessment-based, and evaluative tools should focus on patient outcomes or physician behaviors – such as chart reviews, patient satisfaction, or Plan, Do, Study, Act (PDSA) cycles. For example, in addressing the ‘resource manager’ component, one could conduct chart reviews to see how often residents are asking about patients’ abilities to manage the costs of medications and, when a problem is identified, how the resident addressed the issue. For the ‘team evaluator’ aspect, a 360 degree evaluation of a resident involving ancillary staff, social workers, and nurses could be conducted to measure specific physician behaviors toward members of the interprofessional team. Other assessment tools that could be considered are direct observation of SBP skills – such as observing, scoring, and documenting resident–patient encounters around the use of interpreters. There are many other examples; emphasis should remain on curricular objectives driving by clinical outcomes and tied to a milestone () or EPA.

Table 4 Linking SBP category to milestones

Conclusion

Many of the lessons learned in reformatting our own SBP curriculum centered on best delivery of teaching methods and assessment tools – both of which should align with current changes in health care and health professions education. Despite not yet being able to share successful experiences from our reformatted curriculum (Citation7), sharing lessons learned may assist other programs in enhancing or creating their own SBP curriculum as mandated by the ACGME. We believe our experience, this literature review, and the guiding principles provided can assist in the development of competency-based curricula.

Conflict of interest and funding

All three authors report no conflict of interest.

Acknowledgement

This publication was made possible by Grant Number 1P60MD003421 from the National Institute on Minority Health and Health Disparities. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH/NIMHD.

References

  • Graham MJ , Naqvi Z , Encandela J , Harding KJ , Chatterji M . Systems-based practice defined: taxonomy development and role identification for competency assessment of residents. J Grad Med Educ. 2009; 1: 49–60. [PubMed Abstract] [PubMed CentralFull Text].
  • Accreditation Council for Graduate Medical Education. Outcomes project. Available from: http://www.acgme.org/outcome/comp/compHome.asp [cited December 2011].
  • Green ML , Aagaard EM , Caverzagie KJ , Chick DA , Holmboe E , Kane G , etal. Charting the road to competence: developmental milestones for internal medicine residency training. J Grad Med Educ. 2009; 1: 5–20. [PubMed Abstract] [PubMed CentralFull Text].
  • Alliance for Academic Internal Medicine. Internal medicine end of training EPAs. Available from: http://www.im.org/ACADEMICAFFAIRS/MILESTONES/Pages/EndofTrainingEPAs.aspx [cited 7 November 2013].
  • Lurie SJ , Mooney CJ , Lyness JM . Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med. 2009; 84: 301–9. [PubMed Abstract].
  • Delzell JE Jr , Ringdahl EN , Kruse RL . The ACGME core competencies: a national survey of family medicine program directors. Fam Med. 2005; 37: 576–80. [PubMed Abstract].
  • Martinez J , Phillips E , Fein O . Perspectives on the changing healthcare system (POCHS): teaching systems-based practice to medical residents. Med Educ Online. 2013; 18: 20746. [PubMed Abstract].
  • Allen E , Zerzan J , Choo C , Shenson D , Saha S . Teaching systems-based practice to residents by using independent study projects. Acad Med. 2005; 82: 125–8.
  • Amin AN , Rucker L . A systems-based practice curriculum. Med Educ. 2004; 38: 568. [PubMed Abstract].
  • Crites GE , Schuster RJ . A preliminary report of an educational intervention in practice management. BMC Med Educ. 2004; 4: 15. [PubMed Abstract] [PubMed CentralFull Text].
  • Daniel DM , Casey DE Jr , Levine JL , Kaye ST , Dardik RB , Varkey P , etal. Taking a unified approach to teaching and implementing quality improvements across multiple residency programs: the Atlantic health experience. Acad Med. 2009; 84: 1788–95. [PubMed Abstract].
  • David RA , Reich LM . The creation and evaluation of a system-based practice/managed care curriculum in a primary care internal medicine residency program. Mt Sinai J Med. 2005; 72: 296–9. [PubMed Abstract].
  • Eiser AR , Connaughton-Storey J . Experiential learning of systems-based practice: a hands-on experience for first-year medical residents. Acad Med. 2008; 83: 916–23. [PubMed Abstract].
  • Englander R , Agostinucci W , Zalneraiti E , Carraccio CL . Teaching residents systems-based practice through a hospital cost-reduction program: a ‘win-win’ situation. Teach Learn Med. 2006; 18: 150–2. [PubMed Abstract].
  • Eskildsen MA . Review of web-based module to train and assess competency in systems-based practice. J Am Geriatr Soc. 2010; 58: 2412–13. [PubMed Abstract].
  • Gakhar B , Spencer AL . Using direct observation, formal evaluation, and an interactive curriculum to improve the sign-out practices of internal medicine interns. Acad Med. 2010; 85: 1182–8. [PubMed Abstract].
  • Hingle S , Rosher RB , Robinson S , McCann-Stone N , Todd C , Clark M . Development of the objective structured system-interaction examination. J Grad Med Educ. 2009; 1: 82–8. [PubMed Abstract] [PubMed CentralFull Text].
  • Hingle ST , Robinson S , Colliver JA , Rosher RB , McCann-Stone N . Systems-based practice assessed with a performance-based examination simulated and scored by standardized participants in the health care system: feasibility and psychometric properties. Teach Learn Med. 2011; 23: 148–54. [PubMed Abstract].
  • Kirsh SR , Aron DC . Integrating the chronic-care model and the ACGME competencies: using shared medical appointments to focus on systems-based practice. Qual Saf Health Care. 2008; 17: 15–19. [PubMed Abstract].
  • Korn LM , Reichert S , Simon T , Halm EA . Improving physicians’ knowledge of the costs of common medications and willingness to consider costs when prescribing. J Gen Intern Med. 2003; 18: 31–7. [PubMed Abstract] [PubMed CentralFull Text].
  • Kravet SJ , Wright SM , Carrese JA . Teaching resource and information management using an innovative case-based conference. J Gen Intern Med. 2001; 16: 399–403. [PubMed Abstract] [PubMed CentralFull Text].
  • Leenstra JL , Beckman TJ , Reed DA , Mundell WC , Thomas KG , Krajicek BJ , etal. Validation of a method for assessing resident physicians’ quality improvement proposals. J Gen Intern Med. 2007; 22: 1330–4. [PubMed Abstract] [PubMed CentralFull Text].
  • Nabors C , Peterson SJ , Weems R , Forman L , Mumtaz A , Goldberg R , etal. A multidisciplinary approach for teaching systems-based practice to internal medicine residents. J Grad Med Educ. 2011; 3: 75–80. [PubMed Abstract] [PubMed CentralFull Text].
  • Nagler A , Andolsek K , Dossary K , Schlueter J , Schulman K . Addressing the systems-based practice requirement with health policy content and educational technology. Med Teach. 2010; 32: 559–65.
  • Oyler J , Vinci L , Arora V , Johnson J . Teaching internal medicine residents quality improvement techniques using the ABIM's practice improvement modules. J Gen Intern Med. 2008; 23: 927–30. [PubMed Abstract] [PubMed CentralFull Text].
  • Perez JA Jr , Faust C , Kenyon A . The virtual practice: using the residents’ continuity clinic to teach practice management and systems-based practice. J Grad Med Educ. 2009; 1: 104–8. [PubMed Abstract] [PubMed CentralFull Text].
  • Peters AS , Kimura J , Ladden MD , March E , Moore GT . A self-instructional model to teach systems-based practice and practice-based learning and improvement. J Gen Intern Med. 2008; 23: 931–6. [PubMed Abstract] [PubMed CentralFull Text].
  • Reznek MA , DiGiovine B , Kromrei H , Levine D , Wiese-Rometsch W , Schreiber M . Quality Education and Safe Systems Training (QuESST): development and assessment of a comprehensive cross-disciplinary resident quality and patient safety curriculum. J Grad Med Educ. 2010; 2: 222–7. [PubMed Abstract] [PubMed CentralFull Text].
  • Sehgal NL , Fox M , Vidyarthi AR , Sharpe BA , Gearhart S , Bookwalter T , etal. A multidisciplinary teamwork training program: the Triad for Optimal Patient Safety (TOPS) experience. J Gen Intern Med. 2008; 23: 2053–7. [PubMed Abstract] [PubMed CentralFull Text].
  • Sherman SE , Fotiades J , Rubenstein LV , Gilman SC , Vivell S , Chaney E , etal. Teaching systems-based practice to primary care physicians to foster routine implementation of evidence-based depression care. Acad Med. 2007; 82: 168–75. [PubMed Abstract].
  • Tartaglia KM , Press VG , Freed BH , Baker T , Tang JW , Cohen JC , etal. The neighborhood health exchange: developing a community partnership in residency. J Grad Med Educ. 2010; 2: 456–61. [PubMed Abstract] [PubMed CentralFull Text].
  • Tomolo A , Caron A , Perz ML , Fultz T , Aron DC . The outcomes card. Development of a systems-based practice educational tool. J Gen Intern Med. 2005; 20: 769–71. [PubMed Abstract] [PubMed CentralFull Text].
  • Turley CB , Roach R , Marx M . Systems survivor: a program for house staff in systems-based practice. Teach Learn Med. 2007; 19: 128–38. [PubMed Abstract].
  • Wittich CM , Reed DA , McDonald FS , Varkey P , Beckman TJ . Perspective: transformative learning: a framework using critical reflection to link the improvement competencies in graduate medical education. Acad Med. 2010; 85: 1790–3. [PubMed Abstract].
  • Ziegelstein RC , Fiebach NH . ‘The mirror’ and ‘the village’: a new method for teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004; 79: 83–8. [PubMed Abstract].
  • Zupancic M , Yu S , Kandukuri R , Singh S , Tumyan A . Practice-based learning and systems-based practice: detection and treatment monitoring of generalized anxiety and depression in primary care. J Grad Med Educ. 2010; 2: 474–7. [PubMed Abstract] [PubMed CentralFull Text].
  • Frohna JG , Kalet A , Kachur E , Zabar S , Cox M , Halpern R , etal. Assessing residents’ competency in care management: report of a consensus conference. Teach Learn Med. 2004; 16: 77–84. [PubMed Abstract].
  • Graham MJ , Naqvi Z , Encandela JA , Bylund CL , Dean R , Calero-Breckheimer A , etal. What indicates competency in systems based practice? An analysis of perspective consistency among healthcare team members. Adv Health Sci Educ Theory Pract. 2009; 14: 187–203. [PubMed Abstract].
  • Tamblyn R , Abrahamowicz M , Dauphinee WD , Hanley JA , Norcini J , Girard N , etal. Association between licensure examination scores and practice in primary care. JAMA. 2002; 288: 3019–26. [PubMed Abstract].
  • Tamblyn R , Abrahamowicz M , Dauphinee D , Wenghofer E , Jacques A , Klass D , etal. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA. 2007; 298: 993–1001. [PubMed Abstract].