3,520
Views
7
CrossRef citations to date
0
Altmetric
Special Topic Section Introduction to Social, Emotional, and Behavioral Assessment within Tiered Decision-Making Frameworks: Advancing Research through Reflections on the Past Decade

Social, Emotional, and Behavioral Assessment Within Tiered Decision-Making Frameworks: Advancing Research Through Reflections on the Past Decade

Pages 1-5 | Received 08 Mar 2021, Accepted 18 Mar 2021, Published online: 07 Jul 2021

Abstract

Schools play a critical role in service delivery related to students’ social, emotional, and behavioral (SEB) health. A foundation for effective SEB service delivery includes capacity to identify students in need of support, differentiate which supports are needed, and monitor students’ responsiveness to intervention. Assessment is critical for each of these goals, providing the data needed to make effective decisions across the continuum of service delivery. This special issue provides opportunity to reflect on the past decade of research that strengthens the psychometric evidence for use of the measures in SEB assessment, as well as address several themes that characterize recent SEB assessment research. In this introductory article, we introduce a series of original articles and commentaries addressing these critical issues that are timely and relevant for the future path of SEB assessment. Taken together, the current issue seeks to bring together current evidence around multitiered problem-solving frameworks with focus on SEB domain and propose future directions for research and practice.

Impact Statements

The paper describes key themes related to current status and future directions in social, emotional, and behavioral assessment. Those themes are organized around usability, rater effects, a balanced focus on assets and deficits, and equity considerations.

A decade ago, School Psychology Review featured a special issue on behavioral assessment within problem-solving models (Chafouleas et al., Citation2010). The focus was on possibilities to extend the insights of prevention-oriented multitiered systems of support (MTSS) in academics to social, emotional, and behavioral (SEB) domains. Collectively, the articles in that issue highlighted cutting edge advances in SEB assessment along with outstanding questions and directions for building SEB-focused MTSS. In the decade since the publication of the special issue, many of the issues posed have been taken on by researchers. Methodologies that were just emerging in 2010 have now become well established (e.g., direct behavior ratings; Chafouleas et al., Citation2010), and the technological sophistication of SEB measurement in general has grown significantly (e.g., Anthony et al., Citation2016; Pendergast et al., Citation2017).

Despite progress, however, other issues persist, with a critical need to dive deeper into understanding if a shift to a prevention-oriented framework in SEB assessment has fulfilled its promise. That is, work is needed to advance beyond traditional psychometric considerations in assessment development to a comprehensive evaluation of intended and unintended consequences of use. Such evaluation must engage a social justice lens as the foundation. Related, despite a decade of research to develop viable measures for SEB-focused MTSS, implementation of recommended school-based SEB assessment procedures continues to lag far behind similar academic-focused systems (Briesch et al., Citation2018). Limited effective uptake and sustainment threaten the promise of SEB-focused MTSS to better the lives of all school children, calling for work to study implementation features of integrated MTSS models that engage a whole child picture to fully inform decisions about supports. With this context in mind, the current special issue affords the opportunity to consider the progress made in the past decade of research on SEB assessment. We highlight continued technical, conceptual, and applied issues, and identify signposts for the next decade of SEB assessment work necessary to realize the full potential of SEB-focused MTSS.

ISSUES IN SEB ASSESSMENT: QUESTIONS (UN)RESOLVED OR (UN)ADDRESSED

The issues discussed in the articles comprising the issue are by no means exhaustive of the work done over the past decade nor of the gaps to drive directions for future research. Together, the issues raised provide the opportunity to reflect on four critical issues representative of the progress, promise, and future path of SEB assessment.

USABILITY

The first issue relates to usability, which has been defined as including multiple dimensions such as user knowledge and acceptability, as well as feasibility of and support for use (Briesch et al., Citation2013). Each dimension can serve as a barrier or facilitator to effective use. With regard to SEB assessment, a perennial issue facing scientific measurement across all disciplines is a balance between the technical sophistication and usability of assessment instrumentation and procedures. Over the past decade, there have been several advances in improving the technical properties of SEB assessments in school psychology (e.g., Pendergast et al., Citation2017), and numerous assessment-related investigations utilizing sophisticated analytic techniques (e.g., Miller et al., Citation2018). Yet, several authors have questioned whether technical advances in school psychological assessments have ultimately improved the utility of assessments (e.g., Kazdin, Citation2005; von der Embse & Kilgus, Citation2018). Although there has been a recent increase of interest in SEB competencies from the general public and policy makers, such as the inclusion of a “nonacademic” indicator of school success in the Every Student Succeeds Act (P.L. 114-95), the current landscape suggests SEB assessments are underused (Briesch et al., Citation2018). Related, emphasis on advanced psychometric techniques may have resulted in some neglect of more context-focused assessment approaches, such as functional assessment, which have an established record of utility. As such, this special issue highlights the need for renewed efforts to promote the usability of SEB assessments, furthering the integration of psychometrically advanced assessments within existing, context-focused problem-solving paradigms.

RATER EFFECTS

A second area of tension within SEB assessment research involves increasing concern regarding the indirect nature of many SEB assessment methods. Indeed, one of the major goals of the next generation of SEB assessment identified by Chafouleas et al. (Citation2010) involved establishing and improving the utility and technical rigor of existing approaches to assessment, such as traditional behavior rating scales. This challenge has been met with several investigations (e.g., Anthony et al., Citation2016; Gresham et al., Citation2010; Moulton et al., Citation2019) and methods for increasing the efficiency of traditional rating scales—with aims to increase the feasibility of these measures for widescale use in MTSS. As a result, there are a growing number of brief, high-quality rating scales for use in MTSS-type systems.

Despite the progress, fundamental limitations of indirect measurement approaches such as rating scales persist. Indeed, because of the realities of large-scale or repeated assessment, limitations of indirect measurement—especially the error-prone and bias laden nature of rater mediated assessment—are perhaps more problematic now than ever before. For example, in contrast to traditional paradigms, in which outlier scores from individual raters on behavior rating scales could often be counterbalanced by scores from other informants, universal screening and progress monitoring assessments often rely on scores generated by single raters. As such, there is a risk of conflating student competencies or problems given systematic rating patterns, idiosyncrasies, and biases. Recent investigations (e.g., Splett et al., Citation2018; Tanner et al., Citation2018) have highlighted these issues, but few solutions are yet forthcoming.

DUAL FACTOR SEB ASSESSMENT FOCUS

A third issue relates to targeted competencies in SEB assessments. Traditionally, SEB well-being has been defined as the absence of illness or deficits (Suldo & Shaffer, Citation2008), and thus, SEB assessments in schools have primarily aimed to identify and reduce risk among youth. Although it would be hard-pressed to argue against the benefits of assessing and intervening with symptoms, the absence of symptoms does not guarantee the presence of psychological assets. Research suggests a renewed focus on psychological assets and their relationship to youth’s well-being and success in schools (e.g., Furlong et al., Citation2014). Assessment approaches that focus on student resilience can direct classroom and school-level resources to promote student strength and assets to promote their well-being in schools. The integration of risk and assets in assessment can contribute to a more comprehensive picture of student functioning (Huebner et al., Citation2007), which can better inform data-based decision-making in school service delivery.

EQUITY IN SEB ASSESSMENT

The fourth issue brings attention to equity in SEB assessment, highlighting the need for intense attention to work in areas such as disproportionality and appropriate cultural adaptations within assessment practices. As the previous special issue promoted, we continue to advocate for incorporating SEB assessments within a framework that considers children’s contextual factors (e.g., classroom, school, and community) as key to equitable and effective data-based decision making. Assessment approaches that are reactive, that is, engaged after emergence of a concern and often used to confirm a problem within a child, leads to a host of series issues in exclusion, disparity, and inequity that counter commitment to social justice. Universal screening has been advocated as a promising solution, yet work is needed to fully understand that promise. Universal screening research, for example, could identify how best to help practitioners understand strengths and needs in broader contexts such as the classroom- and school-levels rather than a sole focus on deficits of the individual child. In this approach, assessment data can be used to strengthen the environments in which students learn SEB competencies. Furthermore, identifying appropriate raters and the examination of rater effects on SEB assessments may reduce disproportionalities. For example, although behavior-rating scales have been considered a potential method for addressing disproportionalities (Raines et al., Citation2012), if rater-mediated assessments are susceptible to rater effects or worse, rater biases, such promise would seem to be limited, if not eliminated.

OVERVIEW OF THE CONTRIBUTING ARTICLES

The articles included in this special issue represent directions forward for strengthening the assessment foundations of SEB-focused MTSS. Taking their cues from prior work, these articles raise important issues in current SEB assessment research. The first article by Brann et al. (Citation2022) provides a systematic review of SEB assessment tools’ usability within MTSS. Over the past decade, substantial work has been conducted to develop technically adequate assessments for use in screening and progress monitoring, yet the extent to which usability has been investigated is unknown. Results suggest a dearth of attention to the evaluation of multiple dimensions of usability as most existing students were focused on appropriateness for the intended use and technical adequacy. Those articles that did include usability evaluation tended to focus on features of feasibility and acceptability. For articles that did evaluate usability, the majority examined teacher perception of feasibility and acceptability. Overall results support the need for future researchers’ evaluation of usability of SEB measures of screening and progress monitoring.

The second article by Anthony et al. (Citation2022) offers a unique evaluation of rater effects on SEB assessments using the Many-Facet Rasch Measurement. Many, if not most, assessments of SEB functioning used for screening, diagnostic, and intervention decisions are rater-mediated assessments (e.g., behavior rating scales; Engelhard & Wind, Citation2018). Rater-mediated assessments measure students’ SEB functioning indirectly through the lens of a rater. Therefore, scores from SEB assessments represent both student SEB functioning and rater severity/leniency. Although prior research has established the presence and scope of rater effects (e.g., Splett et al., Citation2018), the typically employed methodology does not allow for evaluation of how rater effects impact score utility and validity. Using a novel methodological approach, Anthony et al. (Citation2022) statistically correct rater effects for a sample of students rated on a short form of the Academic Competence Evaluation Scales. Their results indicate few large effects of rater effects on group level score relations (i.e., convergent validity correlations), but more substantial effects at the individual score level. Implications and future research directions are discussed.

Next, Kim and Choe (Citation2022) examine the integration of risk and strength indicators in universal SEB screening in South Korea. The nationwide SEB screening in K–12 schools primarily focuses on identifying SEB risk, with a little emphasis on SEB strength and resources. The authors suggest that identifying both SEB risk and strengths may expand the goal of school-based mental health support and benefit all students rather than a smaller number of students at high risk. Specifically, students who are not exhibiting urgent needs, but are just “getting by” (i.e., students low in both symptoms and strengths; Suldo & Shaffer, Citation2008) can have the opportunity to receive strength-based, early preventive care. This study provides additional evidence supporting the technical adequacy of the screening tools, as well as the incremental utility of a SEB strength screener in predicting students’ subjective well-being in schools. Kim and Choe discuss the use of SEB screening tools for problem-solving at each tier of services, the benefits and limitations of using self-report measures in SEB screening, and the directions for considering cultural factors in future screening research.

Three commentaries provided by leading experts in SEB assessment highlight the progress made over the past decade in school- and community-based SEB assessment practice and research, and offer their recommendations for future directions. Specifically, the first commentary provided by Cook, as one of the editors from the first SPR special issue on SEB assessment, reflects on the progress made in school-based SEB assessments within problem-solving models. Next, the commentary by Lane ties together the special issue by integrating the past and future of SEB assessment, leading to the next steps for researchers and practitioners engaged in SEB assessment intended to prevent the incidence of emotional and behavioral disorders. Finally, Pickens offers commentary on the intersection of community and school, advocating for culturally responsive assessment practices that promote equity and reduce disproportionality in education.

In conclusion, great progress has been made over the past decade of SEB assessment research, yet the themes presented in this issue reflect tensions that should advance the next generation of SEB assessment research that fulfills the promise of integrated MTSS in supporting whole child success. As Ken Merrell (Citation2010) so aptly noted in the commentary provided for the first special issue, “It is undeniable that we have to do something to improve our services to students with behavioral and mental health problems. Assessment is one of those things we have to do (p. 425).” Successful SEB support in schools includes efficient and effective data-based decision making to identify students in need of support, inform which supports are needed, and monitor responsiveness to those supports. Assessment drives capacity to meet these goals, thus, it is indeed one of those things that demand continued inquiry to advance our knowledge about how to do it well.

Guest Editors’ Note

This proposal was developed in collaboration with the 2019 SPRCC group on behavior assessment, with members including Drs. Christopher Anthony, Kristy Brann, Sandra Chafouleas, Brian Daniels, Thomas Gross, and Eui Kyung Kim.

Additional information

Notes on contributors

Eui Kyung Kim

Dr. Eui Kyung Kim (She/Her) is an assistant professor in the School Psychology program at the University of California, Riverside. Her research interests focus on understanding the pathways to risk and resilience among children and adolescents. She has been conducting research on universal mental health screening, early identification, and prevention services for children’s social and emotional health. She also conducts research internationally, examining the social and emotional development of children and adolescents from diverse cultural and linguistic.

Christopher J. Anthony

Christopher J. Anthony, PhD, is an assistant professor in the School of Special Education, School Psychology, and Early Childhood Studies in the College of Education at the University of Florida. His research focuses broadly on improving the assessment of positive student competencies, especially academic enablers and social and emotional learning.

Sandra M. Chafouleas

Sandra M. Chafouleas, PhD, is a Board of Trustees Distinguished Professor in the Department of Educational Psychology within the Neag School of Education at the University of Connecticut. Dr. Chafouleas served as the lead editor of the 2010 SPR special issue on school-based behavioral assessment within problem-solving models, and served as the 2019 SPRCC catalyst scholar for the authors and editors of the proposed special issue. Dr. Chafouleas has authored over 150 publications primarily related to school mental health and behavior assessment. She is a fellow in both the American Psychological Association and Association for Psychological Science, and is Past-President of the Society for the Study of School Psychology.

REFERENCES

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.