858
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Data to Inform Academic Instruction for Students with Extensive Support Needs: Availability, Use, and Perceptions

ABSTRACT

Students with extensive support needs (ESN), most of whom are taught in separate settings by special educators without extensive academic preparation, have difficulty making progress in the general education curriculum. Although there is evidence that data-based decision-making improves achievement for a wide range of students, there is little research on teachers’ data use for academic instruction of students with ESN. In this study based on 403 survey responses from teachers in nine states, we examined the types of data that were available, how teachers perceived and used data, and how their perceptions were related to data use. Respondents more often reported having access to teacher-developed data sources than to external data (e.g. published assessments, local assessments). Even when external data sources were available, respondents more often used teacher-developed data. Respondents most often reported using data to set goals and tailor instruction, and least often to group students. Attitudes toward data and perceptions of personal data competence were positively associated with data use. School leadership and organizational supports were associated only with decisions about grouping students. There were group differences in data use depending on the type of data they perceived as useful.

Teachers are expected to use assessment data to improve their instructional practice (Fuchs, Citation2004) and, in turn, improve student outcomes (e.g., Every Student Succeeds Act, Citation2015; Jung et al., Citation2018). Special education teachers use data for a variety of purposes, including monitoring student progress in the general education setting, identifying students who need additional supports or services, tracking progress toward Individualized Education Plan (IEP) goals, and evaluating interventions (Filderman et al., Citation2020). Using student data to make decisions is a key component in response to intervention frameworks (Filderman et al., Citation2020) and multitier systems of support models (Espin et al., Citation2017). However, students with extensive support needs (ESN) are typically excluded from the research base for whole-class instruction and tiered support models for students with disabilities (Thurlow et al., Citation2020).

Students with ESN typically have ongoing, pervasive needs in academics and daily-living settings and can have disabilities such as intellectual disability, autism, and multiple disabilities; they often participate in their state’s alternate assessments (Kurth et al., Citation2019). Despite the 1997 Individuals with Disabilities Education Act (IDEA) requirement to include students with ESN in the general curriculum, as well as more recent guidance (U.S. Department of Education, Citation2017) confirming IEPs should be ‘appropriately ambitious’ (p. 7), students with ESN have historically faced barriers to accessing grade-level academic content (Agran et al., Citation2020) and demonstrating academic progress (e.g., Kurth & Mastergeorge, Citation2010). Many students with ESN do not have access to the same grade-appropriate instruction as their general education peers, and most receive their academic instruction in self-contained or separate settings (Kurth et al., Citation2019) from special education teachers with minimal preparation to teach academics (Sindelar et al., Citation2014). There is limited research exploring teachers’ use of instructional data to support academic progress for students with ESN (Ruhter & Karvonen, Citation2023).

Data-based decision-making

Data-based decision-making (DBDM) is an instructional practice shown to improve student performance (e.g., Filderman et al., Citation2018; Stecker et al., Citation2005). DBDM is the process of continuously collecting student data and using the data to inform next steps in instruction to improve student outcomes (Filderman et al., Citation2018). This general approach is also called data-based instruction (Lembke et al., Citation2018), data-driven decision-making (Datnow & Hubbard, Citation2016), or data-based individualization (Jung et al., Citation2018). Eight steps are common to the DBDM literature (Ruhter & Karvonen, Citation2023):

  1. Determine present levels of performance.

  2. Set instructional goal.

  3. Deliver instruction.

  4. Use data to monitor student progress toward the goal.

  5. Use decision rules to evaluate student progress and instructional effectiveness.

  6. Hypothesize about the student’s progress and instructional needs.

  7. Implement changes to instruction.

  8. Repeat the cycle.

Data sources that are typically available to teachers to guide academic instructional decisions may be classified in several ways. Wayman et al. (Citation2016) identified four categories: state, periodic, local, and personal (teacher) data. In special education, teachers gather additional data to document student progress toward IEP goals and objectives. Progress monitoring may include both formative assessments and curriculum-based measures (CBMs), both of which are key to multitier systems of support models (Wakeman et al., Citation2021).

Teachers have access to a lot of data (Datnow & Hubbard, Citation2016). However, the process of collecting and analyzing data, and then using results to guide next steps in instruction, is a complex and nuanced process that incorporates skills in data interpretation and analysis, content knowledge, and pedagogical content knowledge (Mandinach & Gummer, Citation2016) to translate formal and informal assessment data into ‘actionable knowledge and practices’ (Gummer & Mandinach, Citation2015, p. 2).

Several factors influence data use and decision-making within the general education literature, including teacher collaboration time, coaching, structured training, university partnerships, school leadership, and teacher beliefs (Datnow & Hubbard, Citation2016). Training and professional development (PD) in DBDM may be limited to data-information systems rather than deep engagement with data (Datnow & Hubbard, Citation2016). DBDM may also be difficult for teachers because of the demands of data literacy (e.g., Espin et al., Citation2017; Marsh et al., Citation2015). For example, teachers may have difficulty interpreting graphed data, they may not know which instructional changes to make according to the data, or they may not believe the data is accurate (Espin et al., Citation2017).

Teachers also have preexisting beliefs about the value of data and evidence shaping their DBDM processes (Coburn & Turner, Citation2011). Mental models (Jimerson, Citation2014) about data use are influenced by training, collaboration, and personal experience; those models can be rigid and not easily challenged by new ideas. Teacher self-efficacy, combined with anxiety around the data-driven decision-making process, can affect teachers’ ability to effectively use data to inform their instruction (Dunn et al., Citation2013).

Special educator decision-making

Studies on instructional decision-making with teachers of students with ESN are often in the context of broad curricular decision-making, rather than academic DBDM models. For example, Ruppar et al. (Citation2015) conducted a qualitative study to determine what influences special educator curricular decision-making in literacy for students with severe disabilities. The authors identified four factors influencing decision-making: (a) contexts (e.g., staff and materials, school culture); (b) beliefs about students, teaching, and learning (e.g., individualization of lessons for students); (c) expectations (e.g., assumptions about causes of student learning and students’ capabilities to learn); and (d) self-efficacy (e.g., belief that they can teach the content effectively). Teachers report considering students’ characteristics and perceived capabilities when making instructional and curricular decisions (Ruppar et al., Citation2015; Timberlake, Citation2014).

There is some evidence that teachers of students with ESN will use a data collection system if one is made available. Siuty et al. (Citation2018) examined the influence of curriculum on middle school teachers’ decisions about reading instruction for students with severe disabilities. Some teachers in the study used a prescribed curriculum in literacy that included a computer-based assessment and progress monitoring component. Despite concerns about the prescribed curriculum (e.g., decreasing teacher autonomy, low expectations), teachers did report increased self-efficacy. They also made more data-informed decisions about their students’ instruction than those in the control group, who were given no curriculum. The decision-making processes in the control group were more based on their beliefs about student learning and their prior experiences, as opposed to using student instructional data to make decisions.

Although use of assessment data is considered a high-leverage practice in special education (McLeskey et al., Citation2017) and is the subject of teacher PD (Datnow & Hubbard, Citation2016), there are few studies examining teacher PD on DBDM for teachers of students with ESN. A recent systematic literature review of DBDM PD for teachers of students with ESN found few studies, most of which were grounded in decision-making models designed in the 1980s (Ruhter & Karvonen, Citation2023), when the curricular focus was primarily functional skills (Dymond & Orelove, Citation2001). The studies focused on teachers’ decisions about discrete skills (e.g., sight-word identification), raising questions about whether these approaches can support teachers in evaluating student progress on the full breadth of academic expectations each year, or combining their understanding of data with academic content and pedagogical content knowledge to make sound decisions (Mandinach & Gummer, Citation2016).

Compared to literature on DBDM in general education and in interventions for students with high-incidence disabilities, there are gaps in knowledge about the kinds of academic data on students with ESN that are available to teachers, how teachers use data they have, and the factors that may influence their data use. The purpose of this study is to explore how teachers of students with ESN use data in the context of academic instructional decision-making. Specifically, we addressed the following research questions (RQ):

  1. What kinds of academic data about students with ESN do special educators have access to?

  2. How often do special educators use data for academic instructional purposes?

  3. What are special educators’ perceptions of data and data use?

  4. How are special educators’ perceptions related to their use of data to make academic instructional decisions for students with ESN?

Method

Participants

The theoretical (target) population is licensed teachers in the United States who provide instruction in English language arts, mathematics, or science to students with ESN (i.e., students with significant cognitive disabilities who participate in their state’s alternate assessment) in grades 3–12. We relied on methods we have used in many previous studies to gain access to the population. To identify the sampling frame we shared study information with state education agency staff from the accessible population: those that used the Dynamic Learning Maps (DLM) Alternate Assessment System during the 2021–2022 school year. We chose these states for two reasons. First, we have long-standing relationships with these state agencies and they have been partners in past research projects. Second, unlike many alternate assessments, DLM summative results include fine-grained information about students’ academic skills. There is evidence that teachers can use those results to guide instructional decisions in the subsequent year (Clark et al., Citation2023). About one-third of states that use the DLM system follow an assessment model in which they administer teacher-selected assessments throughout the year, adjacent to instruction, and results are available on-demand to guide instructional decisions during the year (Clark & Karvonen, Citation2021). Thus, teachers in these states theoretically have access to state assessment data that teachers might find relevant for within-year instructional decision-making.

States decided if they wanted to promote awareness of the study to their educators. The sampling frame was delimited to teachers in nine states that chose to participate. Those states have a range of population sizes and urban densities and represent geographic locations primarily in the Midwest, South, and Northeast. Based on DLM assessment data, the smallest participating state had fewer than 1,200 students taking DLM assessments, and the largest participating state had more than 30,000 students taking DLM assessments. Teacher recruitment methods are described in the Data Collection section.

Instrumentation

We collected data using a researcher-developed survey adapted from Wayman et al. (Citation2016)’s Teacher Data Use Survey (see Appendix A), which was designed for districts to understand how their teachers are using data and what supportive conditions are in place for effective data use. The survey has been previously piloted. Our modified survey contained 37 items in three sections: Data Availability, Data Use, and Attitudes About Data and Data Use. Eight items solicited teacher background and demographic information.

In Wayman et al.’s approach, district planners identify the data sources that are most relevant to the local context and decides how to label them in the survey. For the Data Availability section of our survey we collapsed Wayman et al. (Citation2016) four categories into two: internal (teacher-driven) and external. Teachers indicated the types of available academic data available to them in each category (e.g., student work products, data sheets in the internal category; curriculum-based assessments in the external category). The internal item included an option to describe other teacher-driven sources. Parenthetical examples of data sources were chosen for their relevance to academic instruction for students with ESN, authenticity, and interpretability. Study authors generated examples from the literature and their knowledge of instructional models across states; an expert in academic instruction for students with ESN reviewed the examples for authenticity.

The Data Use section included 11 items to identify the frequency with which teachers used data in their teaching practice for various purposes in the DBDM cycle. The first six items assessed how often they used each of the available data sources (as indicated in the Data Availability section) for common DBDM cycle steps (Filderman et al., Citation2018; e.g., identifying what to teach, evaluating the effectiveness of instruction). Five items were added to reflect steps in the DBDM process as derived from the DBDM literature, such as ‘How often do you use these forms of data to set academic instructional goals?’ and ‘How often do you use these forms of data to evaluate if your instruction was effective?’ We also included from Wayman et al. (Citation2016) four other potential data uses outside the DBDM cycle, including grouping students for instruction, discussing student performance with the student or caregiver, and collaboration with colleagues. Respondents reported frequency of use on a 4-point Likert scale ranging from less than once per month to a few times per week. The last item was new and asked teachers to evaluate the perceived usefulness of each available data source to their teaching practice (4-point Likert scale ranging from not useful to very useful).

The Attitudes about Data and Data Use section included 24 items used in or adapted from Wayman et al. (Citation2016) survey (e.g., the item ‘I have the proper technology to efficiently examine data’ was adapted to ‘I have access to the technology I need to support my use of data to guide instruction’). This set included four items (e.g., My school leadership creates protected time for using data) from the Principal Leadership and Computer Data Systems scales combined into a School Leadership scale and seven items on Organizational Supports (e.g., There is someone who answers my questions about using data). The remaining sections were comprised of items from Wayman et al.’s Data’s Effectiveness for Pedagogy scale (five items, e.g., Data help teachers plan instruction), Attitudes Toward Data scale (four items, e.g., Using data helps me be a better teacher), and Data Competence scale (four items, e.g., I am good at using data to plan lessons). Response options for all sections were on a 4-point scale (strongly disagree to strongly agree).

Data collection

Nine states distributed recruitment materials to their teachers using common, studywide recruitment language and their own preferred communication channels (e.g., e-mail newsletters, announcements) using distribution lists they created and managed. The recruitment message invited interested individuals to review informed consent information and respond to an anonymous Qualtrics survey. The first survey item screened participants for eligibility. Respondents reported the number of students with ESN they currently taught; 415 teachers accessed the survey and responded to the screener question. Twelve educators reported teaching no students with ESN and were not eligible to participate. The remaining 403 respondents who taught one or more students continued as study participants. Two-hundred and eighty-six participants completed all sections of the survey. The survey was open for 5 weeks in April and May, 2022. Due to the individualized state approach to recruitment, no direct study team access to teacher e-mails, and the anonymous nature of the survey, there were no reminder messages or follow-up messages to partial respondents.

Data analysis procedures

We summarized participant demographic and experience information using descriptive statistics. For RQ1, we calculated the percentage of respondents with access to each type of data. For RQ 2, we reported item-level frequency distributions and created aggregated data-use variables for each of the nine academic instructional purposes (e.g., setting goals, evaluating whether instruction was effective, talking with parents). Because the type and number of available data sources varied by teacher, we calculated each data-use variable as the proportion of available data sources that they reported using frequently (i.e., almost weekly or a few times per week). Larger values indicate respondents used a wider variety of their available data for that purpose. We answered RQ3 using frequency distributions for items related to Wayman et al. (Citation2016) scales (School Leadership, Organizational Supports, Attitudes Toward Data, Data Competence) and analyzed respondents’ perceptions of available data by calculating the percentage of respondents who perceived each data source as useful or very useful. Because some participants did not complete all sections of the survey, percentages for RQ1–3 are based on the total number of responses to sections or individual items where appropriate.

We then used RQ3 items to recreate Wayman et al. (Citation2016) scales to use as perception variables for RQ4. Wayman et al. originally reported internal consistency greater than .85 across scales using samples of teachers, administrators, and instructional support staff. Given the very different population for our study and modification to some items, we evaluated the factor structure for the perception variables according to our sample. We conducted principal component analysis with varimax rotation and found four factors that explained 70.8% of the variance. Items loaded as expected for the Organizational Supports (α = .856), School Leadership (α = .864), and Data Competence (α = .913) variables. Attitudes Toward Data and Data’s Effectiveness for Pedagogy loaded onto a single factor, so we combined them into a single scale using Wayman et al.’s original umbrella term, Attitudes Toward Data (α = .950).

While Wayman et al. included perceived usefulness items in their survey, they did not include them in a scale. Yet perceived usefulness can be a precursor to adopting a new practice (e.g., Abdullah et al., Citation2016), so we wanted to include this variable in the RQ4 analysis. Because usefulness was dependent on availability (i.e., a respondent could not rate usefulness of data sources that were not available to them), there were intentionally missing data and it was not possible to create a scale like the others. We created a Perceived Usefulness variable, from participants’ responses to items about the perceived usefulness of the data sources that were available to them. We categorized teachers (N = 361) into three groups according to the sources they rated very useful: those who perceived data from only one or more teacher-driven assessments (e.g., student work products, informal observation) as very useful (n = 188, 52%), those who perceived data from at least one teacher-driven assessment and one external assessment as very useful (n = 79, 22%), and those who perceived data from none of the assessments as very useful (n = 94, 26%).

To answer RQ4, we explored the relationships among data-use variables from RQ2 (proportion of available data sources used with high frequency for a specific purpose, like tailoring instruction) and the five perception variables (e.g., Organizational Supports, Attitudes Toward Data). We limited this analysis to the five steps in the DBDM cycle and grouping students since it was a type of decision. We excluded uses outside the DBDM cycle: talking with parents, talking with students, and collaborating with colleagues. Data did not meet assumptions for multivariate tests or univariate parametric tests. We examined bivariate correlations among perceptions and the data-use variables using Spearman’s rho (ρ). We evaluated differences in the distribution of each data-use variable across the perceived usefulness groups (neither, teacher only, either/both) using the Kruskal – Wallis H test and post-hoc pairwise comparisons with Bonferroni corrections.

Results

Two-hundred and eighty-six (286) of 403 respondents answered questions about their backgrounds and demographic characteristics. Respondents were from nine states including four in the Midwest, two in the Northeast, two in the Upper Midwest, and one in the South. As seen in , most participants were female (93.4%) and White (88.4%). Most participants (53.5%) had over 11 years of experience teaching academic subjects and worked in self-contained classrooms (69.6%). Respondents’ teaching assignments were relatively evenly split across elementary, middle, and high school grades.

Table 1. Respondent demographic characteristics and background experience (N = 286).

Available academic data sources

Teacher-driven data sources such as informal observation (97.5%) and quantifiable observational data (91.3%) were available more often than externally driven sources such as periodic assessment results (55.7%; see ). Data from state large-scale assessments, such as DLM assessments (90.1%), were readily available to teachers. About one-third of teachers (34.9%) reported other academic data sources were available. Those who described the types of other data listed specific examples for general categories on the survey (e.g., a specific progress-monitoring tool) or data sources that were not clearly academic (e.g., applied behavior-analysis data)

Table 2. Availability and perceived usefulness of data sources.

Frequency of data use for academic instructional purposes

Teachers reported how often they used each available data source for academic instructional purposes. Respondents reported using informal observation the most (i.e., almost weekly to a few times a week) for setting instructional goals (82.8%) and student work products most often for identifying what to teach (79.1%), tailoring instruction to students (i.e., using data for initial lesson planning; 87.5%), evaluating if instruction was effective (80.9%), and informing next steps or changes to instruction (79.4%; see ). Teachers used progress-monitoring data less often than teacher-developed data sources for each of the five purposes aligned to steps in the DBDM cycle. For example, 33.7% frequently used curriculum-based progress monitoring data and 25.4% frequently used non-curriculum-based progress monitoring data to evaluate instructional effectiveness.

Table 3. Frequency of data use by purpose and data source.

Respondents reported how often they used data sources for purposes beyond the DBDM cycle, including grouping students for instruction, discussing student progress, and collaborating. Respondents used informal observation (54.1%) and student work products (50.0%) most often to group students for instruction. Similar to other data-use patterns, respondents used progress-monitoring data less often than other data sources in discussing progress with students or parents. Respondents less frequently used data to collaborate with other educators than for other purposes. According to the aggregated data-use variables (i.e., proportion of available data sources used almost weekly or more often for each instructional purpose), respondents reported most often using their available data to set goals and tailor instruction, and least often to group students ().

Figure 1. Proportion of available data sources used frequently, by purpose.

Figure 1. Proportion of available data sources used frequently, by purpose.

Perceptions of data and data use

Respondents perceived teacher-driven data sources available to them, such as informal observation (79.3% useful or very useful) and student work products (79.2%), to be more useful than externally available data, such as progress monitoring data from a curriculum package (46.5%; see ). Respondents agreed or strongly agreed (87.8%) that they had support from school leaders to use data, but there was less agreement that leaders took specific actions such as creating protected time to use data (48.4% agreed or strongly agreed; see ). Teachers responded very positively to items about their attitudes toward data (88.4% to 98% agreed or strongly agreed). Respondents also perceived themselves to be competent in data use, particularly with determining student learning needs (91.7% agreed or strongly agreed) and adjusting instruction (91.4% agreed or strongly agreed).

Table 4. Perceptions of data and data use.

Factors associated with data use for academic instructional decisions

Attitudes Toward Data and Data Competence were moderately correlated with all instructional uses except grouping students (see ). The strongest correlations were between Attitudes Toward Data and setting goals (ρ = .26, p < .001), and between Data Competence and guiding instructional decisions (ρ = .23, p < .001). Only Organizational Supports (ρ = .16, p < .001) and School Leadership (ρ = .12, p < .001) were statistically significantly correlated with instructional decisions about grouping students. There were statistically significant differences in the distributions of all six types of data use among the three perceived usefulness groups (). Not surprisingly, teachers who perceived neither teacher-created nor external data sources to be useful tended to report less data use than either of the other groups (see example in ). Teachers in the either-or-both groups tended to report higher frequency data use than those in the teacher-only group, although differences were statistically significant only for identifying what to teach, evaluating effectiveness, and changing instruction.

Figure 2. High-frequency data use to identify what to teach, by type of data perceived as useful.

Figure 2. High-frequency data use to identify what to teach, by type of data perceived as useful.

Table 5. Correlations between perceptions and data use for instructional decisions (N = 359).

Table 6. Differences in data use by types of data perceived as useful (N = 359).

Discussion

The purpose of this study was to explore how teachers of students with ESN use data in the context of academic instructional decision-making. We expanded on the available literature on general factors influencing instructional and curricular decision-making (Ruppar et al., Citation2015) and the adoption and use of packaged progress monitoring tools to explore data use patterns around the DBDM cycle (Siuty et al., Citation2018) for students with ESN. Overall, most respondents reported they had access to a variety of data sources to make academic instructional decisions, although more respondents reported having access to teacher-driven products than external data sources. Even after accounting for the variability in which sources were available for use, teacher-driven products were used more often than external sources were. This study did not address reasons for high-frequency use of some forms of data. Respondents might use certain sources more frequently because they are comfortable with the data, they have a greater need for the type of data, they believe the data to be of higher quality or more relevant, or because they believe the data is helping them achieve results.

Not all data sources are equally useful for all purposes. Informal observation and student-work products are useful evidence of student learning in formative assessment practices (Wylie & Lyon, Citation2012), while some progress-monitoring tools are designed for less frequent use. Parent communication may be intentionally periodic (e.g., at the end of a marking period). In this study there was some evidence of mismatch between data sources and uses. For example, literature recommends using more objective data sources to determine students’ present levels of performance (Bailey & Weingarten, Citation2019) and set initial instructional goals in a DBDM cycle (Austin & Filderman, Citation2020). Yet respondents reported mostly using informal observation and student work products for initial goal setting. It is possible teachers value their own data more than external sources, or have few options beyond data systems they create themselves. But if teachers lack the necessary assessment literacy to design quality methods to gather these data or rely on data that are misaligned to grade-level academic expectations, there is a risk that data use does not translate into the desired student progress (Mandinach & Gummer, Citation2016; Petersen, Citation2016).

DLM assessment results were reportedly to be widely available but infrequently used, yet nearly one-third of respondents perceived the data as useful or very useful. This unexpected pattern might be due to the combination of respondents from states that use DLM assessments only for spring summative-assessment purposes and those who use the instructionally embedded model throughout the year.

Regardless of data type, within the DBDM cycle teachers reported most often using data to set goals and tailor instruction. They less frequently used data to evaluate instructional effectiveness or guide instructional change. This pattern may reflect teachers’ preparation and comfort level with data use at different points in the cycle. For example, teacher-education programs in special education prepare candidates to tailor instruction to individualized student needs (Lignugaris/Kraft & Harris, Citation2014), and recommended IEP-development practices (e.g., Lynch & Adams, Citation2008) include using data to determine present levels of performance and then setting goals accordingly. In contrast, DBDM PD interventions to support academic instruction for students with ESN focus more on later steps in the DBDM cycle: progress monitoring, evaluating data patterns to determine progress, and changing instruction (Ruhter & Karvonen, Citation2023). Teachers who are more competent with data may be more comfortable with the later DBDM steps. In this study, respondents who found both teacher-driven data and external data to be useful reported more high-frequency data use to evaluate instructional effectiveness and change instruction than those who used only teacher data.

Teachers who had positive attitudes about data and perceived themselves to be competent in using data reported more-frequent use of available data. Those who did not perceive either type of data (teacher or external) to be useful tended to use data less often. Yet teachers’ perceptions of School Leadership and Organizational Supports was not correlated with data use. This finding is counter to other studies that found evidence of external influences on special educators’ curricular decisions (e.g., Lawson & Jones, Citation2018), and for general educators’ use of DBDM (Schildkamp et al., Citation2017). This finding is also surprising given Ruppar et al.’s (Citation2015) finding on the importance of school context (e.g., available materials and supports, similar staff curricular philosophies) in shaping teacher decision-making.

Decisions about grouping students did not follow the pattern of other instructional uses. This use was less frequently reported than others were (), and it was the only instructional use associated with perceptions of School Leadership and Organizational Supports but not Attitudes Toward Data or Data Competence. These patterns may be explained by the constraints of instructional settings. Almost 90% of respondents indicated they taught in separate classes or schools. These teachers may have caseloads and staffing that support a limited range of groupings or only one-on-one instruction. Where teachers do use data to group students, external influences such as building schedules created by school leaders may allow different groupings. The dominant instructional settings in this study may also explain the low rates of data use for collaboration with other educators. Collaboration rates may be higher among teachers in inclusive models.

Limitations and delimitations

This study was constrained to teachers in nine states. Respondents’ demographic information and teaching experience suggest they are representative of teachers in this population (e.g., primarily female and White; Billingsley et al., Citation2017). Some respondents did not provide demographic information and we did not know the demographic characteristics of the full population, so findings may not be applicable to the full target population. The constraints of our recruitment methods – relying on state-specific communication channels, no opportunity for reminder messages – likely led to a sample that was highly motivated to share their opinions on the topic. The sample likely includes teachers with very positive opinions, as evidenced by high rates of perceived Data Competence, and those with negative views (e.g., those who found Neither type of assessment data to be very useful). Findings for RQ4 are based on responses from a slightly smaller group (about 89% of the respondents) that completed the entire survey.

There may also be limitations based on teacher interpretation of the data sources listed. We did not have an opportunity to pilot this adapted survey with members of the specific target population. Although we based this section of the survey on existing items in Wayman et al. (Citation2016)’s survey, some parenthetical examples were adapted to reflect other data sources likely used by teachers of students with ESN. Although parenthetical examples were reviewed by a special education expert, there is a slight potential that teachers did not recognize a data source and did not mark it as available to them and therefore were not able to answer how they used that data source for different purposes. However, offering participants an open-ended ‘other’ data source option gave them a chance to report using a data source even if they didn’t recognize it on the list. We reviewed each instance of teachers’ reporting an open-ended ‘other’ data source and found that those who reported a specific data source also marked the general existing category as available to them.

Finally, this survey provides some insight into teachers’ data use and attitudes. However, it did not capture the complex decision-making processes described in the DBDM literature (Mandinach & Gummer, Citation2016) or the nature of the academic instruction they provide.

Implications for research

Several findings point to areas where future research could be useful. To address the delimitations of the current study, qualitative data could shed light on how teachers think about data, how they reach instructional decisions, and how they weigh data with other factors that are important in decision-making, such as beliefs about students’ capabilities to learn, school culture, and content knowledge (e.g., Ruppar et al., Citation2015; Timberlake, Citation2014). Qualitative research could also illuminate how local context plays a role in academic instructional decision-making. For example, qualitative research could explain the finding that school leadership and organizational supports were essentially unrelated to data use. Qualitative or quantitative studies could more closely examine how attitudes and DBDM practices are related to the content of academic instruction and to student characteristics and support needs. For example, under what conditions do teachers talk with their students about data – a practice that helps students develop self-regulation skills (Wiliam, Citation2011)? Case studies of expert special educators making data-based academic decisions could lead to a contemporary DBDM model that could be taught to other teachers, with supports for developing that expertise over time. Alternatively, if new studies reveal teachers’ DBDM models are constrained and academic decisions are being made about discrete skills rather than conceptually coherent academics, there may be an opportunity to work with expert general and special educators to develop a more robust DBDM model adapted from the literature on data-based decisions for students without ESN (Filderman et al., Citation2018; Mandinach & Gummer, Citation2016; Wayman et al., Citation2016).

Implications for practice

This study provides a snapshot of current practices and attitudes among teachers who primarily teach in self-contained settings. Because we found instructional data use was unrelated to School Leadership or Organizational Supports, there is a chance that these and similar teachers are developing their own approaches to data collection and data-based decisions. The lack of support and training may lead teachers to inadvertently use practices that do not lead to the desired impact on student learning, or rely heavily on less rigorous forms of data or data not best suited to the purpose. External support such as professional development may prepare teachers to collect high-quality data, interpret data correctly, and reason about data patterns in the context of academic instruction. Ideally, PD would encompass the entire DBDM cycle, rather than only one or a few steps, to support teachers’ understanding of how to adjust their instruction and complete the entire cycle (Datnow & Hubbard, Citation2016). Organizational supports such as coaching and PLCs (Huguet et al., Citation2014; Marsh et al., Citation2015) have been effective for supporting teacher data use.

Our finding that there are groups of teachers who find no data to be useful, or only certain types of data to be useful, indicates a need for differentiated PD based on teachers’ attitudes toward data. PD would need to be grounded in an understanding of the ‘why’ behind teachers’ data use and attitudes on data use so PD designers could consider what factors to target to support teacher change (Killion, Citation2017). For example, those like the group in our study who found neither type of data useful may benefit from starting with PD that addresses their beliefs and knowledge, so they become more open to change. Those who only perceive teacher-driven data as useful may benefit from learning how to incorporate external data or how to improve their own assessment practices. Teachers who are already comfortable with a range of data may be ready for more-complex DBDM grounded in specific academic content and pedagogical content knowledge. These teachers may also be ready to extend their data use to include richer formative practices that engage their students in understanding the data and understanding how their results relate to their learning goals (Wiliam, Citation2011). Those adopting technology-based solutions may need additional support to perceive the system as usable and easy to use before they intend to adopt it (Abdullah et al., Citation2016). In any of these scenarios, DBDM should be taught in the context of academic instruction so teachers are prepared to help students with ESN make meaningful academic progress throughout their school career.

Acknowledgments

The authors thank Lucas Cooper and Diane Guthrie for their assistance with data collection for this study.

Disclosure statement

The authors declare no conflict of interest or financial support for this study.

References

  • Abdullah, F., Ward, R., & Ahmed, E. (2016). Investigating the influence of the most commonly used external variables of TAM on students’ perceived ease of use (PEOU) and perceived usefulness (PU) of e-portfolios. Computers in Human Behavior, 63, 75–90. https://doi.org/10.1016/j.chb.2016.05.014
  • Agran, M., Jackson, L., Kurth, J. A., Ryndak, D., Burnette, K., Jameson, M., Zagona, A., Fitzpatrick, H., & Wehmeyer, M. (2020). Why aren’t students with severe disabilities being placed in general education classrooms: Examining the relations among classroom placement, learner outcomes, and other factors. Research and Practice for Persons with Severe Disabilities, 45(1), 4–13. https://doi.org/10.1177/1540796919878134
  • Austin, C. R., & Filderman, M. J. (2020). Selecting and designing measurements to track the reading progress of students with disabilities. Intervention in School and Clinic, 56(1), 13–21. https://doi.org/10.1177/1053451220910736
  • Bailey, T. R., & Weingarten, Z. (2019). Strategies for setting high-quality academic individualized education program goals (ED599697). ERIC. https://files.eric.ed.gov/fulltext/ED599697.pdf
  • Billingsley, B. S., Bettini, E. A., & Williams, T. O. (2017). Teacher racial/ethnic diversity: Distribution of special and general educators of color across schools. Remedial and Special Education, 40(4), 199–212. https://doi.org/10.1177/0741932517733047
  • Clark, A. K., & Karvonen, M. (2021). Instructionally embedded assessment: A theory of action for an innovative system. Frontiers in Education, 6, Article 724938. https://doi.org/10.3389/feduc.2021.724938
  • Clark, A., Kobrin, J., Karvonen, M., & Hirt, A. (2023). Teacher use of diagnostic score reports for instructional decision-making in the subsequent academic year. Practical Assessment, Research & Evaluation, 28(1), Article 6. https://scholarworks.umass.edu/pare/vol28/iss1/6
  • Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research & Perspective, 9(4), 173–206. https://doi.org/10.1080/15366367.2011.626729
  • Datnow, A., & Hubbard, L. (2016). Teacher capacity for and beliefs about data-driven decision making: A literature review of international research. Journal of Educational Change, 17(1), 7–28. https://doi.org/10.1007/s10833-015-9264-2
  • Dunn, K. E., Airola, D. T., Lo, W., & Garrison, M. (2013). What teachers think about what they can do with data: Development and validation of the data-driven decision making efficacy and anxiety inventory. Contemporary Educational Psychology, 38(1), 87–98. https://doi.org/10.1016/j.cedpsych.2012.11.002
  • Dymond, S. K., & Orelove, F. P. (2001). What constitutes effective curricula for students with severe disabilities? Exceptionality, 9(3), 109–122. https://doi.org/10.1207/S15327035EX0903_2
  • Espin, C. A., Wayman, M. M., Deno, S. L., McMaster, K. L., & de Rooij, M. (2017). Data‐based decision‐making: Developing a method for capturing teachers’ understanding of CBM graphs. Learning Disabilities Research & Practice, 32(1), 8–21. https://doi.org/10.1111/ldrp.12123
  • Every Student Succeeds Act, 20 U.S.C § 6301. (2015). https://www.congress.gov/114/plaws/publ95/PLAW-114publ95.pdf
  • Filderman, M. J., Toste, J. R., & Cooc, N. (2020). Does training predict second-grade teachers’ use of student data for decision-making in reading and mathematics? Assessment for Effective Intervention, 46(4), 247–258. https://doi.org/10.1177/1534508420902523
  • Filderman, M. J., Toste, J. R., Didion, L. A., Peng, P., & Clemens, N. H. (2018). Data-based decision making in reading interventions: A synthesis and meta-analysis of the effects for struggling readers. The Journal of Special Education, 52(3), 174–187. https://doi.org/10.1177/0022466918790001
  • Fuchs, L. S. (2004). The past, present, and future of curriculum-based measurement research. School Psychology Review, 33(2), 188–192. https://doi.org/10.1080/02796015.2004.12086241
  • Gummer, E. S., & Mandinach, E. B. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4), 1–22. https://doi.org/10.1177/016146811511700401
  • Huguet, A., Marsh, J. A., & Farrell, C. C. (2014). Building teachers’ data-use capacity: Insights from strong and developing coaches. Education Policy Analysis Archives, 22(52), 52. https://doi.org/10.14507/epaa.v22n52.2014
  • Jimerson, J. B. (2014). Thinking about data: Exploring the development of mental models for “data use” among teachers and school leaders. Studies in Educational Evaluation, 42, 5–14. https://doi.org/10.1016/j.stueduc.2013.10.010
  • Jung, P.-G., McMaster, K. L., Kunkel, A. K., Shin, J., & Stecker, P. M. (2018). Effects of data-based individualization for students with intensive learning needs: A meta-analysis. Learning Disabilities Research & Practice, 33(3), 144–155. https://doi.org/10.1111/ldrp.12172
  • Killion, J. (2017). Why evaluations fail. The Learning Professional, 38(2), 26–30. https://learningforward.org/wp-content/uploads/2017/08/why-evaluations-fail.pdf
  • Kurth, J., & Mastergeorge, A. M. (2010). Individual education plan goals and services for adolescents with autism: Impact of age and educational setting. The Journal of SpecialEducation, 44(3), 146–160. https://doi.org/10.1177/0022466908329825
  • Kurth, J. A., Ruppar, A. L., Toews, S. G., McCabe, K. M., McQueston, J. A., & Johnston, R. (2019). Considerations in placement decisions for students with extensive support needs: An analysis of LRE statements. Research and practice for persons with severe disabilities. Research and Practice for Persons with Severe Disabilities, 44(1), 3–19. https://doi.org/10.1177/1540796918825479
  • Lawson, H., & Jones, P. (2018). Teachers’ pedagogical decision‐making and influences on this when teaching students with severe intellectual disabilities. Journal of Research in Special Educational Needs, 18(3), 196–210. https://doi.org/10.1111/1471-3802.12405
  • Lembke, E. S., McMaster, K. L., Smith, R. A., Allen, A., Brandes, D., & Wagner, K. (2018). Professional development for data-based instruction in early writing: Tools, learning, and collaborative support. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 41(2), 106–120. https://doi.org/10.1177/0888406417730112
  • Lignugaris/Kraft, B., & Harris, S. (2014). Teacher preparation: Principles of effective pedagogy. In P. T. Sindelar, E. D. McCray, M. T. Brownell, & B. Lignugaris/Kraft (Eds.), Handbook of research on special education teacher preparation (pp. 233–254). Routledge.
  • Lynch, S., & Adams, P. (2008). Developing standards-based individualized education program objectives for students with significant needs. Teaching Exceptional Children, 40(3), 36–39. https://doi.org/10.1177/004005990804000303
  • Mandinach, E. G., & Gummer, E. S. (2016). What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teaching and Teacher Education, 60, 366–376. https://doi.org/10.1016/j.tate.2016.07.011
  • Marsh, J. A., Bertrand, M., & Huguet, A. (2015). Using data to alter instructional practice: The mediating role of coaches and professional learning communities. Teachers College Record, 117(4), 1–40. https://doi.org/10.1177/016146811511700411
  • McLeskey, J., Council for Exceptional Children, & Collaboration for Effective Educator Development, Accountability and Reform. (2017). High-leverage practices in special education. Council for Exceptional Children.
  • Petersen, A. (2016). Perspectives of special education teachers on general education curriculum access: Preliminary results. Research and Practice for Persons with Severe Disabilities, 41(1), 19–35. https://doi.org/10.1177/1540796915604835
  • Ruhter, L., & Karvonen, M. (2023). The impact of professional development on data-based decision-making for students with extensive support needs. Remedial and Special Education, 45(1), 44–57. Advance online publication. https://doi.org/10.1177/07419325231164636
  • Ruppar, A. L., Gaffney, J. S., & Dymond, S. K. (2015). Influences on teachers’ decisions about literacy for secondary students with severe disabilities. Exceptional Children, 81(2), 209–226. https://doi.org/10.1177/0014402914551739
  • Schildkamp, K., Poortman, C., Luyten, H., & Ebbeler, J. (2017). Factors promoting and hindering data-based decision making in schools. School Effectiveness and School Improvement, 28(2), 242–258. https://doi.org/10.1080/09243453.2016.1256901
  • Sindelar, P. T., Wasburn-Moses, L., Thomas, R. A., & Leko, C. D. (2014). The policy and economic contexts of teacher education. In P. T. Sindelar, E. D. McCray, M. T. Brownell, & B. Lignugaris/Kraft (Eds.), Handbook of research on special education teacher preparation (pp. 31–44). Routledge.
  • Siuty, M. B., Leko, M. M., & Knackstedt, K. M. (2018). Unraveling the role of curriculum in teacher decision making. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 41(1), 39–57. https://doi.org/10.1177/0888406416683230
  • Stecker, P. M., Fuchs, L. S., & Fuchs, D. (2005). Using curriculum-based measurement to improve student achievement: Review of research. Psychology in the Schools, 42(8), 795–819. https://doi.org/10.1002/pits.20113
  • Thurlow, M. L., Ghere, G., Lazarus, S. S., & Liu, K. K. (2020). MTSS for all: Including students with the most significant cognitive disabilities. University of Minnesota, National Center on Educational Outcomes, TIES Center. https://nceo.umn.edu/docs/OnlinePubs/NCEOBriefMTSS.pdf
  • Timberlake, M. T. (2014). Weighing costs and benefits: Teacher interpretation and implementation of access to the general education curriculum. Research and Practice for Persons with Severe Disabilities, 39(2), 83–99. https://doi.org/10.1177/1540796914544547
  • U. S. Department of Education. (2017, December 7). Questions and Answers (Q&A) on U. S. Supreme Court Case Decision Endrew F. V. Douglas County School District Re-1. https://sites.ed.gov/idea/files/qa-endrewcase-12-07-2017.pdf
  • Wakeman, S. Y., Karvonen, M., Flowers, C., & Ruhter, L. (2021). Alternate assessments and monitoring student progress in inclusive classrooms. In J. McLeskey, F. Spooner, B. Algozzine, & N. L. Waldron (Eds.), Handbook of effective inclusive elementary schools: Research and practice (2nd ed., pp. 302–321). Routledge
  • Wayman, J. C., Wilkerson, S. B., Cho, V., Mandinach, E. B., & Supovitz, J. A. (2016). Guide to using the Teacher Data Use Survey (REL 2017-166). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia. https://ies.ed.gov/ncee/rel/regions/appalachia/pdf/REL_2017166.pdf
  • Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3–14. https://doi.org/10.1016/j.stueduc.2011.03.001
  • Wylie, C., & Lyon, C. (2012). Formative assessment—supporting students’ learning. R & D Connections, 19, 1–12. https://www.ets.org/Media/Research/pdf/RD_Connections_19.pdf

Appendix A

Instructional decision-making for students with extensive support needs survey

Screener question

How many students with extensive support needs (students with significant and sometimes multiple disabilities who participate in their state’s alternate academic assessments) in grades 3-12 do you teach this year?

Instructions

We are interested in how you think about academic instructional decisions for students with extensive support needs. Students with extensive support needs include students with significant and sometimes multiple disabilities who participate in their state’s alternate academic assessments. Please answer questions with these students in mind.

This survey has four sections.

Data availability

The following questions ask about various forms of data that you may use to support you in making academic instructional decisions. Academics include English language arts, mathematics, science, and social studies. Data comes from many sources, so please think broadly about data and your experience with any of these academic subjects.

A1. Which of the following forms of data about academics do you collect for your students with extensive support needs to determine what and how to teach?

[Response options: yes, no]

Data use

The next set of questions ask you about your use of data for specific instructional purposes.

In this current school year, did you use these sources to determine your students’ baseline academic knowledge at the beginning of the year? (Y/N)

[Response options: less than once a month, once or twice a month, weekly or almost weekly, a few times a week]

*For each item, forms of data populate based on ‘yes’ answers in A1 and A2.

B1. How often do you use these forms of data to set academic instructional goals?

B2. How often do you use these forms of data to identify what to teach?

B3. How often do you use these forms of data to tailor instruction to individual students’ needs?

B4. How often do you use these forms of data to evaluate if your instruction was effective?

B5. How often do you use these forms of data to guide your decisions about changes in instructional strategies?

B6. How often do you use these forms of data to group students for instruction?

B7. How often do you use these forms of data to discuss a student’s performance with a parent or guardian?

B8. How often do you use these forms of data to discuss a student’s performance with the student?

B9. Overall, how useful are the following forms of data to your teaching practice?

B10. How often do you collaborate with other educators, such as instructional coaches or other teachers, to interpret and use each of the following data sources?

Attitudes about data and data use

The next several questions ask about your attitudes and experiences with data use for instructional decision making, as well as supports you have for using data. Please answer these questions based on your experience with teaching academic subjects to students with extensive support needs.

C1. This block asks about supports for using data. Please indicate how much you agree or disagree with the following statements:

[Response options: strongly disagree, disagree, agree, strongly agree]

C5. The previous sections of this survey asked you about using data to make academic instructional decisions. Would you have answered any questions differently if we asked you about using data for other aspects of instruction (e.g., behavior, functional skills, social-emotional learning) for students with ESN?

[Response options: yes, no]

*If yes, display C5a. If no, skip to C6.

C5a. Please briefly explain how your use of data is different for non-academic areas

C6. Is there anything else you want to share about data use to support academic instruction?

Teacher Background

The following questions ask about your background and teaching experience. Your responses are confidential.

D1. In which state do you teach? ______________

D2 Which of the following best describes the service model you use to teach academics to students with extensive support needs?

  • General education classroom/inclusive classroom (co-teaching, consultation, or other model) Resource

  • Self-contained class

  • Separate school

  • Homebound/hospital

  • Other: ______________

D3. Which grade(s) are you currently teaching? Select all that apply.

  • Grade 3

  • Grade 4

  • Grade 5

  • Grade 6

  • Grade 7

  • Grade 8

  • Grades 9-12

D4. How many years of experience do you have teaching academic subjects to students with extensive support needs?

[Response options: None, Less than 1 year, 1–5 years, 6–10 years, 11+ years]

  1. ELA

  2. Mathematics

  3. Science

  4. Social studies

D5. What is your gender?

  • Male

  • Female

  • Non-binary

  • Prefer to self-identify______________

  • Choose not to disclose gender

D6. What is your ethnicity?

  • Hispanic/Latino

  • Non-Hispanic/Latino

  • Chose not to disclose ethnicity

D7. What is your race? (choose one or more)

  • White

  • Black/African-American

  • Asian

  • American Indian/Alaska Native

  • Native Hawaiian/Other Pacific Islander

  • Chose not to disclose race