4,675
Views
2
CrossRef citations to date
0
Altmetric
Intervention, Evaluation, and Policy Studies

Assessing the Effect of Corequisite English Instruction Using a Randomized Controlled Trial

, , &
Pages 78-102 | Received 11 Jun 2020, Accepted 30 Apr 2021, Published online: 11 Aug 2021

Abstract

This is the first study to provide experimental evidence of the impact of corequisite remediation for students underprepared in reading and writing. We examine the short-term impacts of three different approaches to corequisite remediation that were implemented at five large urban community colleges in Texas, and we explore whether corequisites have differential impacts on students with different characteristics. Results from three first-time-in-college cohorts indicate that corequisite remediation increased the probability of completing a first college-level English course within one year by 24 percentage points and within two years by 18 percentage points. The impacts were positive for all three of the corequisite models examined and for traditionally underrepresented groups, including Hispanic students, first-generation college students, and students whose first language is not English. We saw modest positive impacts on the accumulation of college credits but no effect on persistence in college.

Introduction

Many students who enter community colleges are tested in reading, writing, and/or mathematics and are designated “not college ready.” Colleges typically require students who are deemed not college ready in one or more subjects to enroll in developmental education (DE), which has traditionally consisted of a series of noncredit, subject-based courses for students to complete prior to entering college-level classes. Data from 2010 suggest that 68% of community college students enrolled in at least one DE course, at an estimated cost of approximately seven billion dollars (Community College Research Center, Citation2014; Scott-Clayton et al., Citation2014). However, evidence indicates that traditional approaches to DE have not been working for many students, with few ever completing DE course sequences and moving into college-level coursework (Bailey et al., Citation2010; Community College Research Center, Citation2014).

Faced with troubling evidence on the success of students who take traditional DE courses, states and higher education institutions across the United States are rethinking the way they address college readiness. Studies indicate that student “momentum” through foundational reading, writing, and mathematics courses and early accumulation of college credits is a critical predictor of long-term college success (Jenkins & Bailey, Citation2017). For this reason, states and institutions have focused their attention on strategies that seek to accelerate students through DE and move them through college courses more quickly, including corequisite remediation.

Under corequisite remediation, students skip the traditional DE course(s) and move immediately into a foundational college-level course, while also being required to enroll in concurrent DE support in that same semester. In addition to these structural changes that accelerate students immediately into college coursework, corequisite models typically call for changes to instruction to better align content in DE with college-level coursework and may build in opportunities for more personalized support and/or peer support through various design features such as smaller class sizes and the mixing of college ready and DE students (Daugherty et al., Citation2018).

Corequisite remediation has emerged as one of the most common DE reforms being adopted by colleges across the country to support student success, with results from a 2016 national survey indicating that more than one-third of community colleges offered corequisites in reading and writing (Rutschow et al., Citation2019). A report from the Education Commission of the States finds that at least 20 states have policies in place that encourage innovative models of DE such as corequisite remediation (Whinnery & Pompelia, Citation2018). Although findings from several studies suggest that corequisite remediation is effective in improving college success outcomes (Cho et al., Citation2012; Logue et al., Citation2016; Ran & Lin, Citation2019), the evidence base remains limited.

This paper details findings from the first randomized controlled trial (RCT) study of corequisite remediation in reading and writing. We recruited and consented 1,276Footnote1 newly enrolling students over three semesters (fall 2016, spring 2017, and fall 2017) from five Texas community colleges.Footnote2 Students were randomly assigned to either corequisite remediation—a college-level English Composition course paired with a concurrent reading and writing DE support—or the traditional semester-long integrated reading and writing DE course that was required prior to entering college-level English. We examined the impacts of corequisite remediation on the following outcomes measured over one- and two-year intervals: passing a first college-level English Composition course; passing other college-level courses, including a second college-level English Composition course and a college-level reading course; total accumulation of college-level credits; and persistence. These short-term outcomes have been shown to be critical indicators of degree completion (Jenkins & Bailey, Citation2017). Findings can inform the efforts of Texas and other states and institutions to scale corequisites as a key DE reform strategy.

The remainder of this paper is structured as follows. We begin by describing the existing research on corequisite remediation and the context for corequisite remediation in Texas. We then discuss the participating institutions, the specific features of the corequisite models we examined, and findings on the fidelity of implementation. After a description of the study approach and RCT compliance, we detail our findings on the short-term impacts of corequisite remediation. Finally, we reflect on the findings and their implications for the national efforts to scale corequisite remediation.

Background

Developmental Education Reforms and Corequisites

In recent years, there have been increasing concerns about whether the system of DE courses that was developed to support underprepared students was achieving its intended goals. For example, one study suggested that only 20% of students assigned to traditional course-based mathematics DE and 37% of students assigned to course-based reading DE completed a first college-level course within three years of entering school (Bailey et al., Citation2010). Evidence also suggested that a number of students were being “misplaced,” or wrongly assigned into (and out of) DE (Scott-Clayton et al., Citation2014). States and colleges began to explore a wide range of reforms to DE to address concerns that the traditional approaches had not been successful.

Many of the DE reforms focused on accelerating students through DE and into college coursework. For example, colleges experimented with ways to change the structure of courses and sequences, including cutting the number of courses in a sequence, modularizing coursework, and accelerating coursework into intensive half-semester courses (Gardenhire et al., Citation2016; Weisburst et al., Citation2017). Colleges also incorporated adaptive instructional software and “emporium models” to allow students to move through coursework at varying paces (Bickerstaff et al., Citation2016; Bonham & Boylan, Citation2011; Gardenhire et al., Citation2016). Other initiatives achieved acceleration through curricular reforms, such as the integration of separate reading and writing course sequences into a single-course sequence (Edgecombe et al., Citation2014) and the alignment and streamlining of mathematics courses to align with majors through mathematics “pathways” (Hoang et al., Citation2017; Rutschow & Diamond, Citation2015). Also, colleges made reforms to assessment and placement that helped to accelerate students, such as considering other factors that might qualify students as college ready and place them out of DE, or eliminating requirements that students test at all (Barnett & Reddy, Citation2017; Rodriguez et al., Citation2018).

Of the many approaches to acceleration, corequisite remediation has emerged as one of the most commonly adopted reforms. A recent national survey found that, as of 2016, 35% of two-year colleges were offering corequisites in reading and writing, and 16% of two-year colleges were offering corequisites in mathematics (Rutschow et al., Citation2019). Between 2015 and 2017, states such as California, Tennessee, and Texas passed legislation scaling corequisite remediation to most of the students in need of academic assistance. A number of other states have passed more general policies that encourage the adoption of new approaches to DE such as corequisite remediation (Whinnery & Pompelia, Citation2018).

The most well-known corequisite model (and one of the models examined in this study) is the Accelerated Learning Program (ALP), developed by the Community College of Baltimore County. The ALP model required students who were not college ready in writing to enroll simultaneously in a three-credit-hour college-level writing course and a three-credit-hour DE course, with the same instructor teaching both courses and strong alignment in instruction across the two components. The model mixed 10 DE students with 10 college-ready students in the college course; in the DE support course, the group of 10 DE students received an additional three hours of reading and writing instruction each week. Research found that students who enrolled in ALP were 36 percentage points more likely than students in traditional DE course sequences to successfully complete a college-level course within one year and six percentage points more likely to persist into the second year of college (Cho et al., Citation2012).

A second rigorous study of mathematics corequisites was conducted at the City University of New York (Logue et al., Citation2016). Using an RCT, students were assigned to either a DE algebra course or a college-level statistics course together with a workshop, combining the DE reforms of mathematics pathways (placement into statistics rather than algebra) and corequisite remediation (direct placement into a college course with a DE course). The study found that students assigned to the college-level statistics course with the workshop were 17 percentage points more likely to pass a college-level course within one year, accumulated an average of four additional credits, and graduated at rates that were 8 percentage points higher than the students assigned to the traditional algebra course (Logue et al., Citation2019, Citation2016).

Other studies examining corequisite remediation are emerging from states that have been early to scale corequisites. For example, Ran and Lin (Citation2019) used a regression discontinuity approach to examine impacts using Tennessee postsecondary data. The study found that students at the margin of college readiness who went into corequisite remediation saw a 13 percentage point increase in passing of gateway English within one year, but the study found no impacts on persistence or completion. Descriptive studies examined the adoption of corequisite remediation in California (Rodriguez et al., Citation2018) and the implementation of reading and writing corequisite models in Texas (Daugherty et al., Citation2018).

Our study builds on this earlier corequisite remediation research in several ways. As the first experimental study of corequisite remediation in reading and writing, the study provides critical causal evidence needed by policymakers to inform decisions about scaling. Because we examine three different models across five colleges, we are able to determine whether positive impacts can be found when states roll out policies that allow for variation in the delivery and structure of corequisite models across many institutions and corequisite models. In addition, all but one of the colleges implemented a corequisite model that offered fewer hours of DE support than the ALP model, allowing us to examine the impacts of less intensive corequisite remediation.

The Context for Corequisite Remediation in Texas

Texas offers the second largest public postsecondary education system in the country (after California), with 50 public two-year college systems and 37 public four-year colleges enrolling more than 1.5 million students as of 2017. In 2011, the Texas Legislature passed Senate Bill 162, which required the Texas Higher Education Coordinating Board (THECB) to develop a statewide plan for DE that encouraged the adoption and scaling of evidence-based best practices to serve underprepared college students. The plan established by THECB required all public institutions to implement at least one accelerated strategy by 2015, with corequisites being one of the focal acceleration strategies they could pursue. In June 2017, the Texas governor signed House Bill (HB) 2223, requiring institutions across the state to scale corequisite models. The law mandated a three-year progressive scale-up of participation in corequisites: 25% of student enrollments in DE in fall 2018 had to be in corequisites, and this increased to 50% in fall 2019 and 75% in fall 2020. Some groups of students were exempt from the HB 2223 requirements, including students assessed with academic skills below the ninth-grade level and students in English courses for speakers of other languages. Although our study took place in the midst of these policy changes, all of the study participants entered college before statewide scaling under HB 2223 began.

State policy on corequisite remediation required that students be co-enrolled in a credit-bearing course and a DE support in the same subject area and in the same semester. The DE support in corequisite models could be offered as a traditional course, or it could be offered as a non-course-based option (NCBO). NCBOs were first introduced in Texas in 2009 as a means to allow for state funding and tuition funding for DE support outside of the traditional classroom instruction model (e.g., mandatory attendance at the writing center, labs with modularized computer-adaptive instruction). Colleges were able to develop a wider range of corequisite models with the option of using NCBOs to provide DE support. The learning objectives and credit allowances for the college course and the DE support were also set at the state level for all public colleges in Texas. Beyond these few state-level requirements, colleges had considerable flexibility over the design and implementation of corequisites in terms of structure, content, pedagogy, and student population.

Corequisite models in Texas were typically built around English Composition I and College Algebra (the most common entry-level reading/writing and mathematics courses), although colleges also developed corequisite models around other early college courses such as History, Government, Psychology, Statistics, and Contemporary Mathematics. Given their popularity, our study focused on corequisite models built around English Composition I. Across the state, the DE support portion of the corequisite model ranged from one to four credit hours, meaning that students would receive up to four hours of additional weekly instructional time in reading and writing that was tied to their three-hour college-level English course. Four of the five colleges included in this study implemented the least intensive corequisite models with one-hour weekly DE supports, while one college implemented a model that required three additional hours of weekly instruction in the DE support.

In an earlier report on corequisite remediation that drew from the interview and survey data on 36 community colleges across the state (Daugherty et al., Citation2018), we identified five types of English Composition I corequisite models being implemented in Texas community colleges:

  1. Paired-course models: In these models, the DE support and college course remain relatively similar to what was offered outside of corequisite models. There may have been some efforts to strengthen connections between the two courses, but typically they retained separate instructors and largely focused on separate coursework. Students also typically enrolled in separate sections of the college course from college-ready students and did not attend the course and support as a learning community. Approximately 27% of surveyed colleges reported that their English Composition I corequisites were structured as paired course models.Footnote3

  2. Extended instructional time models: In these models, the DE support was built in as an extension of the college course, with the DE support and college course typically indistinguishable to students as two separate components and scaffolding embedded throughout the course. The college course and DE support were always taught by the same instructor and focused on the same coursework, and sections of the corequisite were typically populated entirely by DE students (i.e., there were no efforts to intentionally mix DE students in sections with college-ready students). Approximately 23% of surveyed colleges reported that their English Composition I corequisites were structured as extended instructional time models.

  3. ALP models: As described previously, in these models, DE students were co-enrolled with college-ready students in the college course, and then the smaller group of DE students were enrolled together in the DE support as a learning community. The same instructor taught both portions of the corequisite model, and the focus of the DE support was to provide additional support around the college coursework, typically utilizing the same textbook but often supplementing college coursework with some additional assignments. Approximately 18% of surveyed colleges reported that their English Composition I corequisites were structured as ALP models.Footnote4

  4. Academic support service models: In these models, DE students were co-enrolled with college-ready students in the college course. The DE support, typically structured as an NCBO, involved weekly use of an existing college support service, typically tutoring in the writing center or participation in instructor office hours. Tutoring models often used a different instructor to oversee the corequisite DE support, while office-hour models relied on the same instructor for both parts of the corequisite model. The DE support focused almost exclusively on providing students with additional support around the college coursework, although occasionally instructors assigned some small amounts of additional coursework to provide targeted support. Approximately 14% of surveyed colleges reported that their English Composition I corequisites were structured as academic support service models.

  5. Technology-mediated models: In these models, DE students were typically enrolled in separate sections of the college course that are not mixed with college-ready students, and the DE support was structured as a lab in which students worked independently with computer-adaptive software to receive support with basic concepts. The instructor overseeing the DE support was often different from the instructor of the college-level course. Approximately 11% of surveyed colleges reported that their English Composition I corequisites were structured as technology-mediated models.

Our study focuses on models that were implemented within the five colleges that volunteered to participate in the study. So, we examined implementation and impact for three of these five models: extended instructional time, ALP, and academic support service.

Materials and Methods

Setting and Population for the Study

The experiment was conducted at five community colleges in Texas. The systems participating in the study were all large systems located in urban and suburban regions of Texas with substantial populations of low-income and minority students. All of the participating colleges volunteered to participate in the study and had established their own approaches to corequisite remediation prior to participation.

The study focused on first-time-in-college students. All first-time students in Texas colleges were required by state policy to be placed into DE or college-level courses according to scores on the state’s placement exam, the Texas Success Initiative Assessment (TSIA).Footnote5 Students were tested separately in reading and writing, and the state set a common college-ready cut score for each subject that all public colleges were required to use for placement into DE. At the time of the study, colleges typically created a “bubble range” of scores right below the college-ready cut score in which students were eligible for voluntary participation in corequisite remediation (or traditional DE). The study team worked with participating colleges to agree on a common eligibility score range for writing across the five colleges. Within this common eligibility range, students were recruited and randomized into either corequisite remediation or traditional DE. The use of reading scores to determine study participation varied by college: Two colleges required students in the eligible writing range to be college ready in reading to qualify for the study, two colleges set a bubble range of scores for college reading (i.e., students had to fall in both the writing and reading ranges to qualify), and one college established no reading score requirements for participation.

Prior to the introduction of the intervention, students in our eligible score ranges would have been placed into the highest level of stand-alone DE courses, which consisted of an Integrated Reading and Writing (IRW) course that ranged from three to five credit hours depending on the college. The IRW course was mandated as the highest level reading and writing DE offering for all public colleges in Texas as of spring 2015. Therefore, the traditional DE course in which control students enrolled was the same across the participating colleges.Footnote6 Once students had successfully completed the IRW course or tested as college ready on the placement test, they were permitted to enroll in college-level reading and writing courses, including English Composition I (i.e., enrollment in the college English course was delayed by at least one semester).

The Intervention and Implementation

Colleges participating in the RCT designed their corequisite models around the eight key components that are listed in . As the central and required key component of corequisite remediation, all study colleges required students to enroll in both English Composition I and an attached IRW DE course (the support course). In addition, all five colleges required that the content of the DE support be closely aligned with the content of the college course, with common coursework and learning objectives across the two components.

Table 1. Key components.

With regard to the other key components, each college tailored their corequisite model in unique ways (as allowed under state policy), meaning that categories of key components were the same but the specific component requirements and fidelity thresholds varied by college. For example, the predetermined numbers of instructional hours and weeks of instruction varied across the five colleges. Instructional time for the college course was common across colleges (48 instructional hours), and the course ran for 16 weeks for all colleges except College E. College E used eight-week terms for its corequisite model, meaning that students attended the college course for six hours per week over eight weeks. With regard to the DE support portion, College A’s corequisite model required students to attend 48 hours of additional instruction in the DE support over the course of the semester (i.e., three hours per week in addition to the three-hour course sessions), while the other colleges required just 16 hours of DE support. Colleges A, C, and D required that the college course and DE support run the full 16 weeks of the semester. College B and College E both shortened the DE support portion to offer two hours of DE support time over eight weeks.

The ALP and extended instructional time models at Colleges A, B, and C called for the DE support to be offered as classroom instruction, with students attending as a group. The academic support services models at Colleges D and E provided instruction in the DE support through tutoring, with students primarily receiving one-on-one support outside of a classroom setting (e.g., the instructor’s office, writing center). Four of the five colleges required a common instructor for the college course and DE support, while College E’s model required different instructors for the two parts of the corequisite model.

Finally, the colleges set requirements for the student population with regard to the college course and the DE support, including student-to-instructor ratios and the mix of student abilities in the classroom for the college course. Colleges A, B, and E all set a cap of 10 DE students per college course section, with the remaining students in the course being those deemed to be college ready. College A set a cap of 20 students overall in the college course, while Colleges B and E set a cap of 25 students. College D targeted enrolling five DE students per section and set an overall cap of 30 students in the college course. College C’s corequisite model blended the college course and DE support as one and capped student enrollment at 22 students (primarily DE students, but college-ready students needing an English Composition I section were not prevented from enrolling in these courses).

To assess fidelity of implementation, we relied on measures that were drawn from various data sources, including administrative data, faculty survey data, and faculty focus group data. Overall, we found that most aspects of corequisites were implemented with fidelity across colleges ().Footnote7 College E struggled the most with implementation, with three of the eight key components not implemented with fidelity. For example, some of the corequisite sections at College E required students to receive just one hour of weekly support rather than the two hours initially intended, and some DE support instructors delivered the support as a course rather than tutoring. Most of the colleges faced challenges in mixing DE and college-ready students in the desired ratios for the college course; only College A was able to surpass the threshold for implementation with fidelity.

Table 2. Fidelity

findings.

Within each model, corequisite instructors were given considerable freedom over what took place in the classroom. Instructors reported (and we observed) wide variation in the instructional practices and coursework across sections of the corequisite within each college. Institutions provided general guidance for instructors that the primary focus of the DE support was to provide additional scaffolding and support around the content of the college course, and instructors were not required to use additional textbooks or coursework for the DE support.

In the initial years of the study, preparation and training for instructors teaching the corequisite was limited; only one of the five colleges (College D) offered and required formal training for all corequisite instructors. Some instructors at the other colleges were able to attend training sessions on corequisite instruction at national or state conferences using discretionary professional development funds, but there were no institutional requirements for training prior to teaching corequisites.

Finally, it is important to note that instructors for corequisite sections were not chosen at random, and the selection process for identifying corequisite instructors varied across sites. In particular, in most cases, institutions began developing their corequisite approaches prior to our initial contact. These efforts were often lead by a primary English faculty member, in conjunction with stakeholders from relevant parts of the institution including developmental reading and writing instructors, advisors, and student success personnel. In many cases, the corequisite model was primarily designed by a lead English faculty member. For example, in two colleges, a lead faculty member taught the corequisite sections; while in one college, the lead instructor developed the curriculum for the corequisite course and then assigned adjunct instructors to teach it. These dynamics have implications for the interpretation of our estimates. In particular, it is impossible to disentangle the effects of the corequisite courses from the effects of the individual instructors teaching them (Weiss Citation2010). Nonetheless, we maintain that our estimates are useful for understanding the likely impact of corequisites as they are adopted across institutions that are likely to take similar approaches to developing curricula and assigning instructors.

Research Design

Recruitment and Randomization Processes

To investigate the causal impact of corequisite remediation on student outcomes, we conducted a student-level RCT. All first-time-in-college students scoring within a predetermined range on the state’s college readiness exam at each participating institution were recruited to participate in the study during orientation or initial advising sessions. At two of the five colleges in which advising took place at group orientations, students were pre-randomized from lists of orientation participants and consented and surveyed during orientation. In these two colleges, consented students did not learn about their course placement until after they completed the baseline survey for the study. At the other three study colleges, students saw advisors on a one-by-one basis for advising without advanced notice and were therefore randomized through the survey platform after consent. Students who opted to participate were randomized to corequisite remediation or to the highest level standalone IRW DE course. We refer to the set of students randomized to corequisite remediation as the “treatment group” and those randomized to the stand-alone IRW course as the “control group.” In most cases, students had a 50% chance of being assigned to the treatment condition. However, because College A wanted to scale up corequisite remediation more quickly, we used a higher treatment probability of 75% for Cohort Three at that college.

Students met with advisors individually and were advised to enroll in the course corresponding to their treatment status. However, students typically self-registered outside of these advising sessions. This provided students with some freedom over determining their final schedule, including opportunities to either not enroll in any reading and writing course or to enroll in a course that did not correspond with their treatment status. To ensure that students did not circumvent advisor recommendations, several institutions built programming into the enrollment software that prevented DE students from self-enrolling in college-level coursework until their DE requirements were satisfied, but several colleges did not have these blocks in place. In addition, colleges often allowed and encouraged all students testing at DE levels to retake the placement test in an effort to ensure accurate placement, providing additional opportunities for noncompliance. Despite these potential sources of noncompliance, our compliance rate was high overall, as we describe in the next section.

Table 3. Consent to participate.

Statistical Approach to Estimation of Treatment Effects

To estimate the intent-to-treat (ITT) effect of randomizing students into either the corequisite course or IRW at each institution, we estimated the following regression model: (1) Yci=δc+ψXci+ηRci+uci(1) where Yci is the outcome for student i at college c (either persistence to the third semester or passing the first college-level writing course), Xci is a vector of student demographic characteristics, and δc is a college-by-cohort fixed effect.Footnote8 The variable of primary interest in this study is Rci, the random assignment indicator, which was equal to 1 if the student is assigned to treatment and 0 otherwise. The coefficient associated with this variable, η, captures the ITT of corequisite English instruction.Footnote9 Because of the randomized controlled design, controlling for covariates such as demographics and assessment scores was not necessary to obtain unbiased estimates. However, including these controls allowed us to improve statistical precision. We ran models with and without these controls and obtained similar results.Footnote10

Data Collected and Sample Characteristics

Students were automatically removed if they were younger than 18 years of age. Consent rates varied from 67% to 91% across colleges (). Variation in consent may have been driven by differences in randomization procedures, differences in advisor marketing of the study (and the nature of the corequisite remediation), and differences in student populations and their willingness to take the time to complete the survey in return for the incentive (a $25 Amazon gift card). For example, we heard from advisors at College B that low consent rates were due to the fact that many students were last-minute enrollees and needed to rush to complete registration activities, so they may not have wanted to take the extra time to participate. Across all institutions, the sample size from the three cohorts entering between fall 2016 and fall 2017 was 1,482 participants after removing underaged students and students who did not consent to participation (College A: 466, College B: 181, College C: 449, College D: 279, College E: 107).

Baseline Survey Information

The baseline survey captured a broad set of information, including high school/GED completion, high school curriculum opportunities and performance, student opinions about how their high school prepared them for college, intentions for college enrollment and achievement, student expectations about their own performance in college, motivations for attending college, reading habits, financial aid, transportation challenges, the student’s work and family situation, language, and a battery of questions gauging study skills, attitudes, and approaches to schoolwork. We used this information to assess the composition of our treatment and control samples to ensure that the comparison we form in estimating the treatment effect draws upon populations with similar characteristics. We also used the data to control for differences between groups in the regression analysis and to examine subgroup effects for particular student populations of interest.

Postsecondary Achievement Data

To assess student achievement in postsecondary education, we utilized statewide administrative data held by THECB. These administrative data contain information on course enrollment and grades for each student at all higher education institutions in Texas. The data also contain information on placement exam scores and additional demographic measures.

Our key outcomes of interest for this paper included passing English Composition I by the end of the first and second academic year and one- and two-year persistence. Passing of “gateway” mathematics, reading, and writing courses and persistence have been viewed as important predictors of degree completion (Jenkins & Bailey, Citation2017) and have been frequently examined in the postsecondary literature, including prior studies of corequisite remediation (Cho et al., Citation2012; Logue et al., Citation2016). We also examine passing other key college-level courses including English Composition II and a first college-level reading course other than English Composition I, as well as total accumulation of college-level credits by the end of the first and second academic years.

To assess one- and two-year persistence, we examined the effect of corequisite remediation on enrollment one and three semesters after the semester in which the randomization and initial course enrollment took place (e.g., for one-year persistence for the fall 2016 study cohort, we examined enrollment in the spring 2017 semester). Students who had completed any academic credential, had transferred upward to a four-year college, or who remained enrolled at any postsecondary institution in the state were coded as having persisted.

To assess course passing, grades were recorded on a standard A–F basis in THECB data, and we defined passing as receiving a grade of C or better. We focus on the completion of English Composition I because it is a key gateway course required for all academic degree programs and an important prerequisite for higher level writing and reading coursework.

Given that some stakeholders had expressed concerns that instructors may “water down” content or artificially inflate grades in English Composition I under a corequisite model in which many or all students had lower test scores, we wanted to assess whether corequisites improved pass rates in courses that build upon the content of English Composition I. We focused on English Composition II because it builds directly upon the content of English Composition I and is a course that many students take.

Some stakeholders also expressed concern that because English Composition I is traditionally focused primarily upon writing, students may not receive sufficient reading instruction or support as they would in an IRW course that focused on both reading and writing. To assess this concern, we also examine the completion of a college-level reading course other than English Composition I.

Finally, to assess whether students moved further along in the college sequence under the corequisite model, we examined the effect of corequisites on total accumulation of college-level credits by the end of the second and fourth terms.

Results

Compliance

shows the percentage of students in different course enrollment outcomes by treatment status. Compliance with the experimental design was strong, especially considering the complexity in getting students to enroll in prescribed course sequences. In the treatment group, 74% of students enrolled in English Composition I with the concurrent DE support compared with only 3% of the control group. However, a sizable share of students in both the control and treatment groups enrolled in the English Composition I course without the attached DE support (14% and 8%, respectively) because of either retesting and up-placement into this course or through circumventing the recommendations of their advisor to enroll in DE coursework. With regard to enrollment in a stand-alone IRW course, 73% of the control group students enrolled compared with only 10% in the treatment group. The share of students not enrolling in any college course, or enrolling in college but not in IRW or English Composition I, were both very similar for the treatment and control groups. These patterns indicate that the experiment affected the likelihood of enrollment in IRW versus corequisite remediation (as intended) but also of enrolling in English Composition I without the DE support.Footnote11

Table 4. Compliance with random assignment.

Covariate Balance and Inclusion of Non-Enrollees

presents findings on the balance of predetermined covariates across treatment and control students. These variables were obtained from the baseline survey and were thus available for all students, including those who did not enroll in college and therefore were not included in the state’s postsecondary administrative data. The first set of columns provide covariate means and standard deviations for all students in the experiment. The means are quite similar for treatment and control students, consistent with successful randomization. Moreover, none of the differences are statistically significant at the 5% level.

Table 5. Covariate means by assignment status with mean equality test.

Students who did not enroll in college did not have the opportunity to receive the treatment, so including them in the analysis would dampen the extent to which randomization to treatment would affect student outcomes. At the same time, excluding them could introduce bias because students can self-select into enrollment, and this may differ by treatment status. However, enrollment rates were very similar for the two groups, with 86.6% of the treatment group enrolling and 85.4% of the control group enrolling (with the difference achieving a p value of .51). Furthermore, when we restrict the sample to students who enroll in the second set of columns in , the differences in baseline covariates by treatment status remain small in magnitude, and none of the differences are statistically different from zero. Because restricting the sample to students who enroll does not introduce differences in baseline covariates between treatment and control students, we show results using only students who enrolled in college.

Intent-to-Treat Effect Estimates

reports ITT estimates for the sample restricted to enrolled students.Footnote12 and report the ITT estimates for taking and passing college-level English courses for targeted student subgroups and by model. in the Appendix report the ITT estimates for our other outcomes for targeted student subgroups and model, andfor all outcomes by college. Overall, we found strong evidence of increased likelihood of passing English Composition I. Randomization to treatment resulted in a 24.2 percentage point increase in English Composition I passing rates among enrollees by the end of the first academic year. Although some students in the control group caught up to their peers in the treatment group by the end of the second academic year, students assigned to the treatment group were still 18.4 percentage points more likely to have passed the course. The estimates were relatively consistent across different student subgroups () and across the three different types of corequisite models (), with tests of equivalence across subgroups and models indicating a lack of statistically significant differences. However, there were some differences across institutions (). The largest estimate for the impact of passing English Composition I was found for College D (30.5 percentage points), while the smallest estimate was found for College E (16.6 percentage points and statistically insignificant). We can reject the hypothesis that the effects are constant across institutions.

Table 6. Intent-to-treat effect of corequisite placement.

Table 7. Intent-to-treat effect of corequisite placement on success in college-level English courses, targeted subgroups.

Table 8. Intent-to-treat effect of corequisite placement on success in college-level English courses, by model.

Students assigned to treatment were also 6.4 percentage points more likely to complete English Composition II by the end of the second academic year and 6.7% points more likely to pass a college-level reading course other than English Composition I by the end of the first academic year. We did not find a statistically significant effect on passing a college-level reading course by the end of the second academic year. Students assigned to treatment had also completed more college-level credits by the end of the first (2.3 credits) and second (1.5 credits) academic years. The increase of 2.3 college credits in the first year is statistically indistinguishable from 3, the number of college credits associated with the English Composition I course that the treatment students took instead of taking the integrated reading and writing developmental course.

Despite positive effects on passing college-level courses and accumulation of college-level credits, we did not find evidence that the treatment affected persistence to the end of the first or second academic year. In particular, the estimated effect on persistence to the end of the second academic year is −2.2 percentage points, and the 95% confidence interval includes 0, suggesting that it is possible that there is no difference between groups. For this outcome, we did find some suggestive evidence of treatment effect heterogeneity across institutions (). For example, the estimates for College E suggested large positive effects on persistence, while the estimates for the other institutions were close to zero or negative (and none were statistically significant). We did not find evidence of heterogeneity in the impacts on persistence across key demographic subgroups () or across different types of corequisite models ().

Table A1. Intent-to-treat effect of corequisite placement on other college level courses, persistence and credit accumulation, targeted subgroups.

Table A2. Intent-to-treat effect of corequisite placement on other college level courses, persistence and credit accumulation, by model.

Table A3. Intent-to-treat effect of corequisite placement on success in college-level English courses, by college.

Table A4. Intent-to-treat effect of corequisite placement on other college level courses, persistence and credit accumulation, by college.

Discussion

Overall, short-term estimates from three cohorts of students in our study provide positive evidence of the impacts of corequisite remediation on course outcomes. Specifically, we find that being assigned to corequisite remediation increased the probability of passing English Composition I within one academic year by 24.2 percentage points. This estimate is smaller than the 36 percentage point estimate found in the initial quasi-experimental research on English corequisite remediation (Cho et al., Citation2012) and larger than the 17% point estimate found in a prior experimental study of mathematics corequisite remediation (Logue et al., Citation2016) as well as the 13% point estimate found in Tennessee (Ran & Lin, Citation2019). We also found that students assigned to treatment were more likely to pass English Composition II and a college-level reading course and had higher overall credit accumulation. With the average control student across our colleges accumulating 11.6 credits in the first academic year, the additional 2.3 credits associated with placement into corequisite remediation represent a meaningful increase in postsecondary achievement. While it is impossible for us to disentangle the effects of corequisite models from those of the instructors teaching the corequisite courses, we maintain that these estimates are useful for understanding the likely impact of corequisites as they are more widely scaled and adopted across institutions that are likely to take similar approaches to developing curricula and assigning instructors

On the other hand, we did not find any impacts of corequisite remediation on short-term persistence. These results are consistent with Ran and Lin’s (Citation2019) mixed findings but differ from other studies that have shown short-term impacts on persistence (Cho et al., Citation2012; Logue et al., Citation2019). Substantial proportions of students from both the treatment and control groups left college within the first two semesters, and qualitative and survey evidence suggested that students were facing a wide range of economic and life challenges that may have been more critical factors in driving dropout. It may be that DE reforms need to be combined with other advising and wraparound supports similar to what was done for the Accelerated Studies in Associate Programs (ASAP) program to generate persistence impacts for vulnerable student populations (Scrivener et al., Citation2015). However, we will continue to examine longer term persistence and completion outcomes to determine whether the additional course momentum we observe translates into improved persistence and completion outcomes after two and three years.

Results for course progress were positive for all three of the corequisites approaches we examined, indicating that a range of corequisite models have the potential to improve student course outcomes. However, we did find variation in impacts by school, suggesting that model components, implementation fidelity, and/or school context may have the potential to impact the effectiveness of corequisite implementation. Our study design does not allow us to quantitatively explain the variation in impacts by school, but qualitatively we can identify several different factors that may have contributed to the smaller impacts we observed for Colleges B and E. These two colleges both shortened the length of the courses and/or DE support in their corequisite models—College B concentrated the DE support in the first eight weeks and College E accelerated both the course and the DE support into eight weeks—and it may be that this model is less effective than spreading the corequisite model over a full 16-week term. Also, College E’s corequisite model relied on separate instructors to teach the college course and DE support who did not engage in ongoing communication and collaboration on curriculum and student progress, so this decreased focus on alignment of instruction may have been problematic. In addition, instructor selection and training may have played a role. Colleges B and E largely relied on adjunct instructors who were assigned to teach the corequisite sections and had little knowledge or training prior to stepping into the classroom. In contrast, instructors at the other three colleges tended to have been hand-picked or have volunteered, and many of these instructors engaged in some sort of professional development on corequisite remediation.

When considering the merits of a particular intervention such as corequisite remediation, it is important to consider the costs of implementation, particularly relative to the benefits. In a forthcoming companion paper, we estimate the costs of the five corequisite models examined here, and find considerable differences in costs across approaches (Cunha et al., forthcoming). The cost differences are driven primarily by differences in the number of credit and contact hours, class sizes, and type of instructor utilized across the approaches. Two the approaches were less costly than prerequisite remediation, while two approaches were more costly. In future work, we will compare the costs of the corequisite models to their long term benefits. However, for now, the results from this work indicate that the positive short term impacts of corequisites on course progression can be achieved with a variety of approaches that vary considerably in their costs of implementation. We thus urge policymakers and practitioners to consider factors such as class size, contact hours, and instructor credentials when they implement corequisites.

It is also important to note that, like the majority of research on corequisite remediation, our study examines the impact of participating in a corequisite model, relative to the formerly predominant pre-requisite developmental education model. In particular, existing research does not examine the impact of participating in a corequisite relative to enrolling directly in a college level course without a corequisite support. Given emerging evidence pointing to the potential for positive effects of enrolling directly in a college course (e.g., Kane et al., Citation2018), one might hypothesize that enrolling students directly into a college-level course without support could yield effects similar to that of the corequisite model at lower cost to students and taxpayers. In future work, we plan to explore this question using a regression discontinuity design exploiting the fact that recent cohorts of near-college-ready students in Texas were mandated to enroll in corequisites, while those scoring just above the college-ready cutscore would have enrolled in a college course without support.

Finally, in general, we found that corequisite remediation led to benefits for all types of students, although the relatively small sample sizes limited our ability to examine heterogeneous effects of corequisite remediation for some groups. Future analysis will include an additional cohort of incoming college students that will help to increase statistical power. Using this larger study sample, we will examine differential impacts of corequisites for a broader range of student factors, including high school GPA and course-taking patterns, parental income and education, life circumstances and hardships, and noncognitive factors, among others. It is important to note that, similar to prior studies of corequisite remediation, our study largely focused on a population of students who tested at the higher end of the DE range (i.e., close to college-ready levels). Therefore, our findings cannot be generalized to students scoring at lower levels of readiness. Our study findings do suggest that, for that students at higher levels of readiness, less intensive corequisite models with one-hour DE support portions may be sufficient for driving positive impacts in student course outcomes.

Acknowledgments

The authors would like to thank Christina LiCalsi, Rebecca Medway and Courtney Tanenbaum of American Institutes for Research (AIR) for their contributions to the implementation of the randomized controlled trial and the student surveys conducted as part of this research, and to thank Diana Gehlhaus, Celia Gomez, and Alexandra Mendoza-Graf for their contributions to implementation analysis. We would also like to thank Kathy Hughes and Krissy Zeiser of AIR as well as numerous conference discussants and anonymous reviewers for providing helpful feedback to improve this paper. We would also like to thank policymakers at the Texas Higher Education Coordinating Board for their support in designing this study, providing data access, and in interpreting and communicating findings to different audiences. David Gardner, Jerel Booker, Julie Eklund, Suzanne Morales-Vale, Keylan Morgan, Linda Hargrove, Josie Brunner, Melissa Henderson, Ginger Gossman, and Holly Kosiewicz were particularly helpful. We would also like to thank practitioners at the five community colleges who participated in the study for working with us to implement the randomized controlled trial, providing data and coordinating student focus groups. Finally, we would like to thank the students and faculty members who participated in interviews, focus groups and/or surveys.

Disclosure Statement

We do not have a financial interest in the research presented here.

Additional information

Funding

The research reported here was supported, in whole or in part, by the Institute of Education Sciences, U.S. Department of Education, through grants [R305H170085 and R305N170003] to the American Institutes for Research and [R305H150094] to the RAND Corporation. The opinions expressed are those of the authors and do not represent the views of the Institute or the U.S. Department of Education.

Notes

1 Although we successfully randomized 1,782 students, 1,492 students consented to be surveyed; 1,276 of those students subsequently enrolled in college. Our primary results are based on these 1,276 students.

2 We also recruited another 165 students in fall 2018, but we do not yet have two-year outcome data for those students.

3 Under some definitions of corequisite remediation, these models would not qualify because of the lack of alignment in instruction across the two components of the corequisite model.

4 It is important to note that the designers of the original ALP model may have much stricter requirements for what are classified as ALP corequisites.

5 Students can be exempted from placement testing by demonstrating readiness through other test scores (e.g., SAT, end-of-course exams) or falling into special categories (e.g., veterans).

6 Prior to 2015, many colleges had offered separate reading and writing DE courses at the highest level. To facilitate implementation, the state and colleges had engaged in a range of IRW-focused professional development activities between 2013 and 2015.

7 We did not report fidelity measures for the co-enrollment component in given that this measure is used to assess RCT compliance.

8 The student covariates are indicators for race (Black, Hispanic, White), female, limited English proficiency, part-time enrollment, college degree intention (AA, BA, certificate), high school diploma versus GED degree, first language spoken at home other than English, and first generation in college. We also ran models that included student scores on the placement exam. We were missing placement exam scores for 254 students, requiring us to drop those observations in models that included score data as a covariate. The results were qualitatively similar but slightly less precise. We do not view the lack of score data as problematic, given that students were randomized and entered the study based upon scoring within a common range. We present the models without scores in the report, but the results from models that include scores are available upon request.

9 In addition to the ITT effect, it would be of interest to estimate the effect of treatment on the treated (TOT), which can be defined as the effect of corequisite remediation for students who actually enrolled in the designated college course and the required DE support relative to taking traditional DE. However, as we show next, some students ended up taking college-level English without enrolling in the concurrent DE support (i.e., partial treatment), and this was more common for the control group. This means that to estimate the effect of the college course + corequisite, we would need an additional experimental manipulation. In technical terms, we have two endogenous variables (college-level course + corequisite and college-level course alone) but only one instrument. We could estimate a TOT effect of college-level course + corequisite relative to all other conditions by scaling up the ITT estimates by about 1.3, which is the inverse of the difference in the fraction taking the college-level course + corequisite between the treatment and control groups: 1/(.804–.033) = 1.30. However, we feel this is conceptually inappropriate since it includes students in the college-level course as being in the counterfactual condition even though placement into college-level courses is arguably the central component of the intervention. This approach would overstate the first-stage, leading to under-estimated effects of treatment on the treated.

10 The results for models without controls are available upon request.

11 As explained in a previous footnote, this prevents us from being able to estimate the TOT effects, and we instead focus on estimating the ITT effects.

12 As noted, although we report results for the sample of enrolled students, the estimates for the full sample were slightly smaller but qualitatively similar. We also report results from the model that accounted for student demographics and institution-by-cohort fixed effects. The estimates were not sensitive to the inclusion of student demographics or test scores as would be expected given successful randomization and balance across covariates. The results from alternative specifications are available on request.

References

  • Bailey, T., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255–270. https://doi.org/https://doi.org/10.1016/j.econedurev.2009.09.002
  • Barnett, E., & Reddy, V. (2017). College placement strategies: Evolving considerations and practices. In K. L. McClarty, K. Mattern, & M. Gaertner (Eds.), Preparing students for college and careers: Theory, measurement, and educational practice (pp. 82–93). Routledge.
  • Bickerstaff, S., Fay, M., & Trimble, M. (2016). Modularization in developmental mathematics in two states: Implementation and early outcomes (CCRC Working Paper No. 87). Columbia University.
  • Bonham, B., & Boylan, H. (2011). Developmental mathematics: Challenges, promising practices, and recent initiatives. Journal of Developmental Education, 34(3), 8–10. 2–4, 6.
  • Cho, S. W., Kopko, E., Jenkins, D., & Jaggars, S. S. (2012). New evidence of success of community college remedial English students: Tracking the outcomes of students in the Accelerated Learning Program (ALP) (CCRC Working Paper No. 53). Columbia University, Teachers College, Community College Research Center.
  • Cunha, J., Daugherty, L., Miller, T., Gerber, R., & Martorell, T. (Forthcoming). A cost-benefit analysis of co-requisite English developmental education: Evidence from a randomized controlled trial in Texas Community Colleges. RAND Corporation.
  • Community College Research Center (2014). What we know about developmental education outcomes. Community College Research Center.
  • Daugherty, L., Gomez, C. J., Carew, D. G., Mendoza-Graf, A. C., & Miller, T. (2018). Designing and implementing corequisite models of developmental education: Findings from Texas community colleges (RR-2337-IES). RAND Corporation.
  • Edgecombe, N., Jaggars, S. S., Xu, D., & Barragan, M. (2014). Accelerating the integrated instruction of developmental reading and writing at Chabot College (CCRC Working Paper No. 71). Columbia University, Teachers College, Community College Research Center.
  • Gardenhire, A., Diamond, J., Headlam, C., & Weiss, M. J. (2016). At their own pace: Interim findings from an evaluation of a computer-assisted, modular approach to developmental math. MDRC.
  • Hoang, H., Huang, M., Sulcer, B., & Yesilyurt, S. (2017). Carnegie math pathways 20152016. Impact report: A five-year review. Carnegie Foundation for the Advancement of Teaching.
  • Jenkins, D., & Bailey, T. (2017). Early momentum metrics: Why they matter for higher education reform. Columbia University.
  • Kane, T., Boatman, A., Kozakowski, W., Bennett, C., Hitch, R., & Weisenfeld, D. (2018). Remedial math goes to high school: An evaluation of the Tennessee SAILS program. Research Report. Center for Education Policy Research, Harvard University. https://cepr.harvard.edu/files/cepr/files/sails_research_report_final.pdf
  • Logue, A. W., Douglas, D., & Watanabe-Rose, M. (2019). Corequisite mathematics remediation: Results over time and in different contexts. Educational Evaluation and Policy Analysis, 41(3), 294–315. https://doi.org/https://doi.org/10.3102/0162373719848777
  • Logue, A. W., Watanabe-Rose, M., & Douglas, D. (2016). Should students assessed as needing remedial mathematics take college-level quantitative courses instead? A randomized controlled trial. Educational Evaluation and Policy Analysis, 38(3), 578–598. https://doi.org/https://doi.org/10.3102/0162373716649056
  • Ran, F. X., & Lin, Y. (2019). The effects of corequisite remediation: Evidence from a statewide reform in Tennessee. Columbia University.
  • Rodriguez, O., Cuellar Mejia, M., & Johnson, H. (2018). Remedial education reforms at California’s community colleges: Early evidence on placement and curricular reforms. Public Policy Institute of California.
  • Rutschow, E. Z., & Diamond, J. (2015). Laying the foundations: Early findings from the New Mathways Project. MDRC.
  • Rutschow, E. Z., Cormier, M. S., Dukes, D., & Zamora, D. E. C. (2019). The changing landscape of developmental education practices: Findings from a national survey and interviews with postsecondary institutions. Center for the Analysis of Postsecondary Readiness.
  • Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371–393. https://doi.org/https://doi.org/10.3102/0162373713517935
  • Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students. MDRC.
  • Weisburst, E., Daugherty, L., Miller, T., Martorell, P., & Cossairt, J. (2017). Innovative pathways through developmental education and post-secondary success: An examination of developmental math interventions across Texas. The Journal of Higher Education, 88(2), 183–209. https://doi.org/https://doi.org/10.1080/00221546.2016.1243956
  • Weiss, M. (2010). The implications of teacher selection and the teacher effect in individually randomized group treatment trials. Journal of Research on Educational Effectiveness, 3(4), 381–405. https://www.tandfonline.com/doi/abs/https://doi.org/10.1080/19345747.2010.504289 https://doi.org/https://doi.org/10.1080/19345747.2010.504289
  • Whinnery, E., & Pompelia, S. (2018). Governors’ top education priorities in 2018 State of the State addresses. Education Commission of the States.