5,940
Views
6
CrossRef citations to date
0
Altmetric
Original Articles

A meta-analysis of the efficacy of self-regulated learning interventions on academic achievement in online and blended environments in K-12 and higher education

ORCID Icon, , , ORCID Icon & ORCID Icon
Pages 2911-2931 | Received 15 Sep 2021, Accepted 21 Nov 2022, Published online: 16 Dec 2022

ABSTRACT

Numerous empirical studies, including meta-analyses, have confirmed the impact of self-regulated learning (SRL) on learners’ academic achievement in traditional or face-to-face learning environments. However, prior meta-analyses rarely examined the efficacy of SRL interventions on academic achievement in online or blended education across elementary education, secondary education, higher education, and adult education. Therefore, this meta-analysis addresses this research gap by focusing on the effect of SRL interventions on students’ academic test performance in online and blended learning environments in elementary, secondary, and higher education settings as well as informal settings. The present meta-analysis compares SRL phase, SRL scaffolds, and SRL strategies between treatment and control groups. We also investigated possible differential effectiveness due to substantive features of the included studies, such as different educational levels of learners (e.g. elementary, secondary, and higher education), academic subjects (STEM vs. non-STEM), and learning contexts (e.g. online learning, blended learning, web-based learning, mobile learning). Consistent with previously published meta-analyses, the present meta-analysis confirmed a positive and moderate effect of SRL intervention (ES = 0.69) on learners’ academic achievement in online and blended environments for learners in elementary, secondary, and higher education as well as informal adult education settings.

1. Introduction

Self-regulation is needed whenever we learn something novel or complex. The more novel or complex something is to learn or to master, the more self-regulation we need to stay motivated, overcome frustrations or disappointments, stay focused, and persist (e.g. Baumeister and Vohs Citation2016; Pelletier et al. Citation2001). Given that learning and mastery of novel or complex skills is something that everyone struggles with at one time or another, there has been long-standing interest to identify the factors that contribute to learning and achievement or performance. Such interest can be traced back to the writings of classical Greek philosophers such as Socrates, Aristotle, and Plato from several thousand years ago on impulse and self-control that are the building blocks for self-regulation.

The term self-regulated learning (SRL) refers to self-regulation processes and self-regulatory strategies applied in the learning contexts such as informal education, schools, or classrooms. A large body of research already documents the importance of SRL in academic persistence, academic performance, and educational attainment in schools or classrooms for students in elementary, secondary, and higher education settings as well as informal settings (e.g. León, Núñez, and Liew Citation2015; Muenks et al. Citation2017). Significantly less research has been done to examine the efficacy of SRL interventions in online learning environments, even though extant studies show that students with poor SRL skills and strategies are at increased risk for dropout and failure in online coursework (Lee and Choi Citation2011; Puzziferro Citation2008). In light of the fact that the COVID-19 pandemic made online or blended education a necessity for K-12 and higher education around many parts of the world, there is an urgent need for a clearer understanding of the effect that SRL interventions have on academic outcomes in online or blended learning education for elementary, secondary, college, and adult students.

Previously published meta-analyses on SRL and academic achievement have not always included the range of learners or students from K-12 and higher education in online learning. The present meta-analysis addresses this research gap and examines the research literature from the pre-COVID era in order to inform future research directions, instructional design, and teaching practices related to SRL and online or blended education. Furthermore, the educational levels of learners (e.g. elementary, secondary, and higher education), school or academic subjects (STEM vs. non-STEM), and online versus blended learning environments will be explored as moderating factors that could produce differential impacts of SRL interventions on academic achievement.

2. Literature review

2.1. Self-regulated learning

According to Zimmerman and Schunk (Citation2011), SRL describes a process for controlling and guiding efforts to complete complicated learning activities. Through the interactive SRL process, learners set milestones and try to monitor their metacognitive awareness, cognition, motivation, and behaviour (Delen and Liew Citation2016; Pintrich Citation2000). Multiple theoretical models and frameworks of SRL have been proposed (Panadero Citation2017; Pintrich Citation2000; Zimmerman Citation1990), but there are some commonalities among these models (Dent and Koenka Citation2016; Puustinen and Pulkkinen Citation2001). Despite differences in terminology, most SRL models can be conceptualised around three phases: a preparatory phase, a performance phase, and an appraisal/evaluation phase (Panadero Citation2017; Puustinen and Pulkkinen Citation2001; Zimmerman Citation1986).

Through these phases, learners utilise various SRL strategies to regulate their learning. SRL strategies in the current study were categorised mainly based on Pintrich’s (Citation1991, Citation2004) conceptual and operational framework, which is one of the most influential models in this field and comprises the most comprehensive SRL strategies (Panadero Citation2017; Richardson, Abraham, and Bond Citation2012). We first identified three types of SRL strategies: cognitive, metacognitive, and resource management strategies (Pintrich, 1991). Cognitive strategies are defined as students’ cognitive processes while performing academic tasks, such as rehearsal, elaboration, and critical thinking. Metacognitive strategies focus on the learner’s awareness and control of cognitive processes and include planning, monitoring, and regulating. Resource management strategies refers to managing and regulating resources, including time and environmental management, effort or motivational regulation, help seeking, and peer learning. Considering emotional regulation is also an important component of SRL processes (Hadwin, Järvelä, and Miller Citation2011; Panadero Citation2017), we added it as the fourth type of SRL strategies.

The positive relationships of SRL and academic achievement have been demonstrated in numerous empirical studies (e.g. Chen and Huang Citation2014; Sadati and Simin Citation2016). High-achieving students use SRL strategies more effectively than low-achieving students (Zimmerman, Citation2002). When actively self-regulating their learning, students tend to engage in successful cognitive strategies, which results in learning improvement (Jansen et al. Citation2019). Students’ use of metacognitive strategies is also significantly correlated with academic performance (Dent and Koenka, Citation2016).

2.2. SRL interventions in online or blended learning environments

Due to the effectiveness of SRL for learning outcomes, interventions to support learners’ SRL, especially in online learning environments, have received increasing attention. Under online learning environments, students have access to more learning resources and teamwork opportunities than in face-to-face environments (U. S. Department of Education, Citation2009). For students who find it challenging to attend in-person classes, online learning increases access by providing flexibility (Waschull Citation2001). On the other hand, online learning tends to shift the control of learning from the educational institutions to the isolated learners (Fournier, Kop, and Durand Citation2014). Independent and engaged learners are more likely to be successful in online learning (Broadbent and Poon Citation2015). The need to autonomously engage in the online learning process makes self-regulation increasingly necessary in online learning.

Research has shown that an online learning environment has the potential to foster student SRL. The use of online resources and tools can help support SRL in various ways. For example, Bernacki, Vosicka, and Utz (Citation2020) developed a digital training module to help students apply SRL strategies effectively in their STEM classes. In the Science of Learning to Learn training embedded in the course learning system site, students were asked to complete readings and activities to learn and practice SRL strategies in the context of STEM topics.

Computer-based scaffolds could provide students with on-going diagnosis as well as calibrated and dynamic support to support student SRL behaviours (Azevedo Citation2005). More specifically, in Azevedo and Cromley’s (Citation2004) study, students performed SRL in a hypermedia environment with adaptive scaffolding provided by human tutors, who would assist learners with planning and monitoring their learning activities and using different strategies to learn. In a fixed scaffolding condition, students were presented with 10 domain-specific learning objectives to guide their learning. The SRL intervention could also come in the form of prompts. The prompts simulated or suggested appropriate metacognitive strategies and cognitive strategies in online learning (e.g. Bednall and Kehoe Citation2011; Sonnenberg and Bannert Citation2015).

Not all SRL interventions are equally effective in promoting student achievement (Dignath and Büttner Citation2008). Although researchers have identified several SRL intervention characteristics that moderate the effectiveness (e.g. Jansen et al. Citation2019; Zheng Citation2016), there has been little discussion about what makes SRL interventions effective in online or blended learning environments. Therefore, investigating characteristics of SRL interventions in these learning contexts is another focus of the current meta-analysis.

2.3. Previous meta-analyses

Multiple meta-analyses have examined the effectiveness of SRL interventions on students’ academic performance. Most of them focused on populations of adult learners or students in higher education. This might be due to the convenience of conducting SRL interventions with adult learners, especially in online learning environments.

Broadbent and Poon (Citation2015) investigated whether SRL strategies correlated with academic achievement in online higher education settings by examining articles published from 2004 to 2014. They identified 12 studies and found that the strategies of time management (r = .14), metacognition (r = .06), effort regulation (r = .11), and critical thinking (r = .07) were positively correlated with academic outcomes, whereas rehearsal (r = −.03), elaboration (r = .00), and organisation (r = .00) had the least empirical support. Furthermore, they discovered that the contributors of SRL to achievement appear weaker and suggest that currently unexplored factors may be more important in online contexts.

Examining studies between 1997 and 2010, Richardson, Abraham, and Bond (Citation2012) investigated whether university students’ grade point average (GPA) correlated within and across research domains – personality traits, motivational factors, self-regulatory learning strategies, students’ approaches to learning, and psychosocial contextual influences. Their results demonstrated that meta cognition (r = .18), critical thinking (r = .15), elaboration (r = .18), and concentration (r = .16) were found to have small, significant, positive correlations with GPA. By contrast, measures of organisation and rehearsal learning were not significantly associated with GPA. As for behavioural self-regulation, they found that time/study management (r = .22), help seeking (r = .15), and peer learning (r = .13) had small positive correlations with GPA. Effort regulation (r = .32) showed a medium, positive correlation with GPA.

Sitzmann and Ely (Citation2011) investigated how adults regulate their learning of work-related knowledge and skills. Self-regulation theory was used as a conceptual lens for deriving a heuristic framework of 16 fundamental constructs that constitute self-regulated learning. Their findings support theoretical propositions that self-regulation constructs are interrelated – 30% of the corrected correlations among constructs were .50 or greater. Goal level, persistence, effort, and self-efficacy were the self-regulation constructs with the strongest effects on learning. However, four self-regulatory processes – planning, monitoring, help seeking, and emotion control – did not exhibit significant relationships with learning.

In Jansen et al. (Citation2019), meta-analytic structural equation modelling was utilised to examine whether SRL activity mediates the effect of SRL interventions on students’ academic achievement in higher education. The study revealed a contradictory result to the popular belief that SRL facilitates achievement by only providing evidence for the partial mediation effects of SRL on achievement. Their results indicated that part of the effect of SRL interventions on achievement is mediated by SRL activity, albeit the indirect effect is small (β = 0.05, p < .05).

Two studies focused their population on elementary and secondary students in real classrooms or within a school context. Dignath and Büttner (Citation2008) conducted two meta-analyses, one for primary and one for secondary school, to investigate the impact of various training characteristics on the training outcomes, regarding academic performance, strategy use, and motivation of students. Analysing 49 studies conducted with primary school students and 35 studies conducted with secondary school students, the meta-analyses generated 357 effect sizes. They identified an average effect size 0.69, indicating that SRL learning can be fostered effectively at both elementary and secondary level. In another study focusing on elementary students, Dignath, Buettner, and Langfeldt (Citation2008) analysed the effects of SRL learning on academic achievement. The results confirmed the effectiveness of SRL learning training programs at the elementary level. The aggregated effect sizes that evaluate the use of cognitive and metacognitive strategies produced a mean effect size of d = .73.

Only one meta-analysis covered the learners across all levels (Zheng Citation2016). This meta-analysis investigated research on the effects of SRL learning scaffolds on academic achievement in computer-based learning environments from 2004 to 2015. The results of the study confirmed that SRL learning scaffolds generally yielded a significantly positive effect on academic performance (ES = 0.438). The results also indicated that both domain-general and domain-specific scaffolds can support the entire process of SRL learning.

While there have been several meta-analyses about the effectiveness of SRL on students’ academic performance, few of them included all levels of learners and focused on the online and blended learning environments. Furthermore, the findings from the previous meta-analyses about SRL are inconsistent. While most studies demonstrated that SRL yielded a positive effect on academic performance (Dignath, Buettner, and Langfeldt Citation2008; Zheng Citation2016), there were studies indicating that SRL only provided evidence for a partial mediation effect on achievement (Jansen et al. Citation2019) or only part of the SRL constructs were correlated with students’ learning outcomes (Sitzmann and Ely Citation2011). In addition, the majority of studies included in previous meta-analyses used the correlational design, and they measured SRL by self-reported data that could introduce bias and not accurately reflect student SRL activities (Zimmerman, Citation2008; Winne, Citation2020). These limitations make it important to revisit the causal influence of SRL.

The present meta-analysis aims to examine the effectiveness of SRL interventions on learners’ academic achievement in online or blended learning environments across all education levels. We confined SRL intervention studies to the pre- and post- study design with a control group, which further strengthened the causal implication of the interventions’ impact (Thiese Citation2014). We compared SRL phase, SRL scaffolds, and SRL strategies between treatment and control groups. We also investigated possible differential effectiveness due to substantive features of the included studies, such as different educational levels of learners (e.g. elementary, secondary, higher education, informal education), subjects (STEM vs. non-STEM), and learning contexts (e.g. online learning, blended learning, web-based learning, mobile learning).

3. Research questions

This meta-analysis was guided by three research questions:

  1. What is the overall effect of SRL interventions on learners’ academic achievement in online and blended learning environments?

  2. How are the effects of the SRL interventions moderated by substantive features of the included studies, such as learners’ educational level and learning context?

  3. How are the effects of the SRL interventions moderated by the characteristics of SRL interventions, such as SRL strategies, SRL phase, and SRL scaffolds?

4. Method

4.1. Study search

We searched for and identified studies for inclusion in three phases. Database searching was the first phase. An education librarian developed a comprehensive search in ERIC (EBSCO). The ERIC search was then modified for four additional databases: APA PsycInfo (EBSCO), Education Source (EBSCO), Academic Search Ultimate (EBSCO), and Computer Source (EBSCO). Self-regulation, online and blended environments, and academic achievement were the three core concepts of the search. Synonyms and related terms for self-regulation included self-regulation, cognitive strategy, metacognitive technique, motivation strategy, rehearsal, elaboration, monitoring, time management and self-evaluation. Online and blended environment terms included online, internet, virtual, computer-based, web-based, e-learning, digital environment, and distance education. Academic achievement search terms included achievement, academic ability, grade, test, indicator, academic performance, exam, GPA, score, and learning outcome. To develop search strings for each core concept, the synonyms and related terms were combined with ‘or’ and searched in the title and abstract fields. Additionally, relevant database subject terms for each concept were combined with ‘or’ and searched in the subject field. Then, the search string for the three core concepts was combined with ‘and.’ After limiting each database search from January 1, 2011 to March 16, 2022, the combined the searches retrieved 7,171 results. After deduplication, 5,458 unique references were screened for eligibility.

In abstract screening, the second phase, abstracts were assigned to two authors. Each used the inclusion and exclusion criteria to screen the titles and abstracts of the references. At the end of this stage, 512 articles remained.

The final phase was full-text screening. During the full-text screening, 462 articles and theses were removed. The 50 included articles contain 92 unique studies. shows the three-phase process.

Figure 1. Flow chart of search and identification procedures.

Figure 1. Flow chart of search and identification procedures.

4.2. Criteria for inclusion

Included studies met the following eight criteria.

  1. Examined the effectiveness of SRL interventions on students’ academic performance as measured by standardised tests and/or researcher-designed tests. We excluded articles if they were not about SRL or if they did not include learners’ academic achievement.

  2. Conducted in online or blended learning environments. We excluded articles if the SRL interventions were conducted in face-to-face classrooms or used traditional instructional methods.

  3. Published in a peer-reviewed journal or reported in a thesis from January 1, 2011 to March 16, 2022 and available in English. This time frame includes the most recent studies conducted after the publication of the Handbook of Self-Regulation of Learning and Performance (Zimmerman and Schunk Citation2011). We excluded secondary data analyses, literature reviews, conference papers, and book chapters.

  4. Employed randomised experimental or quasi-experimental designs.

  5. Included an independent control group. Comparison conditions were not involved in any formats of self-regulated strategies but could be in either traditional teacher-led classroom instruction or in online or blended learning environments.

  6. Included any type of learners in the sample. We included samples from kindergarten level to adult level, from normal learners to special needs learners, and from the formal education system to any types of informal education.

  7. Provided the necessary quantitative information for the calculation or estimation of effect size.

4.3. Study coding

In regards to the objectives of this meta-analysis, we limited our analysis to substantive features of the included studies and characteristics of SRL interventions and examined their contribution to between-group heterogeneity. The variables analysed were publication type, educational level, academic subjects, sample size, intervention duration, learning context, and key characteristics of SRL interventions, such as SRL strategies, SRL scaffolds, and SRL phase.

4.3.1. Substantive features of the studies

Substantive features of the studies included publication type, learning environment, and duration and intensity of the intervention. Characteristics about the participants included education level and academic subject area.

The publication type was coded as a journal article or thesis. We aimed to investigate if there is any effect size difference between published studies and unpublished dissertations.

Previous studies demonstrated diversified effects for technology’s intensity and duration on students’ academic performance (e.g. Cheung and Slavin Citation2013; Xu et al. Citation2019). For this meta-analysis, we simultaneously explored the impact of the duration and intensity of different SRL interventions so that we could determine how to best maximise its effect on students’ academic performance. We coded intensity as minutes per week and duration as length in weeks. We also treated intensity and duration as continuous variables to find out the effect of SRL intervention on students’ academic performance.

Education levels were categorised by elementary (K-6), secondary (7-12), higher education (undergraduate and graduate), and informal education. Studies that had participants from multiple levels were coded as mixed. Learners’ academic subjects were categorised as STEM (e.g. science, technology, engineering, and mathematics) and non-STEM (e.g. language, history, education) (Vo, Zhu, and Diep Citation2017). We coded studies as mixed if the studies included both STEM and non-STEM subjects.

Studies were categorised into four groups based on the learning context: blended, online course/learning system, mobile learning, and web-based. Studies were coded as blended if they were conducted in both an online context and a face-to-face context.

4.3.2. Categorisation of SRL types and SRL measures

The primary purpose of the meta-analysis is to investigate the effect of SRL interventions on learners’ academic achievement. We coded the characteristics of SRL interventions as SRL strategies, SRL scaffolds, and SRL phase.

As stated above, the SRL strategies targeted by interventions were categorised into cognitive, metacognitive, resource management, and emotional strategies. We coded studies as mixed if they employed more than one strategy. Preparatory, performance, and appraisal were used as the SRL phases. Learners prepare for learning in the preparatory phase by examining tasks, making plans, and defining objectives. During the performance phase, learners engage in SRL strategies, assessment of learning, and control. In the appraisal phase, learners reflect on and evaluate their learning (Puustinen and Pulkkinen Citation2001; Pintrich Citation2000). Studies investigating multiple phases were coded as mixed.

We coded SRL scaffolds to indicate the delivery methods or forms of SRL intervention. SRL scaffolds were coded as prompts or hints, concept maps, integrated SRL tools, worked examples, videos, and notes, based on Zheng’s (Citation2016) coding scheme (also see Delen, Liew, and Willson Citation2014).

Two independent coders coded each study and then, met to check for agreement. When a dispute occurred in coding, the two coders reevaluated the studies and discussed which code was the most appropriate. They reached 95% agreement across all features and characteristics. For those cases when an agreement was not reached, the fourth author met with the two coders to resolve the issue, resulting in 100% agreement.

4.4. Effect size calculation

Based on reported means and standard deviations, we used procedures described by Lipsey and Wilson (Citation2001) to calculate an unbiased effect size (Cohen’s d) for the SRL intervention on learners’ academic performance under online and blended learning environments for each study. In the absence of reported means or standard deviations, we employed other procedures prescribed by Lipsey and Wilson (Citation2001) using p-values, test statistics (t or F), and/or confidence intervals to estimate an effect size. For example, we used F-statistics to calculate the effect size for Lam (Citation2014). For several of the included studies, we contacted the authors of the original study to obtain the data to compute effect size (e.g. Gu and Lee Citation2019; Long and Aleven Citation2017).

4.5. Statistical analysis

Traditional meta-analyses with random effect were used to estimate the overall effect size. Next, we conducted a series of moderator/subgroup analyses, each time for one moderator. The effect size heterogeneity was tested using the Q formulas provided by Borenstein et al. (Citation2009).

To see how study-level covariates affect the effect sizes, we used – meta regress – command in Stata to investigate whether and how between-study heterogeneities can be explained by these moderators. Meta-regression was run for each categorical covariate such as subject types, education levels, learning contexts, and the four measures of SRL. We also examined the relation of the overall effect size over the two continuous covariates, intensity and duration of the intervention.

To explore potential publication bias, we conducted two additional analyses. One was the moderator analysis that tested the difference in effect sizes between published and unpublished studies (Polanin, Tanner-Smith, and Hennessy Citation2016). The other was the funnel plot asymmetry assessment using Duval (Citation2005) trim and fill analysis.

5. Results and discussions

5.1. Overall results

Our screening resulted in 50 articles that met our eligibility criteria. The studies were published between January 1, 2011 and March 16, 2022. Some of the articles included more than one single study. In total, there were 92 independent studies from 50 articles. Among these studies, 51 were published between 2011 and 2015, and 41 were published between 2016 and 2019. Seventy-one studies were quasi-experimental, whereas the other 21 used randomised experimental designs. The intensity of the SRL interventions varied from six to 330 min each week. The durations of the SRL interventions extended from one week to 36 weeks. The sample sizes of the studies ranged from 20 to 2,536. The selected studies and their features and characteristics are compiled in .

Table 1. Substantive features and characteristics of SRL features of the included studies.

We performed both fixed-effect and random-effect meta-analysis. The fixed effect model with inverse-variance method gives the overall effect size .55 (with 95% CI [.52, .59]). The random effect model gives a larger overall effect size .63 (with 95% CI [.47, .79]). This implied using SRL interventions had a positive and moderate effect size on students’ academic performance in online and blended learning environments. The Q value (Q = 513.22, df = 91, p < .001) indicated the existence of significant variations in the effect sizes for the observed studies. In the following analyses, the random effect models are used to explore different sets of moderators (both categorical and continuous) to explain the variations.

5.2. Subgroup analysis and regression analysis

5.2.1. Article type

The effect size for published studies obtained from refereed journals (n = 86) is .634, which was slightly larger than the effect size for those studies from unpublished theses (.491, n = 6). However, no significant difference was found between studies from refereed journals and the unpublished studies (QB = .08, df = 1, p > .05). Quantitative meta-regression showed that studies from journals significantly predicted the overall effect size (with coefficient .64, and p < .001), while studies from the unpublished theses failed to predict the overall effect size (with coefficient equal .45, and p > .05). Thus, while there was no significant difference found between published and unpublished studies, caution needs to be taken in interpreting the results due to the limited number of studies categorised as unpublished (n = 6).

5.2.2. Subject type

Studies from STEM (n = 51) exhibited a larger effect size (.88), while the effect size for studies from non-STEM (n = 40) was .40. The single mixed study had a negative effect size – .095. A significant difference was statistically detected between these subject types (QB = 18.83, df = 1, p < .05). Meta-regression results showed that both STEM and non-STEM studies significantly (p < .05) affected the overall effect size. The relative effect of STEM studies is larger (with coefficient .83) compared to non-STEM studies (with coefficient .39).

Our findings were consistent with previous meta-analyses. Dignath and Büttner (Citation2008) found that approximately 44% of the variation in effect sizes for mathematics performance at primary school can be explained by SRL interventions, while only 19% of the variation in effect sizes for reading/writing performance can be explained by SRL interventions. Li et al. (Citation2018) found different effect sizes for the different academic domains, indicating that SRL strategies were more effective for improving science performance than for language learning.

Previous studies also demonstrated that STEM subjects such as science, engineering, and mathematics require students to invest more effort, resources, and time into learning (Wilson et al. Citation2012). For instance, learning mathematics demands more cognitive skills than learning social sciences and linguistics (Loong Citation2012). Also, Wolters and Pintrich (Citation1998) examined SRL of junior high school students in mathematics, language, and social sciences, and they reported that both male and female students perceived mathematics as more important and interesting than English or social studies. In recent years, learning activities within STEM disciplines have increased in popularity among students (Miller, Sonnert, and Sadler Citation2018). This could lead to higher motivation of learners to follow the SRL interventions in STEM subjects and partly explain our results that SRL interventions produced better learning outcomes for students in STEM subjects compared to non-STEM subjects.

Our findings and previous reviews’ results that SRL interventions produced a larger effect size in STEM subjects suggest that future research should explore why SRL interventions are more effective in STEM than non-STEM subjects. For example, what characteristics of STEM subjects require more SRL interventions and what kind of SRL interventions.

5.2.3. Learning context

Studies were categorised into four groups based on the learning context: blended, online course/learning system, mobile learning, and web-based. The online course/online learning system (n = 55) produced the largest effect size (.72); the blended learning context (n = 9) came next (.55). The effect size was .52 for the web-based learning context (n = 23). The smallest effect size (.41) was under the mobile learning context (n = 5). The QB value indicated that the difference among groups was not statistically detectable (QB = 1.90 with df = 3, and p > .05). When using learning context to linearly predict the effect size, online learning, web-based and blended learning contexts were significant predictors for the effect size (p < .01, with coefficients .70, .56 and .55 respectively). Mobile context was not a significant predictor (p > .05). This could be due to the small number of studies in those contexts; we only had five mobile studies.

We only had nine studies in the blended learning context and five studies in the mobile learning context, which indicates that we should interpret our results cautiously. However, that the online learning system produced the largest effect size with the biggest sample size demonstrated that there is a trend for learning to go online. The results that online learning, web-based and blended learning were significant predictors for the effect suggest that researchers should focus more on these learning contexts. Previous studies demonstrated that online learning and traditional instruction have their own advantages, and sometimes technology-based learning cannot replace traditional instruction completely (e.g. Jeffries, Woolf, and Linde Citation2003; Vilardi Citation2013; Wagner, Garippo, and Lovaas Citation2011). SRL interventions might be needed more in blended learning contexts. For instance, Hadwin, Järvelä, and Miller’s (Citation2011) theory explains the role of SRL in collaborative learning. The shared regulation (SSRL) model contends that adaption to interactions with others is a part of SRL actions. In addition to SRL on an individual level, learners need to share SRL such as articulating their task perceptions and goals with others to learn effectively (Hadwin et al. Citation2010). This feature places a higher demand for traditional instruction, where it is relatively easy for students to have social interactions. Future studies could investigate whether there is a difference between online and blended learning for the effect of SRL interventions.

5.2.4. Education level

Studies from four education levels led to various effect sizes.

Elementary education (n = 7) produced the largest effect size (2.21), followed by informal education (n = 2) with the effect size .74. The higher education (n = 60) and secondary education group (n = 23) had the effect sizes .58 and .43 respectively. The QB value demonstrated that the difference among groups was not statistically detectable (QB = 5.19, df = 3, p > .05). When it comes to the quantitative effect of education level on the overall effect size, the elementary, secondary and higher education levels have significant influence (p < .05), with the coefficients being 1.76, .42 and .59, respectively. For informal/adult learners, the coefficient is .72 (p = .18).

Most of the previous reviews about the effect of SRL on students’ learning outcomes focused on the higher education level (e.g. Broadbent and Poon Citation2015; Jansen et al. Citation2019). Considering the convenience of conducting SRL interventions with students in online and blended learning environments, it is understandable that the majority of research falls into the category of higher education.

Our findings illustrated that elementary education produced the largest effect size and reinforced that students at primary school can already benefit from SRL interventions (Stoeger, Fleischmann, and Obergriesser Citation2015; Dignath, Buettner, and Langfeldt Citation2008). Hattie et al. (Citation1996) found that learning skills intervention is more effective at primary school than at the secondary and college levels. Similarly, Dignath and Büttner (Citation2008) identified a higher effect size for math achievement in primary school than in secondary school.

However, previous research reached different findings about the elementary school level. Zheng (Citation2016) included 29 studies and compared the effect sizes of SRL scaffolds across primary school, secondary school, and university levels, and the results indicated that there was no significant difference among these three stages. Another inconsistent finding is from Li et al.’s (Citation2018) meta-analysis that found that the effect size of SRL and academic performance in primary school was smaller than junior and senior high school. However, this meta-analysis was limited to Chinese students and the majority (93%) of studies they reviewed were cross-sectional studies.

The difference in findings between previous studies and our study demonstrates that researchers should compare some specific features of their interventions and find out what caused the difference in the effect of SRL interventions under different educational levels. Despite mixed findings from previous studies, our current findings propose that future researchers in the SRL field should put more emphasis on early-age learners. We hypothesise that early SRL interventions might have beneficial effects on students’ learning outcomes compared to the other age groups of students.

5.2.5. SRL strategy

The effect size for those having mixed self-regulation strategies (n = 54) was .92. Utilising cognitive strategies (n = 4) led to a smaller effect size (.78). The other three SRL strategies were using resources (n = 2, with effect size .46), using metacognitive strategies (n = 31, with effect size .19), and using emotional strategies (n = 1, with effect size .19). The QB value (QB = 27.36, df = 4, p < .001) implies a statistical difference for different types of the self-regulation strategy. The effect size of SRL intervention was statistically detectable by mixed strategies (p < .001), with the coefficient being the largest (.87, p < .001). The prediction of cognitive strategy is also significant (.78, p < .05). Metacognitive strategies, resources, and emotional strategies failed to statistically predict the effect size (p = .21, .45, .19 respectively).

Our results demonstrated that the effect of a mixed SRL strategy on achievement is higher than other strategies used individually. This finding is consistent with one of the few studies which also examined the effect of general SRL activity and single aspect of SRL (Jansen et al. Citation2019). Their results indicated that measured SRL activity, including resource management and metacognition, was more strongly correlated with achievement than a resource management strategy alone. Based on this finding, we encourage the use of multiple SRL strategies in SRL interventions to promote the learning outcomes.

5.2.6. SRL phase

When the SRL is applied in the performance phase (n = 17), the effect size is .36, compared to the effect size .15 for studies applied in the appraisal phase (n = 8).

The largest effect size of .78 is observed when SRL is implemented in a mixed manner (n = 66). One study fell into the preparatory phase category, with the effect size .13. Group differences were statistically detectable (QB = 20.05, df = 3, p < .001). Increasing overall effect will happen when SRL is applied in a mixed phase (with coefficient .75, p < .001), compared to when SRL is applied in appraisal phase (with coefficient .23 and p > .05). The coefficient for SRL applied in the performance phase is .37 (p < .05). The coefficient for SRL in the preparatory phase is .13 (p > .05).

In a previous meta-analysis by Li et al. (Citation2018), the performance phase and the appraisal phase were identified as critical and no significant differences were found in effect sizes between these two phases. Only one of our included studies fell into the preparatory phase category. We suggest that future studies integrate SRL interventions from the start and believe early interventions from the preparatory phase will have a positive impact on students’ learning performance.

Our present findings illustrated that there were significant differences between the four phases (preparatory, performance, appraisal, and mixed), and mixed phases produced the largest effect size. Our findings were consistent with the previous studies. For instance, DiBenedetto and Zimmerman (Citation2010) found that high school students with higher science performance used more SRL subprocesses. This suggests that SRL interventions should support SRL in multiple phases rather than focusing on one phase.

5.2.7. SRL scaffolds

Effect sizes were affected by scaffolds implementing the SRL intervention. The four studies that used notes as scaffolds had an effect size of 1.25. Three studies that showed videos had the effect size of 1.20. The estimated mean effect size for the studies utilising mixed scaffolds (n = 25) was .96. The group that provided the integrated SRL tools had the largest sample size (n = 32), and the effect size was .63. The effect sizes for utilising worked examples (n = 1), providing concept maps (n = 3), and giving only prompts or hints (n = 24) were .42, .36 and .24, respectively. The QB value (QB = 19.89, with df = 6 and p < .01) told us that group differences were statistically detectable among these types of implemented SRL scaffolds. Using notes, videos, mixed scaffolds, and integrated SRL tools as scaffolds tend to benefit the effect of SRL (with p < .05), with the coefficients equal to 1.25, 1.08, .84, and .64. Regression analysis illustrated that worked examples and concept maps do not significantly change the effect size (p > .05).

Our results demonstrated that using notes and videos produced the largest effect sizes. However, this result should be interpreted with caution, as the sample sizes for them are very small, with four studies using notes as scaffolds and three studies using videos. The results about the integrated SRL tool, with the largest sample size in this group, corroborated the finding of Zheng (Citation2016) to some degree, who found the integrated SRL tool was correlated with the highest effect size. The integrated SRL tool could provide support through the whole learning process, so this finding is also in line with the need to spread interventions across multiple SRL phases. This informed us that SRL interventions focusing only on one scaffold are not enough. Our study also confirmed the effectiveness of mixed scaffolds, which is unsurprising as these SRL scaffolds generally involve more functions compared to single scaffolds (see and ).

Table 2. Subgroup analysis.

Table 3. Effect size regression analysis

5.2.8. Intensity and duration of intervention

We also performed two separate meta-regressions, each time using one of the two continuous covariates defining intervention duration and intervention intensity as the predictor. A simple linear regression indicated that the overall effect size was not statistically detectable by neither the intervention’s duration nor intensity (p > .05). When we add an interaction term into the regression model, neither the linear term nor the quadratic term for duration is significant in predicting the effect size.

When interpreting the effect of SRL interventions’ intensity and duration on students’ academic performance, there is much missing information in the included studies. Twenty-two studies did not provide information about the intensity of the intervention, and 13 did not provide information for duration. Since previous studies identified varied effects of technology’s intensity and duration on students’ learning outcomes (e.g. Cheung and Slavin Citation2013; Xu et al. Citation2019), we urge the scholars in the field of SRL to provide more detailed information about the intensity and duration of their interventions.

5.3. Publication bias analysis

The significance of regression-based Egger test (Egger, Smith, and Phillips Citation1997) for small-study effects (z = 6.85 and p < .001) necessitates further exploration of the possible publication bias. The funnel plot (see ) appeared to be non-symmetric, indicating the existence of potential publication bias. Imputation results showed that we might need 13 additional studies, which would reduce the effect size from 0.63 using only the observed studies to .38. Nonparametric trim-and-fill analysis of publication bias is also provided (see ).

Figure 2. Funnel Plot.

Figure 2. Funnel Plot.

Figure 3. Nonparametric trim-and-fill analysis of publication bias.

Figure 3. Nonparametric trim-and-fill analysis of publication bias.

6. Conclusion and future directions

Steady advances in research on SRL across diverse educational contexts have occurred since the 1980s (Panadero Citation2017; Pintrich et al. Citation1993; Zimmerman Citation1986), and now SRL has become one of the major areas of research in educational psychology. The present research analysed empirical studies about the effects of SRL interventions on learners’ academic achievement in online and blended learning environments from 2011, when a new handbook presenting a variety of established methods to evaluate SRL opened a new era for this field of research. Online education became common, often necessary, practice for K-12 and higher education during the COVID-19 global pandemic in 2020; therefore, we investigated the empirical studies on SRL interventions in both online and blended environments.

Consistent with prior meta-analyses (e.g. Broadbent and Poon Citation2015; Zheng Citation2016), the present meta-analysis confirmed the positive and moderate effect of SRL interventions (ES = 0.63) on learners’ academic achievement in online and blended environments. This demonstrates that SRL interventions in online and blended learning environments are promising for improving learners’ academic achievement. With the development of diversified educational technology applications, additional research on the specific components of SRL interventions in online and blended environments is warranted, particularly in the post-COVID-19 era.

The present meta-analysis contributes to the literature and is unique from prior meta-analyses on SRL in several ways. First, this meta-analysis covers learners from early childhood to adulthood, and in formal school settings and informal educational settings. Second, the present study attempts to connect SRL interventions with learners’ academic achievement and limits the included studies to online and blended learning environments. Third, the present study covers all core academic subjects including language, arts, science, and mathematics. We aimed to find out whether there are differences on the effects of SRL intervention between core academic subjects, different educational levels, and other substantive features of the studies. We also attempted to identify the different effects of SRL intervention features on learners’ academic achievement in online and blended learning.

The present research also contributes to understanding how well online or blended education worked for students across various school or academic subjects. We discovered that SRL interventions are more effective for STEM subjects and produce the largest effect size for early-age learners under online and blended learning environments. The results from the current study also demonstrated that most SRL intervention studies focus on higher education while elementary education produced the largest effect size. This finding indicated that future researchers in the SRL field might put more emphasis on early-age learners. We also found that online learning systems and blended learning produced the largest effect size with the biggest sample size, which illustrated a trend for learning to go online and further required researchers’ attention. Unsurprisingly, the present study revealed that mixed SRL strategies and interventions throughout mixed phases would generate the largest effect on students’ learning outcomes. These findings can inform the development and implementation of instructional design, curriculum, and SRL interventions in online and blended learning environments. However, these findings also need to be replicated by additional empirical studies, particularly in the post-COVID-19 era.

The primary limitations of this meta-analysis are small samples in certain categories. Between 2011 and 2022, 92 independent studies from 50 articles were included in this meta-analysis. No differences were found between published studies from refereed journal articles and unpublished theses, although results also showed that only journal articles significantly predicted the overall effect size of SRL interventions on learners’ academic performance under online and blended learning environments. However, there were only 6 thesis studies compared to 86 journal article studies. When additional thesis studies become available, these results should be reexamined. Similarly, the findings on the use of notes and videos as scaffolds that produced the largest effect sizes need to be interpreted with caution because there were four studies using notes and three studies using videos. Future research should further investigate the effectiveness of note-taking and video scaffolds in online and blended learning environments. Additional studies in these areas will allow for additional meta-analyses to further our understanding of the effectiveness of these scaffolds, which is of particular importance given how timely SRL interventions and online or blended learning is in the post-COVID era.

In conclusion, online and blended learning has become increasingly prevalent in the twenty-first century, and in many parts of the world, the COVID-19 pandemic has only accelerated the adoption of virtual or remote learning environments for formal and informal learning with learners ranging from childhood to adulthood. While online or blended learning offers learners great flexibility and autonomy to learn at a time, in a space, and at a pace of their choice, it also requires learners at all ages to develop and utilise some level of SRL to benefit from their virtual or remote learning environments. The results from this meta-analysis highlight key features that contribute to the efficacy of SRL interventions in online or blended environments for students in elementary education, secondary education, higher education, and adult learning settings. Findings from this meta-analysis offer valuable guidance for the design and implementation of SRL interventions as well as online or blended curriculum and learning environments.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • References marked with an asterisk indicate studies included in the meta-analysis.
  • Azevedo, R. 2005. “Using Hypermedia as a Metacognitive Tool for Enhancing Student Learning? The Role of Self-Regulated Learning.” Educational Psychologist 40 (4): 199–209. doi:10.1207/s15326985ep4004_2.
  • Azevedo, R., and J. G. Cromley. 2004. “Does Training on Self-Regulated Learning Facilitate Students’ Learning with Hypermedia?” Journal of Educational Psychology 96: 523–535. doi:10.1037/0022-0663.96.3.523.
  • *Bannert, M., and P. Reimann. 2012. “Supporting Self-Regulated Hypermedia Learning Through Prompts.” Instructional Science 40 (1): 193–211. doi:10.1007/s11251-011-9167-4.
  • *Bannert, M., C. Sonnenberg, C. Mengelkamp, and E. Pieger. 2015. “Short-and Long-Term Effects of Students’ Self-Directed Metacognitive Prompts on Navigation Behavior and Learning Performance.” Computers in Human Behavior 52: 293–306. doi:10.1016/j.chb.2015.05.038.
  • Baumeister, R. F., and K. D. Vohs. 2016. “Strength Model of Self-Regulation as Limited Resource: Assessment, Controversies, Update.” In Advances in Experimental Social Psychology, edited by J. M. Olson, and M. P. Zanna, Vol. 54, 67–127. San Diego, CA: Academic Press. doi:10.1016/bs.aesp.2016.04.001.
  • *Bednall, T. C., and E. J. Kehoe. 2011. “Effects of Self-Regulatory Instructional Aids on Self-Directed Study.” Instructional Science 39 (2): 205–226. doi:10.1007/s11251-009-9125-6.
  • *Bernacki, M. L., L. Vosicka, and J. C. Utz. 2020. “Can a Brief, Digital Skill Training Intervention Help Undergraduates “Learn to Learn” and Improve Their STEM Achievement?” Journal of Educational Psychology 112 (4): 765–781. doi:10.1037/edu0000405.
  • Borenstein, M., L. V. Hedges, J. P. T. Higgins, and H. R. Rothstein. 2009. Introduction to Meta-Analysis. Chichester: John Wiley & Sons, Ltd.
  • *Botero, G. G., M. A. Botero Restrepo, C. Zhu, and F. Questier. 2021. “Complementing in-Class Language Learning with Voluntary Out-of-Class MALL. Does Training in Self-Regulation and Scaffolding Make a Difference?” Computer Assisted Language Learning 34 (8): 1013–1039. doi:10.1080/09588221.2019.1650780.
  • *Boykin, A., A. S. Evmenova, K. Regan, and M. Mastropieri. 2019. “The Impact of a Computer-Based Graphic Organizer with Embedded Self-Regulated Learning Strategies on the Argumentative Writing of Students in Inclusive Cross-Curricula Settings.” Computers & Education 137: 78–90. doi:10.1016/j.compedu.2019.03.008.
  • Broadbent, J., and W. L. Poon. 2015. “Self-regulated Learning Strategies & Academic Achievement in Online Higher Education Learning Environments: A Systematic Review.” The Internet and Higher Education 27: 1–13. doi:10.1016/j.iheduc.2015.04.007.
  • *Cai, R., Q. Wang, J. Xu, and L. Zhou. 2020. “Effectiveness of Students’ Self-Regulated Learning During the COVID-19 Pandemic.” Science INSIGHTS 34 (1): 175–182. doi:10.15354/si.20.ar011.
  • *Chang, C.-Y., P. Panjaburee, H.-C. Lin, C.-L. Lai, and G.-H. Hwang. 2022. “Effects of Online Strategies on Students’ Learning Performance, Self-Efficacy, Self-Regulation and Critical Thinking in University Online Courses.” Educational Technology Research & Development 70: 185–204. doi:10.1007/s11423-021-10071-y.
  • *Chase, J. A., R. Houmanfar, S. C. Hayes, T. A. Ward, J. P. Vilardaga, and V. Follette. 2013. “Values Are Not Just Goals: Online ACT-Based Values Training Adds to Goal Setting in Improving Undergraduate College Student Performance.” Journal of Contextual Behavioral Science 2 (3–4): 79–84. doi:10.1016/j.jcbs.2013.08.002.
  • *Chen, C. M., L. C. Chen, and S. M. Yang. 2019. “An English Vocabulary Learning app with Self-Regulated Learning Mechanism to Improve Learning Performance and Motivation.” Computer Assisted Language Learning 32 (3): 237–260. doi:10.1080/09588221.2018.1485708.
  • *Chen, C. M., and S. H. Huang. 2014. “Web-Based Reading Annotation System with an Attention Based Self-Regulated Learning Mechanism for Promoting Reading Performance.” British Journal of Educational Technology 45 (5): 959–980. doi:10.1111/bjet.12119.
  • *Chen, C. M., J. Y. Wang, and Y. C. Chen. 2014. “Facilitating English-Language Reading Performance by a Digital Reading Annotation System with Self-Regulated Learning Mechanisms.” Journal of Educational Technology & Society 17 (1): 102–114.
  • Cheung, A. C., and R. E. Slavin. 2013. “The Effectiveness of Educational Technology Applications for Enhancing Mathematics Achievement in K-12 Classrooms: A Meta-Analysis.” Educational Research Review 9: 88–113. doi:10.1016/j.edurev.2013.01.001.
  • Delen, E., and J. Liew. 2016. “The Use of Interactive Environments to Promote Self-Regulation in Online Learning: A Literature Review.” European Journal of Contemporary Education 15: 24–33. doi:10.13187/ejced.2016.15.24.
  • *Delen, E., J. Liew, and V. Willson. 2014. “Effects of Interactivity and Instructional Scaffolding on Learning: Self-Regulation in Online Video-Based Environments.” Computers & Education 78: 312–320. doi:10.1016/j.compedu.2014.06.018.
  • Dent, A. L., and A. C. Koenka. 2016. “The Relation between Self-Regulated Learning and Academic Achievement across Childhood and Adolescence: A Meta-Analysis.” Educational Psychology Review 28 (3): 425–474.
  • DiBenedetto, M. K., and B. J. Zimmerman. 2010. “Differences in Self-Regulatory Processes among Students Studying Science: A Microanalytic Investigation.” The International Journal of Educational and Psychological 5: 2–24.
  • Dignath, C., G. Buettner, and H.-P. Langfeldt. 2008. “How Can Primary School Students Learn Self-Regulated Learning Strategies Most Effectively?” Educational Research Review 3 (2): 101–129. doi:10.1016/j.edurev.2008.02.003.
  • Dignath, C., and G. Büttner. 2008. “Components of Fostering Self-Regulated Learning among Students: A Meta-Analysis on Intervention Studies at Primary and Secondary School Level.” Metacognition and Learning 3 (3): 231–264. doi:10.1007/s11409-008-9029-x.
  • Duval, S. 2005. “The Trim and Fill Method.” In Publication Bias in Meta Analysis: Prevention, Assessment and Adjustments, edited by H. Rothstein, A. J. Sutton, and M. Borenstein, 127–144. Chichester, UK: Wiley.
  • Egger, M., G. D. Smith, and A. N. Phillips. 1997. “Meta-analysis: Principles and Procedures.” British Medical Journal 315: 1533–1537. doi:10.1136/bmj.315.7121.1533.
  • *Eslami, M., and R. Sahragard. 2021. “Investigating the Effect of Self-Regulatory Strategy Development on Iranian EFL Learners’ Metadiscoursal Writing Skill.” Language Teaching Research Quarterly 21: 54–65. doi:10.32038/ltrq.2021.21.04.
  • Fournier, H., R. Kop, and G. Durand. 2014. “Challenges to Research in MOOCs.” Journal of Online Learning and Teaching 10 (1): 1.
  • *Gu, P., and Y. Lee. 2019. “Promoting Students’ Motivation and Use of SRL Strategies in the Web-Based Mathematics Learning Environment.” Journal of Educational Technology Systems 47 (3): 391–410. doi:10.1177/0047239518808522.
  • Hadwin, A. F., S. Järvelä, and M. Miller. 2011. “Self-regulated, Co-regulated, and Socially Shared Regulation of Learning.” In Schunk. Handbook of Self-Regulation of Learning and Performance, edited by B. J. Zimmerman, and D. H. Schunk, 65–84. New York, NY: Routledge.
  • Hadwin, A. F., M. Oshige, C. L. Z. Gress, and P. H. Winne. 2010. “Innovative Ways for Using GStudy to Orchestrate and Research Social Aspects of Self-Regulated Learning.” Computers in Human Behavior 26 (5): 794–805. doi:10.1016/j.chb.2007.06.007.
  • Hattie, J., J. Biggs, and N. Purdie. 1996. “Effects of Learning Skills Interventions on Student Learning: A Meta-Analysis.” Review of Educational Research 66 (2): 99–136.
  • *Hsiao, H. S., C. C. Tsai, C. Y. Lin, and C. C. Lin. 2012. “Implementing a Self-Regulated WebQuest Learning System for Chinese Elementary Schools.” Australasian Journal of Educational Technology 28 (2): 315–340. doi:10.14742/ajet.876.
  • *Hu, H., and M. P. Driscoll. 2013. “Self-regulation in e-Learning Environments: A Remedy for Community College?” Journal of Educational Technology & Society 16 (4): 171–184. https://www.jstor.org/stable/jeductechsoci.16.4.171.
  • Jansen, R. S., A. Van Leeuwen, J. Janssen, S. Jak, and L. Kester. 2019. “Self-regulated Learning Partially Mediates the Effect of Self-Regulated Learning Interventions on Achievement in Higher Education: A Meta-Analysis.” Educational Research Review 28: 100292. doi:10.1016/j.edurev.2019.100292.
  • Jeffries, P. R., S. Woolf, and B. Linde. 2003. “Technology-Based vs. Traditional Instruction: A Comparison of Two Methods for Teaching the Skill of Performing a 12-Lead ECG.” Nursing Education Perspectives 24 (2): 70–74.
  • *Kauffman, D. F., R. Zhao, and Y. S. Yang. 2011. “Effects of Online Note Taking Formats and Self-Monitoring Prompts on Learning from Online Text: Using Technology to Enhance Self-Regulated Learning.” Contemporary Educational Psychology 36 (4): 313–322. doi:10.1016/j.cedpsych.2011.04.001.
  • *Kereluik, K. M. 2013. “Scaffolding Self-Regulated Learning Online: A Study in High School Mathematics Classrooms.” Doctoral diss., Michigan State University. Educational Psychology and Educational Technology. Available from ProQuest Dissertations and Theses database. (UMI No. 3604541).
  • *Kim, C., and C. B. Hodges. 2012. “Effects of an Emotion Control Treatment on Academic Emotions, Motivation and Achievement in an Online Mathematics Course.” Instructional Science 40 (1): 173–192. doi:10.1007/s11251-011-9165-6.
  • *Kim, H. J., and S. Pedersen. 2011. “Advancing Young Adolescents’ Hypothesis-Development Performance in a Computer-Supported and Problem-Based Learning Environment.” Computers & Education 57 (2): 1780–1789. doi:10.1016/j.compedu.2011.03.014.
  • *Lai, C. L., G. J. Hwang, and Y. H. Tu. 2018. “The Effects of Computer-Supported Self-Regulation in Science Inquiry on Learning Outcomes, Learning Processes, and Self-Efficacy.” Educational Technology Research and Development 66 (4): 863–892. doi:10.1007/s11423-018-9585-y.
  • *Lam, C. M. 2014. “The Roles of Instruction and Metacognition in Enhancing Self-Regulated Learning in a Computer-Based Learning Environment: An Intervention Programme for High School Chemistry”. Doctoral diss., The Chinese University of Hong Kong). Available from ProQuest Dissertations and Theses database. (UMI No. 3707434).
  • Lee, Y., and J. Choi. 2011. “A Review of Online Course Dropout Research: Implications for Practice and Future Research.” Educational Technology Research and Development 59: 593–618. doi:10.1007/s11423-010-9177-y.
  • *Lee, S., T. Barker, and V. S. Kumar. 2016. “Effectiveness of a Learner-Directed Model for e-Learning.” Journal of Educational Technology & Society 19 (3): 221–233. https://www.jstor.org/stable/10.2307jeductechsoci.19.3.221.
  • León, J., J. L. Núñez, and J. Liew. 2015. “Self-determination and STEM Education: Effects of Autonomy, Motivation, and Self-Regulated Learning on High School Math Achievement.” Learning and Individual Differences 43: 156–163. doi:10.1016/j.lindif.2015.08.017.
  • Li, J., H. Ye, Y. Tang, Z. Zhou, and X. Hu. 2018. “What are the Effects of Self-Regulation Phases and Strategies for Chinese Students? A Meta-Analysis of Two Decades Research of the Association Between Self-Regulation and Academic Performance.” Frontiers in Psychology 9: 2434. doi:10.3389/fpsyg.2018.02434.
  • *Li, M., Y. Wang, H. N. Stone, and N. Turki. 2021. “Teaching Introductory Chemistry Online: The Application of Socio-Cognitive Theories to Improve Students’ Learning Outcomes.” Education Sciences 11 (3): 95. doi:10.3390/educsci11030095.
  • Lipsey, M., and D. Wilson. 2001. Practical Meta-Analysis. Thousand Oaks, CA: Sage.
  • *Long, Y., and V. Aleven. 2017. “Enhancing Learning Outcomes Through Self-Regulated Learning Support with an Open Learner Model.” User Modeling and User-Adapted Interaction 27 (1): 55–88. doi:10.1007/s11257-016-9186-6.
  • Loong, E. T. 2012. “Self-regulated Learning Strategies and Their Effects on Math Performance of Pre-University International Students in Malaysia.” Journal of Education and Vocational Research 3: 89–97. doi:10.22610/JEVR.V3I3.54.
  • *Maree, T. J., J. M. van Bruggen, and W. M. Jochems. 2013. “Effective Self-Regulated Science Learning Through Multimedia-Enriched Skeleton Concept Maps.” Research in Science & Technological Education 31 (1): 16–30. doi:10.1080/02635143.2013.782283.
  • Miller, K., G. Sonnert, and P. Sadler. 2018. “The Influence of Students’ Participation in STEM Competitions on Their Interest in STEM Careers.” International Journal of Science Education, Part B 8 (2): 95–114. doi:10.1080/21548455.2017.1397298.
  • Muenks, K., A. Wigfield, J. S. Yang, and C. R. O’Neal. 2017. “How True is Grit? Assessing its Relations to High School and College Students’ Personality Characteristics, Self-Regulation, Engagement, and Achievement.” Journal of Educational Psychology 109: 599–620. doi:10.1037/edu0000153.
  • Panadero, E. 2017. “A Review of Self-Regulated Learning: Six Models and Four Directions for Research.” Frontiers in Psychology 8: 422. doi:10.3389/fpsyg.2017.00422.
  • Pelletier, L. G., M. S. Fortier, R. J. Vallerand, and N. M. Brière. 2001. “Associations among Perceived Autonomy Support, Forms of Self-Regulation, and Persistence: A Prospective Study.” Motivation and Emotion 25: 279–306. doi:10.1023/A:1014805132406.
  • Pintrich, P. R. 2000. “The Role of Goal Orientation in Self-Regulated Learning.” In Handbook of Self Regulation, edited by M. Boekaerts, P. Pintrich, and M. Zeidner, 452–502. New York: Academic Press.
  • Pintrich, P. R. 2004. “A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students.” Educational Psychology Review 16 (4): 385–407.
  • Pintrich, P. R., and Others. 1991. “A Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ).”
  • Pintrich, P. R., D. A. F. Smith, T. Garcia, and W. J. Mckeachie. 1993. “Reliability and Predictive Validity of the Motivated Strategies for Learning Questionnaire (MSLQ).” Educational and Psychological Measurement 53 (3): 801–813. doi:10.1177/0013164493053003024.
  • Polanin, J. R., E. E. Tanner-Smith, and E. A. Hennessy. 2016. “Estimating the Difference Between Published and Unpublished Effect Sizes: A Meta-Review.” Review of Educational Research 86 (1): 207–236.
  • Puustinen, M., and L. Pulkkinen. 2001. “Models of Self-Regulated Learning: A Review.” Scandinavian Journal of Educational Research 45 (3): 269–286. doi:10.1080/00313830120074206.
  • Puzziferro, M. 2008. “Online Technologies Self-Efficacy and Self-Regulated Learning as Predictors of Final Grade and Satisfaction in College-Level Online Courses.” American Journal of Distance Education 22 (2): 72–89. doi:10.1080/08923640802039024.
  • *Randall, J. G. 2015. “Mind Wandering and Self-Directed Learning: Testing the Efficacy of Self-Regulation Interventions to Reduce Mind Wandering and Enhance Online Training”. Doctoral diss., Rice University. https://scholarship.rice.edu/handle/1911/88440.
  • Richardson, M., C. Abraham, and R. Bond. 2012. “Psychological Correlates of University Students’ Academic Performance: A Systematic Review and Meta-Analysis.” Psychological Bulletin 138 (2): 353. doi:10.1037/a0026838.
  • Sadati, S., and S. Simin. 2016. “The Relationship Between Metacognitive and Self-Regulated Learning Strategies with Learners’ L2 Learning Achievement.” International Journal of Research Studies in Language Learning 5 (2): 97–106. doi:10.5861/ijrsll.2015.1267.
  • *Schumacher, C., and D. Ifenthaler. 2021. “Investigating Prompts for Supporting Students’ Self-Regulation – A Remaining Challenge for Learning Analytics Approaches?” Internet & Higher Education 49 (2021): 100791. doi:10.1016/j.iheduc.2020.100791.
  • *Schworm, S., and H. Gruber. 2012. “E-Learning in Universities: Supporting Help-Seeking Processes by Instructional Prompts.” British Journal of Educational Technology 43 (2): 272–281. doi:10.1111/j.1467-8535.2011.01176.x.
  • *Shen, P. D., T. H. Lee, and C. W. Tsai. 2011. “Applying Blended Learning with Web-Mediated Self-Regulated Learning to Enhance Vocational Students’ Computing Skills and Attention to Learn.” Interactive Learning Environments 19 (2): 193–209. doi:10.1080/1049482090280895.
  • *Shin, S., and H. D. Song. 2016. “Finding the Optimal Scaffoldings for Learners’ Epistemological Beliefs During ill-Structured Problem Solving.” Interactive Learning Environments 24 (8): 2032–2047. doi:10.1080/10494820.2015.1073749.
  • Sitzmann, T., and K. Ely. 2011. “A Meta-Analysis of Self-Regulated Learning in Work-Related Training and Educational Attainment: What We Know and Where we Need to go.” Psychological Bulletin 137 (3): 421–442. doi:10.1037/a0022777.
  • *Sonnenberg, C., and M. Bannert. 2015. “Discovering the Effects of Metacognitive Prompts on the Sequential Structure of SRL-Processes Using Process Mining Techniques.” Journal of Learning Analytics 2 (1): 72–100. doi:10.18608/jla.2015.21.5.
  • Stoeger, H., S. Fleischmann, and S. Obergriesser. 2015. “Self-regulated Learning (SRL) and the Gifted Learner in Primary School: The Theoretical Basis and Empirical Findings on a Research Program Dedicated to Ensuring That all Students Learn to Regulate Their own Learning.” Asia Pacific Education Review 16 (2): 257–267. doi:10.1007/s12564-015-9376-7.
  • Thiese, M. S. 2014. “Observational and Interventional Study Design Types; an Overview.” Biochemia Medica 24 (2): 199–210.
  • *Tsai, C. W. 2011. “Achieving Effective Learning Effects in the Blended Course: A Combined Approach of Online Self-Regulated Learning and Collaborative Learning with Initiation.” Cyberpsychology, Behavior, and Social Networking 14 (9): 505–510. doi:10.1089/cyber.2010.0388.
  • *Tsai, C. W. 2013. “An Effective Online Teaching Method: The Combination of Collaborative Learning with Initiation and Self-Regulation Learning with Feedback.” Behaviour & Information Technology 32 (7): 712–723. doi:10.1080/0144929X.2012.667441.
  • *Tsai, C. W., P. F. Hsu, and H. J. Tseng. 2013a. “Exploring the Effects of web-Mediated Game-Based Learning and Self-Regulated Learning on Students’ Learning.” International Journal of Information and Communication Technology Education (IJICTE) 9 (2): 39–51. doi:10.4018/jicte.2013040104.
  • *Tsai, C. W., and T. H. Lee. 2012. “Developing an Appropriate Design for e-Learning with web-Mediated Teaching Methods to Enhance low-Achieving Students’ Computing Skills: Five Studies in e-Learning Implementation.” International Journal of Distance Education Technologies (IJDET) 10 (1): 1–30. doi:10.4018/jdet.2012010101.
  • *Tsai, C. W., T. H. Lee, and P. D. Shen. 2013b. “Developing Long-Term Computing Skills among Low-Achieving Students via Web-Enabled Problem-Based Learning and Self-Regulated Learning.” Innovations in Education and Teaching International 50 (2): 121–132. doi:10.1080/14703297.2012.760873.
  • *Tsai, C. W., and P. D. Shen. 2011. “The Application of Web and Educational Technologies in Supporting Web-Enabled Self-Regulated Learning in Different Computing Course Orientations.” International Journal of Information and Communication Technology Education (IJICTE) 7 (1): 70–79. doi:10.4018/jicte.2011010107.
  • *Tsai, C. W., P. D. Shen, and M. C. Tsai. 2011. “Developing an Appropriate Design of Blended Learning with Web-Enabled Self-Regulated Learning to Enhance Students’ Learning and Thoughts Regarding Online Learning.” Behaviour & Information Technology 30 (2): 261–271. doi:10.1080/0144929X.2010.514359.
  • U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. 2009. Evaluation of Evidence-Based Best Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf.
  • *van Alten, D. C. D., C. Phielix, J. Janssen, and L. Kester. 2020a. “Effects of Self-Regulated Learning Prompts in a Flipped History Classroom.” Computers in Human Behavior 108 (2020): 106318. doi:10.1016/j.chb.2020.106318.
  • *van Alten, D. C. D., C. Phielix, J. Janssen, and L. Kester. 2020b. “Self-regulated Learning Support in Flipped Learning Videos Enhances Learning Outcomes.” Computers & Education 158 (2020): 104000. doi:10.1016/j.compedu.2020.104000.
  • *Van den Broeck, L., T. De Laet, M. Lacante, M. Pinxten, C. Van Soom, and G. Langie. 2020. “The Effectiveness of a MOOC in Basic Mathematics and Time Management Training for Transfer Students in Engineering.” European Journal of Engineering Education 45 (4): 534–549. doi:10.1080/03043797.2019.1641692.
  • Vilardi, R. P. 2013. “Mathematics Achievement: Traditional Instruction and Technology-Assisted Course Delivery Methods.” Doctoral diss., University of Alabama Libraries.
  • Vo, H. M., C. Zhu, and N. A. Diep. 2017. “The Effect of Blended Learning on Student Performance at Course-Level in Higher Education: A Meta-Analysis.” Studies in Educational Evaluation 53: 17–28. doi:10.1016/j.stueduc.2017.01.002.
  • Wagner, S. C., S. J. Garippo, and P. Lovaas. 2011. “A Longitudinal Comparison of Online Versus Traditional Instruction.” MERLOT Journal of Online Learning and Teaching 7 (1): 68–73.
  • *Waldron, A. C. 2020. “Time Management in Online Higher Education Courses (Publication No.  28378623).” Doctoral diss., Johns Hopkins University. ProQuest Dissertations and Theses Global.
  • *Wang, T. H. 2011. “Developing Web-Based Assessment Strategies for Facilitating Junior High School Students to Perform Self-Regulated Learning in an e-Learning Environment.” Computers & Education 57 (2): 1801–1812. doi:10.1016/j.compedu.2011.01.003.
  • Waschull, S. B. 2001. “The Online Delivery of Psychology Courses: Attrition, Performance, and Evaluation.” Teaching of Psychology 28: 143–147.
  • *Weaver, S. O. 2012. “The Effects of Metacognitive Strategies on Academic Achievement, Metacognitive Awareness, and Satisfaction in an Undergraduate Online Education Course.” Doctoral diss., University of South Alabama. Available from ProQuest Dissertations and Theses database. (UMI No. 3544329).
  • Wilson, Z. S., L. Holmes, K. Degravelles, M. R. Sylvain, L. Batiste, M. Johnson, … I. M. Warner. 2012. “Hierarchical Mentoring: A Transformative Strategy for Improving Diversity and Retention in Undergraduate STEM Disciplines.” Journal of Science Education and Technology 21 (1): 148–156. doi:10.1007/s10956-011-9292-5.
  • Winne, P. H. 2020. “Commentary: A Proposed Remedy for Grievances about Self-Report Methodologies.” Frontline Learning Research 8 (3): 164–173.
  • *Winkler, C. 2011. “Measuring Self-Regulation in a Computer-Based Open Online Inquiry Learning Environment Using Google”. Doctoral dissertation, City University of New York. Available from ProQuest Dissertations and Theses database. (UMI No. 3469872).
  • Wolters, C. A., and P. R. Pintrich. 1998. “Contextual Differences in Student Motivation and Self-Regulated Learning in Mathematics, English, and Social Studies Classrooms.” Instructional Science 26: 27–47. doi:10.1023/A:1003035929216.
  • *Wong, J., M. Baars, M. He, B. B. de Koning, and F. Paas. 2021. “Facilitating Goal Setting and Planning to Enhance Online Self-Regulation of Learning.” Computers in Human Behavior 124 (2021): 106913. doi:10.1016/j.chb.2021.106913.
  • Xu, Z., M. Banerjee, G. Ramirez, G. Zhu, and K. Wijekumar. 2019. “The Effectiveness of Educational Technology Applications on Adult English Language Learners’ Writing Quality: A Meta-Analysis.” Computer Assisted Language Learning 32 (1-2): 132–162. doi:10.1080/09588221.2018.1501069.
  • *Yoon, M., J. Hill, and D. Kim. 2021. “Designing Supports for Promoting Self-Regulated Learning in the Flipped Classroom.” Journal of Computing in Higher Education 33 (2): 398–418. doi:10.1007/s12528-021-09269-z.
  • Zheng, L. 2016. “The Effectiveness of Self-Regulated Learning Scaffolds on Academic Performance in Computer-Based Learning Environments: A Meta-Analysis.” Asia Pacific Education Review 17 (2): 187–202. doi:10.1007/s12564-016-9426-9.
  • *Zheng, L., X. Li, and F. Chen. 2018. “Effects of a Mobile Self-Regulated Learning Approach on Students’ Learning Achievements and Self-Regulated Learning Skills.” Innovations in Education and Teaching International 55 (6): 616–624. doi:10.1080/14703297.2016.1259080.
  • Zimmerman, B. J. 1986. “Becoming a Self-Regulated Learner: Which are the Key Subprocesses?” Contemp. Educ. Psychol 11: 307–313. doi:10.1016/0361-476x(86)90027-5.
  • Zimmerman, B. J. 1990. “Self-regulated Learning and Academic Achievement: An Overview.” Educational Psychologist 25: 3–17. doi:10.1207/s15326985ep2501_2.
  • Zimmerman, B. J. 2002. “Becoming a Self-Regulated Learner: An Overview.” Theory into Practice 41 (2): 64–72.
  • Zimmerman, B. J. 2008. “Investigating Self-Regulation and Motivation: Historical Background, Methodological Developments, and Future Prospects.” American Educational Research Journal 45 (1): 166–183.
  • Zimmerman, B. J., and D. H. Schunk. 2011. Handbook of Self-Regulation of Learning and Performance. New York, NY: Routledge.