7,928
Views
14
CrossRef citations to date
0
Altmetric
Research Articles

The impact of disadvantage on higher education engagement during different delivery modes: a pre- versus peri-pandemic comparison of learning analytics data

ORCID Icon, ORCID Icon & ORCID Icon

Abstract

The pandemic forced many education providers to pivot rapidly their models of education to increased online provision, raising concerns that this may accentuate effects of digital poverty on education. Digital footprints created by learning analytics systems contain a wealth of information about student engagement. Combining these data with student demographics can provide significant insights into the behaviours of different groups. Here we present a comparison of students’ data from disadvantaged versus non-disadvantaged backgrounds on four different engagement measures. Our results showed some indications of effects of disadvantage on student engagement in a UK university, but with differential effects for asynchronously versus synchronously delivered digital material. Pre-pandemic, students from disadvantaged backgrounds attended more live teaching, watched more pre-recorded lectures, and checked out more library books than students from non-disadvantaged backgrounds. Peri-pandemic, where teaching was almost entirely online, these differences either disappeared (attendance and library book checkouts), or even reversed such that disadvantaged students viewed significantly fewer pre-recorded lectures. These findings have important implications for future research on student engagement and for institutions wishing to provide equitable opportunities to their students, both peri- and post-pandemic.

Introduction

In 2020, the response from many governments to the COVID-19 pandemic was to ‘lock down’ their country, preventing their populations from leaving home except for various essential activities. In the UK, like much of the world, this meant that higher education institutions quickly had to pivot their learning and teaching activities to online—or mainly online—provision, with all large group lectures provided virtually. In some cases, and for some periods, no in-person teaching was allowed at all, although universities remained officially ‘open’, with students able to use facilities such as the library and study areas. The reality of ‘digital poverty’—exclusion from aspects of daily life through not having appropriate devices, software or internet connectivity—predated the pandemic, with effects far broader than the domain of education. However, the pandemic arguably intensified and more fully exposed such effects, causing concern that it would deepen inequalities, especially in educational settings (Higson, Moores, and Summers Citation2020; Kizilcec, Makridis, and Sadowski Citation2021). Thus far, the impact of the pandemic on student engagement has not received much attention (Senior et al. Citation2021), although evidence is now starting to emerge (e.g. Bashir et al. Citation2021; Xu and Wilson Citation2021; Zhang, Taub, and Chen Citation2021).

Learning analytics and prediction of student success

A plethora of research suggests a significant correlation between attendance and attainment at university (for a review see Credé, Roch, and Kieszczynka Citation2010), although the causal nature of this relationship is debated, with some researchers contesting that poor attainment can cause low attendance as well as vice versa (e.g. Jones Citation1984; Kahu Citation2013). Nevertheless, learning analytics (LA) systems are increasingly being used to collect and report on student engagement data more broadly, using a variety of measures in addition to attendance, including library use and interaction with virtual learning environments (VLEs). A large amount of research has found correlations between VLE activity and academic performance (e.g. Macfadyen and Dawson Citation2010; Mogus, Djurdjevic, and Suvak Citation2012; You Citation2016; Chen and Cui Citation2020; Waheed et al. Citation2020; Summers, Higson, and Moores Citation2021). It has been reported that such activity can account for between 8% and 36% of the variance in end-of-year mark in online courses (Morris, Finnegan, and Wu Citation2005; Ramos and Yudko Citation2008; Macfadyen and Dawson Citation2010; Agudo-Peregrina et al. Citation2014) and up to 23% of the variance in end-of-year mark for in-person courses (Summers, Higson, and Moores Citation2021). A number of studies have also revealed relationships between library use and attainment, although the correlations are generally quite low (Allison Citation2015; Renaud et al. Citation2015; Robertshaw and Asher Citation2019). Whilst the data feeds for LA systems are typically tailored to the particular institution in question, the digital footprint created by these systems contains potentially valuable information about learners, learning, courses and the university itself; it also provides the potential to observe some of the effects of the pivot to online learning on student engagement.

Success of students from disadvantaged backgrounds

Historically, many universities have focussed efforts on equality of access of students from diverse backgrounds into Higher Education, rather than equality of success and progression to employment after enrolment. Data from England, show that, on average, students from more disadvantaged backgrounds—including disabled students, students from an ethnic minority background and students from lower Index of Multiple Deprivation quintiles—have a lower likelihood of continuing their studies after their first year, lower attainment in their degrees, and a lower chance of progression to highly skilled employment or higher-level study (Office for Students Citation2021). It is therefore increasingly recognised that ‘getting on’ as well as ‘getting in’ matters (Higson Citation2018).

There are numerous ethical issues surrounding the use of LA, including the potential for labelling bias (Sclater Citation2016).The ethos of many LA systems is thereby to allow students to view a record of their own behaviour and to trigger interventions based on this behaviour (rather than on any pre-existing data on prior attainment or demographics; for counterexamples see Arnold and Pistilli Citation2012; Jayaprakash et al. Citation2014). Foster and Siddle (Citation2020) demonstrated that LA can potentially be used to reduce disparities in attainment between different populations without using students’ demographic data. Similarly, Summers, Higson, and Moores (Citation2021) concluded that targeting interventions arising from LA systems based on behaviour, rather than demographics, should be a successful strategy. Nevertheless, this digital footprint can be combined with demographic data outside of the LA systems to allow us to understand more about student behaviour, challenges and patterns at a cohort level (Arnold and Pistilli Citation2012; Jayaprakash et al. Citation2014). Indeed, Williamson and Kizilcec (Citation2021) argued that LA research has thus far mostly neglected diversity, equality and inclusion issues, and that LA dashboards provide a potential opportunity to improve equity outcomes at scale, but that more research is needed first (but see Hlosta, Herodotou, Bayer & Fernandez, 2021).

The present study

Here, we investigate three years of LA data from first-year undergraduates at a research active, medium-sized UK university with a highly diverse student population, comparing students from higher versus lower quintiles of the Index for Multiple Deprivation (IMD: IMD quintiles) in pre- and peri-pandemic times. We analysed results from four of our six possible LA feeds: (i) generic VLE course access, (ii) watching of asynchronously delivered material, (iii) watching of synchronously delivered material (‘attendance’), and (iv) borrowing of library books. This allowed a comparison of digital and physical provision, and was important to elucidate the possible impact of the pandemic on the ability of students to engage with their studies. It should be noted that we do not necessarily consider these data feeds to be the best possible data that could be collected to answer our research question. Neither was the configuration of the LA system used optimal in an online learning environment. Instead, these were the feeds that constituted the LA system at the time and therefore those available to us to analyse; ideally, data on library e-book and e-journal use would have additionally been available. The optimal learning feeds for LA systems are the subject of some debate and will be unique to individual institutions and teaching methods (Agudo-Peregrina et al. Citation2014). Two additional feeds were available to us—logins to the VLE and access to the online quiz system—but were not analysed here. Previous work (Summers, Higson, and Moores Citation2021) found that logins were highly correlated with access to course materials and that access to online quizzes were highly course dependent, and we therefore excluded them from analyses.

Materials and methods

Sample data/participants

The data from three cohorts of undergraduate students at Aston University was used for analysis. The university is a medium sized UK university, research active and has an ethnically diverse population relative to other UK institutions. Approximately 53% of the sample of students read a STEM subject (science, technology, engineering or mathematics) and the remainder, 47%, were in the business school.

Undergraduate records were obtained for first-year full-time home students who began their studies in the 2018/19, 2019/20 or 2020/21 academic years. Students who did not complete their first two years (2018/19 and 2019/20 cohorts) or were not listed as current as of June 2021 were removed from the sample. For the remaining students we attempted to match their home postcode to a UK-wide adjusted IMD quintile. IMD quintiles are not normally comparable between the countries of the UK but Abel, Barclay, and Payne (Citation2016) derived an adjustment such that indices from three of the constituent countries of the UK can be compared with the other. This adjustment has been updated for the most recent 2020 indices by Parsons (Citation2021) and was used here; indices for Scotland, Wales and Northern Ireland were adjusted to be comparable to those from England. After removing students whose IMD quintile could not be identified, due to unrecognised postcodes, we were left with a sample of 6486 students from the three cohorts (see ). The UK Office for Students considers students from IMD quintiles 1 and 2 as meeting widening participation criteria (most disadvantaged), whilst students from quintiles 3–5 are not considered disadvantaged (Office for Students Citation2018). We have divided our students into two categories (Q12 and Q345) that align with this distinction.

Table 1. Breakdown of Sex and IMD (Index of Multiple Deprivation) quintile (most disadvantaged, Q12 = quintiles 1 and 2; least disadvantaged, Q345 = quintiles 3–5) for the three cohorts.

Measures

All undergraduate modules at Aston are managed through the university VLE, where university announcements, timetables, online live lectures and course materials can be accessed. Since 2018, attendance at lectures and seminars has been electronically recorded by students swiping their identity card; though neither attendance nor the act of recording attendance is compulsory for home students. Additionally, all lectures are recorded and available through the VLE via a lecture capture system (LCS). Aston’s learning analytics system, provided by Solutionpath, aggregates the log data from the VLE, attendance recording system and lecture recordings on a daily basis. Four data feeds comprise the digital footprint and, between them, represent proxies for access to online learning and in-person learning: (i) VLE course access: number of times the student accessed course materials, (ii) Attendance: total number of in-person classes and live online classes that the student attended, (iii) LCS: number of times the student viewed recorded lectures, and (iv) Library: number of printed materials checked out of the library by the student. Note that during 2020/21 the vast majority of teaching was conducted online, with some exceptions in STEM subjects. To facilitate comparisons across academic years the total attendance was reported which combined in-person attendance and viewing of online live teaching.

Analyses

For each student, the daily data for the four feeds were aggregated on a weekly basis for the 21 teaching weeks of the 2018/19, 2019/20 and 2020/21 academic year. For the 2018/19 and 2019/20 academic years live teaching was conducted entirely on campus whereas for 2020/21 teaching was conducted almost entirely online. These weekly data were then averaged over the whole academic year for each student.

All the statistical analyses were computed using R 4.1.0 (R Core Team Citation2021). Linear mixed models were computed using lmer from the package lme4 (Bates, Mächler, Bolker & Walker, 2015). The significance of the effects of the main factors were evaluated following the approach of Luke (Citation2017) using the package lmerTest (Kuznetsova, Brockhoff, and Christensen Citation2017) which implemented the Satterthwaite approximation to estimate the denominator degrees of freedom of the F statistic. Estimated marginal means from the models were computed using emmeans (Lenth Citation2021).

For each of the four data feeds a linear mixed model was computed that evaluated the interaction between teaching mode (predominantly online versus entirely in-person) and IMD (Q12 versus Q345) with course added as a random effect to account for possible course-dependent levels of activity of each data feed; especially given that some STEM courses required in-person attendance for laboratory classes even when almost all other teaching was delivered online.

Results

Overall, in comparison with in-person teaching, there was an increase in asynchronous interactions (LCS views and VLE course materials) for online teaching and a small decrease in synchronous interactions (attendance); library book checkouts dipped to near zero. Pre-pandemic, when teaching was entirely in-person, students from lower IMD quintiles tended to attend more lectures than their counterparts from IMD Q345 (M = 4.2 vs. M = 3.8 each week), access course materials more frequently (M = 32.7 vs. M = 30.6 access/week), view recorded lectures more frequently (M = 1.3 vs. M = 1.1 views/week) and check out more library books (M = 0.54 vs. M = 0.33 books/week). During the pandemic, when teaching was almost entirely online, these gaps narrowed for attendance (M = 3.35 vs. M = 3.27 classes/week), dropped to near zero for library book checkouts, and reversed entirely for recorded lecture views (M = 4.8 vs. M = 5.3 views/week). Interaction plots (teaching mode x IMD) for the four data feeds are shown in and summary data are in .

Figure 1. Two-way interaction plots between teaching mode (x-axis; in-person vs. online) and IMD [shape; filled downward triangles = Q12 (most disadvantaged), open upward triangles = Q345 (least disadvantaged)] for the mean weekly values of the four data feeds—Total attendance, VLE Course Accesses, LCS Views and Library book checkouts. Error bars are 95% confidence intervals of the mean.

Figure 1. Two-way interaction plots between teaching mode (x-axis; in-person vs. online) and IMD [shape; filled downward triangles = Q12 (most disadvantaged), open upward triangles = Q345 (least disadvantaged)] for the mean weekly values of the four data feeds—Total attendance, VLE Course Accesses, LCS Views and Library book checkouts. Error bars are 95% confidence intervals of the mean.

Table 2. Mean and 95% CI for in-person and online teaching and by IMD for the four data feeds.

These differences in behaviour between students from different IMD quintiles, and the interactions between IMD and teaching mode, were explored further in a series of linear mixed models. The results of these models (see ) revealed that teaching mode was a significant factor for all the four data feeds, IMD was a significant factor only for library book checkouts—with disadvantaged students borrowing more books than their more advantaged counterparts—and that there were significant interactions between IMD and teaching delivery for attendance, LCS views and library book checkouts. Post-hoc pairwise comparisons on the estimated marginal means of the models () revealed that, during in-person teaching, students from more disadvantaged backgrounds attended significantly more ‘live’ classes (0.12 extra classes/week) than students from less disadvantaged backgrounds, but this difference was eliminated during online teaching. Furthermore, during in-person teaching, students from more disadvantaged backgrounds viewed significantly more recorded lectures (0.25 extra recorded lectures/week) than those from less disadvantaged backgrounds. This situation reversed entirely during online teaching with students from the most disadvantaged backgrounds watching significantly fewer recorded lectures (0.29 fewer views/week) than those from less disadvantaged backgrounds. Finally, students from IMD Q12 backgrounds checked out a significantly greater number of books during in-person teaching (0.20 extra books per week) than those students from less disadvantaged backgrounds.

Table 3. Results of ANOVA computed on the outputs of the linear mixed models which express the linear relationship between teaching mode (in-person vs. online), IMD (Q12 vs. Q345) and their interaction.

Table 4. Pairwise comparisons (IMD Q12 vs. IMD Q345 students) of estimated marginal means from the four linear mixed models.

Whilst individually these effects are small, given the size of the 2020/21 cohort (1604 IMD Q12 students) their overall effect is potentially large. Using the difference data from , IMD Q12 students from the 2020/21 cohort during in-person teaching would have been expected to check out ∼6700 more books (1604 students × 0.198 extra books/week × 21 teaching weeks), watch ∼8400 more pre-recorded lectures, and attend ∼4400 more live classes than those students from Q345.

Discussion

Overall, on three of the four measures, the results showed a differential change in engagement of our disadvantaged students versus our non-disadvantaged students to the relative detriment of disadvantaged students. However, the measures which showed this change were not all digital measures, instead including the measure of library book borrowing. It should also be noted that, when considering online teaching only, the engagement levels of the disadvantaged students only differed significantly from their counterparts for the viewing of recorded lectures; whilst pre-pandemic the disadvantaged students viewed significantly more lectures than their counterparts, peri-pandemic, they viewed significantly fewer.

That the number of course access events between the two groups was similar, and increased for both groups, suggests that frequency of availability of access to the VLE per se was not a particular issue for most students. We are unable to determine the reason for the differences between synchronous and asynchronous delivery methods from our data. Of our measures of engagement, interactive synchronous provision would—in principle—require the most internet data and the best internet connection, with adequate upload and download speeds being required to fully participate. In addition, this synchronous provision provides the least flexibility of access, requiring a digital device and internet connection at a precise time (problematic if, for example, devices are being shared in the household or if students need to be on campus to access the Internet). Instead, it was only for pre-recorded lectures that a significant difference was found between groups peri-pandemic, with disadvantaged students watching these recordings less often. It may be that students from both groups valued and enjoyed synchronous provision more, at least peri-pandemic, and were more motivated to attend. Whilst asynchronous material has the advantage of being available ‘any time’, this may lead to complacence or procrastination (see, e.g. Baker et al. Citation2019, who showed how effects of scheduling when students should watch material affected early attainment). Alternatively, the reduction in engagement with synchronous provision in both groups may have reflected fewer of these types of interactions being available, whilst—in contrast—pre-recorded material may have been ‘over-provided’.

Overall, these results suggest that effects of disadvantage on student engagement were potentially wider reaching, yet also more nuanced, than a simplistic or all-encompassing view of ‘digital poverty’. The concept of digital poverty risks downplaying differential effects of different methods of digital delivery as well as other important aspects of the educational experience. Hodges et al. (Citation2020; p5) argue that emergency remote teaching and online learning are very different, and that true online learning requires an ‘ecosystem of learner supports’ as is present for in-person learning and that ‘Face-to-face education isn’t successful because lecturing is good’. We therefore reiterate the need for a ‘multi-pronged’ approach to supporting students both peri- and post-pandemic, considering academic, experiential and pastoral issues (see also Higson, Moores, and Summers Citation2020).

As outlined in the introduction, there has been a relative dearth of research surrounding equality issues in LA (although see Hlosta et al. Citation2021). However, although student demographics were concluded to be unnecessary for the successful implementation of LA systems, Foster and Siddle (Citation2020) found that low engagement alerts were 43% more likely to be sent to disadvantaged students, supporting their argument that targeting could be based on behaviour alone. Summers, Higson, and Moores (Citation2021) reported that social economic status explained small, but statistically significant, amounts of variance in attainment, indicating that those with parents who had never worked/long-term unemployed tended to have poorest attainment in comparison with those from other socio-economic backgrounds. These findings therefore offer some initial insight into potential equality issues both pre- and peri-pandemic, and as universities prepare themselves for a ‘post-covid future’.

Whilst studies have suggested that relationships between library use and attainment are generally quite weak (Allison Citation2015; Renaud et al. Citation2015), it seems reasonable to speculate that it is because many more affluent students purchase key textbooks instead of borrowing library copies; although the increased use of e-books may also be a factor; unfortunately data on this was not available. The impact on attainment of not feeling able to either purchase or borrow physical textbooks has yet to be determined. In contrast, as outlined in the introduction, a large amount of research has found correlations between VLE activity and attainment, and between attendance and attainment. For this study, we did not have access to levels of attainment, but future research could investigate the impact of the changes in engagement on subsequent attainment. Future research should also consider the potential differential effects of engagement on attainment for different groups; for example, the act of borrowing library resources may be more important for some groups than for others.

There are several limitations with this research. First, the LA system in terms of digital engagement only counts individual login events, excluding other factors which may influence students’ experience, such as what type of device they are using (e.g. mobile phone, tablet or PC), or whether the internet connection allowed them to watch, listen or contribute fully. Anecdotal evidence from academic staff suggests that many students are ‘participating’ in some interactive sessions on mobile phones with their cameras switched off. Second, we have not been able to track the use of other physical resources or space, such as use of the library without checking out books or use of other study areas. It seems reasonable to assume, for example, that the working environment of disadvantaged students is more likely to be less than optimal and that use of e-books and journals may have increased during the pandemic.

Third, we only have measures of absolute counts of engagement, without information on the proportion of possible activities engaged with; this may be over-estimating the proportion of lectures watched, especially because many lecturers were encouraged to provide their pre-recorded lectures in multiple smaller 15 to 20 min chunks, rather than a 50 min continuous recording. Thus, it would not be reasonable to assume that the total amount of lecture time experienced is directly related to a count of pre-recorded material engaged with. This issue, however, would be equal for both groups so does not impact on any conclusions relating to differences and interactions between groups, although it does affect the interpretation of increases or decreases in engagement on these measures overall.

Fourth, in terms of our ‘attendance’ measure, we may not be comparing like with like. Although in both modes of delivery our attendance measure is a measure of engagement with ‘synchronous’ learning, it is likely that pre-pandemic many of these sessions were lectures with relatively limited amounts of interaction (replaced peri-pandemic with recordings), whilst peri-pandemic synchronous sessions were more likely to have been designed to elicit a greater level of interactivity. This may be important because attendance at interactive sessions arguably requires a greater level of commitment, sense of belonging and confidence, which may differ amongst different groups (Oldfield et al. Citation2018); it is possible that the observed interaction here may reflect change of format, rather than an effect of digital poverty per se. In addition, these sessions may also have been held with different cohort sizes, which is known to influence attendance (Friedman, Rodriguez, and McComb Citation2001).

Finally, we should note that significant efforts and resources were employed to try to ensure that digital poverty did not impact on this cohort’s student experience, e.g. via the purchase and deployment of laptops; therefore, some of the worst effects of digital poverty may have been mitigated.

The ability to detect the effects of disadvantage on student engagement, despite many efforts of the university to mitigate it, would not be possible without the large amount of data available from learning analytics systems. This research illustrates the effects of disadvantage on student engagement, that the effects of the pandemic on student engagement are likely to go beyond the digital realm, and that effects of disadvantage on engagement may be more easily observed for some specific types of education delivery than others. Given that students from disadvantaged backgrounds are three times more likely to live at home than their more advantaged peers (Donnelly and Gamsu Citation2018), the shift to online learning may have disproportionally affected such students who, when on campus, make use of non-contact time for further study at the library. Universities should seek to mitigate the broader effects of the pandemic on their students.

Acknowledgements

We gratefully acknowledge the assistance of Solutionpath Ltd and Tai Luong with the data output.

Disclosure statement

We have no conflicts of interest or financial interests relating to this work.

Data availability statement

Due to difficulties in properly anonymising the dataset we are unable to share the data associated with this article.

References

  • Abel, G. A., M. E. Barclay, and R. A. Payne. 2016. “Adjusted Indices of Multiple Deprivation to Enable Comparisons within and between Constituent Countries of the UK Including an Illustration Using Mortality Rates.” BMJ Open 6 (11): e012750. doi:10.1136/bmjopen-2016-012750.
  • Agudo-Peregrina, A. F., S. Iglesias-Pradas, M. Conde-Gonzalez, and A. Hernandez-Garcia. 2014. “Can We Predict Success from Log Data in VLEs? Classification of Interactions for Learning Analytics and Their Relation with Performance in VLE-Supported F2F and Online Learning.” Computers in Human Behavior 31 (1): 542–550. doi:10.1016/j.chb.2013.05.031.
  • Allison, D. 2015. “Measuring the Academic Impact of Libraries.” Portal: Libraries and the Academy 15 (1): 29–40. doi:10.1353/pla.2015.0001.
  • Arnold, K. E., and M. D. Pistilli. 2012. “Course Signals at Purdue.” Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK12, 267. Vancouver, BC, Canada. doi:10.1145/2330601.2330666.
  • Baker, R., B. Evans, Q. Li, and B. Cung. 2019. “Does Inducing Students to Schedule Lecture Watching in Online Classes Improve Their Academic Performance? An Experimental Analysis of a Time Management Intervention.” Research in Higher Education 60 (4): 521–552. doi:10.1007/s11162-018-9521-3.
  • Bashir, A., S. Bashir, K. Rana, P. Lambert, and A. Vernallis. 2021. “Post-COVID-19 Adaptations; the Shifts towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting.” Frontiers in Education 6: 1–13. doi:10.3389/feduc.2021.711619.
  • Bates, Douglas, Martin Mächler, Ben Bolker, and Steve Walker. 2015. “Fitting Linear Mixed-Effects Models Using lme4.” Journal of Statistical Software 67 (1): 1–48. doi:10.18637/jss.v067.i01.
  • Chen, F., and Y. Cui. 2020. “Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance.” Journal of Learning Analytics 7 (2): 1–17. doi:10.18608/jla.2020.72.1.
  • Credé, M., S. G. Roch, and U. M. Kieszczynka. 2010. “A Meta-Analytic Review of the Relationship of Class Attendance with Grades and Student Characteristics.” Review of Educational Research 80 (2): 272–295. doi:10.3102/0034654310362998.
  • Donnelly, M., and S. Gamsu. 2018. “Home and Away: Social, Ethnic and Spatial Inequalities in Student Mobility.” The Sutton Trust. https://www.suttontrust.com/wp-content/uploads/2019/12/Home_and_away_FINAL.pdf.
  • Foster, E., and R. Siddle. 2020. “The Effectiveness of Learning Analytics for Identifying At-Risk Students in Higher Education.” Assessment & Evaluation in Higher Education 45 (6): 842–854. doi:10.1080/02602938.2019.1682118.
  • Friedman, P., F. Rodriguez, and J. McComb. 2001. “Why Students Do and Do Not Attend Classes: Myths and Realities.” College Teaching 49 (4): 124–133. doi:10.1080/87567555.2001.10844593.
  • Higson, H. E. 2018. Getting on, Not Just Getting in, Is What Matters. WonkHE Blog. Accessed 2 December 2021. https://wonkhe.com/blogs/getting-on-not-just-getting-in-is-what-matters-with-bme-success/.
  • Higson, H. E., E. Moores, and R. Summers. 2020. Covid Underlines the Case for Inclusive Education. RSA Blog. Accessed 03 August 2021. https://www.thersa.org/comment/2020/11/covid-19-underlines-case-for-inclusive-higher-education.
  • Hlosta, M., C. Herodotou, V. Bayer, and M. Fernandez. 2021. “Impact of Predictive Learning Analytics on Course Awarding Gap of Disadvantaged Students in STEM.” In Artificial Intelligence in Education. AIED 2021. Lecture Notes in Computer Science, edited by I. Roll, D. McNamara, S. Sosnovsky, R. Luckin, & V. Dimitrova, vol. 12749. Cham: Springer. doi:10.1007/978-3-030-78270-2_34.
  • Hodges, C., S. Moore, B. Lockee, T. Trust, and A. Bond. 2020. “The Difference between Emergency Remote Teaching and Online Learning.” Educause Review, 27, 1–12. Accessed 23 December 2021. https://vtechworks.lib.vt.edu/bitstream/handle/10919/104648/facdev-article.pdf?sequence=1&isAllowed=y.
  • Jayaprakash, S. M., E. W. Moody, E. J. M. Lauría, J. R. Regan, and J. D. Baron. 2014. “Early Alert of Academically at Risk Students: An Open Source Analytics Initiative.” Journal of Learning Analytics 1 (1): 6–47. doi:10.18608/jla.2014.11.3.
  • Jones, C. H. 1984. “Interaction of Absences and Grades in a College Course.” The Journal of Psychology 116 (1): 133–136. doi:10.1080/00223980.1984.9923627.
  • Kahu, E. R. 2013. “Framing Student Engagement in Higher Education.” Studies in Higher Education 38 (5): 758–773. doi:10.1080/03075079.2011.598505.
  • Kizilcec, R. F., C. Makridis, and K. C. Sadowski. 2021. “Pandemic Response Policies’ Democratizing Effects on Online Learning.” Proceedings of the National Academy of Sciences 118 (11): e2026725118. doi:10.1073/pnas.2026725118.
  • Kuznetsova, A., P. B. Brockhoff, and R. H. B. Christensen. 2017. “lmerTest Package: Tests in Linear Mixed Effects Models.” Journal of Statistical Software 82 (13): 1–26. doi:10.18637/jss.v082.i13.
  • Lenth, R. V. 2021. “emmeans: Estimated Marginal Means, aka Least-Squares Means.” R Package Version 1.6.2-1. https://CRAN.R-project.org/package=emmeans.
  • Luke, S. G. 2017. “Evaluating Significance in Linear Mixed-Effects Models in R.” Behavior Research Methods 49 (4): 1494–1502. doi:10.3758/s13428-016-0809-y.
  • Macfadyen, L. P., and S. Dawson. 2010. “Mining LMS Data to Develop an ‘Early Warning System’ for Educators: A Proof of Concept.” Computers & Education 54 (2): 588–599. doi:10.1016/j.compedu.2009.09.008.
  • Mogus, A. M., I. Djurdjevic, and N. Suvak. 2012. “The Impact of Student Activity in a Virtual Learning Environment on Their Final Mark.” Active Learning in Higher Education 13 (3): 177–189. doi:10.1177/1469787412452985.
  • Morris, L. V., C. Finnegan, and S. S. Wu. 2005. “Tracking Student Behavior, Persistence, and Achievement in Online Courses.” The Internet and Higher Education 8 (3): 221–231. doi:10.1016/j.iheduc.2005.06.009.
  • Office for Students. 2018. Equality Impact Statement: Regulatory Framework for Higher Education. Accessed 23 December 2021. https://www.officeforstudents.org.uk/media/1446/ofs2018_09.pdf.
  • Office for Students. 2021. Access and Participation Data Dashboard. Accessed 02 December 2021. https://www.officeforstudents.org.uk/data-and-analysis/access-and-participation-data-dashboard/.
  • Oldfield, J., J. Rodwell, L. Curry, and G. Marks. 2018. “Psychological and Demographic Predictors of Undergraduate Non-Attendance at University Lectures and Seminars.” Journal of Further and Higher Education 42 (4): 509–523. doi:10.1080/0309877X.2017.1301404.
  • Parsons, A. 2021. UK 2020 Composite Index of Multiple Deprivation. https://github.com/mysociety/composite_uk_imd.
  • R Core Team. 2021. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/.
  • Ramos, C., and E. Yudko. 2008. “‘Hits’ (Not ‘Discussion Posts’) Predict Student Success in Online Courses: A Double Cross-Validation Study.” Computers & Education 50 (4): 1174–1182. doi:10.1016/j.compedu.2006.11.003.
  • Renaud, J., S. Britton, D. Wang, M. Ogihara, C. Mader, L. Maristany, and J. Zysman. 2015. “Mining Library and University Data to Understand Library Use Patterns.” The Electronic Library 33 (3): 355–372. doi:10.1108/EL-07-2013-0136.
  • Robertshaw, M. B., and A. Asher. 2019. “Unethical Numbers? A Meta-Analysis of Library Learning Analytics Studies.” Library Trends 68 (1): 76–101. doi:10.1353/lib.2019.0031.
  • Sclater, N. 2016. “Developing a Code of Practice for Learning Analytics.” Journal of Learning Analytics 3 (1): 16–42. doi:10.18608/jla.2016.31.3.
  • Senior, C., C. Howard, E. J. N. Stupple, and R. Senior. 2021. “Student Primacy and the Post Pandemic University.” Frontiers in Education 6: 712767. doi:10.3389/feduc.2021.712767.
  • Summers, R. J., H. E. Higson, and E. Moores. 2021. “Measures of Engagement in the First Three Weeks of Higher Education Predict Subsequent Activity and Attainment in First Year Undergraduate Students: A UK Case Study.” Assessment & Evaluation in Higher Education 46 (5): 821–836. doi:10.1080/02602938.2020.1822282.
  • Waheed, H., S. Hassan, N. R. Aljohani, J. Hardman, S. Alelyani, and R. Nawaz. 2020. “Predicting Academic Performance of Students from VLE Big Data Using Deep Learning Models.” Computers in Human Behavior 104: 106189. doi:10.1016/j.chb.2019.106189.
  • Williamson, K., and R. F. Kizilcec. 2021. “Learning Analytics Dashboard Research Has Neglected Diversity, Equity, and Inclusion.” Proceedings of the ACM Conference on Learning at Scale (L@S).
  • Xu, Y., and K. Wilson. 2021. “Early Alert Systems during a Pandemic: A Simulation Study on the Impact of Concept Drift.” LAK21: 11th International Learning Analytics and Knowledge Conference, April 2021, 504–510. doi:10.1145/3448139.3448190.
  • You, J. W. 2016. “Identifying Significant Indicators Using LMS Data to Predict Course Achievement in Online Learning.” The Internet and Higher Education 29: 23–30. doi:10.1016/j.iheduc.2015.11.003.
  • Zhang, T., M. Taub, and A. Chen. 2021. “Measuring the Impact of COVID-19 Induced Campus Closure on Student Self-Regulated Learning in Physics Online Learning Modules.” LAK21: 11th International Learning Analytics and Knowledge Conference, April 2021, 110–120. doi:10.1145/3448139.3448150.