6,698
Views
18
CrossRef citations to date
0
Altmetric
Editorials

Thinking critically about learning analytics, student outcomes, and equity of attainment

Abstract

This is the introduction to the special edition on learning analytics.

Over the last decade the deployment and use of learning analytics has become routine in many universities worldwide, especially in the United States, Europe, and east and south Asia. Its deployment in higher education institutions has been enabled by technological developments originally designed to exploit the growth in the collection of multiple data sources, usually aiming to deliver customer and/or employee insight, and competitive advantage (Beer Citation2017, 21). The formation of learning analytics can be located in the disciplines of mathematics, education and computer science, and with the emergence of educational data mining, predictive modelling, machine learning, and social network analysis. Its potential to contribute new insight and performance data is significant, and it is likely to exert considerable impact on the student learning experience, and university practices, for years to come.

Learning analytics can be defined as the collection and analysis of the demographic, behavioural and digital trace data of students to improve their experiences and outcomes by enabling targeted real time interventions with particular cohorts and individuals based on their profile derived through machine learning and algorithmic processing. The theory of learning analytics in higher education is that the collection and analysis of student data can enable higher education institutions to optimise personalised learning environments, resulting in an enhanced total student experience and improved student outcomes. As a consequence of its implementation in higher education, more responsive, evidence based, data driven and transparent decision-making (institutions and students) are deemed possible.

As the deployment and use of learning analytics has grown, its scope, breadth, depth and complexity has evolved. It is no longer a sideline activity in a higher education sector which is notoriously poor at capitalising on data opportunities; it is now taking centre stage in many higher education institutions. Learning analytics has evolved from a focus on learning and the student experience to one that now includes research, knowledge exchange and praxis (Lang et al. Citation2017). As a field of inquiry it includes an emphasis on a wide range of outcome measures, including student retention, progression, attainment, performance, ‘mastery’, employability and engagement (Francis and Foster Citation2018).

As the first two decades of the twenty first century draw to a close, higher education remains on the cusp of significant upheaval and transformation. Competition, technology, globalisation and demographics continue to shape the sector, as does heightened concern for the mental health and wellbeing of students, academic and professional services staff. At the same time, there is growing acknowledgement of, and attention devoted to, the recognition that higher education does not benefit everyone equally. Indeed learning analytics has contributed to identifying that the extent to which students are retained, make progress, attain and achieve graduate employment can be very different when student data is examined according to demographics such as income and race for instance (Richardson Citation2012; Tempelaar et al. Citation2015).

Given its transformative potential, there is no better time than the present day to explore the nature, deployment and effectiveness of learning analytics in higher education. This is particularly the case when, within the space of a decade, learning analytics has gone from being a specialist and bespoke activity of a few academics working to support student learning in specific disciplines, to one that is being adopted across the sector, with the intention of improving multiple student outcomes, from retention and progression, through to attainment, progression and engagement. Hence the focus of this special edition.

We have drawn together perspectives from across the globe on the role of learning analytics in achieving equity of attainment. As universities move towards a more automated system of identifying and prioritising students who need support, we wonder will this lead to a more inclusive and equitable learning environment? Will centralised, data-driven support systems be able to shed light on systemic issues within the institution that result in advantaging some students over others? As we began to curate the authors we wanted to include in this special edition, whose research was starting to address such questions, four key themes emerged. These are:

  • efficacy and effectiveness of learning analytics

  • data driven decision-making

  • analytics and the pedagogy of learning

  • power, culture and control

Throughout the discussion that follows, these four themes offer a critical framework within which learning analytics and its impact are assessed drawing upon a range of contemporary research and inquiry.

Efficacy and effectiveness of learning analytics

Despite its continuing evolution and expansion, research on the efficacy and the effectiveness of learning analytics in higher education generally remains equivocal. There are systematic reviews that report positive results (Bienkowski et al. Citation2012; Ferguson Citation2012; Romero and Ventura Citation2013; Papamitsiou and Economides Citation2014; Ahern 2018; Vieira et al. Citation2018; Robertshaw and Asher Citation2019). There are also systematic reviews that are more circumspect (Sonderlund et al. Citation2018). What is unclear, from the research to date, is evidence on learning analytics’ ability to help address differential student outcomes for disadvantaged students.

The first theme concerns the efficacy and effectiveness of learning analytics. The first three articles by Foster and Francis, Foster and Siddle, and Nguyen et al., each report on the findings of empirical research to assess the efficacy and effectiveness of learning analytics in higher education. Foster and Francis report that learning analytics and academic analytics are primarily designed for the improvement of academic performance (although some do focus on retention); whilst learner analytics is more aligned to improving student engagement. Foster and Francis report that few published outputs deliver robust, empirically-based research findings; but of those that do, three-quarters provide (mostly quantitative) evidence that the use of educational data analytics is effective in improving student outcomes. The article posits that the efficacy of learning analytics and its relationship with student outcomes requires further and more robust investigation and assessment.

For Foster and Francis learning analytics research must interpret and communicate effectiveness qualitatively by including the student voice in assessments of impact. They report that few studies actually test the hypothesis that an analytically informed personalised learning environment is causally related to improved student outcomes. They make the point that the research designs are limited to quantifying impact as opposed to qualifying it. Finally, they highlight that few studies report on the efficacy of applying analytics methods to address differential student outcomes for disadvantaged students.

This point is particularly interesting given that structural disadvantage does impact on student outcomes, particularly for disadvantaged students such as those defined as ‘widening participation’ students (Evans et al. Citation2018). There is evidence in UK higher education institutions to support the argument that the relationship between student agency—the inputs individuals bring to their learning—and the outputs of that learning—student achievement as measured by the grades they acquire, are shaped more than just by a student’s intellect and behaviour (Richardson Citation2018). BAME (Black, Asian and Minority Ethnic) students, older students and male students, for example, are less likely to access higher education, progress and graduate with the best grades. There is increasing evidence linking student outcomes to financial circumstances (Crawford Citation2014), demographic factors (Soria and Stebleton Citation2012) as well as a sense of community and engagement (Morrow and Ackerman Citation2012).

These socio-economic factors are compounded by the impact of academic performance on retention (Yorke Citation2001). White students compared to Black and Asian students are more likely to achieve higher degree outcomes (Thiele et al. Citation2016). Young people from lower socio-economic backgrounds, compared to those from higher socio-economic backgrounds, on average, are more likely to withdraw from their studies, less likely to either complete their degree or graduate from university with an upper second or first class degree (Crawford Citation2014). Students entering university with ABB grades or above at A-level are also increasingly likely to graduate with an upper second or first class degree (Smith and White Citation2015). Moreover the intersectionalities between social categories of class, race, age, gender, sexuality and other personal characteristics can further exacerbate the differential outcomes for particular groups and individuals.

Universities have invested significantly in initiatives aimed at closing the ‘attainment gap’ and achieving equity of achievement by making changes, notably to curricula and to personal tutoring. Yet crucial to the potential success of such interventions is the identification of students most ‘at-risk’ in order to target intervention resource effectively. Improving individual student performance and student outcomes can be achieved by delivering more personalised learning opportunities, in which student data is analysed to target interventions at specific students, whether through, for example, gamified content delivery in classroom (Minović et al. Citation2015), blended and online environments (Sharma et al. Citation2016; Chen et al. Citation2017), innovations in assessment and feedback (Khan and Pardo Citation2016), or the visualisation of student performance (Wise et al. Citation2013). From these perspectives, learning analytics is benign; data collection and its use is for the student; delivering specificity of intervention, greater personalisation of experience, and opportunities to address the attainment gap and differential experiences.

It is within this broad context that the research by Foster and Siddle is particularly apposite. They explore the effectiveness of learning analytics in identifying ‘at risk’ students using an assessment of their engagement measured for 14 consecutive days. Drawing upon the evidence generated from one of the first learning analytics projects in the UK—in which Nottingham Trent University generates ‘no-engagement’ ‘early warning’ alerts, Foster and Siddle examined two cohorts of first year undergraduate students and found that the generation of these alerts are more effective at identifying students at risk of poorer outcomes than alerts derived from demographic data, using widening participation status as a case study example. Their research shows that, as the number of alerts regarding non engagement increases, the progression and achievement of the students decreases. They found the system was able to allow for the targeting of support at disadvantaged student groups based on their widening participation status, contributing to an ambition to reduce differential student outcomes for disadvantaged groups.

Technological innovation alone cannot deliver change to the achievement and performance of students. Nor can it explain the reasons for students’ non-engagement for any particular period of time. Yet, what Foster and Siddle’s article demonstrates is that the harnessing of data and its analysis can help the creation of a more inclusive and equitable learning environment by identifying the students who would benefit most from an intervention. What is interesting in Foster and Siddles’ work is their argument not to use demographic data, because such data can, for them, potentially lead to individual students being treated unfairly.

With a similar focus on differential student attainment and learning analytics, Nguyen et al. utilise data drawn from student academic performance, demographic data and online ‘digital’ traces to investigate the usefulness of learning analytics to provide insight into the attainment gap between BAME and white students on distance learning programmes. Underpinned by a review of the research literature on the attainment gap, Nguyen et al. assess the range of factors put forward to explain the differences in a study carried out in the UK’s largest university, the Open University. The data proxied for behavioural engagement in this study is the students’ traces in a virtual learning environment (VLE) in 401 modules. Whilst other studies have focused on student attainment at programme level, this study drills down to module attainment, in order to examine the specificity of factors that might impact on student performance.

Nguyen et al.’s research highlights that learning analytics is able to identify that, even when taking into account prior qualification and engagement, BAME students are less likely to complete, pass or achieve an excellent outcome (measured by the achievement of a 2:1 or first degree in the UK awarding system) than their white counterparts. Moreover, they find that BAME students spent between 4% and 12% more time studying, suggesting that they put more effort in than white students to achieve the same level of outcome. Despite being the most disadvantaged, Black students had the highest level of engagement than any other ethnic group. Nguyen et al. also report that when controlling for academic outcome, White students spent the least time on the VLE compared to other ethnicities. The reasons for this engagement could be many and varied. It could reflect effort; it could reflect students’ difficulty in understanding the material; or it could reflect the nature of the material and the way it is interpreted by students of different backgrounds.

Data driven decision-making

A second theme of this special edition concerns whether learning analytics can encourage improved individual agency (students, and academic and professional support staff) and better data driven decision-making. Visualising complex multi-layered data to students in accessible ways that allows them to understand their performance against other students and other variables, such as engagement (VLE usage, attendance), can, it is argued (Jayaprakash et al. Citation2014; Mejia et al. Citation2017), improve variously their awareness, reflexivity, participation, involvement and motivation, and thus their ability to deliver actionable attitudinal and behavioural changes. Charleer et al. (Citation2013), for example, report that students presented with their formative feedback visually are more aware of what is required to succeed in the task. Empowering students to self-regulate through greater understanding of their data is core to the aim of learning analytics.

The articles by Broos et al. and Khatri et al. examine the opportunities that learning analytics offers to enhance student agency, whether by improving student decision-making or by developing their understanding of feedback and engagement. Broos et al. explore the impact that learning analytics can have on student decision-making via an assessment of two self-service dashboards. The context is the transition of students into higher education and the opportunities that learning analytics offers for supporting student transition to university. Transition is a recognised challenge for many students; moving from the relative comfort of one learning environment (secondary or further education) to another (higher education). With different approaches and expectations relating to pedagogy, engagement, independence and feedback, it is not easy for many students to transition to and navigate university culture, systems and processes. However, the nature and extent of the challenge differs amongst students and can be related to their individual background circumstances, demographics, qualifications and prior knowledge of higher education (such as whether they are first generation students). The nature and quality of early student engagement can thus be an important element in transitioning, and once at university, feedback can help support the process.

For Broos et al. student facing dashboards can represent data about the learning process visually to students, whilst also enabling more actionable decision-making. They report using a case study methodology in nine participating STEM (science, technology, engineering and mathematics) study programmes, and present evidence from a) the tracking of student usage in two dashboards, and b) a before and after survey of students in their second year. The dashboards allow for the provision of feedback on learning and study skills, and on academic achievement, to first year undergraduate students in several STEM study programmes in Belgium. Carried out in the Catholic University of Leuven, the study reports that learning analytics enhances the decision-making of students through the visualisation of their data—in effect helping support the development of feedback literacy amongst the student cohort.

The enablement of better decision-making amongst students is also the focus of the article by Khatri et al. The authors report on work exploring the impact learning analytics combined with serious gaming can have on student motivations for career planning. The purpose of the game is to engage the student; the learning analytics is used as a means to analyse activities and subject preferences in order to support personalised career counselling. The study was carried out with first year MBA students, with the intention of analysing the impact of the tool on the learning effectiveness of the students. The study identifies that learning analytics is a significant intervention in enhancing the learning effectiveness of students.

It has been reported that allowing academic staff to view complex student data visually can also deliver more effective tailored interventions better aligned to the needs of the student (Liu et al. Citation2016). Shimada et al. (Citation2017), for example, in a study of the use of student data—visualised real time to academic staff whilst teaching—reported that it enabled the tutors to adjust their delivery speed and style according to the cohort in front of them, aligning delivery to the ambition of improving student outcomes.

Gray et al. pick up this theme and explore the potential of learning analytics to enable consistency in marking and moderation of assessment. The article examines the use of learning analytics to give feedback direct to academic staff in order to inform their performance. The authors introduce the concept of Learning Pictures coupled with contemporary visualisation techniques, enabled through learning analytics, with the aim of providing a symbolic overview of student’s achievement in 16 representations. Gray et al. argue that visualising student achievement in this way provides ‘a wide-angle view of a students’ entire university career that can extract a set of common patterns and generate appropriate objective responses from examiners when making adjustment decisions to final classifications conferred. Drawing upon the authors’ reflections on the development of the tool, and an evaluation of the tool by educationalists involved in its piloting, the authors suggest that Degree Pictures can address the subjective nature of assessment. This is because the tool is able to develop greater emphasis on data driven decision-making and also impact positively on the way in which interventions can be developed and delivered.

Research strategies, learning analytics and the pedagogy of learning

A third theme of this special edition concerns the relationship between student data, its analysis for interventions, and the pedagogy of learning and teaching. A decade or so ago, learning analytics appeared mostly as a technological solution, reacting and responding to the emergence and availability of big data, rather than to the specific needs of learners and educationalists, or indeed of the institutions within which they studied and worked. Context was lacking, as was the voice and participation of students and staff. Particularly lacking in the formative years was empirical, conceptual and theoretical insight into the ways in which learning analytics was able to improve student outcomes. A detailed theory of change that describes comprehensively the link between inputs (data, analysis), outputs (interventions) and appropriate outcomes (improved student performance) is still lacking within much that has been written on learning analytics.

Sadly, today the student voice is not as front and central, nor as involved in the development and carrying out of research into learning analytics, as it should be. This is surprising given that a large proportion of studies (see Papamitsiou and Economides Citation2014; Sonderlund et al. Citation2018) report learning analytics as an effective enabler of targeted interventions to students. What is often lacking is evidence regarding why the interventions work; the student perspective of what it was about the intervention or nudge that impacted positively upon them and their outcomes. Central to much of these criticisms lies a predominance of quantitative research methodologies, aimed principally at reporting significance of correlations rather than the explanatory reasons behind them. Certainly mixed methodologies are noticeable by their absence in much of the research literature. It should also be noted that learning analytics remains a field of inquiry influenced by the technological companies involved in its implementation, and there remains too much advertorial reporting of efficacy and effectiveness that in our view often lack robust peer review and external assessment.

The weakness of learning analytics research is a point forcefully picked up by Tormey et al. in their contribution, which focuses on self-regulated student learning in the context of distance learning. It is their contention that weaknesses—both empirical and conceptual—in learning analytics research means that there is limited understanding of how interventions work, and importantly why. Tormey et al. note the lack of research on whether learning analytics is impactful both on student learning, and on the practice of learners and teachers. They note the lack of pedagogical sophistication, and the limited theoretical frame of reference that underpins the design of much learning analytics projects. For Tormey et al., the issue is that whilst learning analytics is presented as enabling student self-regulated learning, much of it is developed without clear pedagogical reference points, and without an understanding of the context within which it is used. As they state, it often appears like so many other computer aided learning tools which ‘are often little more than a solution in search of a problem’. Many such tools are, they suggest, merely data driven, ‘often developed by those with an interest in algorithms, data analytics and data visualisation, but who hold naïve views of how learning and teaching happens.’

For learning analytics to be effective, Tormey et al. argue, it needs to be informed by the concepts and ideas which emerge from learning (pedagogical) research. The article identifies the need for greater theorisation about the design of self-regulated learner analytics tools. The authors outline a series of design issues to be considered when developing self-regulated and learner approaches to analytics; and they report on their development of one such tool, the Learning Companion, designed to improve students’ skills in problem solving in scientific and mathematical exercises.

In a similar vein, Tempelaar reports on the development of a theory of change in relation to the use of assessment for learning and feedback in a blended learning environment. He begins by noting how formative feedback using student trace learning activity data (in this case the number of problems the student has attempted to solve, how many scaffolds the student has used, and the time taken on each task). He describes these as ‘the digital footprints in the learning process’, and argues these can help support interventions both instructor led and student initiated.

Tempelaar also notes that demographics, prior schooling, learning dispositions, internationalisation and differing educational backgrounds can all lead to differential opportunities for students to succeed. It is within this context that he draws upon sophisticated data collection and analysis (dispositional learning analytics) as part of a research study carried out on a large and diverse group of students undertaking their first university module. He details the use of trace and disposition data—student trace data and self-response survey data—to understand how students find their own learning paths in technology enhanced learning environments, and what learning approaches can be seen as antecedents of such choices. He argues that digital learning platforms based on the mastery learning concept, that integrate assessment as, for and of learning and teaching, can play a key role in solving the issue of unequal chances. He shows how by researching a challenging quantitative methods course for first year students, and by contrasting the learning approaches of two groups of students. The findings suggest that the intensity of use of interventions varies between student clusters, and thus the time and effort required differs, although the study does not report module performance differences.

Power, culture and control

As a number of the articles in this special edition confirm, learning analytics has the potential to improve student agency and enable greater personalisation and transparency supporting whole university or institutional approaches to student success. It can also enable institutional responses to student under-performance, creating environments that locate the learner centre stage, and which offer innovative ways of intervening in order to improve student attainment. This should not be a surprise. First, learning analytics is both emerging and evolving, and the last ten years have witnessed the development of many new and dynamic approaches to supporting students as a consequence of its implementation. Second, learning analytics is acknowledged as enabling the personalisation of services, allowing learners to receive an experience based on their own preferences, needs and profiles. Third, learning analytics often involves technology start-up companies that spend considerable time, effort and resource on testing their products within the sector, alongside evaluating and promoting the potential benefits deriving from them. Fourth, institutions are constantly on the lookout for new solutions, and learning analytics is clearly one such opportunity to strengthen student support and engagement.

Nevertheless, tensions do co-exist between such opportunities, and the risks that the deployment and use of learning analytics can pose—risks to the individual student, as well as broader risks regarding privacy and ethics that can impact negatively on institutions. Often such risks are couched within wider debates concerning surveillance, regulation and control. Indeed, some comentators have begun to extol the potential problems associated with the deployment and use of learning analytics (such as privacy, ethics and consent), or regarding the use of big data to widen the net and thin the mesh of student regulation and control. The most vociferous commentaries about learning analytics have arisen in the context of debates about the ‘platform society’—one in which our use of platforms (such as Facebook, Twitter, and digital learning environments) is routine—the implications of which, for some, is not always as benign as expected. Platforms enable individuals to interact for the purpose of coordination, collaboration and exchange. However, they also allow for the extraction and control of data providing proprietary access to record traces of activity and use data to generate new forms of value. Probing questions arise such as in whose interests does learning analytics operate, why, how and with what impact? After all, learning analytics is able to capture data available as a result of the use of platforms, with engagement, behavioural and demographic data all available to be collected, collated and analysed. For us, what is required is a critical assessment of learning analytics in the context of the platform university (see Robinson 2018).

The final theme running throughout a number of the articles in this special edition is that of power, culture and control. The three articles by Tsai et al., Broughan and Prinsloo, and Archer and Prinsloo bring to the fore debates about the challenges that learning analytics poses, especially in the context of empowering students to be the best they can be.

Although not necessary polarised, significant tensions do exist, as Tsai et al. articulate. Rather than enabling student agency in a highly regulated environment, Tsai et al. explore the impact that the collection and use of trace data can have on ‘datafying’ students within a context and culture of asymmetrical power relations between staff, students and institutions; as well as complex institutional arrangements involving humans and machines. Drawing upon primary research findings collected through the delivery of focus groups with staff and students, their analysis offers insight into the very real tensions that exist between student empowerment through learning analytics, and the diminishment of student autonomy through their datafication. Highlighting individual and institutional power, culture and control as key to these tensions, they argue that it is important that empowerment is not assumed to have arisen merely from the implementation of learning analytics. Rather, they argue that power, and the nature of the relationship between machines, people and their data must be accounted for to secure equity and the agency of the learner. Moreover, they highlight the need for engagement as part of learning analytics development and interventions, between students, staff and institutions,

In a similar vein, Broughan and Prinsloo highlight the need for critical reflection on the development and implementation of forms of learning analytics that are becoming ever more commonplace within higher education institutions. Not only is there more data than ever before, but much of it is behavioural and demographic, and whilst offering huge potential, it also presents huge challenges to individual institutions and the sector more generally. They make the case that ‘While the notion of “student-centred” is well established in the discourses and practices surrounding assessment and evaluation, the concept of student-centred learning analytics is yet to be fully realised by the sector’. Drawing upon the teachings of Freire as a framework to understand and address the inherent disadvantages that current systems and approaches place on certain groups of students, Broughan and Prinsloo argue that higher education institutions would benefit from a Freirean approach to learning analytics that is sympathetic to individualised learner pathways, capable of responding ethically and equitably to help all students succeed, and empowering to all involved.

The potential for learning analytics to impact negatively is a key theme running throughout this special edition. This is particularly the case as learning environments and university campuses become increasingly digital and technologically rich, and learning is delivered in blended forms or at a distance, leaving greater opportunities for data traces to be available for collection, collation and manipulation. Power, culture, inequality, social context, and ethics are all key elements that need addressing when institutions develop and implement learning analytics, otherwise considerable negative impact may result. Situating their critical discussion in the context of student success and failure in higher education, Archer and Prinsloo ‘speak the unspoken’ about learning analytics in order to open up to critical debate and investigation into what they view as five default positions on learning analytics. These are 1) the deficit thinking informing algorithms and nudges; 2) conflating ‘what’ is happening and why, 3) failure as the inverse of success, 4) the unequal cost of false positive and negatives; and 5) the paradox of objective science. They argue that if these default positions are not identified and interrogated, they may very well negatively impact on student learning, the student experience and the reputation of learning analytics within the sector.

Conclusion

It is clear that higher education has made advances in the deployment of learning analytics due to the technological advances in the field and their potential to respond to demand from stakeholders to evidence that equity, diversity and inclusivity issues are being addressed successfully. However, while institutional leaders have been keen to enter the race to engage learning analytics they have evidenced limited understanding of the ontological lens from which it should be viewed and/or the theory of change to be deployed. Importantly, they have failed to engage with perhaps the most important of the stakeholders in its use—students. This special edition suggests that whilst there is evidence of impact, it is limited and confined to specific research studies, and more generally, robust evidence regarding the efficacy of learning analytics is limited. The special edition also highlights that, deployed without caution, learning analytics can lead to the replication of disadvantage within the sector.

This special edition offers a salutary caution that, as we head towards a future higher education sector that is governed, measured and rewarded by the data it produces, we need to categorically assure ourselves that the data we gather, and the subsequent interpretation of that data, is based on sound principles. The authors of the articles in this special edition do, however, suggest that learning analytics can have a positive impact on the outcomes of all students irrespective of demographic background or personal circumstances. What they offer is a set of guidelines, which, if taken into consideration, can lead to learning analytics adding value to the learning experience for the student, tutor, institution and policy maker/regulator. A recent development has been in the potential use of learning analytics to support and promote mental health and wellbeing for students. The guidelines offered here are, one might argue, even more significant for this very personalised student experience.

As we enter into a more challenging period for higher education, technology, alongside competition and globalisation will continue to shape the future success of the sector. It is our view that learning analytics will continue to be deployed across higher education, aided by the impetus of technological companies promoting their worth on the back of narratives about efficiency, effectiveness and value for money, and will come to shape to an extent the nature of the sector and student experience for years to come. In this context, it is our view that a critical realist research agenda is required, one that can begin to provide reflective, critical and insightful assessments of learning analytics, utilising mixed methodologies and empowering students and their voice as co-creators of future learning analytics practices and evaluations.

In this context, we would hope that any future research agenda encapsulates the following questions:

  • How can learning analytics be applied for success? How can we devise research that not only tells us ‘what’ but ‘why’, and develops theories of change rather than just quantitative correlations.

  • How can learning analytics be inclusive? How can students be more involved and have truly informed consent about the way their data is used; as well as offer voice as to the design, deployment, implementation and evaluation of learning analytics?

  • How can learning analytics support educational success? How can equity of attainment become a significant area of intervention and evaluation, as well as moving beyond traditional areas of focus to new areas such as employability and mental health and wellbeing?

  • How can biases be avoided: for instance, by avoiding using data that perpetuates biases of the past?

  • In whose interest does learning analytics operate, how and why?

In an era where higher education institutions have access to more data than ever before, we respectfully suggest that institutions pause to consider the value that learning analytics brings to: the learning process, institutional processes, teaching pedagogies and student learning.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Peter Francis

Peter Francis is Deputy Vice Chancellor and Professor of Criminology at Northumbria University, UK. With a track record in higher education leadership, he is currently leading a major government funded project in student mental health and wellbeing through advanced educational data analytics, relationship management, and effective models of support.

Christine Broughan

Christine Broughan is Professor of Higher Education and Director of Curriculum 2025 at Coventry University. She has a proven track record in higher education leadership, teaching and research. With a key role in equity and social inclusion, she has been instrumental in the transformation of Coventry’s teaching and learning provision.

Carly Foster

Carly Foster is a data specialist who has worked in various sectors including private and public transport and most recently higher education. She has designed, delivered and evaluated several significant projects which have seen businesses transition to data and analytics informed practice.

Caroline Wilson

Caroline Wilson is an Associate Professor and Curriculum Change Lead at Coventry University, embedding issues such as sustainability and inclusivity into the learning environment. Research interests include learning from other disciplines how to invoke positive change.

References

  • Ahern, S. J. 20187. “The Potential and Pitfalls of Learning Analytics as a Tool for Supporting Student Wellbeing.” Journal of Learning and Teaching in Higher Education 1(2): 165.
  • Beer, D. G. 2017. “The Data Analytics Industry and the Promises of Real-Time Knowing: Perpetuating and Deploying a Rationality of Speed.” Journal of Cultural Economy 10(1): 21–33. ISSN 1753–0369
  • Bienkowski, M., M. Feng, and B. Means. 2012. Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Available at https://tech.ed.gov/wp-content/uploads/2014/03/edm-la-brief.pdf
  • Charleer, S., J. Klerkx, J. L. Santos, and E. Duval. 2013. “Improving Awareness and Reflection through Collaborative, Interactive Visualizations of Badges.” CEUR Workshop Proceedings 1103: 69–81.
  • Chen, J., J. Xu, T. Tang, and R. Chen. 2017. “WebIntera-Classroom: An Interaction-Aware Virtual Learning Environment for Augmenting Learning Interactions.” Interactive Learning Environments 25(6): 792–807. doi:10.1080/10494820.2016.1188829.
  • Crawford, C. 2014. Socio-Economic Differences in University Outcomes in the UK: Drop-Out, Degree Completion and Degree Class. IFS Working Papers, No. W14/31. doi:10.1920/wp.ifs.2014.1431.
  • Evans, C., G. Rees, C. Taylor, and C. Wright. 2018. “Widening Access’ to Higher Education: The Reproduction of University Hierarchies through Policy Enactment.” Journal of Education Policy 34(1):101-116. doi:10.1080/02680939.2017.1390165.
  • Ferguson, R. 2012. “Learning Analytics: Drivers, Developments and Challenges.” International Journal of Technology Enhanced Learning 4(5/6): 304–317.
  • Francis, P., and C. Foster. 2018. “Educational Analytics: A Systematic Review of Empirical Studies.” Paper presented to Advance HE Surveys Conference, Leeds, 9th May 2018. Available at https://s3.eu-west-2.amazonaws.com/assets.creode.advancehe-document-manager/documents/hea/private/hub/download/Advance%20HE%20Surveys%20Conference%202018%20-%20Learning%20analytics%20-%20Peter%20Francis%20and%20Carly%20Foster_1568037595.pdf
  • Jayaprakash, S. M., E. W. Moody, E. J. M. Lauría, J. R. Regan, and J. D. Baron. 2014. “Early Alert of Academically at‐Risk Students: An Open Source Analytics Initiative.” Journal of Learning Analytics 1(1): 6–47.
  • Khan, I., and A. Pardo. 2016. “Data2U: Scalable Real Time Student Feedback in Active Learning Environments.” In Proceedings of the Sixth International Conference on Learning Analytics and Knowledge (LAK 2016), 25–29 April 2016, Edinburgh, UK, 249–253. New York: ACM. doi:10.1145/2883851.2883911.
  • Lang, C., Siemens, G., Wise, A., & Gasevic, D., eds. 2017. Handbook of Learning Analytics. Beaumont. Canada: Society for Learning Analytics Research (SoLAR). doi:10.18608/hla17.
  • Liu, D. Y. T., C. E. Taylor, A. J. Bridgeman, K. Bartimote-Aufflick, and A. Pardo. 2016. “Empowering Instructors through Customizable Collection and Analyses of Actionable Information.” CEUR Workshop Proceedings 1590: 3–9.
  • Mejia, C., B. Florian, R. Vatrapu, S. Bull, S. Gomez, and R. Fabregat. 2017. “A Novel Web-Based Approach for Visualization and Inspection of Reading Difficulties on University Students.” IEEE Transactions on Learning Technologies 10(1): 53–67.
  • Minović, M.,. M. Milovanović, U. Šošević, and M. A. Conde-González. 2015. “Visualisation of Student Learning Model in Serious Games.” Computers in Human Behavior 47(C): 98–107.
  • Morrow, J., and M. Ackerman. 2012. “Intention to Persist and Retention of First-Year Students: The Importance of Motivation and Sense of Belonging.” College Student Journal 46(3): 483–491.
  • Papamitsiou, A., and A. A. Economides. 2014. “Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence.” Educational Technology & Society 17(4): 49–64.
  • Richardson, J. T. E. 2012. “The Attainment of White and Ethnic Minority Students in Distance Education.” Assessment & Evaluation in Higher Education 37(4): 393–408. doi:10.1080/02602938.2010.534767.
  • Richardson, J. T. E. 2018. “Understanding the under-Attainment of Ethnic Minority Students in UK Higher Education: The Known Knowns and the Known Unknowns.” In Dismantling Race in Higher Education: Racism, Whiteness and Decolonising the Academy, edited by Jason A., and Heidi Safia, M., 87–102. London: Palgrave Macmillan.
  • Robertshaw, M. B., and A. Asher. 2019. “Unethical Numbers? A Meta-Analysis of Library Learning Analytics Studies.” Library Trends 68(1): 76.
  • Robertson, S. 2018. “Platform Capitalism and the New Value Economy in the Academy.” In World Yearbook of Education 2019: Methodology in an Era of Big Data and Global Networks, edited by Gorur, R., Sellar, S., and Steiner Khamsi, G. London: Routledge.
  • Romero, C., and S. Ventura. 2013. “Data Mining in Education.” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 3(1): 12–27.
  • Sharma, K., H. S. Alavi, P. Jermann, and P. Dillenbourg. 2016. “A Gaze-Based Learning Analytics Model: In-Video Visual Feedback to Improve Learner’s Attention in MOOCs.” In Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK 2016), Edinburgh, Scotland, April 25-29.
  • Shimada, A., K. Mouri, and H. Ogata. 2017. “Real-Time Learning Analytics of e-Book Operation Logs for on-Site Lecture Support.” In Proceedings - IEEE 17th International Conference on Advanced Learning Technologies, ICALT 2017, edited by R. Huang, R. Vasiu, Kinshuk, D. G. Sampson, N-S. Chen, & M. Chang, 274–275. Piscataway, NJ: Institute of Electrical and Electronics Engineers Inc.
  • Smith, E., and P. White. 2015. “What Makes a Successful Undergraduate? The Relationship between Student Characteristics, Degree Subject and Academic Success at University.” British Educational Research Journal 41(4): 686–708. doi:10.1002/berj.3158.
  • Sonderlund, A., E. Hughes, and J. R. Smith. 2018. “The Efficacy of Learning Analytics Interventions in Higher Education: A Systematic Review.” British Journal of Educational Technology.
  • Soria, K., and M. Stebleton. 2012. “First-Generation Students' Academic Engagement and Retention.” Teaching in Higher Education 17(6): 673–685. doi:10.1080/13562517.2012.666735
  • Tempelaar, D. T., B. Rienties, and B. Giesbers. 2015. “In Search for the Most Informative Data for Feedback Generation: Learning Analytics in a Data-Rich Context.” Computers in Human Behavior 47: 157–167. doi:10.1016/j.chb.2014.05.038.
  • Thiele, T.,. A. Singleton, D. Pope, and D. Stanistreet. 2016. “Predicting Students’ Academic Performance Based on School and Socio-Demographic Characteristics.” Studies in Higher Education 41(8) : 1424–1446. doi:10.1080/03075079.2014.974528.
  • Vieira, C., P. Parsons, and V. Byrd. 2018. “Visual Learning Analytics of Educational Data: A Systematic Literature Review and Research Agenda.” Computers & Education 122: 119–135. doi:10.1016/j.compedu.2018.03.018.
  • Wise, A. F., Y. Zhao, and S. N. Hausknecht. 2013. “Learning Analytics for Online Discussions: A Pedagogical Model for Intervention with Embedded and Extracted Analytics.” In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK 2013), Leuven, Belgium, 48–56. New York, NY: ACM.
  • Yorke, M. 2001. “Formative Assessment and Its Relevance to Retention.” Higher Education Research & Development 20(2): 115–126. doi:10.1080/758483462.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.