495
Views
2
CrossRef citations to date
0
Altmetric
Editorials

Editorial

The long-standing tension between accountability and improvement has been a focus of Quality in Higher Education since the inception of the journal in 1995 and it continues to be a key perennial debate internationally (Williams, Citation2016). In this issue, it is the focus of a discussion of contemporary Namibian higher education. Ngepathimo Kadhila and Nangula Iipumbu describe the implementation of new quality assurance processes at national and institutional levels: they argue that there is much work to be done by the sector as a whole to instil a sense of ownership of quality assurance process and practice amongst academics.

Similarly, there has been much debate about how students, as a key stakeholder in higher education, can take ownership of quality processes. In this issue, two articles explore various facets of the role of students in contemporary quality processes. In their article on Italian higher education, Serafina Pastore, Amelia Manuti, Fausta Scardigno, Antonella Curci and Monica Pentassuglia argue that despite the wide acknowledgement of the potential for student feedback, there is litte evidence that Italian students actually take part in the quality process for a range of reasons, partly including traditional notions of deference. In contrast, in their study of Erasmus Mundus programmes, Rediet Adebe and Krisztina Ford have found that students are much more willing to be involved in quality processes. However, they argue that organisational culture, national policies and regulations are the main barriers to such participation.

At the core of many of the debates about quality in higher education is a concern with the efficacy of standards and performance indicators. In this issue of Quality in Higher Education, two articles contribute to this debate. Khaled Alzafari and Jani Ursin assess the implementation of the European Standards and Guidelines (ESG) across different European countries and highlight the impact of national differences on the implementation of the ESG. In the context of German higher education, Theodore Leiber identifies a comprehensive set of performance indicators for learning and teaching and argues that learning and teaching quality must be approached in a holistic way across the four subdomains of learning and teaching environment, teaching processes, learning processes, and learning outcomes and their assessment. Leiber’s work is unusual because he addresses one of the fundamental conundrums of the quality debate: how learning and teaching are actually measured.

Occasionally, however, new issues emerge and one of the most prominent has been digitalisation and its potential impact on quality management. Digitalisation has become an increasingly important part of the life of staff and students in higher education and is now an issue that is coming under the purview of quality assurance agencies and institutions. In this issue, Cathrine Edelhard Tømte, Trine Fossland, Per Olaf Aamodt and Lise Degn provide a comparative analysis of the use of technology in learning and teaching. They highlight how processes of digitalisation can influence the development of learning and teaching strategies that are more effective.

Learning analytics and its potential for quality assurance

The last two articles in this issue highlight the important and influential role played by ‘learning analytics’ in contemporary higher education and this has implications for quality management. The term ‘learning analytics’ has been most recently defined as ‘the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs’ (Sclater, Citation2017, p. 13). The data collected as part of this process include what is often called the ‘digital footprint’ left by students, that is, data collected through admissions systems, attendance monitoring, from the use of virtual learning environments and other learning resources. As a tool for monitoring, learning analytics provides an ideal source of data for measuring things and adds hugely to the potential of institutional research. Digitisation, as hinted by Tomte and colleagues in this issue, provide a seemingly endless amount of data about almost every aspect of students’ lives and can be used to evaluate institutional performance against various standards and performance indicators.

Concomitantly, learning analytics can provide a valuable tool for improvement of the student learning experience. Teaching staff can use such data to improve their own practice and institutions can use learning analytics proactively as a diagnostic tool. Potentially, learning analytics can be useful for academic staff in introducing and developing a more personalised approach to learning and teaching because it provides information about how students actually engage with learning material through their use of virtual learning environments, libraries and other facilities. As such, learning analytics can help institutions to use analytics to identify students who are at risk of failing and put in place interventions to advise and support such individuals at an earlier stage than would otherwise be possible (Sclater et al., Citation2016).

Learning analytics is still relatively under-researched and much of the work currently highlights its potential value rather than demonstrating its efficacy. There is some evidence, both published and anecdotal, of examples of teaching practice that has been informed by lecturers’ use of learning analytics (see for example, Rutherford et al., Citation2015). However, there is little published research showing that learners’ practice has actually improved as a result of teachers’ use of learning analytics (Viberg et al., Citation2018).

The success of learning analytics faces five major challenges. First, it depends on a sensitive use of data collected as part of the learning analytics process. A crude use of digital data can lead to unhelpful labelling of groups of students. Second, it is vital that academics can gain access to the data that they need and use it effectively. Institutional responses to such developments as the introduction of the General Data Protection Regulation (GDPR) may lead to restrictions on staff access to student data. Third, there needs to be a shared institutional understanding of the purpose of collecting data. Different people have different understandings of what data is collected and why. Fourth, data collection and management needs to be joined-up between institution and the different departments and individuals within them. Fifth, all of this assumes that the data that is collected is accurate and useful.

Existing data collection often, it seems, fails adequately to address current needs. Data is collected but not always for a clear purpose other than, often, simply to comply with external requirements. Clearly, care is needed to ensure that the potential of learning analytics is realised. In the United Kingdom, for example, the Higher Education Committee (HEC) recommended that analytics systems are driven by the improvement of learning and teaching processes and student engagement. This committee also argued that such systems should be designed in consultation with students and supported by an ethical framework or policy; it should also be tailored to the particular needs of each institution and embedded in institutional strategic plans (HEC, Citation2016).

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.