Publication Cover
Business and Management Education in HE
An International Journal
Volume 1, 2014 - Issue 1
527
Views
3
CrossRef citations to date
0
Altmetric
Other

Signals from the Silent: Online Predictors of Non-success in Business Undergraduate Students

&

Abstract

Changes to student funding, in parallel with the introduction of new technologies into teaching and learning, and a blurring of traditional boundaries between full and part time study have brought an increased focus on retention in higher education. As more providers look towards offering online portfolio options, this is likely to increase; with some evidence that drop-out on online-led modules can be higher.

As many classroom-based behavioural indicators of student retention are reduced or absent in the online environment, institutions can be slower to understand that students are considering dropping out, with the first indication of problems being when withdrawal takes place.

This paper draws on data from online interactions among 3000 students studying an introductory Business Studies module at the Open University to identify early online indicators of potential drop-out. Findings suggest that both erratic interactions, and/or marked reduction in activity level from previously active students are predictors of subsequent withdrawal.

Existing demographic data (on age, gender, etc.) used by the institution has historically allowed broad ‘at risk’ categories of student to be identified; coupling this information to some of the online behavioural indicators revealed through new tracking technology allows more precise targeting of individuals more promptly than historically. This may potentially change a student support model from reactive to proactive; allowing the institution to offer additional support as soon as such signs emerge which improve student retention.

Universities seeking to increase student retention may find early warning of vulnerable students useful in targeting appropriate interventions.

Introduction and literature review

Online learning has been part of Higher Education Institution (HEI) portfolios since the 1990s, with many business schools investing heavily in technical infrastructure and in the underpinning learning design (CitationOpen University Learn Design Initiative, OULDI 2012). This investment has brought rapid growth in this area, both from individual institutions and in partnership; many Open University (OU) modules are now offered online, or with an online only option. Developments such as Massive Online Open Courses (MOOC) are widely cited as potentially significant for the higher education environment. Commentators such as CitationCoughlan (2012) have referred to ‘a first step towards a major shift in higher education – with implications for the current constraints on time, capacity and funding’. Early adopters show significant student interest in this model, Anant Argarwal of edX is aiming to reach a billion students over a decade, saying: ‘We’ve got 400,000 in four months with no marketing, so I don’t think it’s unrealistic’, as quoted by CitationCadwalladr (2012).

The 2008 survey of technology-enhanced learning for higher education in the UK by the Universities and Colleges Information Systems Association (UCISA) found that enhancing the quality of teaching and learning, and meeting student expectations, are seen as the two most significant drivers for institutions to invest in new technologies (CitationBrowne et al. 2008: 2, Executive Summary). Improving access to learning for students off campus and for part-time students was also cited as being important, although there are many differing perceptions of the implications of the investment in learning technologies. CitationKirkwood and Price (2006: 6) state that:

When considering how ICT [information and communication technology] can be used to support higher education, some teachers think primarily about content or materials. They see ICT in terms of its capacity to store and deliver teaching materials, or its potential role in finding and retrieving dispersed resources. Others think of ICT primarily in terms of the communication that it can facilitate and the dialogue that can be enabled ….

In addition to the educational drivers for ICT investment, there are also significant economic factors (see CitationBramble & Panda 2008). CitationRumble (2001: 83) cites six areas of cost to an institution of a fully developed e-learning system, indicating that e-administration may be an area where savings occur as ‘institutions invert traditional processes, such as student services, to focus more on web-based, self-service models’, despite an acknowledgement of little information on this area at the time of writing.

As online learning options increase however, it is important to question the impact both on the learner and institutions in terms of retention. CitationSimpson (2008: 11) cites both US figures, ‘where the consultants Corporate Xchange found dropout rates from e-learning of around 71%’ and OU figures where one online-only module has poorer retention than any other module offered by the institution, which illustrate the greater importance of this issue compared with face-to-face institutions.

Similar reasons have been suggested for student drop-out on online courses to their face-to-face counterparts. CitationArbaugh et al. (2010: 45) suggest that: ‘failure rates for online business communication courses suggest similar causes to those associated with failure in classroom-based courses’.

There are other, useful comparisons to be made with studies done in the traditional ‘on-site’ environment. Early studies on retention management at face-to-face institutions as cited by CitationBennett et al. (2007) ‘Catching the Early Walker’ used data from students’ movements via an ‘electronic turnstile system’. The captured data alerted staff to those students who had passively withdrawn in the first few weeks of commencing study. The subsequent attempt to contact these students to establish the reason for withdrawal offers an attempt at managing the key factors for early withdrawal.

CitationBennett et al. (2007: 112) highlighted in a section entitled ‘Commitment to being a Student’ that students with specific career-related goals were much more likely to be retained than those undecided about a career or not committed to being a student. Other factors identified in the paper, (pp112–115) however, gave the institution an opportunity to ‘intervene’; examples given were such as ‘service failures’ around late marking by tutors and ‘moments of truth’ where students acknowledged that ‘getting a degree would be difficult’. Johnson (2001) and Thomas (2002) as cited in CitationLaing et al. (2005: 172–173) suggest that retention is generally affected by factors such as age, tutorial attendance and other commitments.

American studies of relevance here include the Education Advisory Board’s 2009 report (CitationVenit 2009) on early warning strategies employed by a number of HEIs. Practices such as pre-enrolment risk models for students and usage analyses of learning systems offer some insight into the possible development of future student retention management systems in UK institutions. The emergence of research in this area is likely to grow with the online market. Purdue University, whose Signals project (see CitationPistilli et al. 2012) reports proactive use of demographic, performance data and student engagement to increase retention, resulted in just under a 20% increase in retention for students engaged on the programme; ‘Students experiencing Signals at least once in their academic career at Purdue are retained at a rate of 87.42%, while their peers who did not are retained at 69.40%’. The Purdue University press release (CitationTally 2009) describes the project methodology as the use of combined data from sophisticated business intelligence systems that track whether students have read material, done assignments and participated in class to provide an indicator of success, together with test scores and personalised text and emails from faculty staff to support the student in a personal way.

Whilst most of the emphasis on online interaction by students remains on pedagogical design in terms of online teaching and learning, there are related questions that focus on how technology and the online environment can contribute to our understanding of student behaviour. How students engage with the online spaces provided on a course can significantly add to institutional understanding of likely retention and could influence how an institution chooses to interact with its students. CitationKirkwood (2007: 372–382) provides a useful discussion of the issues, highlighting the importance of the wider social context of engagement with technologies as well as the rationale behind student choices online. The process of student engagement or disengagement with their studies in the project was often signalled in changing patterns of engagement with their online environment. The team found examples where students had been slow in accessing their forum at module start, and then not submitted the first assignment, but also where tutor or institutional intervention then made an impact. There were also students who were highly active online early on who then ceased to be so, and then consequentially ran into difficulties. These are discussed later in this paper. It is possible that the patterns identified could be used to predict subsequent disengagement and consistent early institutional intervention at this point could potentially reduce the ‘drop-out’ rate.

CitationSimpson (2012, particularly pp88–91) explicitly refers to ‘proactive motivational support’ which acknowledges the value of an early warning system and its potential to be cost-effective for both the institution and the students, given that it is cheaper to retain existing students than to attract new ones. This is particularly pertinent under new UK funding structures, as institutional-led interventions will shape the flow of demand through student queries to the organisation, allowing more appropriate allocation of scarcer resources. CitationKeegan (2009: 80) refers to this issue in relation to studies at Thistle College:

… College has its own objectives and these can cause conflict for students ….

… The need to manage resources means that the institution will consider its needs before those of the student.

Use of tracking tools to manage proactive support will better handle this area of potential conflict as well as aid retention.

The aim of this paper is, therefore, to contribute to work in this particular area of student retention by presenting a retrospective study of activity patterns displayed by certain students online on one introductory OU Faculty of Business & Law (FBL) module, An Introduction to Business Studies. The study is an initial exploration of whether the use of technology to track student behaviour online can offer institutions an early warning system which produces more ‘live’ retention data than the historic reactive information gained when a student tells the intuition they plan to withdraw, or passively disappears from their course. This would allow the institution to focus its intervention resources where most needed.

The project initially focused on identifying whether certain students who regularly post to forums were more likely to be retained and if disengagement was evidenced by either a reduction in postings or erratic online activity, and, secondly whether this information could be added to existing profile data as suggested in other studies to strengthen retention activity.

The OU has historically collected data from student profiling criteria using factors such as age, gender, previous successful study, etc. Where project funding has allowed (e.g. government-funded initiatives that target lower income areas) it has coupled with measures such as tutorial attendance, and assignment submission to map progression, and plan institutional interventions for students. This raises the question of whether existing knowledge about student profiling can help retention in the online environment, and whether it may be useful coupled with information on student online activity.

Methodology

One cohort of students on the introductory module of the OU FBL BA Business Studies programme was selected for study. This is a 30 credit point module, presented over six months, twice yearly, attracting approximately 7000 students per annum. The primary reason for selection was that the module is a compulsory element for the majority of Business Undergraduate awards, so retention is paramount to future student progression and success.

An Introduction to Business Studies (B120) module achieves both high levels of student satisfaction and retention (73% for this sample) with an average pass rate of over 85% for those who complete the module. However, each 1% increase in retention is 70 students, and so worth significant funding to the University.

The data was drawn from the 2009–2010 intake of 3102 students. To pass the module students are required to complete four assignments on business understanding, human resources, finance and marketing, as well as a larger End of Module Assessment (EMA) based round a case study, drawing on material covered throughout the module to provide an integrated piece of work.

Two main types of online forum exist for students on this module, a small Tutor Group Forum (TGF) limited to around 20 local students and their personal tutor, and a module-wide student café. Students have to contribute to a number of online discussion activities in their TGF and then provide a summary of them as one part of every assignment. This part is worth 10% of the overall mark for each assignment, and is widely perceived to be an easy way to gain a few marks and build confidence at the introductory level, ahead of greater online collaborative work further on in the degree.

The study set out to explore whether there was any evidence that a systems approach could identify vulnerable students prior to withdrawal, passively or actively, based on their interactions (or not) with the virtual learning environment. Historically, the University has held face-to-face tutorials in the first week of the module which have been well attended; tutors have kept a record of attendees, and matched it to students who have contacted them directly in early weeks of the module, and in recent years to those who have been active on their forum. The OU then asks tutors to refer any students who have not made any contact at all to Student Services. However, this system is highly dependent upon individual tutor experience and record keeping, as well as being time consuming for tutors. Additionally, each tutor will only see a small segment of 20 students per presentation of the module. As the University makes plans to move to more online provision in the portfolio, there is an emerging wish to explore what the technology might enable, with several other studies of online retention pilots in the institution available (e.g. CitationSlade & Mullett 2010, also see CitationSimpson 2012, for a wider discussion).

The initial stages of the study used looked at assignment scores, module results and personal data (age, gender, disability declaration) which were then mapped to student tutor group online postings (approximately 54,000) in order to identify any interesting trends which could then be subjected to further analysis. (Due to the optional nature of engagement here, the 13,000+ café postings were excluded from the study, although noted as potentially relevant to any future project.)

The six month module was divided into 3 × 2 month sections. Student postings in each of these sections were analysed and batched by student identity number and matched to their existing profile and assignment scores. Student engagement was categorised for each period as follows:

  • ‘Active’ – with number of postings recorded

  • ‘Lurking’ – reading others’ postings but not contributing to any discussion

  • ‘Nothing’ – students not present in the forum at all (Note: this group included prisoners who do not have access to a forum but complete an alternative activity)

Categories were chosen from the informal terminology used within the OU to describe student behaviour online, although the term ‘lurking’ to define students who read, but do not post to forums seems to be fairly widely understood in the distance learning community (see CitationPreece et al. 2004, for an early example).

Where relevant to try and explore individual student scenarios, random small pieces of data were also taken from student records where there had been interaction with the University.

Results and discussion

The team found that in the majority of sample individual cases that were examined in detail, where students either withdrew or needed counselling from a member of the University staff, active tracking of online engagement would have offered an earlier alert to the institution of a student’s problem, particularly when coupled with existing profile data.

Profiling and results

Whilst historically, many traditional universities have had their largest population of undergraduate students aged 18–25, recent changes in technology and growth in areas such as MOOCs may mean that wider demographic information as an indicator of student success becomes more important to other providers.

General profiling data kept by the OU tells us, for example, that female students, aged 36–45, are statistically the most successful group, and young men, aged less than 25 years are the least successful group. We also know that those who come to us with some previous attempt at higher education study, or A levels, tend to be more successful than those without any prior qualifications, as well as knowing that some subjects such as Arts retain students more successfully than say, Accounting. CitationSimpson (2012: 103) cites 83% probability of success to a middle aged woman studying arts, to as little as 9% for an under 25 year old unemployed man with no previous qualifications studying maths.

The sample studied was broadly in line with institutional profiling expected for business studies with nearly 79% of women aged 36–45 being successful on the module, (58% of men of the same age gained a pass) compared with 50% of men aged less than 25 years.

Younger students in general were less likely to gain a pass grade on the module. For 18–25 year olds, who accounted for 31% of the cohort, they accounted for 28% of the passes (545), in contrast to 26–35 year olds who accounted for 36% of the cohort and 37% of the passes (702) and 36–45 year olds who were 23% of the cohort and 29% of the passes (556).

The two younger student groups were also the most likely to be in need of proactive institutional support. It is not necessarily that their problems and issues were more likely to hinder their studying, more that they were significantly less likely than other age groups to advise the institution (either the tutor or support staff) that they needed help.

An indication of this can be evidenced by looking at the group of ‘passive withdrawals’ from the module. These are categorised as the group of students who received a fail result and had no recorded interaction with the tutor or the University to ask for support, or where recorded institutional attempts to make contact failed. These accounted for 172 students in total.

It is difficult to draw any firm conclusions, as no detailed analysis of the individual cases was undertaken and we have no information on why these people passively withdrew as they chose not to engage with the institution after making their decision not to continue. However, it does indicate that the two younger age groups were less likely to talk to the institution about issues with study and as such, were more in need of the institution proactively working with them. Students aged 26–45 for example, accounted for 36% of the cohort but 40% of the passive withdrawals, whereas for the 36–46 age group which accounted for 23% of students, they accounted for only 18% of passive withdrawals, suggesting students becoming more confident in seeking assistance as they got older (see ).

Table 1 Passive withdrawals by age range and gender.

The passive withdrawals by gender show that as men got older they were less likely than women to passively withdraw, given they account for 46% of students in total on the module. However, women were more likely to withdraw than men overall, with women accounting for 175 (58%) of the 297 active withdrawals, where students did engage with the University prior to ending their studies. The greater number of resit grades (49% of total ‘resit’ results obtained) sustained by men suggest they were more likely to keep going in adversity.

Early findings, therefore, suggested that some students who might benefit from institutional proactive support could be identified as the two younger groups of students, particularly men, as well as in other ‘pockets’ such as the most statistically successful group of women aged 26–45, who although as a group are highly likely to succeed, are less likely to ask for help when they need it. The project did not look at previous educational qualifications but this dataset might also inform future studies.

Integrating online engagement with module results

Posting relevance during first period of the module

Previous retention studies in the OU have focused on the initial few weeks of the module as being critical, and tutors are already asked to report to the University non-submission for the first assignment or students they have failed to make contact with.

It was invariably the case that those who failed to make a connection at all to the online forum (even after institutional attempts to contact) were part of this cohort, with the result that of those 156, every single one did not submit the first assignment, and all these students failed the module (this was removing 23 cases where students were either prisoners or armed forces personnel stationed in places without internet access such as Afghanistan). This illustrates a significant portion of students (around 5% of the cohort) who could be identified by tracking tools prior to the submission date of the first assignment, and/or manual note by the tutor that the student had not engaged at all; of these students 35% (55) were under 25 years old and already identified as potentially less likely to succeed, although there were no real differences by gender in this group. It is unclear whether intervention by the University would have made a difference to retention with this group; however, even a 10% return of 15 students would represent a reasonable financial boost to the institution, and earlier information rather than reactively waiting for a tutor to report a missing assignment should mean a greater opportunity to retain the student.

A second group of students, who did not initially engage actively online, were identified as ‘lurkers’, and read messages by tutors and other students but did not contribute themselves. From an initial 177 students who ‘lurked’ during the first period of the module (November and December), 28 subsequently passed the module (although none secured a distinction grade), 54 actively withdraw (largely through University/tutor-led prompting) and 84 passively withdrew so failed the module (see ).

Figure 1 Percentages of student lurkers.

The University recorded engagement and attempted intervention with 48 of the passive withdrawal students, but the remaining 36 students might have been engaged with more extensive tracking. (There were 11 anomalous results, predominantly students who were in debt to the University and so did not receive any module score.)

Men were more likely to ‘lurk’ than women, and comprised 54% of those students in this category. Young students of both sexes were less likely to contribute, with 61 of the lurkers being under 25 (34%) also perhaps reflecting less confidence in participation from those in this age group.

It is difficult to generalise from the single sample studied whether students who ‘lurk’ are more likely to fail than actively posting students, as it depends on their motivation for non-contribution. Nonnecke (as cited in CitationTakahashi et al. 2007: 3) states that ‘lurking’ is a form of participation and the individual may usefully take information gained elsewhere; however, statistically, they are a significant enough group for institutional intervention, based on their results.

Posting relevance from engaged student

For those (c. 2800) students who did participate online in some way, as can be seen by , students made the highest number of postings in the early stages of the module. Many of these were social in nature such as ‘Hello, I’m Steve, I work as an account manager for …’ or asking induction-type questions ‘Where’s the link to the assignment question again?’ etc. During the latter parts of the module they became more subject-orientated (see ).

Table 2 Posting relevance – number of postings to Tutor Group Forum (TGF) by student per period, B120 November 2009.

The majority of students then appeared to take quite a mechanistic approach throughout the later stages of the module, with postings significantly dropping in period 3, once marks had been achieved and forum contribution was no longer linked to assessment. This correlates with findings elsewhere: ‘Usually after the assignments are handed in they stop asking questions anyway … becomes related to assessment … they put more effort in … if it’s knowledge acquirement, forget it’ (CitationFry & Love 2011: 53). It also suggests that looking at engagement through student postings may be intrinsically of less value to the institution once assessed contributions are complete, or for modules where there are no marks formally ascribed to online contributions.

The majority of tutor groups then only seemed to have a small number of students who wished to keep social interaction in the group going in the latter stages of the module, although many inactive students did continue to ‘lurk’ by reading forum postings during this period. Entwistle, McCune and Hounsell (2002) as quoted in CitationFry and Love (2011: 53) state that some students adopt a ‘smash and grab’ or surface approach; ‘where they simply login and download the information they require at a particular moment’ (CitationFry & Love 2011: 53) rather than spending time making virtual learning environment contributions.

For those students who were more actively engaged with the forum, there was typically a correlation with success in their studies: 75% of those gaining a pass grade made more than 10 posts to the forum throughout the module, rising to 86% for those who made over 30 posts (although it should be acknowledged that this may also be a reflection of the general effort put in by more conscientious students to their studies).

As all types of postings were given equal weighting in this initial project, there may be further benefit in codifying the posting types to establish whether there are any firm links to drop-out from students in any further studies. This could be achieved by asking certain types of questions; for example, students requesting forum help just prior to assignment submission, displacement activity such as posting social messages immediately before an examination or whether some posting types such as the debates students were engaging with were more useful to the organisation than others in terms of future module design.

Posting indicators further on through the module (period 2 onward)

For those students who withdrew further along the module (having completed at least one assignment) statistics were harder to categorise and more obviously aligned to individuals’ circumstances, although where students actively withdrew there was a clear disengagement from the forum often a few weeks prior to contact with the University.

Disengagement was often evidenced by significant reduction in TGF postings or erratic postings from formerly active students. As this was a retrospective study, decisions by students and advisers were made without this information, but its addition to existing profiles would help future retention via proactive intervention.

It was also more difficult to quantify patterns for student withdrawal later on in the module, given with passive withdrawals at least, it often cannot be said accurately when the student actually made the decision to withdraw. In fact, it can be argued that there may be no actual engagement by the student to the possibility of withdrawal, borne out by other examples from the dataset with students who do ‘drop-out’ (e.g. passively withdraw) continuing to ‘lurk’, and occasionally contribute in their TGF many weeks after their last missed TMA submission.

This ‘student drift’ (CitationDwyer et al. 1998: 10) suggests that in some cases, the institution will understand better than the student that, with key assignment milestones missed, the student has probably already failed the module without intervention. This builds a case for proactive targeting based on online tracking to catch some students who may still perceive that they are studying a module, but will need help if they are to succeed.

One example that illustrates this point is a student contacting student services at the very end of the module, the day before the final examinable component. The student had started off well with good scores and had drifted downwards in both marks and online engagement. That the student was still hoping to salvage her studies can be seen from her email requesting an extension to student services on 22 April 2010 (submission date 23 April 2010):

… the reason why I need extension for my ECA [End of Course Assessment] cut-off date is because I don’t have enough time to do my course work. I am busy with work, family … I know I’m not being fair with myself because I can’t manage my time well in order to meet the cut-off date … .

Active tracking of this student’s online behaviours coupled with assignment scores and intervention by the University at an earlier point may have produced a different result and a pass grade, given that this student was an active participant throughout period 1, with seven TGF postings, which then fell sharply in period 2, to two postings corresponding with the drop in assignment performance.

Given that 242 students did not submit the second assignment after being both active online and submitting the first, investigation of patterns here could be significant. For those 39 who withdrew actively during the period, the usual type of reasons for withdrawal were recorded by the University (trouble with fees, too busy at work, family commitments, etc.), leaving 203 that withdrew passively during this stage of the module, either temporarily or permanently. These people were, in the main, identifiable as being active during the first period of the course, but their online postings dropped during the second period, or they became ‘lurkers’, or even ceased posting abruptly.

Of these individuals 71 were recorded as having been in contact with the University during the period (largely prompted by the tutor), from which 47 then went on to submit the following assignments and pass the module (albeit with a lower grade than would have been possible). It can then be suggested that there is a correlation between online behaviour, assignment non-submission and institutional actions, and that interventions made through online tracking may have produced an easier way for the University to reach these students than relying on the tutor (see ). (There were statistically only a few students who dropped out by the third and fourth assignment, and many of these had significant personal issues but were seen to be engaging with the University in the majority of cases.)

Figure 2 Correlation between institutional intervention and student progression.

CitationSimpson (2012: 102) argues that using tutors alone has additional drawbacks in at least two areas. Firstly, a number of tutors may not actively monitor their students in this way, and secondly, extra support happens after the student has missed an assignment, which may be too late to make a difference, particularly where time is an issue for the student. The early indicator of a reduction in online collaboration, monitored through technology and paired to other profiling information is quicker, easier, and does not trouble the student unless there is a good reason for doing so. This enables scarce institutional resources to be more accurately placed where they are needed.

The change potentially engineered by the use of these tools is that the institution does not at this point, (apart from a few funded projects), proactively contact students to identify whether they have issues but takes a reactive approach. Online tracking tools, coupled with existing profile data offer a resource to specifically target vulnerable students and respond proactively, probably increasing retention, but certainly shaping the flow of support demand to the organisation. Anderson (2006) as cited in CitationSimpson (2012: 88) promotes the proactive nature of successful retention through tools, commenting ‘Student self referral does not work as a mode of promoting persistence. Students who need services the most refer themselves the least’. Therefore a system set up to analyse student behaviour at key module milestones, e.g. early forum activity, hand in (or not) of assignments, coupled with flags for vulnerable groups and markers saying whether or not the student has contacted the University otherwise could be quite powerful in supporting retention practices.

Anomalies to patterns of student behaviour

As has been discussed briefly, there are a number of student groups who are not able to be effectively studied through online engagement. Prisoners, as they do not have internet access, complete an alternative assessment. Armed forces students comprise another set that often have sporadic or no internet access when on duty tours, but when they are on leave can be highly active, so patterns in interaction are of little value to the institution. There were around 80 students in this category of the cohort studied.

However, students who were studying two modules simultaneously comprised a large group (406 or just over 13% of the cohort), whose online interactions were erratic and difficult to analyse compared with those on the ‘one module at a time’ pattern, as students regularly diverted their attention between the two modules. CitationPerraton (2009: 275) in her book review of CitationBramble and Panda (2008) refers to the ‘fuzzy dividing line’ between a student registered as full time with a job, and someone with a job who is a part-time student. For those studying two modules (or potentially more) simultaneously, a third category may become appropriate where the student may potentially be both a full-time employee and a full-time student (in terms of time that must be allocated for studying). The University has historically considered these students as ‘at risk’ and attempts to dissuade them from such a course of action.

In terms of progression, based on existing institutional assumptions, it was expected that these students would generally achieve poorer results than those who had studied B120 alone, thus indicating that erratic behaviour could be a good predictor for interventions being required through tracking. Surprisingly, however, the investigation showed that the overall results were comparable, with little difference between Pass and Fail rates. For example, for students studying only B120, 56% gained a ‘pass’ grade first time around (leaving out resit and deferral students), compared to 57% for those studying another module simultaneously. However, a higher grade of pass does show a much lower percentage rate for those studying more than one module so overall success could be affected (over 5% of students studying B120 alone gained a distinction for this cohort, whereas only 3% of students doing another module simultaneously gained this result). Fail, with no resit grades were comparable 23% for each group of students.

Contrary to institutional expectations, the results provide an early indicator to suggest that studying two modules simultaneously does not necessarily impact on success and progression, but this would bear further analysis with other cohorts. The study did not consider the limited number of students taking more than two modules simultaneously, although they would also be worth including in any future study.

Conclusions

The coupling of online retention tracking tools with student profiling provides factual data that can highlight risk factors for student success and HEIs with a potentially rich and developing picture of engagement by the student. The benefits are that this information can be presented in real time and can act as an ‘early warning or monitoring system’ that provides information ahead of possible retention and completion issues, thus buying the institution more time to engage with a student before withdrawal.

These tools could be of particular use for supporting students through online only modules. Online study may be particularly attractive to some groups of students as it offers the potential for greater flexibility. The work of CitationYorke et al. (2008: 2) cites that ‘the strongly predominant reason given for opting for part-time study over full-time study was that it allowed study to be undertaken alongside other commitments’. This is also true of online study as it removes the need to travel and be in attendance at a specific time. The need for students to fit study around other commitments, however, can make retention online a greater challenge for the hosting institution and where non-invasive tools such as online tracking may provide the greatest indicator of retention.

The results of the various pilots are being used to develop organisational practice during the current splintering of UK fee structures, and provide a reference point for subsequent institutional research on how changing fees and further growth to online offerings may alter student behaviour.

As a result of internal studies, OU funding has now been allocated to looking at the next stage of tracking student behaviour online with the implementation of proactive student support, through investment in tracking software. Investments made here to adapting existing software as a result of early studies should really speed up future scholarship in this area, given that many of the counts made on student contributions were made semi-manually. Automated email and text interventions are being designed to contact students whose tracking profile indicates they need support. The team hope to study patterns of student engagement with these interventions, and their potential effects on retention and continued study.

Online delivery is expected to increase year on year as HEIs compete for global markets and endeavour to make cost savings against greater competition from private providers. These factors will, inevitably, increase the importance of student retention and graduation rates in a competitive marketplace and providers will be anxious to seek ‘tools’ to support their endeavours in retaining students.

Acknowledgements

With thanks to Erica Youngman and Helen Bacon, Sheila Cameron and Jackie Price, Faculty of Business and Law, Open University.

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.