20,699
Views
92
CrossRef citations to date
0
Altmetric
Original Articles

E‐learning in higher education: some key aspects and their relationship to approaches to study

, &
Pages 303-318 | Received 13 Sep 2007, Accepted 07 Nov 2008, Published online: 20 Apr 2009

Abstract

While there has been systematic and on‐going research into e‐learning in universities for over two decades, there has been comparatively less evidence‐based research into how key aspects of e‐learning are internally constituted from a student perspective and how these aspects might be related to university students’ learning experiences. The purpose of this paper is to explore key aspects of e‐learning that might be related to university student approaches to study, so that a better understanding of the internal structure of these aspects is achieved. Student responses to surveys are analysed at the level of each item to identify which items made the most sense to over 200 third‐year economics students. Data was also analysed both at the variable level, to identify which items coalesce to determine the structure of e‐learning variables, and at the student level, to see if there were groups of students in the sample that shared similar experiences of e‐learning when it was used to support a predominately campus‐based learning experience. The results suggest several implications for improving particular aspects of the student experience of e‐learning when it is used to support a campus‐based experience.

Introduction

E‐learning is being introduced as a fundamental part of the student learning experience in higher education. It is no longer core business only for those universities with a mission for distance education, its affordances are being systematically integrated into the student learning experience by predominately campus‐based universities. Evidence of this widespread uptake can be seen in reputable research journals and on the websites of national bodies responsible for leading learning and teaching in higher education. Examples of these include the websites of the Higher Education Academy in the UK, Educause in the USA and the Australian Learning and Teaching Council in Australia.

While we can recognize sustained research interest into e‐learning in the student experience in higher education over the last two decades (Goodyear, Citation1984, Citation1991; Goodyear, Jones, Asensio, Hodgson, & Steeples, Citation2005; Laurillard, Citation1993, Citation2002; Salmon, Citation2002a, Citation2004), more focused explorations into how key aspects of e‐learning are associated with the students’ face‐to‐face experience of learning are relatively sparse. There is comparatively little research into how both online and face‐to‐face contexts play a relational role in helping students achieve their learning outcomes. A growing use of e‐learning to support face‐to‐face experiences presupposes there is an understanding of what the key aspects of e‐learning are, how they are internally constituted and externally associated with each other and how they are related to key aspects of the face‐to‐face experience. Without these fundamental understandings, the quality of the student experience of learning comprising online and face‐to‐face experiences is likely to be put at risk. There is a need for more evidence‐based research to inform the ways we think about creating and designing such experiences so that the quality of learning is likely to be enhanced.

This study investigates how e‐learning is used to support the face‐to‐face experience of third year business students at University. In this paper, e‐learning is defined as information and communication technologies used to support students improve their learning (Higher Education Funding Council of England, Citation2005). Students studying Government Foreign and Defence policy in their business degree experienced tutorials and lectures as part of their weekly schedule. They were also expected to lead and run a tutorial by engaging in pre‐ and post‐ online contributions in the form of discussions and related submissions to structure and inform the debate. This study explores how the students perceived key aspects of the online experience, such as the design of their course website and the submissions made by others and themselves. The study also investigated how these perceptions are related to their approaches to study, measured by the revised Study Process Questionnaire, the R‐SPQ (Biggs, Kember, & Leung, Citation2001) and to their academic achievement. The term ‘explore’ is used here intentionally, as while we have some idea of the broad focus of these aspects from previous research, we seek to understand how these aspects are internally structured and which parts might coalesce within the online environment and with study approaches. The research questions of the study are:

  • What are some of the key aspects of the student experience of e‐learning when it supports a face‐to‐face experience?

  • How are the parts constituted?

  • How do these parts relate to student approaches to study?

  • Is there any relationship between variations in the student experience of e‐learning, approaches to study and their achievement?

Theory of learning and prior research

Seminal research into student learning in higher education over the last few decades has focused on key aspects of the student experience of learning, as shown in Figure .

Figure 1 Presage‐Process‐Product Model of Student Learning (adapted from Prosser & Trigwell, Citation1999).

Figure 1 Presage‐Process‐Product Model of Student Learning (adapted from Prosser & Trigwell, Citation1999).

Research into the student experience of learning in higher education has focused on: student characteristics, such as the conceptions of learning with which they enter courses; course context, such as teaching methods; learning context, such as student perceptions of the quality of teaching and quantity of work; student approaches to learning, what they do and why they approach learning in particular ways; and the quality of their learning outcomes (Prosser & Trigwell, Citation1999; Ramsden, Citation2002). This research has shown that variation in the way students approach their learning is related to how they perceive their context, what they think they are learning and the quality of their learning outcomes. This study adds to this research by considering associations between student approaches to learning and their experience of e‐learning. In particular, the focus is on four aspects: (1) interactivity, (2) approaches to e‐moderating, (3) issues related to course design, and (4) workload awareness.

Interactivity

Definitions of interactivity in the higher education literature include those that refer to action and activity amongst people and those that refer to action and/or activity between a student and a computer/program. Barretto, Piazzalunga, Ribeiro, Dalla and Filho (Citation2003) provide a general definition of the term ‘interactivity’ as an ‘activity and/or action between individuals and/or machines’ (p. 272). Educational software is considered interactive and interactivity in this case occurs between the computer program and the user. More specifically, it is possible to describe four types of interaction: learner‐content, learner‐instructor, learner‐learner and learner‐interface. Interactivity for the purposes of learning is recognized as one of the key ways to capture affordances of e‐learning to increase the learner’s knowledge (Laurillard, Citation2002; Sabry & Baldwin, Citation2003). In order to do this purposefully, the use of the interaction needs to be closely tied to the objectives of the course (Martens, Valcke, & Portier, Citation1997). It is suggested that it also needs to focus on student control of the learning so as to encourage active engagement that will be meaningful (Sims, Citation1997) and may require in‐depth case studies to understand how technologies support the development of interactivity and comprehension (Godwin, Thorpe, & Richardson, Citation2008). In this study, we refer to interaction as activity and action amongst the students in the learning process, i.e. student‐student interaction. Interaction between student and teacher is seen as part of a concept of e‐teaching and moderating.

Approaches to e‐moderating

A key aspect of teaching related to e‐learning is e‐moderating. This can be broadly described as a process of facilitating the development of small and large groups through conferencing (Brace‐Govan, Citation2003; Salmon, Citation2002b). Key teaching strategies in e‐moderating include summarizing and weaving together the ideas of students who have made a contribution to a topic of discussion. The role of e‐moderator is particularly important in helping learners to reflect about the issues under discussion. E‐moderation of online discussions has been shown to be beneficial for promoting group work and collaborative learning and it is considered to be an effective method to achieve structured group interaction (Dewiyanti, Brand‐Gruwel, & Jochems, Citation2005). In this study, the concept of e‐moderation is broadened to include online communication between student and teacher that is not necessarily part of a structured discussion in a small group. It includes online feedback from the teacher about class activities or submitted written work, as well as online communication to keep students informed about matters relevant to their learning. In this study, we refer to all these activities as e‐teaching.

Issues related to course design

The incorporation of e‐learning into the student experience of learning typically requires consideration of the interaction between students, teachers and the technology within a design framework. Often cohesion of the design elements is found through alignment to intended learning outcomes (Biggs, Citation2005). Some studies have highlighted the importance of using an outcomes‐based approach to course design, so that the learners’ outcomes are one of the key rationales for design decisions, rather than the provision of content (Littlejohn, Citation2002; O’Toole & Absalom, Citation2003). Some research has itemized aspects of design from an instructional perspective, such as the use of the technology, instructional objectives, testing, multimedia materials and the learning activities that arise through combinations of these aspects (Hashim, Citation1999). There has been significant research into approaches to learning that are moderated by design approaches, such as problem‐based learning. This approach has been particularly influential in international medical education and is readily adapted for online learning (Oliver & Omari, Citation1999; Pearson, Citation2006). The concept of design in this study focuses on associations between the face‐to‐face and online contexts of the students’ learning experience and how the design of the online materials and activities help the students to learn and understand the whole experience.

Workload awareness

Students’ perceptions of workload have been found to be related to the quality of their learning experiences. The research is often quantitative and seeks to identify associations between the different aspects of the student experience. Research into the relationships between learning approaches, study motivation, hours of study and perceived workload suggest that workload and motivation are significantly associated to the quality of the experience (Kember, Ng, Tse, Wong, & Pomfret, Citation1996). Other research has used self‐report inventories and closed‐ended questionnaires to investigate associations between learning and workload. For example, Richardson (Citation2003) administered the Approaches to Study Inventory and the Course Experience Questionnaire to students enrolled in an online course and found that students’ final results were most strongly associated with perceptions of appropriate workload. In the present study, workload is conceptualised as the amount of online work additional to the work the students are expected to do in class.

The above areas of research are a way of unpacking key aspects of e‐learning. This study explores how these issues are internally structured from a student perspective, and how they are related to the students’ whole experience of learning, both at the variable and student levels. To investigate key aspects of the whole student experience of learning, the revised Study Process Questionnaire (R‐SPQ: Biggs et al., Citation2001) is used. We report results from a new questionnaire, labelled ‘Questionnaire 1’, which draws on some of the above research in the development of its items. This questionnaire is designed to explore how e‐learning helps students learn in a context which also involves significant face‐to‐face learning. Analyses are then performed to investigate associations between the experiences tapped by the two questionnaires.

Learning context

Third‐year students studying Government as part of their undergraduate business degree were supported with significant online resources designed to integrate with, and extend, their face‐to‐face learning.

The course aimed to help students understand the factors leading to the direction taken and decisions made in Australian Government foreign and defence policy over the last one‐hundred years. Particular attention was paid to links between foreign policy‐making and international relations. To engage with these aims in a meaningful way students were expected to not only attend lectures and tutorials on the key issues, but also lead discussions on one of the issues – largely through the online environment. A typical approach adopted by students was to contribute in both pre‐ and post‐class discussions. They first researched the topic, then posted key issues and summaries on the online discussion board before the class, fielded any queries leading up to the face‐to‐face tutorial and, finally, posted key questions for debate online after all the students had had a chance to engage in the debate in the tutorial. Their online role post‐tutorial required them to moderate the debate, for which they received guidelines and training earlier. Their leadership of a topic online was worth 20% and their contributions to the postings of other students’ topics was worth another 20%. This made the online part of their course experience worth 40% of their final mark. To post and moderate the online discussions, students used Blackboard 6.1.

Methods

We believe that relatively little is known about the internal structure of the online experience and how key aspects relate to the face‐to‐face experience. Consequently a semi‐exploratory approach was adopted (Goodyear et al., Citation2005). It is semi‐exploratory in the sense that the content of the e‐learning items in the new questionnaire have been informed by prior research and while we have some idea about how these items might coalesce we seek to explore the students’ experience of e‐learning as it links to, integrates with and extends their face‐to‐face experience. In analysing the data we strive to maintain a balance between pre‐conceived and emerging patterns (Prosser & Trigwell, Citation1999). In other words, our aim is to use our understanding and experience of the phenomenon being researched to inform the way we interrogate the data and let patterns within the data reveal new insights to improve the types of research questions being asked.

Three types of statistical analyses are used. Factor analyses are used to look at the structural relationships amongst the items of the questionnaire, as there were expectations that a smaller number of underlying constructs may explain students’ responses to individual items. Pearson correlation coefficients are used to investigate the strength of the relationships between pairs of constructs that were identified through grouping the items. The cluster analyses are at the level of the student. These look for subgroups identified on the basis of similarities of the variables being investigated.

Data

Data gathered included student responses to Questionnaire 1, a twenty‐item, five‐point Likert scale questionnaire investigating the student experience of e‐learning when it is designed to support a face‐to‐face experience; student responses to the revised R‐SPQ (Biggs et al., Citation2001); an ‘Overall Satisfaction with Quality’ item, to check on the validity of scale scores used in some of the analyses; and the students’ achievement measured by the mark they received for their online participation.

Results

The results are offered in four parts. Firstly, student perceptions of e‐learning within their course experience are considered using Questionnaire 1. Descriptive and frequency analyses of the items that offer the most interesting differences in students’ ratings are discussed. We then examine how the items relate to each other at a variable level using factor analysis. Based on these results, scale scores were constructed. To investigate how key aspects of the e‐learning experience identified by the factor analysis are associated to student approaches to study, we calculated deep‐approach and surface‐approach scale scores for the R‐SPQ and we offer correlation analyses of the scores from the scales in Questionnaire 1, the R‐SPQ scales, the ‘overall satisfaction with quality’ item and student achievement. Finally, to investigate if there were any similar experiences of learning at the level of the student across the sample, a cluster analysis was used to look for groupings of the variables.

Student perceptions of e‐learning

The main source of data for how the students experienced e‐learning is Questionnaire 1. Student ratings ranged from 1 (strongly disagree) to 5 (strongly agree). In explaining the analysis, observations are offered about the student responses to the individual items. These are categorized according to similarities amongst the foci of the items and the frequency of student responses to the 5‐point Likert scale summarized in three groupings as percentages: agree, disagree and neutral. We then present an analysis of the students’ responses to the items around groupings or themes using a principal components factor analysis. The methodology for this section is adapted from Goodyear et al. (Citation2005) and is particularly useful for exploring which items have the most face‐validity to students at a discrete level and how patterning in the data then coalesce at the variable level.

Items 12, 13 and 2 in Table identify student perceptions about the submissions made by themselves and by others of the online part of their learning experience.

Table 1. Items focusing on submissions made by students.

Items 12 and 13 reported two of the highest agreement responses frequencies by students, as well as the highest mean. The affordance of e‐learning to facilitate the sharing of alternate perspectives on key issues seems to be captured here. While self‐submissions also seemed to be an important perception, the representing item, item 2, did not attract the same level of agreement amongst the students.

Some items, given in Table , focused on the design of the online materials that students were using to help them achieve their learning outcomes.

Table 2. Items related to design issues.

There was a mixed response by students to the items related to the design of their course websites. Student responses to items 11 and 13 suggest that design of online activities is a relatively important issue for their learning. Student perceptions of the website design as a whole, and the way online materials explain things, seem to be less important to students according to their response to items 19 and 4.

Some items focused on workload, informed by previous research (Ramsden, Citation1991). These are presented in Table .

Table 3. Items related to student workload.

The volume of workload was rated a significant issue for students, according to their responses to items 6 and 16. This may capture student concern about adding on significant e‐learning activities to existing face‐to‐face activities without allowing for the additional time involved.

Table includes some items focused on the skills related to how the teacher engaged with students online.

Table 4. Items related to online responses made by the teachers.

Student perceptions of teacher responses, as indicated by the mean scores and Likert scale responses, seemed to vary. Responses to item 9 suggested that the majority of the students felt that the teacher’s online responses were helpful. Responses to item 1 suggest that the teacher’s online responses did not necessarily motivate the majority to engage in deeper learning.

Some items, focused on the students’ experience of the link between the online activities and the face‐to‐face context are presented in Table .

Table 5. Items relating online resources with face‐to‐face context.

Student responses to items 17 and 18 about how the online activities may have helped learning in the face‐to‐face context did not attract widespread agreement. A significant percentage of students were unsure about the relationships between the online activities and the face‐to‐face context. The design of the materials may have contributed to this. Attention to the design, especially how the activities are supposed to link learning across the online and face‐to‐face contexts, may be a way to remove the students’ uncertainty.

To investigate the patterns of the student data in greater detail (i.e. to see if there were any coherent groupings of items) factor analyses including all items were conducted. Using principal components estimation with varimax rotation to simple structure, by applying the Kaiser Guttman rule (Kaiser & Caffrey, Citation1965), which suggests that the number of factors is equal to the number of factors with eigenvalues greater than 1, we identified a subset of seventeen items that provide a clear factor structure across four factors when low loadings of less than 0.4 are omitted. (Before the analysis, negatively worded items were reversed to ensure the consistency of the interpretation of the data.) Table shows the loadings of four independent factors, which suggest the existence of four coherent subsets of variables.

Table 6. Exploratory factor analysis structure for Questionnaire 1 (student experience of e‐learning supporting a face‐to‐face experience).

Factor 1 in Table shows five coherent items underlying aspects related to the way the teacher taught using the course website. From a student perspective, the way the teacher supported online discussions by guiding and focusing them (items 14 and 3), the way the teacher responded and interacted with students online (items 1 and 5) and the way the teacher provided feedback online to students (item 9) were items that were coherent at the level of a variable. For the remainder of this study, this variable will be referred to as student perceptions of e‐teaching.

Factor 2 shows five items that loaded onto one factor focusing on student perceptions of online resources. From a student perspective, items reflecting the design of activities and the website (items 7, 11 and 19) and the links between design and the face‐to‐face experience (items 17 and 18) are coherent at the level of the variable. For the remainder of this study, this variable will be referred to as student perceptions of online design.

Factor 3 shows three items that loaded onto one factor (items 6, 16 and 8). These items capture workload issues related to e‐learning. As negative items were reversed to ensure consistency of analysis, this variable is referred to as student perceptions of appropriate workload online.

Factor 4 shows four items (12, 10, 13 and 2) that loaded onto one factor reflecting the ways in which submissions made by students were perceived as helpful for learning. In this study, this variable is referred to as student perceptions of online interactivity.

The factor analysis suggests that at the level of variables, three items were not part of the structure of any of the factors identified: item 4, ‘The online learning materials in this course are extremely good at explaining things’; item 20, ‘The online learning materials in this course are designed to really try to make topics interesting to students’; and item 15, ‘The workload online didn’t stop me from developing my understanding of the key issues of this course’.

Associations amongst student perceptions of e‐learning and the quality of the course, student approaches to study and achievement

Next, we investigated patterns of student perceptions of e‐learning by analysing the scores of the subscales reflected by the four factors (e‐teaching, design, workload and interactivity) as well as the scores of subscales for student approaches to study (deep and surface) based on the student responses to the R‐SPQ. Table shows correlations amongst the variables and two outcome indicators, that is, the item measuring student satisfaction with the quality of the online materials and activities and the student online grade.

Table 7. Correlations between student perceptions of e‐learning in their face‐to‐face experience, perception of overall quality, R‐SPQ scores and online grade.

Table also shows the degree of reliability of the scales using Cronbach’s (Citation1951) estimate of internal consistency. All scales included in the analysis showed acceptable levels of internal consistency (Schmitt, Citation1996). Pearson product‐moment correlations were then calculated between these scale scores, students’ overall ratings of the quality of the online learning materials and activities and a mark awarded for the online component of the course.

Inspection of the correlation matrix indicated several results of note. Firstly, using a Type One error rate (α) of 0.05 for statistical significance, students’ responses on each of the proposed scales correlated with ratings of the overall quality of the online materials and activities. These correlations ranged from .71 for Good Design, to .46 for Appropriate Workload, indicating moderate to strong levels of correlation. As is standard practice with the Course Experience Questionnaire (e.g. Lawless & Richardson, Citation2002; Wilson, Lizzio, & Ramsden, Citation1997), the overall rating of quality of the online materials and activities was included as a check on the validity of identified scales. The results of the current study indicate the constructs underlying the proposed scales were perceived by students as important facets of e‐learning supporting a face‐to‐face experience.

Secondly, two of the four e‐learning experience scales were significantly correlated with a deep approach to study. There was a positive correlation between ratings of Good e‐teaching and self‐reports of a deep approach, r = .22; and between ratings of Student interactivity and self‐reports of a deep approach, r = .22. There was also a statistically significant correlation between students’ overall ratings of the quality of the online materials and activities and a deep approach, r = .19. Finally, students who tended to report a surface approach to learning in the course were more likely to receive a lower grade for the online component of the unit, r = −.29.

The final analysis undertaken was a cluster analysis. Cluster analyses were conducted to look for distributions of qualitatively different experiences in the student population. The methodology used for this analysis is based on related studies by Seifert (Citation1995), Prosser, Ramsden, Trigwell and Martin (Citation2003) and Ellis and Calvo (Citation2006). We used a hierarchical cluster analysis on standardised variable scores for this purpose, followed by a between‐groups contrast analysis on individual variables. Using Ward’s minimum variance method, we found that a two‐cluster solution was the most parsimonious explanation of the relations between variables and group membership. The means and standard deviations of the two groups on the variables are given in Table .

Table 8. Summary statistics for student perceptions of e‐learning, approaches to study and achievement.

We interpret the cluster analysis results as follows. The first cluster of students may be characterised as a group that, on average, rated relatively low the items on e‐teaching, design of resources, student interaction and appropriate workload, and tended to score higher on items reflecting a surface approach to learning. They also had relatively lower grades for the online component of the course. The second cluster, in comparison, rated relatively high items reflecting e‐teaching, design of resources, student interaction and workload, and had relatively low scores on the items reflecting a surface approach to learning. They also had relatively higher grades for the online component of the course. The difference between the clusters on the deep approach variable, while in the expected direction, was not statistically significant.

The cluster analysis results indicate that students who have different perceptions of the e‐learning environment take different approaches to study and achieve at different levels on an assessment task associated with online learning.

Discussion

This study began by recognizing that the experience of campus‐based students is being systematically supported through the use of e‐learning materials and resources. It was noted that in campus‐based experiences supported by e‐learning we have little research‐based evidence about how key aspects of e‐learning might be constituted and how these relate to key aspects of the whole student experience (Bliuc, Goodyear, & Ellis, Citation2007; Sharpe & Benfield, Citation2005; Sharpe, Benfield, Roberts, & Francis, Citation2006). Adopting a semi‐exploratory approach, previous research had suggested that aspects such as e‐moderating, online communication, the design of materials and websites, workload issues and submissions made by students online might be some of the areas worth investigating (Prosser & Trigwell, Citation1999; Ramsden, Citation1991).

The frequency analyses gave the researchers a feel for levels of agreement amongst the student population as to which items were perceived to be the most relevant. In terms of further exploration in this area, some items that did not load are almost as important as those that did. For example, item 15, ‘The workload online didn’t stop me from developing my understanding of the key issues in this course’, did not load significantly onto the workload subscale, but item 8, ‘I generally had enough time to understand the things I had to learn online’, did. In order to understand which items have the most meaning for students, and how these might be related to their subscale, and other aspects of the experience, more work on the constituent structure of all the subscales is needed.

The identification of the four underlying factors described as e‐teaching, design, workload and interactivity is an important contribution to research into the most meaningful aspects of e‐learning when it is used to support students in a predominately face‐to‐face experience. We do not suggest that these are the only possible aspects to be identified here, nor that they are, in a sense, necessarily pitched at a level of abstraction that may not change. For example, e‐teaching as a subscale in the factor analysis is made up of items addressing online discussions. In other studies it may include other things. The main outcome for this study is the fact that we identified four independent factors accounting for different aspects of the learning experience. The way they related to each other and the outcome variables (satisfaction and mark) provide an improved understanding of how e‐learning is related to other key aspects of the student experience.

Specifically, at the variable level, significant correlations were identified amongst the e‐learning, approaches and outcome variables. Significant strong positive correlations were found between all the e‐learning variables and students’ perceptions of the quality of the e‐learning experience. Significant strong positive correlations were found between the deep approaches, the e‐learning variables, perceptions of the quality of e‐learning and achievement. We interpret these results as evidence of the importance of careful structuring and design of e‐learning activities and resources, especially in relation to the broader student experience.

At the level of groups of students, significant differences were found amongst students in terms of their perceptions, approaches to study and achievement. The differences were consistent with the results suggested by the analyses at the variable level. It is perhaps these results that have the clearest implications for practice. The analyses suggest that students who had negative perceptions of the quality of teaching, design, interactivity and workload tended to approach their studies in the course in a comparatively poor way and tended to achieve relatively poorly online. These associations suggest that if we wish to improve the quality of the student experience online, addressing student perceptions about what the e‐learning experience involves and how it can be useful for learning is essential. For example, about a third of the students (cluster 1, n = 43) did not perceive the value of the submissions made by other students, nor of the interaction with the teacher online and of the learning process facilitated by online activities. This suggests that some awareness‐raising about the nature and purpose of submissions and online feedback would be a useful teaching strategy if we wish to improve the quality of e‐learning. We cannot assume that the mere existence of e‐learning activities and materials supporting a face‐to‐face experience of learning will improve the quality of the experience. How students perceive and use the activities and materials represent one of the keys to unlocking the full value of e‐learning in the student learning experience at university.

References

  • Barretto , S.F.A. , Piazzalunga , R. , Ribeiro , V.G. , Dalla , M.B.C. and Filho , R.M.L. 2003 . Combining interactivity and improved layout while creating educational software for the Web . Computers and Education , 40 : 271 – 284 .
  • Biggs , J.B. 2005 . Aligning teaching for constructing learning Retrieved August 20, 2008, from: http://www.heacademy.ac.uk/embedded_object.asp?id=21686&filename=Biggs
  • Biggs , J. , Kember , D. and Leung , D.Y.P. 2001 . The revised two‐factor Study Process Questionnaire: R‐SPQ‐2F . British Journal of Educational Psychology , 71 : 133 – 149 .
  • Bliuc , A. , Goodyear , P. and Ellis , R.A. 2007 . Research focus and methodological choices in studies into students’ experiences of blended learning in higher education . Internet and Higher Education , 15 : 231 – 244 .
  • Brace‐Govan , J. 2003 . A method to track discussion forum activity: The Moderators’ Assessment Matrix . Internet and Higher Education , 6 : 303 – 325 .
  • Cronbach , L.J. 1951 . Coefficient alpha and the internal structure of tests . Psychometrika , 16 : 297 – 334 .
  • Dewiyanti , S. , Brand‐Gruwel , S. and Jochems , W. 2005 . Applying reflection and moderation in an asynchronous computer‐supported collaborative learning environment in campus‐based higher education . British Journal of Educational Technology , 36 : 673 – 676 .
  • Ellis , R.A. and Calvo , R.A. 2006 . Discontinuities in university student experiences of learning through discussions . British Journal of Educational Technology , 37 : 55 – 68 .
  • Godwin , S.J. , Thorpe , M.S. and Richardson , J.T.E. 2008 . The impact of computer‐mediated interaction on distance learning . British Journal of Educational Technology , 39 : 52 – 70 .
  • Goodyear , P. 1984 . LOGO: A guide to learning through programming , London : Heineman .
  • Goodyear , P. 1991 . Teaching knowledge and intelligent tutoring , Norwood, NJ : Ablex .
  • Goodyear , P. , Jones , C. , Asensio , M. , Hodgson , V. and Steeples , C. 2005 . Networked learning in higher education: Students’ expectations and experiences . Higher Education , 50 : 473 – 508 .
  • Hashim , Y. 1999 . Are instructional design elements being used in module writing? . British Journal of Educational Technology , 30 : 341 – 358 .
  • Higher Education Funding Council of England . 2005 . HEFCE strategy for e‐learning Retrieved January 14, 2006, from http://www.hefce.ac.uk/pubs/hefce/2005/05_12/
  • Kaiser , H.F. and Caffrey , J. 1965 . Alpha factor analysis . Psychometrika , 30 : 1 – 14 .
  • Kember , D. , Ng , S. , Tse , H. , Wong , E.T.T. and Pomfret , M. 1996 . An examination of the interrelationships between workload, study time, learning approaches and academic outcomes . Studies in Higher Education , 21 : 347 – 358 .
  • Laurillard , D. 1993 . Rethinking university teaching: A framework for the effective use of educational technology , (1st ed.) , London : Routledge .
  • Laurillard , D. 2002 . Rethinking university teaching: A framework for the effective use of educational technology , (2nd ed.) , London : Routledge .
  • Lawless , C.J. and Richardson , J.T.E. 2002 . Approaches to study and perceptions of academic quality in distance education . Higher Education , 44 : 257 – 282 .
  • Littlejohn , A.H. 2002 . Improving continuing professional development in the use of ICT . Journal of Computer Assisted Learning , 18 : 166 – 174 .
  • Martens , R.L. , Valcke , M.M.A. and Portier , S.J. 1997 . Interactive learning environments to support independent learning: The impact of discernability of embedded support devices . Computers and Education , 28 : 187 – 197 .
  • Oliver , R. and Omari , A. 1999 . Using online technologies to support problem‐based learning: Learners’ responses and perceptions . Australian Journal of Educational Technology , 15 : 58 – 79 .
  • O’Toole , J.M. and Absalom , D.J. 2003 . The impact of blended learning on student outcomes: Is there room on the horse for two? . Journal of Educational Media , 28 : 179 – 189 .
  • Pearson , J. 2006 . Investigating ICT using problem‐based learning in face‐to‐face and online learning environments . Computers & Education , 47 : 56 – 73 .
  • Prosser , M. and Trigwell , K. 1999 . Understanding learning and teaching: The experience in higher education , Milton Keynes, , UK : Society for Research into Higher Education & Open University Press .
  • Prosser , M. , Ramsden , P. , Trigwell , K. and Martin , E. 2003 . Dissonance in experience of teaching and its relation to the quality of student learning . Studies in Higher Education , 28 : 37 – 48 .
  • Ramsden , P. 1991 . A performance indicator of teaching quality in higher education: The Course Experience Questionnaire . Studies in Higher Education , 16 : 129 – 150 .
  • Ramsden , P. 2002 . Learning to teach in higher education , (2nd ed.) , London : Routledge .
  • Richardson , J.T.E. 2003 . Approaches to studying and perceptions of academic quality in a short web‐based course . British Journal of Educational Technology , 34 : 433 – 442 .
  • Sabry , K. and Baldwin , L. 2003 . Web‐based learning interaction and learning styles . British Journal of Educational Technology , 34 : 443 – 454 .
  • Salmon , G. 2002a . E‐tivities: The key to active online learning , London : Taylor & Francis .
  • Salmon , G. 2002b . Mirror, mirror on my screen: Exploring online reflections . British Journal of Educational Technology , 33 : 379 – 391 .
  • Salmon , G. 2004 . E‐moderating: The key to teaching and learning online , (2nd ed.) , London : Taylor & Francis .
  • Schmitt , N. 1996 . Uses and abuses of coefficient alpha . Psychological Assessment , 8 : 350 – 353 .
  • Seifert , T. 1995 . Characteristics of ego‐ and task‐orientated students: A comparison of two methodologies . British Journal of Educational Psychology , 65 : 125 – 138 .
  • Sharpe , R. and Benfield , G. 2005 . The student experience of e‐learning in higher education: A review of the literature . Brooks eJournal of Learning and Teaching , 3 : 1 – 10 .
  • Sharpe , R. , Benfield , G. , Roberts , G. and Francis , R. 2006 . The undergraduate experience of blended e‐learning: A review of UK literature and practice Retrieved January 14, 2006, from www.heacademy.ac.uk
  • Sims , R. 1997 . Interactivity: A forgotten art . Computers in Human Behavior , 13 : 157 – 180 .
  • Trigwell , K. and Prosser , M. 1997 . Towards an understanding of individual acts of teaching and learning . Higher Education Research & Development , 16 : 241 – 252 .
  • Wilson , K.L. , Lizzio , A. and Ramsden , P. 1997 . The development, validation and application of the Course Experience Questionnaire . Studies in Higher Education , 22 : 33 – 53 .

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.