392
Views
0
CrossRef citations to date
0
Altmetric
Research Article

The profiles of Chinese university students’ learning experience in flipped classrooms: combining the self-reported and process data

ORCID Icon
Received 29 Aug 2022, Accepted 11 May 2023, Published online: 24 May 2023

ABSTRACT

Drawing on student approaches to learning research, this study combined both self-reported and process data to examine: (1) the extent of the alignment between the self-reported and process data of the profiles of 179 Chinese university students’ learning experience in the flipped classrooms and (2) the contributions of the self-reported and process data of students’ learning experience to their academic performance. Two groups of students with contrasting learning experiences were identified. One group reported a more desirable learning experience (cohesive conceptions of learning theoretical mechanics, deep approaches in both face-to-face and online learning, and positive perceptions of both the human and non-human elements of the learning environments in the course). Another group had a poorer learning experience (fragmented conceptions, using surface approaches, and perceiving a lack of peer online interaction and not appraising online design of the course site). The students self-reporting better and poorer learning experiences, however, only differed in frequency of participation in the pre-lecture quizzes, demonstrating weak alignment between the self-reported and process data. The hierarchical regression analyses revealed that including both self-reported and process data significantly increased the variance explained in students’ academic performance.

Introduction

Over the last few decades, learning in higher education has undergone significant transformation (Cho et al., Citation2021). China has implemented systematical reforms in its higher education in order to meet the international standards. One of the key objectives in the transformation is to re-design large lecture-focused courses by adopting flipped classroom principles (Teaching Committee of Ministry of Education, Citation2018). As a specific type of blended course designs, the flipped classrooms require students to engage in “interactive content focusing on key concepts prior to class, thus allowing class time for collaborative activities that clarify concepts and contextualise knowledge through application, analysis, and planning and producing solutions” (Karanicolas et al., Citation2018, p. 1). Accordingly, university students’ experiences of learning in flipped classrooms are becoming increasingly complex as students need to move back and forth between in-class and on-line learning environments. In these experiences, students not only interact with the human elements (teaching staff and fellow students), but also with the non-human or material elements (Fenwick, Citation2015). For instance, students navigate online learning platforms, interact with a variety of technology-enabled learning tools, engage in online discussion forums, or learn across physical and online learning spaces (Fenwick, Citation2015). As a result, descriptions of students’ learning merely by self-reported measures are insufficient to capture a comprehensive picture of their learning experiences in flipped classrooms. Hence, an in-depth understanding of students’ learning experiences in flipped classrooms requires combining self-reported and process measures and data. In Chinese universities, despite the large-scale transformations from traditional lecture-based learning to flipped classrooms, there is a dearth of systematic investigations on Chinese university students’ learning experiences in flipped classrooms by combining self-reported and process measures. To address this gap, the present study adopted the student approaches to learning (SAL) research as a theoretical framework to examine the profiles of Chinese university students’ learning experience in flipped classrooms.

SAL research

SAL research is one of the guiding frameworks widely used in higher education to understand students’ experiences of learning (Zusho, Citation2017). SAL research identifies variations in key factors of students’ learning processes, such as conceptions of learning, approaches to learning, and perceptions of the learning environment (Trigwell & Prosser, Citation2020).

Conceptions of learning are concerned with the ways students understand what the learning process and the academic performance are the beliefs about the nature of knowledge, and how to come to know that knowledge (Limbu & Markauskaite, Citation2015). SAL research has consistently identified two broad categories of conceptions – cohesive and fragmented conceptions (Säljö, Citation1979). While cohesive conceptions see learning as knowledge re-construction processes, in which new concepts are integrated into existing knowledge; fragmented conceptions view learning as knowledge reproduction and accumulation from bits of unrelated information (Yang & Tsai, Citation2010).

In a variety of learning tasks and across a range of academic disciplines, SAL research has also identified two broad categories of approaches to learning – deep and surface (Trigwell & Prosser, Citation2020). Students who adopt deep approaches tend to learn in proactive, reflective, and engaging ways; whereas students who use surface approaches mainly learn through rote memorization and mechanistic activities (Nelson Laird et al., Citation2014). Past SAL research has repeatedly revealed systematic and logical interrelations between students’ conceptions, approaches, and their academic performance (Trigwell & Prosser, Citation2020). Students who hold cohesive conceptions in a course tend to adopt deep approaches and perform better in that course; whereas students with fragmented conceptions are likely to adopt surface approaches, and attain poorer academic performance.

Students’ approaches to learning have also been found to be associated with their perceptions of characteristics of the learning environment (Trigwell & Prosser, Citation2020). For instance, students’ perceptions of high teaching quality, clear teaching goals, and good teaching organization tend to be positively associated with deep approaches; whereas perceptions of heavy workload, inappropriate assessments, and lack of teacher–student interaction are often related to surface approaches (Guo, Citation2018).

In the last two decades, the SAL research has been increasingly adopted to investigate students’ learning experiences in online and blended courses, such as online discussions (Han & Ellis, Citation2019), online collaborative writing (Limbu & Markauskaite, Citation2015), blended inquiry-based (Ellis, Citation2014), and problem-based learning (Samarakoon et al., Citation2013). In these online and blended learning designs, students’ approaches to using learning technologies and their perceptions of the integration between the face-to-face and online modes have been investigated. Students who adopt deep approaches to using online learning technologies use technologies in a meaningful way to aim for broadening views, deepening conceptual understanding, or facilitating learning. In contrast, those favor surface approaches apply technologies in learning in a simplistic manner to merely fulfill practical purposes, like downloading documents, completing assignments, or satisfying course requirements (Ellis & Han, Citation2019). Adopting deep approaches to using online learning technologies has also been shown to related to students’ perceptions of the integration between the face-to-face and online modes in flipped classrooms (Ellis et al., Citation2021).

Combining self-reported and process data to research university students’ learning experience in flipped classrooms

Despite the significant contribution of the SAL research framework to the literature on students’ experiences in learning, it predominantly assesses various aspects of such experiences using the self-reported measures, such as focus groups, semi-structured interviews, and open-ended or Likert-scale questionnaires (Han, Citation2022). Self-reported measures and data have been criticized for being subjective, which may affect their accuracy in representing students’ experiences of learning in reality (Zhou & Winne, Citation2012). Hence, researchers have proposed to expand using the self-reported data alone by including other types of data, such as process data (Vermunt & Donche, Citation2017), which are able to reflect students’ learning (in particular, in the online environments) in a more objective manner and in nuanced details (Richardson, Citation2017).

Only a limited number of studies have adopted the SAL research framework to investigate university students’ learning experience in flipped classrooms by combining self-reported and process data (Ellis et al., Citation2017; Gašević et al., Citation2017; Han & Ellis, Citation2017; Citation2020a). These studies have two different research foci: (1) the extent to which combining self-reported and process data may improve the predictive power of aspects in students’ learning experience to their academic performance (Ellis et al., Citation2017; Han & Ellis, Citation2020a) and (2) the extent to which self-reported and process data offer consistent evidence in describing students’ learning (Gašević et al., Citation2017; Han & Ellis, Citation2017).

While the research with the first focus has produced relatively consistent results; studies with the second focus do not have conclusive findings. For instance, Han and Ellis (Citation2017) found coherence between students’ self-reported positive perceptions of the course learning environment and their higher online participation rates measured by process data, whereas Gašević et al. (Citation2017) only found partial alignment between the self-reported and process data, as only self-reported deep approaches to learning aligned with their observed approaches. Clearly further research is required.

The present study and research questions

The present study was designed to fill a number of research gaps. First, the present study examined the learning experiences of Chinese university students, which are less research population in the literature of the flipped classroom learning. Second, the present study extended previous investigations by covering a broader range of aspects in SAL research (i.e. conceptions of learning, approaches to learning, and approaches to using online learning technologies, and perceptions of the learning environment) to identify the profiles of students’ learning experiences. Third, the present study addressed the above-mentioned two research foci in a single study. Additionally, the study used two different types of process data: both duration and frequency, which would provide richer information than only one type of data used in previous research.

The present study addressed two research questions:

  1. What are the profiles of Chinese university students’ learning experience in flipped classrooms as reflected by the self-reported data (conceptions, approaches, and perceptions) and process data (duration of students’ online participation and frequencies of their participation in different online learning activities)?

  2. To what extent are the two types of data aligned?

  3. What are the contributions of Chinese university students’ self-reported conceptions, approaches, and perceptions, duration of online participation, and frequencies of participation in different online learning activities, to their academic performance in flipped classrooms?

Methods

The participants

Altogether 179 second year mechanical engineering students in a Chinese public university, specializing in science and engineering studies, voluntarily participated in the study. The participants were predominantly male students (n = 166), because the major of mechanical engineering tends to attract males in China.

Description of the flipped classroom course

All the participants were enrolled in a foundational mechanical engineering course – Theoretical Mechanics, which was a second-year compulsory course for students majoring in a Bachelor of Mechanical Engineering. The course was semester long and lasted for 16 weeks.

The face-to-face learning and teaching included three one-hour lectures and one-hour tutorial per week. In the tutorials, key concepts and difficult points, practical exercises, and assignments were discussed amongst students in groups.

The online learning required compulsory participation before and after each week’s lecture and tutorial, and functioned as preparing, reviewing, and extending the face-to-face learning. The details of the online learning activities are described below:

  • Pre-lecture online learning materials: had the formats of readings and videos clips, which covered the concepts to be discussed in the lecture in the coming week.

  • Pre-lecture quizzes: tested students’ understanding of the pre-lecture online learning materials.

  • Pre-tutorial online learning materials: consisted of preparation requirements for each tutorial and demonstration of the sample problem-solving tasks to be practiced and solved in the tutorials.

  • Online discussion board: continued discussions from the face-to-face lectures and tutorials.

  • Post-tutorial quizzes: had diverse formats, including calculations using formulas; model construction exercises; short answer questions; and problem-solving tasks which involved applying theories to tackle practical issues.

Instruments and data

The 5-point Likert scale questionnaire to collect self-reported data

The questionnaire was developed based on the SAL research framework (Biggs et al., Citation2001), and has been used in previous studies on students’ learning experience in blended course designs, confirming its validity and reliability (Ellis & Bliuc, Citation2016; Citation2019; Han & Ellis, Citation2020b). The questionnaire consisted of eight scales, which are described in the following:

  • Cohesive conceptions of learning conceive of learning theoretical mechanics as a process of in-depth reflection and thought clarification (8 items, α = .95; e.g. “The learning activities for theoretical mechanics allow us to better understand the topics from a number of perspectives”).

  • Fragmented conceptions of learning see learning theoretical mechanics as only serving simplistic purposes (7 items, α = .78; e.g. “The purpose of learning for theoretical mechanics is mostly to help us remember facts for our tasks”).

  • Deep approaches to learning capture approaches to learning as being independent, taking initiatives, and critically reflecting the learning processes (9 items; α = .93; e.g. “I test myself on important topics until I understand them completely”).

  • Surface approaches to learning use approaches that predominantly focus on rote memorization and using practical strategies to pass examinations and to satisfy course requirements (8 items; α = .88; e.g. “I see no point in learning material which is not likely to be in the examination”).

  • Deep approaches to using online learning technologies describe using online learning technologies as facilitating learning and deepening understanding of theoretical mechanics (6 items; α = .89; e.g. “I try to use the online learning technologies in this course to achieve a more complete understanding of key concepts”).

  • Surface approaches to using online learning technologies describe using online learning technologies as fulfilling practical purposes (8 items; α = .80; e.g. “I use online learning technologies in this course mainly to download files”).

  • Perceptions of online interactivity perceive the online learning as being interactive and value contributions by peers (4 items; α = 80; e.g. “Other students’ online submissions in this course encouraged me to investigate further sources of knowledge”).

  • Perceptions of online design perceive the online learning design as being of high quality and well-integrated with face-to-face learning (6 items; α = .92; e.g. “The online learning materials in this course are designed to make topics interesting to students”).

The learning management system (LMS) to collect the process data of frequencies of students’ online participation

Two types of process data were used in the study: (1) duration of online participation; and (2) frequencies of students’ participation in different online learning activities. The process data were collected through the analytic functions in the LMS – Tsinghua Education Online, which is widely adopted by universities in China.

Students’ academic performance

The course marks were used to represent students’ academic performance in the course. The course marks were aggregated scores of the assessment tasks, including weekly written problem-solving tasks (10%); a reflective report on the learning experience (10%); quality of the online postings in the online discussion board (10%); and a close-book examination (70%).

Data collection

Data collection took place towards the end of the semester so that the participants had a comprehensive experience of the whole course in order to respond to the questionnaire. One week before data collection, each student was given a Participant Information Statement, which explained that participation in the study was completely voluntary; and participation in the study required completion of a close-ended questionnaire, giving permission to access to their online participation data in the LMS and the assessment marks. Students who signed a Participant Consent Form were given access to the online questionnaire.

Data analysis

To answer research question 1a – the profiles of Chinese university students’ learning experience in flipped classrooms, a hierarchical cluster analysis was conducted to cluster students using the Means of the eight scales in the self-reported questionnaire. Based on the cluster membership, one-way ANOVAs were performed to compare if duration and frequencies of students’ online participation and their academic performance differed between clusters. The results of the one-way ANOVAs were used to answer research question 1b – the extent of alignment between self-reported and process data on students’ learning experience in flipped classrooms.

To answer the second research question – the contributions of self-reported and process data of students’ learning experience to their academic performance, hierarchical regression analyses were performed. Before the regression analyses, correlation analyses were conducted between self-reported and process data, as well as academic performance to ensure linear relations between the independent and dependent variables. The values of tolerance were also screened to check the assumption of no multicollinearity was met. Two regression models were constructed. The first model only used self-report data as predictors of academic performance because there was established SAL literature on the contributions of self-reported data to students’ academic performance. The second model added the process data of duration of students’ online participation based on regression model 1. On top of model 2, the last model also included the process data of frequencies of students’ online learning participation. All the data analyses were conducted in SPSS 28.

Results

Results for research question 1a – the profiles of Chinese university students’ learning experience and 1b – the extent of consistency between the self-reported and process data on Chinese university students’ learning experience

The results of cluster analysis and one-way ANOVAs are displayed in . Using the mean scores of the eight scales of students’ self-reporting of their learning experience, the hierarchical cluster analysis produced a range of two-cluster to four-cluster solutions. The values of squared Euclidean distance revealed a relatively large increase in the value of a two-cluster solution compared to three-cluster and four-cluster solutions, suggesting a two-cluster solution was more appropriate. Of 179 students, 86 students and 93 students were clustered into clusters 1 and 2, respectively.

Table 1. Results of the hierarchical cluster analysis and one-way ANOVAs.

Results of the one-way ANOVA suggest that the two clusters of students differed significantly on all the self-reported scales: cohesive conceptions: F(1,177) = 70.73, p < .01, η2 = .29; fragmented conceptions: F(1,177) = 40.22, p < .01, η2 = .19; deep approaches to learning: F(1,177) = 48.56, p < .01, η2 = .22; surface approaches to learning: F(1,177) = 62.01, p < .01, η2 = .26; deep approaches to using online learning technologies: F(1,177) = 78.65, p < .01, η2 = .31; surface approaches to using online learning technologies: F(1,177) = 81.57, p < .01, η2 = .32; perceptions of online interactivity: F(1,177) = 43.94, p < .01, η2 = .20; and perceptions of online design: F(1,177) = 76.90, p < .01, η2 = .30. Specifically, cluster 1 students had higher cohesive conceptions of learning, adopted more deep approaches to learning, as well as more deep approaches to using online learning technologies. Further, they were also more positive about the interactivity and design of the online part of the course than cluster 2 students. Cluster 2 students, on the other hand, reported higher fragmented conceptions of learning theoretical mechanics, reported using more surface approaches to learning, and surface approaches to using online learning technologies. At the same time, cluster 2 students also perceived online interactivity and online design more negatively than their peers in cluster 1.

Based on the cluster membership, students’ duration of online participation did not differ from each other. Of the five frequencies of students’ online learning participation, only frequency of participation in the online pre-lecture quizzes showed significant differences between the two clusters: F(1,177) = 5.08, p < .05, η2 = .03. Cluster 1 students were observed to interact with the pre-lecture online quizzes more frequently than cluster 2 students. The two clusters of students also differed significantly on course marks: F(1,177) = 10.98, p < .01, η2 = .06., as cluster 1 students, who reported more desirable learning experience in the course and more actively participated in the online quizzes, also obtained higher course marks than cluster 2 students did.

Results for research question 2 – contributions of the self-reported and process data on students’ learning experience to their academic performance

presents the results of correlation analyses. With regard to the correlations between academic performance and self-reported variables, the results show that the course marks were significantly and negatively correlated with the fragmented conceptions of learning (r = −.17), surface approaches to learning (r = −.24), and surface approaches to using online learning technologies (r = −.16).

Table 2. Results of correlation analyses.

With regard to correlations between academic performance and process data, the results show that course marks were positively related to duration of online participation (r = −.25). In addition, the correlations between course marks and frequencies of online learning participation in all five online learning activities were also significant and positive. Of these correlations, participation in the pre-lecture online quizzes was the strongest (r = .43), followed by pre-lecture online learning materials (r = .36), then by pre-tutorial online learning materials (r = .27). The correlation between academic performance and online discussion board (r = .22) was the same as that between academic performance and post-tutorial quizzes (r = .22), which had the lowest correlation coefficients.

Based on the significant correlations in , fragmented conceptions, surface approaches to learning, and surface approaches to using online learning technologies were used as predictors in regression model 1. In regression model 2, duration of students’ online participation was added on the basis of model 1. In the last model, frequencies of students’ participation in all five online learning activities were included as independent variables based on model 2.

The results of the two regression models are presented in .

Table 3. Results of hierarchical regression analyses.

shows that in model 1, when the three self-reported variables were simultaneously entered into the regression equation, only surface approaches to learning (β = −.22) significantly and negatively predicted academic performance: F(3, 175) = 3.62, p < .05, f2 = .04, explaining approximately 4% of the variance in academic performance. This suggests that the more surface approaches were adopted by a student; he/she performed more poorly in the course. In model 2, after adding duration of online participation into the regression model, the surface approaches to learning (β = −.22) still made a significant contribution to course marks. At the same time, duration of online participation (β = .25) was also significantly and positively predicted students’ academic performance, explaining an extra 6% of variance in academic performance: F(4, 174) = 5.95, p < .01, f2= .10. Regression model 3 included frequencies of students’ participation in the five online learning activities on top of model 2. While surface approaches to learning (β = −.18) still made a significant contribution to course marks, duration of online participation did not (β = −.02). Frequencies of participation in four online learning activities were significant contributors to academic performance (pre-lecture online learning materials: β = .16; pre-lecture quizzes: β = .27; online discussion board: β = .25; and post-tutorial quizzes: β = .30). Together, self-reported and process data explained around 36% of academic performance, showing large effect size: F(9, 169) = 11.91, p < .01, f2= .56. Adding the process data explained an extra 32% of variance in academic performance.

Discussion

Before discussing the results of the study, it is important to acknowledge some of the limitations, which should be considered when interpreting the results. The university students of the present study were recruited from only one public university in China. Future studies should try to recruit Chinese university students from multiple public and private universities to make the findings more generalizable. In addition, the present study only collected the observations of how students learnt in the online part of the flipped classrooms learning. In the future, the observation of how students learn in the face-to-face part of the flipped classrooms should also be obtained. Moreover, future studies may consider including more diverse types of process data, such as time-stamped sequences of interactions with different online learning activities, which will offer nuanced details of students’ learning experience in flipped classrooms.

The profiles of Chinese university students’ learning experience in the flipped classrooms

Using the important aspects of learning experiences in SAL research, the present study identified two distinct profiles of Chinese university students’ learning experience in flipped classrooms. Slightly less than half of students (48%) reported a more desirable learning experience, which was characterized by holding cohesive conceptions of learning theoretical mechanics, using deep approaches in both face-to-face and online learning, and having positive perceptions of both the human (perceptions of online interactivity) and non-human (perceptions of online design) elements of the learning environments in the course. In accordance with what they reported, this group of students were also observed to participate in the pre-lecture quizzes more frequently than their peers. The results means that this group of students took the opportunities to self assess their pre-lecture learning to ensure that they understood what they learnt online or to note down difficult points so that they would pay more attention to in the face-to-face lectures and tutorials. Furthermore, this group of students not only had more desirable learning experience but also obtained significantly higher course marks than their peers.

In contrast, slightly more than half of the population sample (52%) had a poorer learning experience, which was reflected by their reporting of having fragmented conceptions of learning theoretical mechanics, using surface approaches, and perceiving a lack of peer online interaction and not appraising the online design of the course site. Students who self-reported having a less desirable learning experience were also observed to participate in the pre-lecture quizzes less frequently and obtained lower course marks than their classmates. While previous studies adopting SAL research found contrasting profiles of learning experiences in flipped classrooms were mostly in terms of approaches to learning (Ellis et al., Citation2017; Gašević et al., Citation2017; Han & Ellis, Citation2017, Citation2020a); our study demonstrated that the profiles of Chinese university students’ learning experience in flipped classrooms not only differed on approaches of learning, but also on conceptions of learning and perceptions of the learning environment.

Weak alignment between the self-reported and the process data

The results of the cluster analysis and the one-way ANOVAs demonstrated that the alignment between self-reported and process data, which described how they learnt online more objectively, was rather weak. Specifically, duration of online participation and frequencies of participation in four online learning activities did not differ between the two distinct profiles of students based on their self-reports. This finding was similar to the weak alignment identified in Gašević et al. (Citation2017), which showed that students’ interactions with sequences of events were only consistent with their self-reporting of the deep approaches to learning in flipped classrooms rather than the surface approaches to learning. One possible explanation for the weak alignment could be that: the self-reported data measured students’ learning experience in the entire flipped classroom learning, which consisted of both face-to-face and online components; whereas the process data only captured students’ online part in the flipped classrooms. Future studies should tease apart the face-to-face and online components in the self-reported questionnaire so that a better match between the two types of data can be achieved.

The joint contribution of the self-reported and process data of students’ learning experience to their academic performance in the flipped classrooms

The results of both the second and the third regression models show that including self-reported and process data of students’ learning experience explained much more variance in their academic performance than using the self-reported data alone. These results were largely consistent with previous studies investigating Western university students’ learning experiences in flipped classrooms using SAL research (Ellis et al., Citation2017; Han & Ellis, Citation2020a).

In regression model 2, including duration of online participation doubled the contribution to academic performance made by the self-reported data alone. Cho et al. (Citation2021) also reported similar results that the time students spent online significantly predicted their final grade in flipped engineering courses. In regression model 3, including frequencies of online participation contributed eight times more than the self-reported data did in terms of predicting students’ academic performance.

However, we noticed that in regression model 3, after introducing frequencies of online participation, duration of online participation was no longer significant. This could be possibly due to that the duration measure was an aggregation of time spent on different online learning activities. Hence the duration measure was likely to be overlapped with frequencies of online participation. Similar results were also reported in Ober et al. (Citation2021), which found non-significant prediction by duration variable after including the frequency of clicks on the course website.

The correlation results between the duration and frequency measures seemed to confirm our speculation, as duration of online participation was significantly associated with frequencies of participation in all the five online learning activities. Most of the correlations were positive, suggesting that students who participated more frequently in the online learning also tended to spend more time learning online. However, we noticed that duration was negatively related to the frequency of participation in the post-tutorial quizzes. These correlation results could mean that students approached different online learning activities using different strategies, such as distributing time differently in these learning activities. Using data mining methods, Fincham et al. (Citation2019) and Jovanović et al. (Citation2017) also found that students adopting different study approaches in flipped classrooms also differed on the time they spent in their courses.

The significant contributions of both self-reported and process data to students’ academic performance in flipped classroom suggested that neglecting the more objective measures of what students did in learning only represented a fragmented picture of their learning experiences in flipped classrooms. Therefore, combining self-reported and process data is required to provide a more complete profile of students’ learning experience and to enhance our abilities to identify important aspects in flipped classroom learning.

Practical implications for improving Chinese university students’ learning in flipped classrooms

The findings of the present study provide some practical implications for Chinese teachers to improve Chinese students’ learning experience in flipped classrooms. Teachers may use a questionnaire, such as the one employed in this study, to identify the profiles of students’ learning experience early on in the course so that they can pair strong and weak students to help those with relatively impoverished concepts, approaches, and perceptions in flipped classroom learning to emulate their peers with a better learning experience.

Concerning the large and substantial contributions of students’ participation in different types of online learning activities, teaching staff should strive to maximize students’ understandings about the importance of online component in the flipped classrooms. As our results showed that students with better and poorer learning experience differed in their participation in the pre-lecture quizzes, teachers could design this online activity as an essential part of the assessment, which will encourage students to evaluate their understandings of the pre-lecture learning so that they can be well prepared when attend the face-to-face lectures and tutorials (Jungic et al., Citation2015; Mok, Citation2014).

Geolocation information

Australia.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Data available on request due to privacy/ethical restrictions.

Additional information

Notes on contributors

Feifei Han

Feifei Han is a Senior Research Fellow at Australia Catholic University. Her current interests focus on educational technology and learning analytics. Some of her publications appeared in top-quality journals, such as The Internet & Higher Education, Computers & Education, and IEEE Transactions on Learning Technologies.

References

  • Biggs, J., Kember, D., & Leung, D. (2001). The revised two-factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71(1), 133–149. https://doi.org/10.1348/000709901158433
  • Cho, H., Zhao, K., Lee, C., Runshe, D., & Krousgrill, C. (2021). Active learning through flipped classroom in mechanical engineering: Improving students’ perception of learning and performance. International Journal of STEM Education, 8(1), 1–13. https://doi.org/10.1186/s40594-021-00302-2
  • Ellis, R., & Bliuc, A.-M. (2019). Exploring new elements of the student approaches to learning framework: The role of online learning technologies in student learning. Active Learning in Higher Education, 20(1), 11–24. https://doi.org/10.1177/1469787417721384
  • Ellis, R., Han, F., & Pardo, A. (2017). Improving learning analytics – combining observational and self-report data on student learning. Educational Technology & Society, 20(3), 158–169.
  • Ellis, R. A. (2014). Quality experiences of inquiry in blended contexts – university student approaches to inquiry, technologies, and conceptions of learning. Australasian Journal of Educational Technology, 30(4), 273–283. https://doi.org/10.14742/ajet.522
  • Ellis, R. A., & Bliuc, A. (2016). An exploration into first year university students’ approaches to inquiry and online learning technologies in blended environments. British Journal of Educational Technology, 47(5), 970–980. https://doi.org/10.1111/bjet.12385
  • Ellis, Robert A, Bliuc, Ana-Maria, & Han, Feifei. (2021). Challenges in assessing the nature of effective collaboration in blended university courses. Australasian Journal of Educational Technology, 1–14. https://doi.org/10.14742/ajet.5576
  • Fenwick, T. (2015). Sociomateriality in medical practice and learning: Attuning to what matters. Medical Education, 48(1), 44–52. https://doi.org/10.1111/medu.12295
  • Fincham, E., Gašević, D., Jovanović, J., & Pardo, A. (2019). From study tactics to learning strategies: An analytical method for extracting interpretable representations. IEEE Transactions on Learning Technologies, 12(1), 59–72. https://doi.org/10.1109/TLT.2018.2823317
  • Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10
  • Guo, J. (2018). Building bridges to student learning: Perceptions of the learning environment, engagement, and academic performance among Chinese undergraduates. Studies in Educational Evaluation, 59, 195–208. https://doi.org/10.1016/j.stueduc.2018.08.002
  • Han, F. (2022). Recent development in university student learning research in blended course designs: Combining theory-driven and data-driven approaches. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2022.905592
  • Han, F., & Ellis, R. (2017). Variations in coherence and engagement in students’ experience of blended learning. In H. Partridge, K. Davis, & J. Thomas (Eds.), Proceedings of the ASCILITE2017 (pp. 268–275). University of Southern Queensland.
  • Han, F., & Ellis, R. (2020a). Combining self-reported and observational measures to assess university student academic performance in blended course designs. Australasian Journal of Educational Technology, 36(6), 1–14. https://doi.org/10.14742/ajet.6369
  • Han, F., & Ellis, R. (2020b). Initial development and validation of the perceptions of the blended learning environment questionnaire. Journal of Psychoeducational Assessment, 38(2), 168–181. https://doi.org/10.1177/0734282919834091
  • Han, F., & Ellis, R. A. (2019). Identifying consistent patterns of quality learning discussions in blended learning. The Internet and Higher Education, 40, 12–19. https://doi.org/10.1016/j.iheduc.2018.09.002
  • Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. The Internet & Higher Education, 33(4), 74–85. https://doi.org/10.1016/j.iheduc.2017.02.001
  • Jungic, V., Kaur, H., Mulholland, J., & Xin, C. (2015). On flipping the classroom in large first year calculus courses. International Journal of Mathematical Education in Science and Technology, 46(4), 1–8. https://doi.org/10.1080/0020739X.2014.990529
  • Karanicolas, S., Snelling, C., & Winning, T. (2018). Translating concept into practice: Enabling first-year health sciences teachers to blueprint effective flipped learning approaches. The University of Adelaide.
  • Limbu, L., & Markauskaite, L. (2015). How do learners experience joint writing: University students’ conceptions of online collaborative writing tasks and environments. Computers & Education, 82, 393–408. https://doi.org/10.1016/j.compedu.2014.11.024
  • Mok, H. (2014). Teaching tip: The flipped classroom. Journal of Information Systems Education, 25(1), 7–11.
  • Nelson Laird, T., Seifert, T., Pascarella, E., Mayhew, M., & Blaich, C. (2014). Deeply affecting first-year students’ thinking: Deep approaches to learning and three dimensions of cognitive development. Journal of Higher Education, 85(3), 402–432. https://doi.org/10.1080/00221546.2014.11777333
  • Ober, T., Hong, M., Rebouças-Ju, D., Carter, M., Liu, C., & Cheng, Y. (2021). Linking self-report and process data to performance as measured by different assessment types. Computers & Education, 167, Article 104188. https://doi.org/10.1016/j.compedu.2021.104188
  • Richardson, J. (2017). Student learning in higher education: A commentary. Educational Psychology Review, 29(2), 353–362. https://doi.org/10.1007/s10648-017-9410-x
  • Säljö, R. (1979). Learning about learning. Higher Education, 8(4), 443–451. https://doi.org/10.1007/BF01680533
  • Samarakoon, L., Fernando, T., Rodrigo, C., & Rajapakse, S. (2013). Learning styles and approaches to learning among medical undergraduates and postgraduates. BMC Medical Education, 13(1), Article 42. https://doi.org/10.1186/1472-6920-13-42
  • Teaching Committee of Ministry of Education. (2018). National standards on teaching quality for undergraduate programs. Higher Education Press.
  • Trigwell, K., & Prosser, M. (2020). Exploring university teaching and learning: Experience and context. Springer Nature.
  • Vermunt, J., & Donche, V. (2017). A learning patterns perspective on student learning in higher education: State of the art and moving forward. Educational Psychology Review, 29(2), 269–299. https://doi.org/10.1007/s10648-017-9414-6
  • Yang, Y., & Tsai, C. (2010). Conceptions of and approaches to learning through online peer assessment. Learning & Instruction, 20(1), 72–83. https://doi.org/10.1016/j.learninstruc.2009.01.003
  • Zhou, M., & Winne, P. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413–419. https://doi.org/10.1016/j.learninstruc.2012.03.004
  • Zusho, A. (2017). Toward an integrated model of student learning in the college classroom. Educational Psychology Review, 29(2), 301–324. https://doi.org/10.1007/s10648-017-9408-4