Publication Cover
Social Work Education
The International Journal
Volume 43, 2024 - Issue 3
1,641
Views
0
CrossRef citations to date
0
Altmetric
Articles

Doing it step by step: a flipped classroom approach to teaching statistical analysis in social work

ORCID Icon & ORCID Icon
Pages 546-569 | Received 19 May 2022, Accepted 08 Sep 2022, Published online: 03 Oct 2022

ABSTRACT

Social workers should be trained in both qualitative and quantitative research methods, irrespective of whether they aspire to conduct research on their own or use research to inform their practice. The apparently problematic position of quantitative research methods in social work suggests, however, a need to explore new forms of teaching statistical analysis. In this article, we propose a flipped classroom approach to teaching statistical analysis for social work. In the empirical analysis, we investigate how students perceive this way of learning statistical analysis at a Norwegian university. The data are based on 3 years of evaluation data from a course on statistical analysis for master’s level graduate students of social work, with 2 years of data taken from before the COVID-19 pandemic and the last data point being during a period of lockdown. We discuss the most important factors for succeeding with this approach and explore if and to what extent the perceptions of students differed during the pandemic compared to the two previous years. Based on the findings, we argue that a flipped classroom approach to teaching statistical analysis may be one way of changing the apparent problematic position of quantitative research methods in social work.

Introduction

Quantitative literacy and having the necessary skills to conduct and understand quantitative research has been a key concern in social work in recent years (see outline discussion by Sheppard, Citation2016, pp. 1521–1524). A review based on analysis of nearly 1,500 articles published over 10 years in three British-based international social work journals showed that although the number of quantitative research has increased, along with the expansion of publications, quantitative articles continue to lag behind qualitative research articles (Sheppard, Citation2016, p. 1527). This may indicate an imbalance in research methodology competence among social workers that is alarming to an academic discipline that is criticized for having an ill-defined knowledge base, in terms of robust research (Taylor & Sharland, Citation2015, p. 626). There are of course some justifiable reasons why researchers within social work tends to lean against qualitative methods such as aspirations to move the voices of service users and practitioners to the forefront and incorporate social work values into this research as discussed by Gleeson et al. (Citation2021, p. 13) in the context of UK. We are not arguing that social work should be turned into a discipline only preoccupied with ‘interventions’ and ‘outcomes’ (Jacobsson & Meeuwisse, Citation2020, p. 286), nonetheless, to educate social workers to be ‘research-minded’ (MacIntyre & Paul, Citation2013, p. 699), social workers need to be trained in both qualitative and quantitative research methods. This competence is important, irrespective of whether social work students aspire to conduct research on their own or use research to inform their practice (MacIntyre & Paul, Citation2013, p. 690). However, there is a need to explore new forms of teaching statistical analysis to master’s students of social work, considering the prevalent anxiety toward mathematics and statistics among them (Forte, Citation1995; Green et al., Citation2001; Pan & Tang, Citation2005; Royce & Rompf, Citation1992).

The traditional lecture has been, and still is, the most common form of teaching at Norwegian universities (Ramsden, Citation2003; Skodvin, Citation2016). This is despite increasing awareness that this teacher-driven, monologue-influenced, and student-passive form of learning is not very efficient in terms of learning outcomes (Ramsden, Citation2003). The criticism of traditional lectures is that they represent a ‘rigid teacher-centered perception of teaching and learning’ (Ramsden, Citation2003, p. 147). Another criticism is that it is unrealistic to assume that knowledge can be transferable orally from lecturer to student. Critics emphasize that traditional lectures are not suitable for achieving deep learning and are inefficient in terms of learning and practicing skills (Bligh, Citation2000; after Ramsden, Citation2003, p. 3; Skodvin, Citation2016). A flipped classroom approach that facilitates student-active learning, dialogue-based teaching, and, perhaps most importantly, forms of teaching that facilitate individually adapted learning (Foldnes, Citation2016, Citation2017; Steen-Utheim & Foldnes, Citation2018) counteracts much of the criticism directed toward traditional lectures. In the context of this article, we consider individually adapted learning as an opportunity for students to acquire knowledge at their own pace. Moreover, their learning progression can be adapted to prior experiences, efforts, and especially ambitions. In courses where skills are central, such as statistical analysis, introducing flipped classrooms could potentially provide far better learning outcomes, compared to teaching based solely on traditional lectures (Foldnes, Citation2016, Citation2017; Steen-Utheim & Foldnes, Citation2018).

In this article, we begin by outlining previous research on the apparently problematic position of quantitative research methods in social work and suggestions for improving quantitative literacy among social work students. We then describe our framework and approach to a statistical analysis of flipped classroom teaching in social work. Thereafter, we investigate how students perceive this way of learning statistical analysis at a Norwegian university. The data are based on 3 years of evaluation from a course on statistical analysis for master’s level graduate students of social work. The first two data points (March 2019 and 2020) are from before the COVID-19 pandemic led to lockdowns in Norway, and the last data point was taken during the pandemic (March 2021). Based on these analyses, we discuss the most important factors for succeeding with a flipped classroom approach and furthermore explore if and to what degree students’ perception of it varied during the pandemic.

Social work and statistical analysis—a difficult friendship?

There are several issues that may underlie the apparently problematic position of quantitative research methods in social work. One of these relates to issues about ‘methodological ideology’ (Sheppard, Citation2016, p. 1523) and the old (unnuanced) dichotomy between positivist approaches versus interpretivist approaches ‘pledging allegiance’ to one over the other. At a more general level is the troubled relationship between social work as an academic discipline at the one hand and professional training at the other. This creates resistance or reluctance among educators, students, and practitioners toward engaging in research (MacIntyre & Paul, Citation2013, pp. 685–686). Elliott et al. (Citation2013) discusses factors that contribute to an environment in social work that fosters attitudes of reluctance toward learning and teaching statistics in research methods courses. Lack of emphasis on research methods and statistics (time devoted to research methods and statistics) along with faculty staff’s reluctance toward teaching these courses (due to own uneasiness with research and statistics) may create an environment that fosters anxiety among social work students (Elliott et al., Citation2013, p. 84). Although the stereotypical reputation of social workers being ‘research reluctant’ (Epstein, Citation1987) has been modified as being too simplistic, research findings suggest that students with less statistical knowledge are more fearful of research courses or reluctant toward them (Secret et al., Citation2003). Prior studies have shown that social work students report having more research and computer anxiety and find research less important to their profession than other student groups, such as psychology and business students (Green et al., Citation2001). A study about attitudes toward research among social work students in USA showed that research orientation, referring to the perceived importance and usefulness to social work practice increased students’ belief in the importance of research and decreased their research anxiety. Moreover, age and self-efficacy was associated with increased research interest (Bolin et al., Citation2012). A later study about first year social work students at the bachelor level in Switzerland using structural equation modeling (Gredig & Bartelsen-Raemy, Citation2018) showed contrary to Bolin et al. (Citation2012) that fear of research courses predicted research orientation and interest in research courses. More specifically, the findings showed that higher level of fear of research courses (worries, concerns, and strains) were associated with lower scores on research orientation (perceived importance of research, usefulness of research for social work practice, and perceived unbiased nature of research) and lower scores on research orientation were associated with less interest in research courses. Moreover, level of fear of research directly influenced the level of interest in research courses. A recent study by the same authors involved a comparison of undergraduates in Switzerland and Australia (Gredig et al., Citation2022). The findings showed that in both student groups interest in research courses was predicted by students fear of research courses and their research orientation. Fear of research courses was predicted by general self-efficacy and statistics anxiety. Fear of research courses did not, however, determine research orientation contrary to earlier findings (Gredig & Bartelsen-Raemy, Citation2018).

Research anxiety has often been discussed in the literature in the context of anxiety toward mathematics, statistics, and computers (Bolin et al., Citation2012, p. 226), but also anxiety against research cources as illustrated above (Gredig & Bartelsen-Raemy, Citation2018; Gredig et al., Citation2022). In an early exploratory study, Royce and Rompf (Citation1992, p. 270) found that two samples of undergraduates majoring in social work had much higher levels of mathematics anxiety compared to a cross-section of university undergraduates and had taken fewer mathematics courses in high school and college. Pan and Tang (Citation2005, p. 205) argued that statistics anxiety is not only due to a lack of training or insufficient skills, but also related to students’ misperceptions about statistics, such as not having sufficient mathematics training to do well in statistics classes and prior negative experiences with statistical courses. Factors contributing to statistics anxiety include fear of mathematics, lack of connection to daily life, pace of instruction, and the attitude of the instructor (Pan & Tang, Citation2005, p. 209). Other studies have similarly linked mathematics anxiety to lack of self-confidence, fear of failure, lack of knowledge, and non-engagement by students (Finlayson, Citation2014, p. 105). Moreover, mathematics anxiety is often linked to teaching styles in the classroom based on traditional delivery methods, whereby instructors provide information that students passively receive, and the instructor assumes a directive authoritative role (Finlayson, Citation2014, p. 101).

Many suggestions have been made about pedagogical methods and principles for teaching research methods and statistics (Epstein, Citation1987; Pan & Tang, Citation2005). Epstein (Citation1987, p. 76) argued the importance of starting where the students are, or as emphasized by Hattie (Citation2009, p. 238), teachers should see learning from a student’s perspective. That is to empathize with and acknowledge any resistance on the students’ behalf and encourage them to feel free to ask questions in class. In addition, students should be allowed to set the pace. Furthermore, assignments should be linked to practical issues and concerns, real-time data should be used, and assignment questions should be partialized and broken down into logical steps (Epstein, Citation1987, pp. 79–80). Moreover, humor should be used as a stress reduction device (Epstein, Citation1987, p. 85). Similarly, a focus group study exploring students’ experiences in a statistics class suggested helpful instructional strategies such as working on real-world problems, applying statistics in a research project, and reinforcing concepts through homework (Pan & Tang, Citation2005, p. 210). A mixed methods study examined among other social work research teachers’ strategies to identify and manage anxiety (Maschi et al., Citation2013). Such strategies included creating a supportive class climate, moreover, providing activities that are emotion focused such as to openly discuss and acknowledge student fears and anxieties. Furthermore, cognitive focused strategies such as creating cognitive links between research and students’ interests and everyday lives and action-focused strategies including for instance discussions of articles or critiquing each other’s work (Maschi et al., Citation2013, p. 810). Other positive experiences reported by Pan and Tang (Citation2005, p. 210) were providing lecture notes in advance of class and flexible availability of assistance by instructors and teaching assistants (Pan & Tang, Citation2005, p. 210). In the same vein in a explorative study, Tonsing (Citation2018), showed that increased use of ‘immediacy’ by the teacher, i.e. a set of verbal (e.g. talking outside class) and non-verbal behaviors (e.g. smiling) that reflect psychological and physical availability was associated to reduced students level of anxiety (Tonsing, Citation2018, p. 225). Also, Finlayson (Citation2014, p. 101) highlighted ‘constructivist teaching’ in contrast to traditional ways of teaching that include, among others, activities that are interactive and student-centered, whereby students are encouraged to ask questions and pursue their interests, work in groups, and use manipulative materials as primary sources.

Many of these pedagogical strategies point to the importance of the instructor’s role, behavior and attitude being immediate, flexible, and available, fostering a teaching environment that encourages students to ask questions in class, makes efforts to reduce stress, shows empathy, and acknowledges any resistance on the students’ behalf. Furthermore, the instructional strategies also point to active forms of learning and allowing students to work at their own pace, in groups as well as on real data and real problems based on their interests and everyday lives. In the next section, we first present the data and methods used in the empirical analysis presented in this paper. Thereafter we outline our framework and approach to a flipped classroom with active forms of learning statistics that aims to increase quantitative literacy among social work students.

Data and methods

The empirical analysis, presented in the second part of this paper, are based on 3 years of evaluation data from courses in statistical analysis among master’s level graduate students of social work at a Norwegian University. The first two data points (March 2019 and 2020) are from before the COVID-19 pandemic led to lockdowns in Norway, and the final data point was taken during the pandemic (March 2021). A total of 203 students completed the questionnaire, and the response rate was 87% in 2019, 72% in 2020, and 71% in 2021 (see in the Appendix for detailed information). These students were in their second semester of the master’s program in social work, child welfare and social policy. The course is a 10 ETCS course and is the only mandatory course in statistical analysis these students must take. Most of the students have never taken a course in statistical analysis previously. After taking this course the students may choose to take an elective course providing support for those fifteen to twenty five percent of each cohort who chooses to use statistical analysis in their master thesis.

The course evaluation was performed online during the last lecture in the course each year in a totally anonyms set-up and students were asked to provide their answers being guaranteed total anonymity. The compliance was given by partaking in the survey and the students were informed about the use of the data in research and development of the course. The questions used in the course evaluation, presented in in the Appendix, were not based on any validated instruments; nevertheless, we argue that they measure important aspects of learning and teaching. The course evaluation form used to generate the data for the empirical analysis was developed as part of the development of the course over time (please see all included questions in Appendix ). Parts of the evaluation form consists of questions used in previous evaluations of courses at the university and were modified as to fit this course. Other questions were based on our experiences teaching statistical analysis for social work students and tested in previous years lecturing in this course. Some of the questions were also inspired by the main pedagogical elements in a flipped classroom approach, presented in the discussion below—namely ‘feed up’, ‘feedback’ and ‘feed forward’.

To identify different dimensions of learning and teaching, we have used exploratory factor analysis. More specifically we used a common factor analysis to reduce the number of items before including them as independent variables in the subsequent regressions analysis. Based on a reflective measurement model we identified five dimensions of learning and teaching: ‘the learning environment dimension’, ‘the lecturer dimension’, ‘the traditional student’, “the prepared student and ‘the student active learning dimension’. In this article, we will present these dimensions and make use of them when analyzing different aspects of the students’ perceptions of learning statistical analysis using a flipped classroom approach.

Blended learning with a flipped classroom approach to teaching statistical analysis

Knowledge is not transferred in a loss-free process from teacher to student. Elements must be processed and put together in a way that makes sense for the student. This way of approaching learning is, as we see it, the basis for the principles of ‘blended learning’ with a ‘flipped classroom’ approach. Blended learning can be defined as: ‘ … the thoughtful integration of classroom face-to-face learning experiences with online learning experiences’, through which the strengths of synchronous (face-to-face) and asynchronous (flexible time) learning activities are integrated (Garrison & Kanuka, Citation2004, p. 96). Flipped classroom has been defined by Abeysekera and Dawson (Citation2015, p. 1) as:

… the information transmission component of a traditional face-to-face lecture (‘traditional lecture’) is moved out of class and the learning in-class are active, collaborative tasks. Students prepare for class by engaging with resources that cover what would have been in a traditional lecture. After class they follow up and consolidate their knowledge.

In essence, our understanding of blended learning with a flipped classroom approach follows Hattie (Citation2009, p. 238), who referred to it as ‘A model of visible teaching—visible learning’, whereby ‘teachers see learning through the eyes of the student. When students see themselves as their own teachers’.

Traditional forms of lectures can make a significant impact on the learning process but combined with more student-active forms of learning and formative assessments, it is possible to reduce the distance between students’ knowledge and the learning objective to achieve even better learning conditions (Foldnes, Citation2016; Hattie & Timperley, Citation2007; Nicol & Macfarlane-Dick, Citation2006). Combining the strengths of the traditional lecture with more student-active forms of learning, as well as the use of self- and per-student-assessments and feedback to increase learning outcomes, are the foundations of our teaching.

Feed up, feedback, and feed forward—a flipped classroom approach to teaching statistical analysis

The aim of applying blended learning with a flipped classroom approach is to make use of pedagogical measures that supports effective learning. The main pedagogical elements of such a teaching program, as we see it, are based on feedback that defines the goal (‘feed up’), providing a clear frame of reference for progress (‘feedback’), and showing the way to future learning opportunities (‘feed forward’). Feedback should also address four different levels of learning: the task, the process, self-regulation, and the self (Hattie & Timperley, Citation2007). The use of feedback for learning has proven very effective in the learning situation (Hattie & Timperley, Citation2007; Hattie, Citation2009).

As shown in Illustration 1 and 2, feed up is about clarifying learning goals and what students should have achieved on completing the course. Feedback sets the framework and enables the individual student to assess their progress and know how well they are doing. Students receive feedback on the tasks they perform based on an ‘expected standard’. Students need to be made aware of what is required, and a joint review of the requirements and examples of good answers is a way of facilitating an understanding of what is expected. Feed forward provides students with an indication of where they can go next and shows the way to greater learning opportunities, and thus greater challenges. This may lead to better self-regulation, access to more strategies, and new ways of working (Hattie & Timperley, Citation2007).

Illustration 1. How “feed up,” “feedback,” and “feed forward” at different levels of learning contribute to effective teaching (Hattie & Timperley, Citation2007).

Illustration 1. How “feed up,” “feedback,” and “feed forward” at different levels of learning contribute to effective teaching (Hattie & Timperley, Citation2007).

Illustration 2. How “feed up,” “feedback,” and “feed forward” at different learning levels are implemented in our teaching (Hattie & Timperley, Citation2007).

Illustration 2. How “feed up,” “feedback,” and “feed forward” at different learning levels are implemented in our teaching (Hattie & Timperley, Citation2007).

Statistical analysis is a course that for most students is novel and different from all other courses they have previously attended. The fact that this course is so different, and especially because it requires an understanding of numbers, means that many students bring with them a good dose of statistical anxiety to the introductory lecture, as discussed in the introduction.

Feed up

Nicol and Macfarlane-Dick (Citation2006) emphasized that students can only achieve their learning goals if they understand the goals, gain ownership of them, and can assess their own progression. The first lecture is crucial to set the tone i.e. to define the goal and set the framework for the course (feed up). We aim to establish a dialogue whereby students feel that they can be open about what they find challenging and can be shown that it is possible to master the course and achieve the learning goals. Our experience is that to set the tone and establish this dialogue, it is crucial to meet students at the point they are at in terms of knowledge. This principally implies, as we see it, that we as teachers strive to take the students’ perspective on learning statistical analysis and enabling them to see themselves as their own teachers (Hattie, Citation2009). This role is an important part of what we later will call the ‘lecturer dimension’. More specifically, this is about acknowledging the fact that statistical analysis is perceived as challenging, ensuring that experiencing these challenges is common ‘we’ve been there ourselves’–and outlining the teaching resources available for students to meet the challenges in a constructive way.

The introductory lecture draws on the strengths of the traditional lecture format by setting the framework for the course, i.e. content and progression, expectations in terms of self-effort, and engaging students in the academic work that will take place throughout the course (Ramsden, Citation2003). In the first lecture, we particularly focus on emphasizing the teaching plan as an important document to govern the learning process for the course and that the student must relate to it in an active manner. The availability of course information and materials, and an intuitive structure of the learning process laid out in the teaching plan, is in our experience crucial for a flipped classroom approach to be successful. This is because we, as teachers, must let go of much of the control when we go from teacher-led to student-led teaching and put more pressure on students to prepare for class. As teachers, we facilitate and provide all the resources necessary to achieve the learning outcomes; however, students must actively engage in their own learning process by making use of all the resources made available to them.

In the upcoming empirical analysis, we will be looking at ‘prepared students’ and ‘traditional students’ as distinct dimensions, based on course evaluation questions. Given the fact that a flipped classroom approach puts more pressure on students to prepare for class, our hypothesis is that prepared students evaluate this approach to learning statical analysis in a more favorable way than those who are less prepared. As for traditional students, this dimension includes questions meant to capture the opinion of those who feel less enthusiastic about student-led teaching and favor more of teacher-led activities.

Another central part of the first lecture, and the feed up part of the teaching plan, is to present the analysis portfolio that constitutes the main element in our flipped classroom approach to teaching statistical analysis. The analysis portfolio is a document guiding and presenting all the assignments the students work through during the course. In the first part of the portfolio the students familiarize themselves with the statistical software and the dataset. They start writing an introduction and formulate a research question of interest guiding their analysis. Thereafter the portfolio consists of various assignments with rising degree of difficulty starting with univariate, bivariate and lastly multivariate analysis. Through the process of working with the analysis portfolio, the focus is on discussion and reflection. The students themselves apply the knowledge they have acquired by conducting the practical exercises and interpreting the results.

The analysis portfolio is based on cooperative learning in the sense that students are encouraged to form groups, work together, and help each other out while working with the analysis portfolio. The students are also supervised in a group setting, allowing those with similar issues to listen in when we explain how to go about solving the task at hand. Through working in groups, taking part in discussions, and peer feedback, the analysis portfolio facilitates cooperative learning, with students working together to achieve their learning goals (Foldnes, Citation2016).

Several studies have indicated that cooperative learning encourages students to put in more effort to achieve learning goals than they would when learning on their own (Hassanien, Citation2007; Roseth et al., Citation2008; Springer et al., Citation1999). However, most of this research was based on studies of children in primary and secondary schools. There have been few empirical studies on the use of cooperative learning to strengthen learning outcomes in higher education (Foldnes, Citation2016; Herrmann, Citation2013). Nevertheless, Foldnes (Citation2016) found a highly significant increase in test and examination score performance for students in the flipped classroom group over the traditional lecture group. The study was based on an RCT design and included first-year undergraduate students at a Norwegian business school who attended statistics and mathematics courses. The author concluded that a flipped classroom approach implemented with cooperative learning is a more effective teaching method than the traditional lecture—homework format.

In our course, the teaching plan and analysis portfolio outline the direction for the students, and the various e-resources made available to them in the form of e-lectures, explanatory videos, and padlets provide them with access to alternative understandings of all the information they must relate to. In this way, they will have a better basis for understanding the specific subject they are working with in the analysis portfolio (Hattie & Timperley, Citation2007).

Feed forward

For the subsequent six lectures, the teaching plan outlines which e-lectures the students must watch in advance of the ordinary lectures. We use the first hour of each three-hour session to provide a short summary with emphasis on the most important elements from the e-lecture that the students have already watched. Hence, during this first hour, we use a traditional form of lecture that gives students the opportunity to ask questions about the content of the e-lecture that is summarized and to clarify ambiguities. The next 2 hours are spent working with the analysis portfolio that covers all parts of the course, from research question and simple descriptive statistics to multiple regression analysis and statistical modeling. The teaching plan outlines which part of the analysis portfolio the students are expected to be working with during a given lecture and the progress that is expected.

The focus now shifts from the assignment to the process, whereby the framework and goal are clarified, and the teaching is designed to support individually adapted learning through work with the analysis portfolio. Facilitating individually adapted learning has been central to the pedagogical approach in the course, enabling a learning situation whereby we as teachers can address the students’ issues in an individual manner. Compared to traditional lectures, where the whole class needs to follow the same pace, working with the analysis portfolio allows students to set the pace according to their own understanding of the subject at hand. Our experience is that this way of teaching significantly reduces stress and the anxiety that students often express about not understanding the subject. In a traditional lecture setting, some students may feel that progress is too slow, some may feel it is too fast, and some may feel it is just about right. In the cooperative learning setting, facilitated by working with the analysis portfolio, those students who feel secure can help those who feel less secure to gain a better understanding by explaining the subject at hand. At the same time, students struggling with a certain part of the analysis portfolio can feel confident about their understanding before moving on. As teachers, we are often invited into the discussion, asked to validate peer-to-peer explanations, and provide answers to other questions arising from the discussion.

Feedback

When we provide feedback on the different tasks included in the analysis portfolio, our approach assumes that it is often most effective to focus on feedback on interpretations of information than on missing information (Nicol & Macfarlane-Dick, Citation2006). As Hattie and Timperley (Citation2007, p. 82) put it: ‘To take on this instructional purpose, feedback needs to provide information specifically relating to the task or process of learning that fills a gap between what is understood and what is aimed to be understood’. The focus should therefore be to give hints to the student about how they can assess their own understanding and thus contribute to raising their level of knowledge (Nicol & Macfarlane-Dick, Citation2006). In our experience, such an approach to feedback further facilitates cooperative learning, in the sense that students need to be proactive in seeking, making sense of, and using comments on their performance or approach to learning from their peers and teachers. Thus, working with the analysis portfolio puts a focus on students’ actions in response to performance information from teachers, peers, and their own self-evaluation. This cooperative group work, based on actively asking and answering questions with almost instant feedback, is a powerful learning tool, since the feedback students receive is primarily from fellow students in their group (Foldnes, Citation2016; Winstone & Carless, Citation2020).

Feedback is an important part of the ‘learning environment dimension’ and also of what we will call the ‘student active learning dimension’ in the upcoming empirical analysis of the course evaluation data. The role of the teacher is to support the reflection process, whereby the students themselves find a satisfactory way of solving a task. This enables the students to work at their own pace and get statistical analysis ‘under the skin’ by ‘doing’ and not just ‘hearing’.

We have developed tutorial videos for each part of the analysis portfolio to provide students with an online resource for clarifying questions when working with the portfolio between lectures. These tutorials not only support the students when working with the analysis portfolio, they also provide them with an insight into what is required of them based on the outline of an expected standard for each part of the portfolio (Hattie & Timperley, Citation2007; Nicol & Macfarlane-Dick, Citation2006).

As students work through different parts of the portfolio, we aim to provide them with continuous confirmation of whether they are on the right path, filling in the knowledge gap between what they understand and the learning goals of the course (Hattie & Timperley, Citation2007). Through this dialogue, we as teachers not only provide students with information, but we actively seek to be part of a professional discussion about the feedback they receive from us (Nicol & Macfarlane-Dick, Citation2006). This dialogue is not only important for students’ learning, but also for our ability to continuously adjust the course material and information to be made available to further support the students in their work with the portfolio and achieve their learning goals.

Summing up, feed up, feedback and feed forward cover important dimensions of teaching and learning based on the flipped classroom approach outlined above. Many of these elements find support in earlier studies and suggestions on pedagogical strategies for teaching statistics in the context of social work, whereby research reluctance and statistical anxiety may be important barriers to overcome, at least for some students. In this next part, we investigate how the students themselves perceive this way of learning statistical analysis. We use the COVID-19 pandemic as a critical test and investigate whether this way of teaching was perceived differently by the 2021 students who had to attend the course in a wholly online setting during the general lockdown of Norway.

Results

In the empirical analysis of the course evaluation data, we first begin by looking at the students’ ‘overall rating of the course’ before moving on to an exploratory factor analysis to identify the dimensions of learning and teaching. Thereafter follows a bivariate correlation analysis to investigate the linear relationship between each of the five dimensions and year. Finally, we perform a linear regression analysis whereby we use the students’ overall rating of the course as a dependent variable and the dimensions, and in some instances selected items from the dimensions, as well as year as independent variables. Our main purpose for performing these analyses is first and foremost to investigate the importance of the five dimensions of learning and teaching, those being ‘the learning environment dimension’, ‘the lecturer dimension’, ‘the traditional student’, “the prepared student and ‘the student active learning dimension’ for the students ‘overall rating of the course’.

We chose the ‘overall rating of the course’ as the dependent variable because this question covers all aspects of the course, not confining the student to evaluate specific aspects of the course. Furthermore, since the evaluation was performed at the end of the course the students had experienced the course in its totality. Also, we wanted to investigate how the five dimensions, as well experiencing a teaching situation fully online during the COVID-19 pandemic, was related to the student’s overall experience of the course.

Overall, the students were very satisfied with the course. As shown in , almost eight out of ten (76.8%) rated it as ‘very good’ or ‘excellent’, when looking at the results across all the years. The 2019 cohort was the most satisfied, with 87.7% responding with ‘very good’ or ‘excellent’, whereas 80.9% of the 2020 cohort provided the same response. Not surprisingly, the 2021 cohort, who took the course during a period of lockdown, rated it much less favorably than the two previous cohorts, with 64.4% responding with ‘very good’ or ‘excellent’ for the ‘overall rating of the course’.

Figure 1. The students’ overall rating of the course by year (N = 198).

Mean: 4.1 (“very good”), Std. dev: .821, Skewness: -.584.
Figure 1. The students’ overall rating of the course by year (N = 198).

Following the general lockdown, the 2021 course could only take place online over Zoom, which created a significantly different learning environment, compared with the two previous years, when all lectures were provided in a physical classroom setting. The flipped classroom approach is, as we have previously emphasized, based on a design whereby students are required to actively engage in their own learning process and make use of the resources available to them. Our experience is that many students find this stressful and demanding at first, but as they continue, they adapt to this way of teaching in an engaged manner. This was strongly underscored by how they rated the course, as shown in .

The 2021 cohort experienced a ‘double-burden’ of student activation: not only did they have to adapt to an entirely online student reality, but they also had to manage a course in statistical analysis based on a flipped classroom approach. From our previous knowledge of how demanding a flipped classroom approach is for students to begin with, we were mentally prepared for a semester with many frustrated students, due to the extra burden of not being able to physically meet their teachers or fellow students. We were surprised when 64.4% of the 2021 cohort rated the course as ‘very good’ or ‘excellent’. If we were to add those who responded with ‘good’, a total of 97.3% of that cohort responded positively. Given the fact that this is a course in statistical analysis for master’s students of social work, and the fact that mathematics, statistics, and computer anxiety among students is well documented (Forte, Citation1995; Green et al., Citation2001; Pan & Tang, Citation2005; Royce & Rompf, Citation1992), we believe these results provide a strong argument for a flipped classroom approach to teaching statistical analysis in social work.

With this in mind, we were interested in further investigating the most important factors for succeeding with a flipped classroom approach. Furthermore, we wished to explore if and to what extent the perception of students during the COVID-19 pandemic differed from that of the cohorts of the two previous years. We began with an exploratory factor analysis to identify five dimensions in the course evaluations with high to acceptable (>0.4) factor loadings, with the exception one item— ‘How many times did you attend class?’—with a somewhat low loading (.304). summarizes the factor loadings for the five dimensions that we have called: ‘The lecturer dimension’, ‘The learning environment dimension’, ‘The traditional student’, ‘The prepared student’, and ‘The student active learning dimension’ (Please see Appendix for scree plot).

Table 1. Results from an exploratory factor analysis* – identifying five dimensions.

As shown in , only ‘The learning environment dimension’ and ‘The lecturer dimension’ have a satisfactory internal consistency measured by Cronbach’s alpha (above .70). The three other dimensions— ‘The traditional student’, ‘The prepared student’, and ‘The student active learning dimension’—have somewhat low internal consistency (above .50, but below .70). Consequently, in the bivariate analysis presented below, we included ‘The learning environment dimension’ and ‘The lecturer dimension’ as additive indexes, whereas the most relevant questions in the other dimensions were included as separate variables.

Bivariate correlation analysis is useful for investigating the relationship between a dependent variable and each of the independent variables to assess whether and how they co-vary. The results of the analysis are shown in . The analysis shows that the strongest association is the correlation between the ‘overall rating of the course’ and ‘the learning environment dimension’, with a significant and strongly positive Pearson r of 0.708. The results indicate that students who found the learning environment favorable, including those who had more positive perceptions of the course objectives, material, and expectations and an increased interest in statistical analysis, rated the course more highly. The association between ‘overall rating of the course’ and ‘the lecturer dimension’ is also strong with a Pearson r of 0.647 (cf. ). The association between ‘overall rating of the course’ and the separate variables of the ‘student active learning’ dimension is moderate for students’ assessment of the organizing of the course (Pearsons r = 0.420) and for working with the analysis portfolio (Pearson r = 0.314), showing that those students who were positive toward student active forms of learning tended to rate the course more highly. Also, the questions included in the ‘prepared student dimension’ are positively and significantly associated to the ‘overall rating of the course’, albeit with a somewhat weaker association (cf. ). The association between the dependent variable and the separate items in the ‘traditional student’ dimension are not significant (cf. ). The bivariate analysis also shows that students in the 2021 cohort were significantly less satisfied with the course.

Table 2. Bivariate correlations between the five dimensions and year.

In the next step, we analyzed the association between the ‘learning environment dimension’ and ‘lecture dimension’, in addition to the separate items in the ‘traditional student dimension’, the ‘prepared student dimension’, and the ‘student active learning dimension’, in a multivariate stepwise regression analysis (). Using this approach allows us to investigate the net effect of each variable, considering the other variables in the model, and assess which dimensions of the flipped classroom approach are most important, i.e. have the strongest statistical effect on students’ overall rating of the course.

Table 3. Regression analysis using “overall rating of the course” as dependent variable and the five dimensions and year as independent variables.

The results, presented in , shows that one standard deviation increase in ‘the learning environment dimension’ is associated with a 0.707 standard deviation increase in the overall rating of the course, on average. However, introducing the lecturer dimension in Model 2 decreases the beta coefficient of ‘the learning environment dimension’ to 0.500, showing that some of effect of ‘the learning environment dimension’ in Model 1 is explained by ‘the lecturer dimension’. The statistical effect of both ‘the lecturer dimension’ and ‘the learning environment dimension’ holds with rather small changes throughout the stepwise analysis, controlling for all the other variables in the model.

In line with the bivariate analysis, the effect of ‘the traditional student dimension’ items are not significant. However, ‘the prepared student dimension’ items have a positive and significant effect on ‘the overall rating of the course’, except for ‘hours spent on preparation for the course’, which has a significant negative effect on the overall rating (models 4–6). Only one item in ‘the student active learning dimension’, namely the assessment of learning outcomes of the flipped classroom approach compared to more traditional ways of lecturing, is significantly and positively related to ‘the overall rating of the course’, whereby one standard deviation change increases the overall rating of the course by 0.117 standard deviations in Model 6, controlling for all other variables in the model (cf. ).

The dummy year variables comparing the students overall rating of the course in 2019 with the 2020 class and the 2021 pandemic cohort are not significant, indicating that the year when the students took the course has no significant effect on ‘the overall rating of the course’. As shown in , the dimensions and items included in the analysis explain a very high share of the variation in ‘the overall rating of the course’. from an R2 of 0.500 in Model 1 to 0.603 in Model 6. As shown in Model 2, ‘the learning environment dimension’ and ‘the lecturer dimension’ explain a total of 55.2% of the variation in ‘the overall rating of the course’. Even though the exploratory factor analysis identified ‘the learning environment dimension’ and ‘the lecturer dimension’ as two distinct dimensions in the course evaluation data, these two dimensions greatly overlap with a bivariate correlation of .666 (as shown in ).

Concluding discussion – social work and statistical analysis: a friendship in the making?

We have applied blended learning with a flipped classroom approach with the objective of making use of pedagogical measures that support effective learning. As previously underscored, the main pedagogical elements in such a teaching program, as we see it, are based on feedback that defines the goals (feed up), providing a clear frame of reference for progress (feedback), and showing the way to further learning opportunities (feed forward). In this article, we have presented the implementation and use of these pedagogical elements in our teaching and furthermore investigated students’ perception of learning statistical analysis using a flipped classroom approach.

The results from the empirical analysis, based on 3 years of evaluation data from our course in statistical analysis for master’s level graduate students of social work, shows that the ‘learning environment dimension’ and the ‘lecturer dimension’ are especially important for explaining the students’ ‘overall rating of the course’. The ‘learning environment dimension’ includes items measuring important aspects of feed up (‘the course objectives were clear’), feedback (‘I think the e-lectures and analysis portfolio instruction videos on YouTube were very helpful and instructive’), and feed forward (‘the course increased my interest in quantitative methods/statistical analysis’).

In our experience, finding that ‘the course textbooks were clear and well written’ is important when students try to grasp the goals (feed up) and yet another resource for understanding and evaluating their work with the analysis portfolio (feedback). Furthermore, when the students find that ‘the course exceeded my initial expectations’, they are far more open to the possibility of exploring further learning opportunities (feed forward), which includes using statistical analysis in their master thesis. As teachers, we undoubtedly play a key role in facilitating a fruitful learning environment by defining the goals (feed up), provide a clear frame of reference for progress (feedback), and show the way to greater learning opportunities. When students find that ‘the lecturer demonstrated in-depth knowledge of the subject’, our interpretation is that the students trust our ability to guide and help them overcome barriers that perhaps might cause stress and statistical anxiety.

The importance of the ‘lecture dimension’ demonstrates that the behavior and attitude of the teacher is vital in terms of overall satisfaction with the course. Students who find that “the lecturer showed real commitment to the students’ learning” and that ‘the lecturer was enthusiastic about the course’ awarded a higher rating for the course. Furthermore, the ‘lecturer dimension’ is undoubtedly an important part of feed up in our flipped classroom approach, through which we as teachers try to take a student’s perspective on learning statistical analysis (‘the lecturer encouraged feedback from the class’) and communicate in a way that enables the students to see themselves as their own teachers (Hattie, Citation2009). These results also support earlier findings underscoring the importance of teachers being flexible and available to the students (Pan & Tang, Citation2005) and perhaps teacher immediacy (Tonsing, Citation2018).

Due to their lack of experience with statistical analysis, and possibly even the prevalence of statistical anxiety, meeting the students where they are seems especially important when teaching statistical analysis to master’s students of social work (Epstein, Citation1987). From our point of view, that is to empathize with the students, acknowledge any resistance they may experience, and encourage them to ask questions that is also in line with the emotion focused strategies suggested by Maschi et al. (Citation2013).

One of the items in the ‘student active learning dimension’ is based on the hypothetical question: ‘I think I learn more from the flipped classroom organization of the course (combination of e-lectures, short lectures, and doing the analysis portfolio in class) than I would by only listening to the lecturer for the whole lecture.’ It is not a surprise that students who are convinced of the flipped classroom approach also rate the course significantly more highly. In our experience, making students believe in this approach to teaching and learning is crucial for the success of the teaching program, seeing that it is student-centered and based on student activity (Finlayson, Citation2014, p. 101; Maschi et al., Citation2013; Ramsden, Citation2003).

The items that are part of the ‘traditional student dimension’, namely assessments about whether the course was based too much on student active forms of teaching and preference for more traditional lecturing, are not significantly related to the ‘overall rating of the course’, providing additional support for a flipped classroom approach to teaching statistical analysis.

The flipped classroom approach is more demanding of the students in terms of preparing for class. The negative effect of hours spent preparing for the course may suggest that some students, and perhaps those who have struggled the most to grasp the content of the course, despite their preparation, did not manage to reach their aspirations in terms of learning outcomes. An alternative explanation is that some of the strongest students, and perhaps those with some previous experience of statistical analysis (who prepared a lot), felt that their progression in the course was not satisfactory. For the first group of students, the level of difficulty of the course might have been too high; conversely, for the second group of students, it might have been too low—a point well worth considering in future adaption of the course. Nonetheless, the findings also suggest that watching e-lectures, as well as reading the syllabus, was associated with higher course ratings, indicating that preparations beforehand are an important part of the students’ feed forward process.

Finally, our findings show that the students who took the course during the COVID-19 pandemic in 2021 did not rate the course significantly lower than those who took it in 2019 and 2020. These results are somewhat surprising, given the fact that the 2021 cohort was entirely taught online through Zoom, creating a noteworthy different learning environment and what we call a ‘double-burden’ of student activation. We expected that the students’ evaluation of the course would be influenced by some form of digital ‘tiredness’, which we cannot rule out, of course. Nonetheless, this finding suggests that the course was especially suitable for the digital reorganization that was imposed on all teaching due to infection control measures taken by the Norwegian government. Most of the resources had been developed beforehand (e-lectures, instructional videos, and the analysis portfolio) and some we developed, such as digital blackboards (i.e. padlets) and grouping students into Zoom breakout rooms to work with the analysis portfolio, and these provided an opportunity to maintain the important dialogue-based aspects of our approach to a flipped classroom. During class, we teachers swopped from one breakout room to the next to provide feedback and discuss and reflect with students on their work with the analysis portfolio. In sum, we were able to carry out the course in a similar way to previous years, despite only meeting the students online.

Based on our experience with this form of teaching, supported by the findings from the evaluation data, we argue that a flipped classroom approach to teaching statistical analysis may be a means to reduce the apparent problematic position of quantitative research methods in social work. Considering that traditional lectures are not adequate when learning and practicing skills, in addition to earlier findings on the prevalence of mathematics and statistical anxiety among social work students (Forte, Citation1995; Green et al., Citation2001; Pan & Tang, Citation2005; Royce & Rompf, Citation1992), a flipped classroom approach, such as the one outlined in this article, consists of important elements that may enhance learning outcomes. These elements are as outlined: feed up (defining goals), feedback (providing a clear frame of reference), and feed forward (showing the way to greater learning opportunities). Enhancing learning outcomes in statistical analysis may be important in terms of attitudes toward research in general, and quantitative research in particular, among social workers. Perhaps statistical analysis in social work can be a friendship in the making?

Limitations

The analysis is restricted to 3 years, with course evaluations from a Norwegian University; thus, the external validity of the findings may be restricted. The course evaluation form is not based on validated instruments.

Acknowledgements

We would like to thank our colleagues for valuable discussions and comments to earlier drafts of the paper and express our gratitude to the two anonymous referees for useful and inspiring comments and suggestions for changes.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Due to the nature of this research, participants of this study did not agree for their data to be shared publicly, so supporting data is not available.

References

  • Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research and Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336
  • Bligh, D. A. (2000). What’s the use of lectures? (Vol. 30). Jossey-Bass Inc.
  • Bolin, B. L., Lee, K. H., GlenMaye, L. F., & Yoon, D. P. (2012). Impact of research orientation on attitudes toward research of social work students. Journal of Social Work Education, 48(2), 223–243. https://doi.org/10.5175/JSWE.2012.200900120
  • Elliott, W., Choi, E., & Friedline, T. (2013). Online statistics labs in MSW research methods courses: Reducing reluctance toward statistics. Journal of Social Work Education, 49(1), 81–95. https://doi.org/10.1080/10437797.2013.755095
  • Epstein, I. (1987). Pedagogy of the preturbed: Teaching research to the reluctants. Journal of Teaching in Social Work, 1(1), 71–89. https://doi.org/10.1300/J067v01n01_06
  • Finlayson, M. (2014). Addressing math anxiety in the classroom. Improving Schools, 17(1), 99–115. https://doi.org/10.1177/1365480214521457
  • Foldnes, N. (2016). The flipped classroom and cooperative learning: Evidence from a randomised experiment. Active Learning in Higher Education, 17(1), 39–49. https://doi.org/10.1177/1469787415616726
  • Foldnes, N. (2017). The impact of class attendance on student learning in a flipped classroom. Nordic Journal of Digital Literacy, 12(1–02), 8–18. https://doi.org/10.18261/issn.1891-943x-2017-01-02-02
  • Forte, J. A. (1995). Teaching statistics without sadistics. Journal of Social Work Education, 31(2), 204–218. https://doi.org/10.1080/10437797.1995.10672258
  • Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7(2), 95–105. https://doi.org/10.1016/j.iheduc.2004.02.001
  • Gleeson, H., Pezzella, A., & Rahman, N. (2021). Learning to become evidence based social workers: Student views on research education and implementation in practice. Social Work Education, 1–17. https://doi.org/10.1080/02615479.2021.1965113
  • Gredig, D., & Bartelsen-Raemy, A. (2018). Exploring social work students’ attitudes toward research courses: Predictors of interest in research-related courses among first year students enrolled in a bachelor’s programme in Switzerland. Social Work Education, 37(2), 190–208. https://doi.org/10.1080/02615479.2017.1389880
  • Gredig, D., Heinsch, M., & Bartelsen-Raemy, A. (2022). Exploring social work students’ attitudes toward research courses: Comparing students in Australia and Switzerland. Social Work Education, 41(4), 451–471. https://doi.org/10.1080/02615479.2020.1849086
  • Green, R. G., Bretzin, A., Leininger, C., & Stauffer, R. (2001). Research learning attributes of graduate students in social work, psychology, and business. Journal of Social Work Education, 37(2), 333. https://doi.org/10.1080/10437797.2001.10779058
  • Hassanien, A. (2007). A qualitative student evaluation of group learning in higher education. Higher Education in Europe, 32(2–3), 135–150. https://doi.org/10.1080/03797720701840633
  • Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Routledge. https://doi.org/10.4324/9780203887332
  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
  • Herrmann, K. J. (2013). The impact of cooperative learning on student engagement: Results from an intervention. Active Learning in Higher Education, 14(3), 175–187. https://doi.org/10.1177/1469787413498035
  • Jacobsson, K., & Meeuwisse, A. (2020). ‘State governing of knowledge’ – constraining social work research and practice. European Journal of Social Work, 23(2), 277–289. https://doi.org/10.1080/13691457.2018.1530642
  • MacIntyre, G., & Paul, S. (2013). Teaching research in social work: Capacity and challenge. British Journal of Social Work, 43(4), 685–702. https://doi.org/10.1093/bjsw/bcs010
  • Maschi, T., Wells, M., Yoder Slater, G., MacMillan, T., & Ristow, J. (2013). Social work students’ research-related anxiety and self-efficacy: Research instructors’ perceptions and teaching innovations. Social Work Education, 32(6), 800–817. https://doi.org/10.1080/02615479.2012.695343
  • Nicol, D. J., & Macfarlane Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
  • Pan, W., & Tang, M. (2005). Students’ perceptions on factors of statistics anxiety and instructional strategies. Journal of Instructional Psychology, 32(3), 205.
  • Ramsden, P. (2003). Learning to teach in higher education (2nd ed.). RoutledgeFalmer. https://doi.org/10.4324/9780203507711
  • Roseth, C. J., Johnson, D. W., & Johnson, R. T. (2008). Promoting early adolescents’ achievement and peer relationships: The effects of cooperative, competitive, and individualistic goal structures. Psychological Bulletin, 134(2), 223–246. https://doi.org/10.1037/0033-2909.134.2.223
  • Royce, D., & Rompf, E. L. (1992). Math anxiety: A comparison of social work and non-social work students. Journal of Social Work Education, 28(3), 270–277. https://doi.org/10.1080/10437797.1992.10778780
  • Secret, M., Ford, J., & Rompf, E. L. (2003). Undergraduate research courses: A closer look reveals complex social work student attitudes. Journal of Social Work Education, 39(3), 411–422. https://doi.org/10.1080/10437797.2003.10779146
  • Sheppard, M. (2016). The nature and extent of quantitative research in social work: A ten-year study of publications in social work journals. British Journal of Social Work, 46(6), 1520–1536. https://doi.org/10.1093/bjsw/bcv084
  • Skodvin, A. (2016). Fra kateter og kaos? Forelesning i forskjellige varianter [from catheter to chaos? Lecturing in varying ways. In H. I. Strømsø, K. H. L. Lycke, & P. Lauvås (Eds.), Når læring er det viktigste: Undervisning i høyere utdanning [when teaching is the most important: Teaching in higher education] (pp. 141–154). Cappelen Damm Akademisk.
  • Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69(1), 21–51. https://doi.org/10.3102/00346543069001021
  • Steen-Utheim, A. T., & Foldnes, N. (2018). A qualitative investigation of student engagement in a flipped classroom. Teaching in Higher Education, 23(3), 307–324. https://doi.org/10.1080/13562517.2017.1379481
  • Taylor, B. J., & Sharland, E. (2015). The creation of the European social work research association. Research on Social Work Practice, 25(5), 623–627. https://doi.org/10.1177/1049731514558686
  • Tonsing, K. N. (2018). Instructor immediacy and statistics anxiety in social work undergraduate students. Social Work Education, 37(2), 223–233. https://doi.org/10.1080/02615479.2017.1395009
  • Winstone, N., & Carless, D. (2020). Designing effective feedback processes in higher education: A learning-focused approach (1st ed.). Routledge. https://doi.org/10.4324/9781351115940

Appendix

Table A1. Questions asked in the course evaluations.

Table A2. Response rate.