106
Views
0
CrossRef citations to date
0
Altmetric
Original article

Asking students how to best teach statistics virtually: results of focus group discussions

ORCID Icon, ORCID Icon & ORCID Icon
Article: 2347628 | Received 28 Feb 2023, Accepted 18 Apr 2024, Published online: 12 May 2024

ABSTRACT

Objective

The unique circumstances of COVID-19 have raised questions of best practices for how to teach statistics virtually. The present study evaluated which characteristics of statistics learning activities in the virtual environment increase undergraduate psychology students’ (a) engagement, (b) satisfaction, and (c) knowledge of statistics.

Method

Semi-structured focus groups (N = 13 participants, aged 21–58, 64.3% female, 28.6% male, 7.1% gender fluid) were conducted. The focus group conversations were transcribed and analysed using reflexive thematic analysis.

Results

The findings indicate three main themes: catering to/supporting external students, why learning statistics online does not work, and how to make teaching and learning statistics online work. Social norms that prioritise time efficiency over engagement and interaction mean that the immediacy of learning and engaging with an instructor is often lost virtually. Suggestions for how to improve teaching and learning statistics online centred on readily accessible content, statistical software access, a teaching delivery that is methodical and flexible, immediacy of instructor response, and prioritising student-student and student-instructor connection.

Conclusion

These findings illustrate key challenges specific to teaching and learning statistics virtually, with real implications for how to better design and implement related curriculum.

Key Points

What is already known about this topic:

  1. Students experience statistics and research methods anxiety and avoidance.

  2. The switch to emergency online learning during COVID-19 was challenging for educators and students.

  3. Difficulties occur in implementing active learning for statistics teaching in an online environment.

What this topic adds:

  1. Learning statistics online is considerably different to what is traditionally experienced on-campus. Those instructing these subjects need to be aware of the unique challenges faced in teaching and learning statistics content online.

  2. Educators must strive to counteract social norms that prioritise time efficiency over engagement and interaction. Genuine communication opportunities amongst staff and students will support student engagement, satisfaction, and confidence.

  3. When teaching statistics subjects online, content should be flexibly accessible in a way that students can access it anytime and anywhere. Home access to statistical software and a software “how to” guide are essential for psychology students when studying statistics subjects online.

COVID-19 massively disrupted teaching and learning at the tertiary level (Maher et al., Citation2023; Müller et al., Citation2021). To reduce the spread of the virus, universities postponed on-campus activity, swiftly moving to the online environment (Müller et al., Citation2021). Making use of online learning models in response to an unforeseen emergency event is referred to as “emergency online learning” (EOL; Müller et al., Citation2021). Due to the suddenness of COVID-19, educators had minimal opportunities to prepare online learning materials, impacting the quality of subject delivery and content and educators’ ability to support students (Haardörfer & Livingston, Citation2021; Müller et al., Citation2021). The sudden move to the online environment was especially challenging for statistics educators (Maher et al., Citation2023) who faced practical issues such as student access to (often pay-walled) statistical software and poor internet quality (Haardörfer & Livingston, Citation2021). Additionally, active learning practices – which have been shown to increase student engagement and confidence when learning statistics (Allen & Baughman, Citation2016) – are difficult to replicate in the online environment, particularly under short notice (Müller et al., Citation2021).

The move to online learning was equally challenging for students, who experienced increased anxiety and depression (Elmer & Stadtfeld, Citation2020; Mantasiah et al., Citation2021), due to a reduction in in-person socialisation and difficulties in navigating online methods of communication (Alomyan, Citation2021; Dzakiria et al., Citation2005). Additionally, students experienced difficulty concentrating and feelings of lethargy (Alomyan, Citation2021).

Students consider statistics and research methods subjects difficult even when delivered in person. “Statistics anxiety” – a feeling of anxiousness, nervousness and a lack of confidence – is commonly reported among students (Earley, Citation2014; Murtonen, Citation2005). Outside of those considering a career in research, students often perceive statistics and research methods as irrelevant to their professional or clinical careers (Counsell et al., Citation2016; Strohmetz et al., Citation2023); and such perceptions often result in avoidance, disengagement, and a lack of interest to learn (Counsell et al., Citation2016; Earley, Citation2014; Lloyd-Lewis et al., Citation2024). Nonetheless, the ability to consult and critically evaluate psychological research is fundamental to operating as an “evidence-based practitioner”, and, thus, a core competency as outlined by the International Declaration on Core Competences in Professional Psychology for psychology graduates (International Association of Applied Psychology and International Union of Psychological Science, Citation2016). Indeed, in Australia, research methods and statistics subjects are considered a key component of undergraduate programmes by accrediting bodies (Australian Psychology Accreditation Council [APAC], Citation2019).

Successful implementation of online teaching involves creative uses of technologies and consideration of more than just the practicalities in conveying information (Brown et al., Citation2022; Pasco Dalla Porta & Ponce Regalado, Citation2021). It requires the creation of an online space that is friendly, informative and authentic (Brown et al., Citation2022). Research suggests that effectively teaching statistics online requires a user-friendly platform, videos and multimedia resources, self-assessment opportunities, shared learning spaces, and communication between teacher and students (Pasco Dalla Porta & Ponce Regalado, Citation2021). Content and relationships between concepts should be clearly outlined and a balance needs to be achieved with academic load (Pasco Dalla Porta & Ponce Regalado, Citation2021).

Given the differences between teaching online, the challenges experienced by teaching staff, and the difficulties encountered by students, the current study aimed to explore psychology students’ experiences in learning statistics online. Specifically, it aimed to explore and identify factors that promote and hinder student engagement, confidence, and satisfaction when learning statistics online.

Method

Design

We used a qualitative design involving a series of focus groups to explore students’ experiences undertaking tertiary-level statistics subjects via an online modality. The use of focus groups is a well-established method for qualitative research and has been used with thematic analysis to explore the lived experiences of participants (Braun & Clarke, Citation2013, Citation2019). Ethical approval was granted by James Cook University’s Human Research Ethics Committee (#H8417).

Participants

Fourteen participants completed an expression of interest for the study; however, only 13 participants participated. Participants were aged 21–58 years (M = 35.79, SD = 11.39) and the majority (64.3%) identified as female (28.6% identified as male and 7.1% as gender fluid). All participants resided in Australia and had completed at least one university-level psychology statistics subject. Recruitment involved announcements made via university subject forums and public posts on Facebook pages belonging to student-led, tertiary psychology groups in Australia. Recruitment and data collection occurred between September and November 2021. Participants were provided a $25 (AUD) gift voucher for their participation.

Materials and procedure

In response to advertisements, respondents contacted the first author and consented to participate via an online survey. When consenting, participants reported their gender, age, and country of residence. Participants were then able to select a preferred focus group time.

The first author, who had no prior relationships with the participants, conducted five focus groups, consisting of up to four participants, online using Zoom. The first author first thanked participants for their time and reaffirmed participants’ consent verbally after a brief overview of the study. A semi-structured interview guide was used to elicit discussion of experiences relating to learning statistics online (see Appendix). At the conclusion of the session, participants were thanked for their time. Focus groups lasted 48–82 minutes, were audio recorded, and transcribed. Identifying data was removed and participants were given pseudonyms to ensure confidentiality during data analysis.

Data analysis

Thematic analysis (done using Microsoft Excel) was used to analyse the data focusing on an inductive, realist approach where participants’ experiences were interpreted as accurate reflections of their realities (Braun & Clarke, Citation2006; Pickens & Braun, Citation2018). Codes were data-driven rather than fit to pre-existing frameworks (Braun & Clarke, Citation2013). The process involved familiarisation with the data through reading and re-reading transcripts, followed by coding features of the data relevant to the online learning environment (Braun & Clarke, Citation2013; Connelly & Peltzer, Citation2016). Examples of semantic codes included: “Visual connection to speaker is important to learning” and “Ability to rewatch lectures/demonstrations at own pace is beneficial”. Initial coding and theme development was conducted by the first author followed by regular peer dialogue (Smith & McGannon, Citation2018) with the second author to review theme construction. Themes were then named and defined to outline the overall story of the analysis and a final analysis report was produced (Braun & Clarke, Citation2006) following consultation with the third author who provided a “fresh set of eyes” to review the analysis. The extracts presented are reproduced “as spoken” with bracketed words added to improve readability. Extracts from different participants were selected to illustrate themes.

Results

Three main themes were generated from the accounts of students’ experiences: catering to/supporting external students; why it does not work; and how to make it work. Descriptions of these themes and are displayed in .

Table 1. Description of the resulting themes and sub-themes.

Catering to/Supporting external StudentsFootnote1

Given that external students were more likely to study part-time (to allow for other life responsibilities), they often experience longer periods between taking statistics subjects. Due to the length and depth of lecture material, external students were likely to search outside sources (e.g., YouTube) for clarification of concepts, as these deliver content in a concise and engaging way.

like the rest of us, we’ve got other things that take priority or other things that push their way up to the top. When you’re trying to work out the eta square, and you get “mum, can you take me to dancing” and you’re like, good, I didn’t want to do it anyway. So you just put it … on ice. Hey, and then three days later, you pick it up again because you got the assessment and you’re stressing because you haven’t put the time in because you don’t have the tools, the material, the understanding to give it what it needs.

We [part-time students] have taken a few more years to get where we are and we’ve still got a couple more years to go. Something that [the lecturer] kept doing throughout the unit was he would say, look, can you look back on your notes from this unit, or I’m not going to cover this on the basis that I am assuming you have retained basic knowledge of what you’ve learned in such and such a unit. For me and my mate, we were like this is something we did three years ago. We certainly haven’t kept our notebooks on that unit thinking this lecturer was going to tell us he’s not going to cover this topic because we already know that.

The lectures are too long. I don’t have [time]. And he would have them from four to six. So I’ve got kids that have got dancing, football, seventeen hundred different things that. And you know what? I’m the one who’s made the decision to go to uni and all that sort of stuff, but it’s supposed to be for distance ed. So make it distance ed and don’t have your lecture 11 o’clock during the day because when you’re working, you can’t. I can’t do any. I can’t tune in.

Why it does not work

Social norm of efficiency

Interrupting a live lecture through audio or video was considered daunting, rude, or embarrassing. More than that, participants felt that, in the online environment, time efficiency was valued over learning: thus, by asking questions one was negatively taking up the lecturer’s and other students’ time. Interruptions were expected to be kept to a minimum to keep the lecture moving along. From participants’ accounts, it was clear that the online lecture environment was perceived to encourage passive engagement from both lecturers and students, rather than interactive engagement.

I ended up giving up asking questions … It just felt like, for some reason, online it almost felt like nagging to interject again.

I don’t know if this is an all-round Zoom thing, but he had a thing that would come up that would say meeting ending in 30 minutes, and he’d make a note, oh, we’ve got 30 minutes left, and then meeting ending in five minutes. Oh, we’ve got five minutes left. Almost like he was counting down as well. So, it was kind of - it was almost like do I ask a question and get an understanding of this, or do I let him get through the whole lecture?

It just seemed really awkward to ask him to stop and go back, which I think part of was also to do with some people didn’t have their cameras on and some people did, and some people you’d be talking and someone might accidentally jump in on top of you, and it sort of gets awkward as to where you were and if you would continue or they would continue. … . Whereas in an actual classroom, it’s more - it seemed like there was more directness of you could sort of go back ask the lecturer to restate that, whereas online, it just felt like, okay, we’re moving on.

I’ve got no problem with [asking a question verbally], but it just seemed more - yeah, like there was more of a disconnect to him sort of being bothered to fully answer the question, or even take note of the question. Whereas in class, you could almost pull him up on jumping over the question.

Limits immediacy

In contrast to live, online or on-campus lectures, where students could ask a question during/after the lecture, participants considered it more difficult to interact with lecturers online. Recorded lectures meant students needed to remember the part of the lecture the question was relevant to and ask it later, being reliant on the lecturer remembering the specific point in the lecture and understanding the question. Additionally, students felt that live, online lectures had a time limit, and that lecturers were also often unwilling to extend online sessions (by disconnecting at the end), making it difficult for students to ask further questions. Given the challenge in speaking out in the first place, if lecturers were dismissive or unaware of the challenges posed in the online environment, participants were unlikely to indicate further that they did not fully grasp an answer being provided. Students frequently commented that emails or discussion board posts were left unanswered, or lecturers provided a vague response pointing to prior knowledge.

Participants also felt tutorials transferred less effectively to the online environment. Attending online tutorials did not permit having immediate access to a tutor or sharing learning experiences with the person sitting next to you (e.g., getting help with or chatting about the learning activities). It is these elements of in-person tutorials, as well as the ability for the instructor to view what students are doing, that are challenging to replicate.

I think it’s an instant opportunity for one-on-one feedback that you have someone there who can come over directly look at what you’re engaging in at that time and immediately giving you that feedback, which you just can’t get in an online setting that tutor or lecturer can’t approach each individual’s analysis or what they’re currently working on as quickly or effectively …

One thing I found very difficult is the lack of engagement when it’s pre-recorded lectures. I’m one of these people that if I don’t understand something or I want clarification, I want to ask then and there to get it sorted out. Nine times out of ten, you’re not the only one that doesn’t understand it. … . So, you couldn’t ask anything in the [pre-recorded] lecture if you were stuck you had to try and remember the time on the video and then hope and pray that the lecturer would actually know what you were talking about when you sent an email.

Whereas in [an in-person] class, it feels like they’re just - you can stay after class if you need to or if you can’t, if that lecturer has got another class immediately after that one, or the classroom is being used, whatever, on campus, that lecturer was more than happy to say, oh, we can go and talk outside for a few minutes or you can come and see me in my office at such and such a time. Whereas online, it was like trying to catch fog sort of scenario, trying to pin him down to ask a question or get clarification on something.

How to make it work

Flexible and accessible content

Given the depth of content in statistics subjects (where statistical concepts from previous subjects are often embedded and assumed knowledge), students valued the ability to interact with lecture content in a way that suited them individually. Students felt short and concise videos were easier to navigate, digest, and watch in-between other life responsibilities.

… short videos that you can stop and rewind and specific, so it’s on what is it? Eta square, how to find the eta square. It’s there. It’s three minutes or seven minutes. You look at it, they show you, they circle it, they show you. They write it out and I got it.

Recordings could also be rewatched and slowed to support student understanding.

It also helped that their lectures were recorded. When you’re in the lecture theatre, it’s not recorded sometimes, it’s going quite fast, and you can’t take down notes. And so, actually going back over what’s being said and being able to watch those lectures and get your head around it that way, then you know, not having, to kind of quickly write down notes, I think, was really helpful …

Comments concerning accessibility and flexibility extended to tutorial content as well. Flexibility in being able to review tutorial activities after tutorials, or prior to tutorials, was preferred.

Communication and connection

In addition to the ability to watch recorded lectures, students also valued the ability to attend live, online lectures. Students felt that live lectures enabled student-lecturer interaction that naturally occurs during on-campus, in-person lectures, enabling a sense of involvement and engagement.

So I’m listening to a pre-recorded lecture, which I wondered whether that was half the problem because a lot of them were just pre-recorded. And I like the, for me, I love the interaction.

I did really like one of my lecturers … he’d do late night lectures which was really great. So, we’d all group up, have a glass of wine and that was quite good. It was quite interactive and lively discussions. It was relaxed and we’re all post-grad students. We’re all working, and we all had kids, so having a lecture late at night where we could just sit down, have a glass of wine and go, right, what are we learning? I really enjoyed that.

Students preferred when lectures and tutorials were designed to be interactive. For instance, the use of check-in points throughout the lecture creates opportunities for students to test their knowledge and provides time for them to ask questions if they do not understand content. If students have access to statistical software at home, tutorials can provide the opportunity to conduct a statistical analysis by following the tutor’s demonstration. This provides students opportunities to ask questions along the way.

I do find it very helpful with doing online lectures, where it is quite difficult to engage because you’re at home, whatever the environment isn’t that academic environment. Like [my lecturer’s] lectures this year for our stats subject, she would like periodically do kind of like a check your knowledge, where we would use like on collaborate, we just select to like a multiple choice and just to kind of keep you engaged throughout the session which would usually just be talking and just listening and trying to pay attention. I found that quite helpful to stay engaged.

And my lecturer, he had like mini [examples] that we could well, we’d have it’s open “spazz” is what I used to call it [SPSS]. We have the program and he’d sit there and tell us, you know, how to go through it and run our own data, like while he was talking to us. Like, I found that quite engaging.

Students suggested utilising available, online communication tools. For instance, the “Chat box” function provides a non-intrusive way for students to communicate with lecturers during live lectures and tutorials. This allows students to ask questions when they are watching from locations that are not ideal for video participation (or when they are still in their pyjamas). However, lecturers need to be conscious to read out each question, to ensure clarity about what is being asked for students watching/listening to recordings (where the chat box may be absent).

Getting a question in the chat is very easy and comfortable.

I’m more likely to ask a question in the online chat than I am, to verbally talk.

Sometimes it is because you can type it and other people will go, yeah, I don’t get that either. But the problem is, it’s not recorded, so even though they record the tutorial, or they record the lecture or whatever, even if it’s live and you go back to watch it later, it goes I’m getting all these comments coming in, but you don’t know what the comments are. You can’t see them. It’s only if the lecturer or tutor reads it out that you have an idea of what someone asks. That’s the only downside of that. I’m pretty sure they don’t do recordings of tutes now. [My lecturer] did last year when they went online which helped a lot, but he did his on Collaborate and Blackboard, but a lot do them on Zoom and they don’t record them on Zoom.

Having a lecturer that was accessible to students was key for student satisfaction in the online environment. Since not all students were able to attend live lecture, being able to post questions to a discussion board and receive timely responses, or engage with teaching staff during a dedicated “question and answer” time, was highly valued.

I’m okay if they don’t have [class] live and it’s pre-recorded, if they’re really accessible to answer questions. I’ve found, you know, when they’re really accessible, you can write a question and an hour later you’ve got a thorough answer, fantastic. Or you can see that someone else has asked a question and they’ve got a thorough answer, fantastic.

I’ve even had lecturers that are happy for you to find them and discuss stuff. That is the most invaluable thing ever. I don’t mind if it’s online because then I can watch it at nine o’clock at night on a Thursday if that’s the only time I get. So, I enjoy that flexibility, but you still have to have that human contact … . If we used to be able to walk into lecture halls and talk to the teacher after for five minutes, then why can’t we still do the same thing now? I appreciate the lecturers that are available on email and phone and are timely because it makes a world of difference.

… she was happy to go back over the same thing as many times as was needed to get it and she would do her best to reword it in different ways to try and get you to understand it, and she would take her time to sit down with you, was really willing to make that personal connection and make sure that you were comfortable with your understanding of whatever the concept was.

The ability to connect with other students also increased student enjoyment and satisfaction. Students valued the use of subject discussion boards or Facebook groups, as these afforded a “social space” for students to connect and share experiences within the subject or ask for help. However, the benefit of these varied based on lecturer involvement and other students’ use. It also meant that students had “friends”, making attending any “live” components less daunting.

I created the Facebook group and we were all like bouncing off each other. You know, I was zooming with friends like other students and stuff trying to get this all sorted because he wouldn’t answer emails and he wouldn’t explain like, it was all just pre-recorded lectures.

And like, there’s a discussion board, you know, a uni thing, which is actually quite busy. Well, surprisingly, it was good for me.

we have the online but undergrad psychology page and then going well back when I was, you know, I was creating pages for all the subjects that I was doing and I’d encourage, you know, the talking between the students there. And so, when we were in lectures, like with forensic psychology and put into groups, you know, I had “friends”, as such that if we were put into the same group, it was cool.

Statistical software

A “how to” manual for the statistical software package was essential, and considered more valuable to engagement, enjoyment, and satisfaction compared to any textbook teaching statistical concepts. Access to statistical software at home meant that students could practice the steps involved in running an analysis in their own time, taking as long as they needed to understand the process. Further, additional datasets were thought to increase student confidence through additional practice with data analyses.

On that note of SPSS and just during the start of COVID they made it available to students to use on their personal laptops and I think that that should just be made regularly available to students that we that we can have a student subscription and do that at home and don’t have to go, and I feel like that’s a big issue for expected to work online. But we’re not given the tools – we have to either purchase it ourselves or be willing to come into uni.

If they give you an example and go “Here’s how this works. Now you’ve got ten examples to go through in your workbook”. That’s really helpful. You can just go back and it’s that practicing and then you go “Okay, I’ve got the hang of this”. That would be really helpful.

Discussion

The present study explored tertiary students’ experiences in learning statistics subjects online, specifically in identifying factors that increased (or hindered) engagement, satisfaction, and confidence. Considering our research aims, from the three main themes identified (see ), two overarching concepts characterise students’ experiences and desires: access and interaction. In brief, participants identified that readily available and easily accessible (and digestible) content would support their lifestyle (and, thus, subject engagement) and learning. Additionally, participants identified that communication (and opportunities for connection) increased their engagement, satisfaction, and confidence, and created further learning opportunities. These two broader concepts and their implications for teaching and learning statistics online are discussed below.

Interestingly, while the content of statistics is perceived by many students to be challenging and anxiety-provoking (Earley, Citation2014; Murtonen, Citation2005) such that students often avoid or show little interest in such subjects (Counsell et al., Citation2016; Lloyd-Lewis et al., Citation2024), our participants focused not on the difficulty of the material, but on the technologies involved and their use. Indeed, we find that teaching and learning statistics online depends on how educators best make use of technologies creatively (Brown et al., Citation2022; Pasco Dalla Porta & Ponce Regalado, Citation2021) and support their students (Haardörfer & Livingston, Citation2021).

Access

Students prioritised (repeated) access to recordings of subject content, materials ahead of lectures and tutorials, statistical software at home, and additional datasets for practice. Acknowledging this, content delivery should be packaged in short “bursts” or “bites” and directions to foundational knowledge should be provided. This might include existing YouTube videos (short, accurate, and visually engaging) or a library of materials within the university’s chosen virtual learning environment (e.g., videos, additional activities, or readings on various statistical topics for revision). Students remarked that this would increase their satisfaction, because they would spend less time searching for foundational content and would be able to mentally digest challenging content in between other responsibilities.

Moreover, participants felt it was important that statistical software was available on their own computers. Statistics educators often set subjects around SPSS (Maher et al., Citation2023), an identified concern among participants, given its cost for home use (particularly in addition to purchasing a textbook). Additionally, an accompanying, high-quality “how to” guide for the prescribe software was considered imperative for learning statistics online. This type of guide (which needs to cover how to conduct different analyses and how to interpret output) was regarded as more valuable to students’ learning than statistics textbooks, which were often considered expensive and useless. Those teaching statistics should consider the financial costs of software and texts, against the value they add to learning, for example, weighing the use of free software without established “how to” guides against paid software with associated “how to” guides and other open-source learning materials.

Interaction

Like the findings of Maher et al. (Citation2023), which identified educators’ difficulty in determining students’ engagement with material in online, students also felt greater disconnection online. This pertained to both instructor-student and student-student interactions and was tied to social norms that discouraged asking questions. Acknowledging that “reading the room” when delivering content online differs from being able to gauge interest and understanding in a lecture auditorium, instructors should consider their own perceptions of the online environment and how their own behaviours or mannerisms might be interpreted by others.

The participants offered several practical suggestions to encourage interaction: offering live online lectures (which are recorded for accessibility and include a question-and-answer component); weekly online “office hour” meetings; and increased use of online discussion boards. When delivering a lecture, educators should have their video on, and consciously create an environment that encourages questions (verbally and via chat postings) and discussion. Our participants indicated use of the chat function was a non-intrusive way to ask questions (although the instructor needs to voice questions before answering, so that those watching the recording have context regarding the answer). Participants also suggested learning checkpoints (e.g., poll questions) throughout online sessions as a way for students and instructors to evaluate their learning and either ask questions or reiterate content.

Additionally, statistics subjects need to be designed for the online space in a way that provides opportunities for students to engage with other students. Again, participant responses focused not on the content, but on the discussion and application of content among those involved (Brown et al., Citation2022; Pasco Dalla Porta & Ponce Regalado, Citation2021). To support camaraderie and “shared learning” in tutorials, numbers should be kept to a size that is practical for engagement, where possible videos of participants should be on, and break out rooms should be utilised. Other practical suggestions include the use of chat functions and discussion boards as well as digital communication tools external to the university (e.g., student-led Facebook groups which allow students to interact and ask questions of each other more openly than on an instructor-monitored discussion board). Thus, it is important to think about how educators not only use, but support, communication tools to ensure students are able to communicate.

Limitations and future directions

The present study is not without limitations. First, the small sample was drawn from Australian psychology students. The response and impact of COVID-19 was different in Australia than in other countries (e.g., Brooks et al., Citation2020; Van Agteren et al., Citation2020) and, therefore, the challenges experienced by students during this time may not be representative of other countries. Second, this study focused on the experience of learning statistics online, rather than explicitly comparing the perspectives and experiences of online, on-campus, and blended studies. Such a comparison as well as consideration of individual differences in technological self-efficacy falls to future research. Additionally, we cannot disassociate participants’ responses from the broader context of experiencing COVID-19. Thus, it would also be important to critically examine students’ experiences with online subjects that have undergone years of development and refinement. Since EOL is marked by a lack of preparation time (Müller et al., Citation2021), it is possible that educators with more time and experience might be able to implement or improve upon the suggestions offered by students in this study.

Third, regarding student engagement, confidence and satisfaction, it falls on future research to consider the relative importance of different design elements and online features and to explicitly test the impact of changes to subject design and delivery in line with the proposed suggestions. Moreover, future research is well placed to also identify if changes to these factors increase students’ understanding (e.g., making use of student grades as an objective measure of understanding). Lastly, because the findings highlight the importance of social engagement (which has also been shown to be an important indicator of student outcomes; Conklin & Garrett Dikkers, Citation2021; Rajalingam et al., Citation2021), future research should explicitly consider both instructor-student and student-student interactions. Methodological designs permitting longitudinal data collection from instructors and students would further our understanding of how best to teach and learn statistics online.

Nonetheless, the current research indicates that learning statistics online is considerably different to the traditional on-campus experience. Those instructing these subjects need to be aware of, and have a plan to address, the unique challenges faced in teaching and learning statistics content online. As the psychology student-participants highlighted in this study, because students may feel overwhelmed, anxious, or disconnected with the subject and/or its content, creating a flexible and accessible learning environment replete with meaningful interaction and opportunity for communication is key to success.

Author contributorship

AK and DM secured funding for the project. AK, KF, and DM collaboratively developed the study and KF conducted the data collection. KF conducted the data analysis, with input from AK and DM. KF drafted an initial version of the manuscript, with AK and DM offering input and revisions. All authors collaborated to approve the final version of the manuscript.

Acknowledgements

The authors wish to thank the individuals who participated in the research and JCU staff members for their early feedback on this project.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The raw data and materials presented in this article are not readily available in compliance with the study’s ethical approval. Requests regarding the data should be directed to Amanda Krause, [email protected]. No aspects of the study were pre-registered.

Additional information

Funding

This research received support financial support via the James Cook University JCUA and JCUS Cross-collaboration scheme.

Notes

1. External enrolment is defined as “off-campus” or online delivery.

References

  • Allen, P. J., & Baughman, F. D. (2016). Active learning in research methods classes is associated with higher knowledge and confidence, though not evaluations or satisfaction. Frontiers in Psychology, 7, Article 279. https://doi.org/10.3389/fpsyg.2016.00279
  • Alomyan, H. (2021). The impact of distance learning on the psychology and learning of university students during the COVID-19 pandemic. International Journal of Instruction, 14(4), 585–10. https://doi.org/10.29333/iji.2021.14434a
  • Australian Psychology Accreditation Council. (2019). Accreditation standards for psychology programs: Version 1.2. https://psychologycouncil.org.au/wp-content/uploads/2021/09/APAC-Accreditation-Standards_v1.2_rebranded.pdf
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners. SAGE Publication.
  • Braun, V., & Clarke, V. (2019). Reflecting on reflexive thematic analysis. Qualitative Research in Sport, Exercise, and Health, 11(4), 589–597. https://doi.org/10.1080/2159676X.2019.1628806
  • Brooks, S. K., Webster, R. K., Smith, L. E., Woodland, L., Wessely, S., Greenberg, N., & Rubin, G. J. (2020). The psychological impact of quarantine and how to reduce it: Rapid review of the evidence. The Lancet, 395(10227), 912–920. https://doi.org/10.1016/S0140-6736(20)30460-8
  • Brown, A., Lawrence, J., Foote, S., Cohen, J., Redmond, P., Stone, C., Kimber, M., & Henderson, R. (2022). Educators’ experiences of pivoting online: Unearthing key learnings and insights for engaging students online. Higher Education Research and Development, 42(7), 1593–1607. Advance online publication. https://doi.org/10.1080/07294360.2022.2157798
  • Conklin, S., & Garrett Dikkers, A. (2021). Instructor social presence and connectedness in a quick shift from face-to-face to online instruction. Online Learning: The Official Journal of the Online Learning Consortium, 25(1), 135–150. https://doi.org/10.24059/olj.v25i1.2482
  • Connelly, L. M., & Peltzer, J. N. (2016). Underdeveloped themes in qualitative research: Relationship with interviews and analysis. Clinical Nurse Specialist, 30(1), 52–57. https://doi.org/10.1097/NUR.0000000000000173
  • Counsell, A., Cribbie, R. A., & Harlow, L. L. (2016). Increasing literacy in quantitative methods: The key to the future of Canadian psychology. Canadian Psychology / Psychologie canadienne, 57(3), 193–201. https://doi.org/10.1037/cap0000056
  • Dzakiria, H., Idrus, R. M., & Atan, H. (2005). Interaction in open distance learning: Research issues in Malaysia. Malaysian Journal of Distance Education, 7(2), 63–77. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=484e6973ead0349707589f05d9827b30e312e14a
  • Earley, M. A. (2014). A synthesis of the literature on research methods education. Teaching in Higher Education, 19(3), 242–253. https://doi.org/10.1080/13562517.2013.860105
  • Elmer, T., & Stadtfeld, C. (2020). Depressive symptoms are associated with social isolation in face-to-face interaction networks. Scientific Reports, 10(1), 1444. https://doi.org/10.1038/s41598-020-58297-9
  • Haardörfer, R., & Livingston, M. (2021). Teaching statistics in public health during COVID-19: Lessons learned from spring 2020 and recommendations for the future. Pedagogy in Health Promotion, 7(1), 25–28. https://doi.org/10.1177/2373379920963608 63608.
  • International Association of Applied Psychology and International Union of Psychological Science. (2016). International declaration of core competencies in professional psychology. https://iaapsy.org/policiesinitiatives/ipcp-documents/
  • Lloyd-Lewis, B. L., Miller, D. J., & Krause, A. E. (2024). Against all odds: Students’ interest in, and perceived value of, research and non-research psychology subjects. Psychology Teaching and Learning, 23(1), 65–89. https://doi.org/10.1177/14757257231222647
  • Maher, K., Krause, A. E., & Miller, D. J. (2023). The impact of COVID-19 on tertiary statistics teaching practices in Australia. Scholarship of Teaching and Learning in Psychology, Advance online publication. https://doi.org/10.1037/stl0000352
  • Mantasiah, R., Yusri, G., Sinring, A., & Aryani, F. (2021). Assessing verbal positive reinforcement of teachers during school from home in the covid-19 pandemic era. International Journal of Instruction, 14(2), 1037–1050. https://doi.org/10.29333/iji.2021.14259a
  • Müller, A. M., Goh, C., Lim, L. Z., & Gao, X. (2021). COVID-19 emergency e-learning and beyond: Experiences and perspectives of university educators. Education Sciences, 11(1), 19. https://doi.org/10.3390/educsci11010019
  • Murtonen, M. (2005). University students’ research orientations: Do negative attitudes exist toward quantitative methods?. Scandinavian Journal of Educational Research, 49(3), 263–280. https://doi.org/10.1080/00313830500109568
  • Pasco Dalla Porta, M. M., & Ponce Regalado, M. D. F. (2021). Assessment of the effectiveness of virtual modules for teaching management research methods. Ubiquitous Learning: An International Journal, 14(1), 47–64. https://doi.org/10.18848/1835-9795/CGP/v14i01/47-64
  • Pickens, C., & Braun, V. (2018). “Stroopy bitches who just need to learn how to settle”? Young single women and norms of femininity and heterosexuality. Sex Roles, 79, 431–448. https://doi.org/10.1007/s11199-017-0881-5
  • Rajalingam, S., Kanagamalliga, S., Karuppiah, N., & Caesar Puoza, J. (2021). Peer interaction teaching– learning approaches for effective engagement of students in virtual classroom. Journal of Engineering Education Transformations, 34, 425–432. https://doi.org/10.16920/jeet/2021/v34i0/157191
  • Smith, B., & McGannon, K. R. (2018). Developing rigor in qualitative research: Problems and opportunities within sport and exercise psychology. International Review of Sport and Exercise Psychology, 11(1), 101–121. https://doi.org/10.1080/1750984x.2017.1317357
  • Strohmetz, D. B., Ciarocco, N. J., & Lewandowski, G. W., Jr. (2023). Why am I here? Student perceptions of the research methods course. Scholarship of Teaching and Learning in Psychology, Advance online publication. https://doi.org/10.1037/stl0000353
  • Van Agteren, J., Bartholomaeus, J., Fassnacht, D. B., Iasiello, M., Ali, K., Lo, L., & Kyrios, M. (2020). Using internet-based psychological measurement to capture the deteriorating community mental health profile during COVID-19: Observational study. JMIR Mental Health, 7(6), e20696. https://doi.org/10.2196/20696

Appendix

Focus Group Question Schedule