227
Views
0
CrossRef citations to date
0
Altmetric
Medical Education

Academic performance among pharmacy students using virtual vs. face-to-face team-based learning

ORCID Icon
Article: 2349205 | Received 19 Nov 2023, Accepted 17 Apr 2024, Published online: 13 May 2024

Abstract

Introduction

This study compares pharmacy students’ performance using face-to-face (FTF) team-based learning (TBL) vs. virtual TBL across multiple courses and different academic levels while accounting for student demographic and academic factors.

Methods

The study included pharmacy students from different academic levels (P1–P3) who were enrolled in three didactic courses taught using FTF TBL and virtual TBL. Multiple generalized linear models (GLMs) were performed to compare students’ performance on individual readiness assurance tests (iRATs), team readiness assurance tests (tRATs), team application exercises (tAPPs), summative exams, and total course scores using FTF TBL vs. virtual TBL, adjusting for students’ age, sex, race, and cumulative grade point average (cGPA).

Results

The study involved a total of 356 pharmacy students distributed across different academic levels and learning modalities: P1 students [FTF TBL (n = 26), virtual TBL (n = 42)], P2 students [FTF TBL (n = 77), virtual TBL (n = 71)], and P3 students [FTF TBL (n = 65), virtual TBL (n = 75)]. In the P1 cohort, the virtual group had higher iRAT and tRAT scores but lower tAPP scores than the FTF TBL group, with no significant differences in summative exams or total course scores. For P2 students, the virtual TBL group had higher iRAT and tRAT scores but lower summative exam scores and total course scores than the FTF TBL group, with no significant differences in tAPP scores. In the P3 student group, the virtual TBL group had higher iRAT, tRAT, tAPP, summative exam, and total course scores than the FTF TBL group.

Conclusions

Students’ performance in virtual TBL vs. FTF TBL in the pharmacy didactic curriculum varies depending on the course content, academic year, and type of assessment.

Introduction

The field of pharmacy education has undergone significant transformations in recent years, spurred by technological advancements, shifts in pedagogical approaches, and the global impact of the coronavirus disease 2019 (COVID-19) pandemic. As the world grappled with pandemic challenges, educational institutions had to adapt rapidly, ensuring the continuity of learning while prioritizing the safety and well-being of students and educators. This emergency shift resulted in a notable transition from traditional face-to-face (FTF) instructional methods to a virtual format, impacting various aspects of higher education, including pharmacy programs [Citation1].

Even before the onset of the COVID-19 pandemic, pharmacy education was evolving to embrace innovative teaching methodologies, moving away from conventional didactic lectures and laboratory-based learning. A notable transformative pedagogical approach gaining traction was team-based learning (TBL), a learner-centered strategy fostering student active learning, collaboration, critical thinking, and problem-solving skills [Citation2, Citation3]. TBL involves students working in teams to apply their knowledge to solve complex, real-world problems, enhancing student engagement, retention, and cognitive skills [Citation1–5].

The TBL framework typically consists of various components that enable effective student learning. First, students prepare for class based on clear learning objectives and materials provided by instructors [Citation6]. Next, the readiness assurance process takes place, which involves both individual readiness assurance tests (iRATs) and team readiness assurance tests (tRATs), where instructors offer immediate clarification as needed [Citation6]. Finally, students work together in teams to apply the key concepts they have learned to team application exercises (tAPPs) [Citation6]. However, the sudden shift to virtual learning during the COVID-19 pandemic presented unique challenges for implementing active learning methods like TBL in pharmacy education [Citation6].

These challenges included the lack of adequate technology resources, difficulties in maintaining student engagement and motivation, the need to modify teaching approaches and learning materials for virtual instruction, ensuring fair assessment methods, addressing concerns of academic integrity, and maintaining social interaction and collaboration between students and educators [Citation6].

Two studies in the literature compared students’ performance on various components of TBL using virtual TBL vs. FTF TBL. Franklin et al. (2016) explored students’ performance in a hybrid pharmacokinetics course that was offered using virtual TBL to one group and FTF TBL to two other student groups [Citation5]. The study found that one of the FTF TBL groups had higher iRAT scores, while the other had lower iRAT scores than the virtual TBL group [Citation5]. However, the virtual TBL group outperformed the two FTF TBL groups in tRATs [Citation5].

Shen et al. (2024) conducted a study comparing third-year medical students’ scores on iRATs, tRATs, experiment reports, and final exams in a basic laboratory course using virtual TBL vs. FTF TBL [Citation7]. The student’s scores on iRATs were similar between the two learning modalities, but the tRAT scores were lower in the virtual TBL group than in the FTF TBL group [Citation7]. The study showed that student’s scores were not different on their experiment report or final exam [Citation7].

However, the studies conducted by Franklin et al. (2016) and Shen et al. (2014) did not compare the performance of students on tAPPs using virtual TBL with those using FTF TBL [Citation5, Citation7]. Also, they focused only on one course and did not consider the potential variations in students’ demographics and academic factors across different study cohorts, which could have affected students’ performance on various assessments [Citation5, Citation7]. Moreover, the study by Franklin et al. (2016) did not compare students’ performance on summative assessments or overall course grades between the virtual TBL and FTF TBL groups [Citation5].

A critical gap in the literature exists in assessing pharmacy students’ performance using FTF TBL and virtual TBL while accounting for student demographic and academic factors. The current literature lacks a thorough examination of various TBL components, including iRATs, tRATs, tAPPs, summative assessment scores, and final course grades using FTF TBL vs. virtual TBL. Therefore, our study aims to bridge this gap by comparing pharmacy students’ performance on the various components of FTF TBL vs. virtual TBL, adjusting for relevant student demographic and academic factors.

Materials and methods

Study design, population, and course description

This study compared students’ performance using FTF TBL vs. virtual TBL across three 14-week didactic courses at different academic levels within the pharmacy didactic curriculum at The University of Texas at Tyler Fisch College of Pharmacy.

The first course was PHAR 7203 Introduction to Medicinal Chemistry (2 credit hours). This course was taught to first-year pharmacy (P1) students, with 26 students enrolled in the FTF TBL course (Spring 2022) and 42 students enrolled in the virtual TBL course (Spring 2021).

The second course was PHAR 7483 Integrated Pharmacotherapy 3: Cardiovascular (4 credit hours). This course was taught to second-year pharmacy (P2) students, with 77 students enrolled in the FTF TBL course (Spring 2019) and 71 students enrolled in the virtual TBL course (Spring 2021).

The third course was PHAR 7377 Pharmacoepidemiology and Pharmacoeconomics (3 credit hours). This course was taught to third-year pharmacy (P3) students, with 65 students enrolled in the FTF TBL course (Fall 2021) and 75 students enrolled in the virtual TBL course (Fall 2020).

These courses were randomly selected, representing three different major areas in pharmacy curricula in the United States (pharmaceutical sciences for the P1 course, clinical sciences for the P2 course, and social/administrative/behavioral sciences for the P3 course), classified according to the Accreditation Council for Pharmacy Education (ACPE) Standards 2016 [Citation8].

Each course was taught using FTF TBL for one student cohort and virtual TBL for another cohort within the same academic level (P1–P3) during a separate semester. The content of each course remained similar between virtual TBL and FTF TBL offerings, including formative assessments (iRATs, tRATs, tAPPs) and summative assessments (exams). The instructors for these courses had at least four years of TBL experience.

The assessment questions for iRATs, tRATs, tAPPs, and summative exams were similar from year to year. While we acknowledge the possibility of compromised integrity of the iRATs, tRATs, and tAPPs due to students’ access to these assessments from previous years, we maintained the integrity of summative assessments by using ExamSoft® for testing under proctored conditions. Additionally, students were not allowed to take pictures or copy any exam questions during the test or when reviewing their individual exams later, further preserving the integrity of summative exams.

The study included students from different academic levels (P1 to P3) in the pharmacy school program to encompass various topics, teaching and assessment approaches, and different learning levels. All students in these groups had previously completed at least one semester of didactic courses delivered using TBL, as this teaching method is the predominant pedagogy in most didactic courses at The University of Texas at Tyler Fisch College of Pharmacy. Each course had 4–6 students in each team, and these teams remained constant throughout the duration of the semester.

Transitioning to virtual TBL during the COVID-19 pandemic, all class meetings, discussions, and collaborations for these three didactic courses were conducted using Zoom© audiovisual teleconferencing (Zoom Video Communications, San Jose, CA). The virtual TBL approach was designed to mirror the structure, content, and assessments of the FTF TBL sessions. The instructors, module sequence, and evaluation framework remained the same. The virtual TBL approach was divided into three phases:

Phase I: students were assigned lecture videos created by the instructor, required textbook readings, and primary research journal articles to read before class;

Phase II: iRATs and tRATs were administered remotely using ExamSoft® (ExamSoft, Inc., Dallas, TX) or Canvas© (Instructure, Inc., Salt Lake City, UT), then students were assigned to Zoom© breakout rooms based on their team assignments to discuss the tRATs, defend their answers, and resolve any discrepancies with the instructor acting as a facilitator;

Phase III: following the readiness assurance process, students stayed in their assigned Zoom© breakout rooms to collaborate on tAPPs, building on the readiness materials and applying key concepts from each course module. Instructors could join these breakout rooms as facilitators to oversee and engage with the teams.

To ensure academic integrity during virtual TBL activities, third-party online testing and proctoring platforms, including ExamMonitor™ (ExamSoft, Inc., Dallas, TX) or ProctorU (Meazure Learning, Inc., Birmingham, AL), were utilized during summative assessments. A schematic diagram of the study design and participant flow can be found in .

Figure 1. Study design, participants, and course description. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

Figure 1. Study design, participants, and course description. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

Outcomes

Two assessment types were used to compare students’ academic performance in virtual TBL vs. FTF TBL: 1) Formative assessments (iRATs, tRATs, and tAPPs) are crucial components that help students engage in active learning, collaborative problem-solving, and critical thinking skills development; 2) Summative assessments (exams), provide a more comprehensive evaluation of overall academic performance.

Data analysis

Descriptive statistics were used to summarize student demographic and academic factors. The Shapiro-Wilk test was performed to assess the normality of data on students’ demographic and other characteristics. The test results showed that the data were not normally distributed (p < .05). Therefore, continuous data were summarized using median and interquartile range (IQR), while categorical data were presented using frequencies and percentages. To compare students’ demographics and other characteristics between the FTF TBL and virtual TBL student cohorts for each course, the nonparametric Wilcoxon rank-sum (Mann–Whitney U) exact test was used for continuous variables, and Fisher’s exact test was used for nominal variables.

Box-and-whisker plots were employed to visually display the median (Q2), first quartile (Q1), third quartile (Q3), and outliers for students’ performance on various course assessments (iRATs, tRATs, tAPPs, summative exams, and total course score) when comparing FTF TBL to virtual TBL. Multiple generalized linear models (GLMs) using Gaussian multiple linear regression models and identity link function were performed to compare students’ performance in different course assessments (iRATs, tRATs, tAPPs, summative exams, and total course scores) using FTF TBL vs. virtual TBL. These comparisons were adjusted for students’ covariates: age, sex, race, and cumulative GPA (cGPA). We controlled for these covariates as they have been identified in the literature as factors that may have an impact on students’ academic performance and success in pharmacy education [Citation9–15].

All statistical tests were two-sided and conducted at an a priori significance level of 0.05 using Stata® version 18.0 (StataCorp LLC, College Station, TX).

Results

A total of 356 pharmacy students were included in this study, divided into the different academic levels and learning modalities into P1 students [FTF TBL (n = 26), virtual TBL (n = 42)], P2 students [FTF TBL (n = 77), virtual TBL (n = 71)], and P3 students [FTF TBL (n = 65), virtual TBL (n = 75)].

The P2 students in the FTF TBL group had a significantly higher age compared to the virtual TBL group (Z = 3.99; p < .001). Conversely, the P3 students in the FTF TBL group were significantly younger compared to those in the virtual TBL group (Z = −3.0; p = .003). However, when it came to the distribution of students’ sex, there were no significant differences between the FTF TBL and virtual TBL groups in any of the professional years. A notable finding was the difference in the distribution of race/ethnicity among P2 students (p = .015). Additionally, P2 students in the FTF TBL group had a lower cGPA compared to those in the virtual TBL group (Z = −2.12; p = .034). No significant differences in cGPA were observed between the FTF TBL and virtual TBL groups in the P1 or P3 years.

provides a detailed overview of the characteristics of pharmacy students stratified by their professional school year and TBL modality.

Table 1. Characteristics of pharmacy students stratified by students’ professional school year and TBL modality.

Comparison of pharmacy students’ performance using FTF TBL vs. virtual TBL

For P1 students, virtual TBL resulted in significantly higher iRAT scores than FTF TBL (Z = 6.97; p < .001) after adjusting for students’ age, sex, race, and cGPA. Similarly, tRAT scores were significantly higher in the virtual TBL group compared to the FTF TBL group (Z = 4.05; p < .001) in the adjusted model. However, tAPP scores were significantly lower in the virtual TBL group compared to the FTF TBL group (Z = −4.76; p < .001), after adjusting for students’ age, sex, race, and cGPA. No significant difference was observed in summative exam scores or total course scores between the FTF TBL and virtual TBL groups. A visual comparison of P1 students’ performance in various course assessments (iRATs, tRATs, tAPPs, summative exams, and total course scores) using FTF TBL vs. virtual TBL can be found in .

Figure 2. Comparison of P1 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

Figure 2. Comparison of P1 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

In the P2 students, the virtual TBL group achieved significantly higher iRAT scores compared to the FTF TBL group (Z = 3.53; p < .001), after adjusting for students’ age, sex, race, and cGPA. Likewise, the virtual TBL group had significantly higher tRAT scores compared to the FTF TBL group (Z = 11.19; p < .001) in the adjusted model. Conversely, the summative exam scores were significantly lower in the virtual TBL group compared to FTF TBL (Z = −4.71; p < .001), accounting for students’ age, sex, race, and cGPA. Similarly, the total course score was significantly lower in the virtual TBL group compared to the FTF TBL group (Z = −3.2; p = .001) in the adjusted model. No significant differences were observed in tAPP scores between the FTF TBL and virtual TBL groups. A visual comparison of P2 students’ performance in various course assessments using FTF TBL vs. virtual TBL is presented in .

Figure 3. Comparison of P2 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

Figure 3. Comparison of P2 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

For P3 students, the virtual TBL group had significantly higher iRAT scores compared to the FTF TBL group (Z = 2.87; p = .004) after adjusting for students’ age, sex, race, and cGPA. Similarly, the virtual TBL group had significantly higher tRAT scores compared to the FTF TBL group (Z = 2.04; p = .042) in the adjusted model. The same was observed with students having significantly higher tAPP scores in the virtual TBL group compared to the FTF TBL group (Z = 1.96; p = .05). Moreover, the virtual TBL group achieved significantly higher summative exam scores than the FTF TBL group (Z = 3.06; p = .002), adjusting for students’ age, sex, race, and cGPA. Similarly, the students in the virtual TBL group had significantly higher total course scores than the FTF TBL group (Z = 3.54; p < 0.001) in the adjusted model. No significant differences were observed in tAPP scores between the FTF TBL and virtual TBL groups. provides a visual comparison of P3 students’ performance in various course assessments using FTF TBL vs. virtual TBL.

Figure 4. Comparison of P3 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

Figure 4. Comparison of P3 students’ performance using FTF TBL vs. virtual TBL. FTF = face-to-face; TBL = team-based learning; iRATs = individual readiness assurance tests; tRATs = team readiness assurance tests; tAPPs = team application exercises; P1 = first professional year; P2 = second professional year; P3 = third professional year.

A comprehensive comparison of students’ scores in iRATs, tRATs, tAPPs, summative exams, and total course scores using FTF TBL vs. virtual TBL stratified by students’ professional year while adjusting for students’ age, sex, race, and cGPA using multiple GLMs is provided in .

Table 2. Generalized linear models comparing students’ performance in various course assessments using FTF TBL vs. virtual TBL stratified by students’ professional year.

Discussion

The abrupt shift from FTF teaching to virtual learning during the COVID-19 pandemic presented unique challenges for pharmacy education. One of the difficulties was implementing active learning strategies like TBL in a virtual environment. To our knowledge, this is the first study to compare pharmacy students’ performance on different TBL components (iRATs, tRATs, tAPPs, summative exams, and total course scores) between FTF TBL and virtual TBL while accounting for various student demographic and academic factors. The study compares FTF TBL and virtual TBL across different academic levels within a pharmacy program. By addressing the limitations of earlier research, this study can contribute essential insights into the effectiveness of each teaching modality in pharmacy education.

Formative assessments

The results of our study showed that in all three classes, virtual TBL groups consistently performed better than their FTF TBL counterparts in both iRATs and tRATs. This finding is supported by a previous study conducted by Franklin et al. (2016), which compared a single virtual TBL group with two FTF TBL groups enrolled in a pharmacokinetic course in different settings [Citation5]. The study found that while one of the FTF TBL groups achieved higher individual iRAT scores, the other group scored lower than the virtual TBL group. However, the virtual TBL group had higher tRAT scores than both FTF TBL groups in the study [Citation5].

The P1 students in the virtual TBL group had lower tAPP scores compared to the FTF TBL group. Interestingly, the P3 students in the virtual TBL group had higher tAPP scores, while there were no significant differences in tAPP scores among P2 students in the virtual TBL and FTF TBL groups. This may suggest that the impact of the virtual modality on tAPP performance is multifaceted, possibly influenced by factors such as course content alignment, students’ technological preparedness, and prior experiences with virtual learning.

Summative assessments

No significant differences were found in summative exams and total course scores between P1 students in the virtual TBL and FTF groups. This is consistent with various studies evaluated in a meta-analysis suggesting virtual learning can be as effective as traditional classroom-based instruction [Citation16]. Additionally, a study conducted by Shen et al. (2024) compared virtual TBL and FTF TBL and found that third-year medical students in both the virtual TBL and FTF TBL groups achieved similar outcomes, as demonstrated by ­comparable scores on experiment reports and final exams [Citation7].

Our study has shown that the P2 students who participated in the virtual TBL group had lower scores on their final exams and overall course grades than the FTF TBL group. However, the students in the P3 group who used virtual TBL had higher scores on their summative assessments compared to the FTF TBL group. The findings are contrary to the results of a comparative analysis by Shen et al. (2024), which involved 179 third-year medical students enrolled in a basic medical laboratory course [Citation7]. Their research showed that there were no significant differences in academic outcomes between the virtual TBL and FTF TBL groups, as demonstrated by comparable scores on experiment reports and final exams [Citation7].

It is important to differentiate between statistical significance and academic significance when interpreting study findings. While certain data points may show statistical differences, it is crucial to assess the practical implications of these variations in the context of pharmacy education. In some cases, even though statistically significant, observed differences may only represent minor changes that may not translate into substantial academic distinctions. Hence, we emphasize the importance of considering both statistical and academic significance to gain a comprehensive understanding of how virtual TBL impacts pharmacy student performance.

Key takeaways

This study aimed to compare pharmacy students’ performance using FTF TBL vs. virtual TBL across various TBL components and academic levels within the pharmacy program. The study provides insights into the impact of the transition to virtual TBL and how it affected different TBL components, which varied across classes depending on their advancement in the pharmacy program and the course content.

It is crucial to note that the performance of students in both FTF TBL and virtual TBL can vary due to several factors. These factors may include individual characteristics of the students, specific attributes of the courses, learning contexts, motivation levels, and the dynamic evolution of virtual learning environments [Citation6]. Additionally, the course’s inherent characteristics, subject matter intricacies, the effectiveness of instructor facilitation, and the level of advancement and quality of technological infrastructure can all affect students’ performance in virtual TBL [Citation6].

This research study contributes to the existing literature by investigating TBL across multiple courses, content areas, and academic levels within the pharmacy program while controlling for relevant student demographics and academic factors. Additionally, we compared students’ performance in virtual TBL vs. FTF TBL across all TBL components (iRATs, tRATs, tAPPs, summative assessments, and total course scores).

Limitations

While this study contributes valuable insights into the performance of pharmacy students using FTF TBL and virtual TBL modalities across different course components, it is essential to acknowledge certain limitations that may affect the generalizability and interpretation of the findings.

First, the study was conducted at a single pharmacy school, potentially limiting the generalizability of the findings to other institutions with different curricular structures, student populations, and faculty expertise. The effectiveness of TBL in virtual settings may be influenced by the specific context and resources available at each institution.

Second, the study included three didactic courses with different content areas. It is essential to recognize that the impact of virtual TBL on students may differ depending on the nature of the course content. Nonetheless, by including courses from different content areas and academic levels within the pharmacy program, we aimed to enhance the generalizability of the findings across a broader spectrum of pharmacy education scenarios. Future research may consider further course content diversification to ascertain the robustness of observed patterns.

Third, this study analyzed the impact of transitioning from FTF TBL to virtual TBL over a single semester during the COVID-19 pandemic. Long-term implications and sustained effects of virtual TBL on students’ performance remain unexplored. Future research could investigate the durability of these findings over multiple semesters.

Fourth, the shift to virtual learning occurred under emergency circumstances due to the pandemic, which may have influenced students’ performance using virtual TBL vs FTF TBL. The extent to which these findings can be generalized to situations where virtual TBL is implemented more intentionally warrants further examination.

Fifth, factors external to the study, including students’ previous experiences with online learning, access to technology, and technological proficiency, may have influenced their performance. These external variables were not explicitly controlled for in the analysis.

Sixth, while efforts were made to ensure the academic integrity of assessments during virtual TBL, we acknowledge the inherent challenge in definitively controlling for the integrity of the assessment questions. Testing and proctoring platforms aimed to mitigate risks, but it is challenging to ascertain whether a compromise occurred due to the nature of remote assessments and if it might explain some of the study findings. The historical and non-experimental nature of the data collection further complicates efforts to quantify and control for potential integrity issues in the analysis. Additionally, there is the possibility of compromised integrity of the iRATs, tRATs, and tAPPs due to students’ access to these assessments from previous years. This limitation underscores the need for caution in interpreting findings related to assessment outcomes.

Seventh, while the assessments used in this study were designed to be consistent year-over-year, the absence of specific reliability measures may introduce a level of uncertainty regarding the stability of assessment outcomes across different administrations. Future studies could benefit from incorporating reliability measures, such as the Kuder–Richardson (KR) coefficient, to strengthen the interpretation and generalizability of assessment findings.

Finally, a power analysis was not performed in this study. The decision not to conduct a formal power analysis stemmed from the exploratory nature of this investigation, where we aimed to uncover trends and variations rather than testing predefined hypotheses. While we acknowledge the importance of statistical power considerations, the absence of a power analysis is a limitation of our study. Future research with a more hypothesis-driven approach and larger sample sizes can delve deeper into these aspects to strengthen the generalizability and robustness of the findings.

Despite these limitations, the intricate findings and nuances uncovered within this investigation shed light on potential areas where further exploration can enhance pedagogical practices and address the evolving needs of students engaged in virtual TBL.

Future research can compare students’ performance in an intentionally designed virtual TBL course to the FTF TBL approach. Moreover, to make the findings more generalizable, future research can expand the comparison of virtual TBL and FTF TBL to a wider range of course content areas within pharmacy education. Finally, it would be beneficial to investigate how students’ previous experiences with online learning may impact their performance in virtual TBL.

Conclusions

This research investigated the shift from FTF TBL to virtual TBL and its impact on pharmacy students’ academic performance. Our study revealed that P1 students in the virtual TBL group had higher iRAT and tRAT scores compared to those in the FTF TBL group. However, P1 students’ performance in tAPPs was lower in virtual TBL than in FTF TBL. On the other hand, P2 had lower summative exams and total course scores in virtual TBL compared to their FTF TBL counterparts. In contrast, P3 students excelled in the virtual TBL environment, demonstrating improved performance across all assessments compared to the P3 FTF TBL group. In conclusion, our study demonstrated that students’ performance in virtual TBL compared to FTF TBL in the pharmacy didactic curriculum varies depending on the course content, academic year, and type of assessment.

Author contributions

Osama A. Shoair: Conceptualization, Methodology, Formal analysis, Investigation, Resources, Data curation, Writing – original draft, Writing – review & editing, Visualization, Supervision, Project administration, Funding acquisition.

Ethical committee approval

This study was reviewed and granted expedited review approval by The University of Texas at Tyler Institutional Review Board (Protocol number FY2021-108). The study strictly uses deidentified, historical student grades and does not involve direct interaction or involvement with the students. The University of Texas at Tyler Institutional Review Board has granted the study a waiver of consent. As such, there was no requirement for obtaining informed consent from the students. The data used in our research was obtained per all relevant privacy and ethical guidelines, ensuring complete anonymization of the information to safeguard the students’ identities.

Disclosure statement

No potential conflict of interest was reported by the autho(s).

Data availability statement

In accordance with the Family Educational Rights and Privacy Act (FERPA) regulations, the student data analyzed in this study has been deidentified and reported in aggregate form to ensure the privacy and confidentiality of individual students. Raw data will not be shared publicly to maintain compliance with FERPA guidelines, which prioritize the protection of student information. The study data will be shared by the corresponding author upon reasonable request.

Additional information

Funding

This work was supported by the 2021 American Association of Colleges of Pharmacy (AACP) Scholarship of Teaching and Learning (SOTL) Grant.

References

  • Brazeau G, Romanelli F. Navigating the unchartered waters in the time of COVID-19. Am J Pharm Educ. 2020;84(3):1. doi: 10.5688/ajpe8063.
  • Parmelee D, Michaelsen LK, Cook S, et al. Team-based learning: a practical guide: AMEE guide no. 65. Med Teach. 2012/05/012012;34(5):e275–11. doi: 10.3109/0142159X.2012.651179.
  • Allen RE, Copeland J, Franks AS, et al. Team-based learning in US colleges and schools of pharmacy. Am J Pharm Educ. 2013;77(6):115. doi: 10.5688/ajpe776115.
  • Ofstad W, Brunner LJ. Team-based learning in pharmacy education. Am J Pharm Educ. 2013;77(4):70. doi: 10.5688/ajpe77470.
  • Franklin AS, Markowsky S, De Leo J, et al. Using team-based learning to teach a hybrid pharmacokinetics course virtual and in class. Am J Pharm Educ. 2016;80(10):171. Dec 25 doi: 10.5688/ajpe8010171.
  • Shoair OA, Smith WJ, Abdel Aziz MH, et al. Pharmacy students’ perceptions and attitudes toward face-to-face vs. virtual team-based learning (TBL) in the didactic ­curriculum: a mixed-methods study. Med Educ Online. 2023;28(1):2226851. doi: 10.1080/10872981.2023.2226851.
  • Shen J, Qi H, Mei R, et al. A comparative study on the effectiveness of online and in-class team-based learning on student performance and perceptions in virtual simulation experiments. BMC Med Educ. 2024;24(1):135. doi: 10.1186/s12909-024-05080-3.
  • Accreditation standards and key elements for the professional program in pharmacy leading to the doctor of pharmacy degree (“Standards 2016”). Chicago, Illinois [February 2015]. Available from https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf;Accessed 01 October 2023.
  • Spivey CA, Chisholm-Burns MA, Johnson JL. Factors associated with student pharmacists’ academic progression and performance on the national licensure examination. Am J Pharm Educ. 2020; 84(2):7561. doi: 10.5688/ajpe7561.
  • Tejada FR, Parmar JR, Purnell M, et al. Admissions criteria as predictors of academic performance in a three-year pharmacy program at a historically black institution. Am J Pharm Educ. 2016; 80(1):6. doi: 10.5688/ajpe8016.
  • Windle JM, Spronken-Smith RA, Smith JK, et al. Preadmission predictors of academic performance in a pharmacy program: a longitudinal, multi-cohort study. Curr Pharm Teach Learn. 2018;10(7):842–853. doi: 10.1016/j.cptl.2018.04.018.
  • Chisholm MA, Cobb HH, III, Kotzan JA. Significant factors for predicting academic success of first-year pharmacy students. Am J Pharm Educ. 1995;59(4):364–370. doi: 10.1016/S0002-9459(24)04472-3.
  • Kelly KA, Secnik K, Boye ME. An evaluation of the pharmacy college admissions test as a tool for pharmacy college admissions committees. Am J Pharm Educ. 2001;65(3):225–230.
  • Wu-Pong S, Windridge G, Osborne D. Evaluation of pharmacy school applicants whose first language is not English. Am J Pharm Educ. 1997;61(1):61–66. doi: 10.1016/S0002-9459(24)08141-5.
  • Carroll CA, Garavalia LS. Gender and racial differences in select determinants of student success. Am J Pharm Educ. 2002;66(4):382–387.
  • U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies. Washington, D.C. [September 2010]. Available from https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf; Accessed 29 April 2024