3,980
Views
4
CrossRef citations to date
0
Altmetric
EDUCATIONAL ASSESSMENT & EVALUATION

Rethinking online assessment from university students’ perspective in COVID-19 pandemic

ORCID Icon, , , & ORCID Icon
Article: 2082079 | Received 18 Aug 2021, Accepted 05 May 2022, Published online: 06 Jun 2022

Abstract

The recent COVID-19 pandemic prompted the implementation of online teaching and online assessment. Online assessment can be challenging to both teachers and students due to technical, academic, and ethical issues. In this survey, we adopted both qualitative and quantitative approaches to evaluate (1) the perceived effectiveness of online assessment; (2) barriers and problems of using online assessment; and (3) suggestions for improvement. The online survey was conducted in May 2020, 752 full-time undergraduate and postgraduate students had completed the questionnaires. Forty-three undergraduate students attended an individual interview between May and June 2020. A total of 739 (98.3%) students had the experience of taking online assessment during COVID-19 outbreak. The survey results revealed that only 16.6% of students were satisfied with their online assessment arrangements. The major difficulty that students encountered was technical problems (52.6%). Majority of students (72.6%) agreed that online assessments were more affected by computer problems and internet connection when compared with traditional examination. Students expressed that teachers’ feedback was essential for their learning, and they wished to receive timely and detailed feedback on their performance. Students suggested that technical support should be provided, and standardized measures should be taken to ensure academic honesty.

1. Introduction

Assessment plays a crucial role in education and it is considered the core component for effective learning (Gikandi et al., Citation2011). Assessment can provide feedback to teachers and students and the results will serve as guidelines for correction (Bloom, Citation1968). There are numerous assessment methods available, such as examination, test, quiz, written assignment, individual project, and group project. The assessment methods used should be in line with the course objectives and they often vary among programs (Organisation for Economic Co-operation and Development, Citation2013). Traditionally, majority of the assessment methods require face-to-face contact, such as end-of-term final examination or in class presentation (Vyas et al., Citation2021).

Following the outbreak of the Coronavirus Disease 2019 (COVID-19) since January 2020, the Hong Kong government had implemented a series of preventive measures. Classes were suspended and social gatherings were discouraged (Centre for Health Protection, Citation2021). The university had switched majority of its teaching and learning activities to online mode to reduce human-to-human contact and to cater for the needs of overseas students. The COVID-19 pandemic persisted throughout the semester (World Health Organization, Citation2021). Thus, online teaching remained the preferred option and online assessment became inevitable.

While many local universities have stated their plan and strategy of adopting technology to promote e-learning, most teachers and students had only limited experience of conducting online assessment (Coniam et al., Citation2014; Evans et al., Citation2020; Foung & Chen, Citation2019). Under COVID-19, modification of assessment methods was often needed (Carrillo & Flores, Citation2020). Common strategies included shifting from traditional summative assessment to formative assessment, changing the final written examination to a project or term paper, or hosting online examinations (Cleland et al., Citation2020; Nic Dhonncha & Murphy, Citation2021). In our university, three types of online examinations were adopted, including (i) synchronous or asynchronous online examinations with lockdown browser, webcam, and video analysis to detect and prevent cheating (Aguirre & Selampinar, Citation2020); (ii) synchronous online examinations with any online quiz system and video-conferencing tool; and (iii) paper-form synchronous online examinations under invigilation via video-conferencing tool.

Nevertheless, with such sudden and massive changes in assessment methods, both teachers and students had raised a lot of concerns. There were worries that technical problems might arise during online examination, project or assignment might not be as effective as examination in accessing students’ learning, and difficulties in ensuring academic integrity and prevention plagiarism or cheating during online assessments (Darling-Hammond & Hyler, Citation2020; Deranek & Parnther, Citation2015). The current study aimed to collect students’ opinion, identify the barriers and problems, and provide suggestions for improvement on online assessment.

2. Literature review

Chickering and Gamson suggested seven principles of good practice for teaching and learning in undergraduate education (Chickering & Gamson, Citation1987), including encouraging contacts between students and faculty, developing cooperation among students, encouraging active learning, giving prompt feedback, emphasizing time on task, communicating high expectations, and respecting diverse ways of learning. These principles had been widely adapted as performance criteria to evaluate the effectiveness of assessment, including online assessment in web-based virtual classrooms (Gorsky & Blau, Citation2009; Tartavulea et al., Citation2020). Morgan and O’Reilly had proposed ten key qualities of online assessment (Morgan & O’Reilly, Citation2005), including clear rationale and consistent pedagogical approach, explicit values, aims, criteria, and standards, relevant authentic and holistic tasks, awareness of students’ learning contexts and perceptions, sufficient and timely formative feedback, facilitative degree of structure, appropriate volume of assessment, valid and reliable, certifiable as students’ own work, and subject to continuous improvement via evaluation and quality enhancement. Both Chickering and Morgan put great emphasis on giving feedback and engaging students in the assessment.

It was suggested that effective feedback was essential for stimulating students to learn and develop effective studying strategies (Gikandi et al., Citation2011). The greatest advantages of online assessment were its flexibility and accessibility (Rolim & Isaias, Citation2019). The flexibility around time and place of taking assessment enhanced students’ learning experience. Technology also helped consolidate reliability and validity of online assessment by providing timely interactive feedback (Bajzek et al., Citation2008). Students generally received more prompt feedback from peer assessment and computer-marked assessment than teacher-marked ones (Ogange et al., Citation2018). Nevertheless, there were challenges in the design, development, and delivery of online assessment and evaluation. Quality of the feedback generated from online assessment could be a concern. Problems reported included unclear feedback from tutors or vague suggestions on improvement (Higgins et al., Citation2001; Weaver, Citation2006). Choices of assessment methods should be in line with the course objectives, but assessment options might be limited by the online setting (Hickman et al., Citation2005). Securing and proctoring online tests was crucial (Howlett & Hewett, Citation2005). Teachers should pay special attention to the question design when traditional testing was delivered in distance learning in order to maintain academic integrity. Students’ acceptance towards online assessment was another issue. Inadequate peer support could affect motivation and confidence in online assessment (Webb & Jones, Citation2009).

As the mode of communication and learning paradigm shifted, the assessment practices in online environment should be adjusted to direct teaching and promote learning (Bartley, Citation2005). There was rising teachers’ awareness on transforming the way students were assessed by technology (Sampson et al., 2014). Previous studies revealed that multiple factors apart from personal ability could affect students’ performance, such as quality of teaching, level of the course, familiarity with examination format, and quality of examination items (Inuwa et al., Citation2012). Multiple factors affecting students’ perspectives should be considered when researching into students’ preferences on assessment methods.

3. Methods

3.1. Study design and study population

Study population included undergraduate and postgraduate students from all eight faculties at the participating university.

The current study consisted of two parts.

The first part was a cross-sectional study which aimed at collecting quantitative data. Students were invited to fill in an online questionnaire in May 2020. Invitation was sent to all eligible students through email. The questionnaire contained five sections namely (1) Types of online assessment method used; (2) Evaluation of the overall effectiveness of online assessment from students’ perspectives; (3) Problems encountered using online assessments including written assignments, presentations, test, quiz, and examination; (4) Students’ perception of teachers’ competency in the implementation of online assessment; (5) Obstacles or difficulties in online assessment. Five-level Likert-scale with answers from strongly agree to strongly disagree was used. For challenges and suggestions, open-ended questions were used. The questionnaire was pre-tested on 15 students in April 2020 before large scale distribution. Minor rephrasing of the questions was done after the pilot testing.

The second part aimed to collect qualitative data on students’ opinion and suggestions on online assessment through individual interviews. Students were invited to participate in an in-depth individual online interview between May and June 2020. Invitation was sent to all eligible students through email. Data collected from the online questionnaire were used as reference when developing the questions for individual interview. Findings from questionnaire and interview would be reported separately.

3.2. Data processing and statistical analysis

Descriptive statistics such as mean, standard deviations, percentages were used to report the statistical results from questionnaire. Chi-squared test was used to determine any difference in students’ perception of difficulties in different types of online assessments among faculties. A coding scheme was developed based on all the collected qualitative data. Two researchers did the coding separately, then discussed and agreed on the coding results. The coding reliability for the open-ended questions was almost 99%. Students’ response from the semi-structured questions like the types of barriers and difficulties students encountered in online assessments and the support needed were summarized according to the main themes with content analysis technique. Based on the latest official information reported by the University Grants Committee in the year 2020, 20,464 full-time students from the eight faculties in this participating university were eligible for participation in this research. The sample size estimation to achieve a 95% confidence level (level of significance of 0.05) with a marginal error of 5% was 378 students. IBM SPSS Statistics (version 26) was used to analyse the questionnaire data.

The individual interviews were conducted by one researcher. The interviews were audio-recorded and transcribed by student helpers. A general inductive approach was adopted in data analysis. The transcripts were read several times to identify themes and categories. All the transcripts were first read and checked for accuracy by two researchers independently. A coding frame was developed after discussion. The transcripts were coded by one researcher and checked by another research. If new codes emerged, the coding frame was changed and the transcripts were reread according to the new structure. This process was used to develop categories, which were then summarized and classified into broad themes. The themes were categorised into experience, opinion, and suggestions on online assessment.

3.3. Ethics

The current study was approved by the University Survey and Behavioural Research Ethics Committee (Reference No. SBRE-19-624). All students had provided their informed consent before filling in the questionnaire or joining the interview.

4. Results

A total of 752 students completed the online survey. Majority (86.6%) of the respondents were undergraduate students. Respondents came from all eight faculties of the university, namely, the Faculty of Arts, the Faculty of Business, the Faculty of Education, the Faculty of Engineering, the Faculty of Law, the Faculty of Medicine, the Faculty of Science, and the Faculty of Social Science. Table showed the demographic distribution of the participants in the online survey. In addition, 43 students form the eight faculties attended the individual interview.

Table 1. Demographics of questionnaire respondents

Table summarized students’ engagement in online assessment during COVID-19. A total of 739 (98.3%) students had taken online assessment during COVID-19 outbreak. The most frequently used methods included attempting online test, quiz, and examination at a fixed time and place within a fixed period (69.0%), followed by students doing open book online test, quiz, and examination (59.7%), students doing an online presentation (54.1%) and group projects and collaborative writing through online platform (50.3%). There were 526 (69.9%) students who attempted online assessments for at least 5 times, and 222 (29.5%) students who attempted online assessments for at least 10 times. On average, students took online assessment for 6.5 times.

Table 2. Online assessment methods that students attempted during COVID-19

Table summarized students’ opinion on online assessment. A total 125 (16.6%) students were satisfied with their online assessment arrangements, while 348 students (46.3%) were unsatisfied. There were 546 students (72.6%) who agreed that online assessments were more affected by computer problems and internet connection when compared with traditional examination, 435 (57.8%) thought it was more convenient to students when they were allowed to do the online assessments at their preferred location and at a desired time-slot, 422 (56.1%) students who agreed that feedback given by the teacher was prompt and easy to understand, 418 students (55.6%) agreed that teachers gave good instructions and guidance for their online assignments, and 393 (52.3%) were inexperience with the new format which resulted in online assessments being more stressful for students.

Table 3. Students’ opinion on online assessment from questionnaire

Table summarized teachers’ and students’ competency with online assessment. The major barrier that students faced was technical problems (52.6%). There were higher percentage of students having no difficulty with written assignments (P < 0.01) and presentations (P = 0.02) than those who did. While for online examination, there were higher percentages of students having difficulties than those who did not (P < 0.01). Majority of students agreed that their teachers could make use of online discussion board to encourage questions and discussions (83.9%) and that their teachers would look for opportunities to provide feedback (71.5%). Around half of the students thought that their teachers were capable of assisting self and peer assessments (54.9%) and that their teachers were skilful in using synchronous technologies to communicate in real-time (48.9%). Only 21.8% of students agreed that their teachers were skilful in using formative assessment, such as ungraded quizzes, to check students’ learning.

Table 4. Teachers’ and students’ competency with online assessment from questionnaire

Table summarized students’ comments on online assessment and their suggestions for improvement from individual interviews. Students expressed that they prefer taking traditional assessment for professional courses or practicum courses. They thought their learning could be facilitated if teachers could return students’ course work earlier and provide more timely and detailed feedback. To avoid cheating, students suggested that lockdown browser or camera should be used for examination invigilation. The examination format could be adjusted by setting more challenging and application questions to test students’ understanding on fundamental concepts and prevent students from copying answers directly from online sources. Assessment rubric for presentation should also be revised in accordance with the virtual context. From the university’s level, students hoped a set of rules could be developed to standardize practice across departments. They also hoped that technical problems with online assessment can be addressed by being allowed to take online examinations on-campus or getting real-time support from the university information technology service unit.

Table 5. Students’ opinion and suggestions from individual interviews

5. Discussion

COVID-19 pandemic has caused a paradigm shift in many aspects of citizens’ lifestyle (Khan & Jawaid, Citation2020). One major change is the implementation of online teaching in primary, secondary, and tertiary institutions (Moorhouse, Citation2020; Ng et al., Citation2020). While there are debates on the pros and cons of online teaching, issues with online assessment have also attracted teachers’ and students’ attention (Elzainy et al., Citation2020). Assessment is a crucial component of teaching and learning, as it evaluates the achievement of course learning outcomes by the students (Gorsky & Blau, Citation2009; Tartavulea et al., Citation2020). Online assessment tools have been available for a long time. However, they are not often adopted to conduct major assessment on students due to controversies over validity, reliability, and dishonesty (Guangul et al., Citation2020).

When comparing computer-based examinations with paper-and-pencil examination, some studies suggested that the performance results were similar (Martinez et al., Citation2009). However, some students reported their performance was adversely affected by the online environment (Dillon & Clyman, Citation1992). Detailed studies revealed that multiple factors could affect student performance (Inuwa et al., Citation2012). In the current study, only less than 20% of students were satisfied with the online assessment arrangements. Nearly half of the students did not like online examination and preferred the traditional paper-and-pencil examination. The major barrier in online assessment that students encountered was technical problem. There were 72.6% of students thought they were more affected by computer problems and internet connection when compared with traditional examination. Around 40% of students found it more difficult to focus on the test when working in an online assessment environment. Over 50% of students also expressed that inexperience with the new format had resulted in online assessments being more stressful for students. The current study echoes literature findings that students’ performance in online assessment might be affected by multiple factors, and students might be more stressful with online assessment (Stowell & Bennett, Citation2010).

Ensuring academic honesty is one of the major challenges in online assessment (Deranek & Parnther, Citation2015). In a recent study conducted in a university in Jorden, nearly 45% of students admitted misconduct or dishonesty during the remote online examinations, including seeking help from friends or searching for answers from all possible sources (Elsalem et al., Citation2021). Academic dishonesty does not only concern faculty members, but also students (McGee, Citation2013; Spaulding, Citation2009). Students who attended the focus-group interview pointed out that it was crucial for the university to consolidate a set of rules and regulations to standardize practice across departments. Common strategies that students proposed included changing the close book examination to an open book examination, making the use of lockdown browser and camera for examination invigilation compulsory, and adjusting examination format by setting more challenging and practical questions. The decision to change assessment methods would require serious consideration, especially for professional programs (Sagoo et al., Citation2020). Nevertheless, it is clear that program-specific modifications are inevitable as COVID-19 pandemic persists.

Changing the assessment format by increasing the proportion of formative assessment is often proposed in online assessment. Formative assessment serves as a tool to boost students’ achievement and identify learning gaps (Hayes, Citation2015). Effective integration of formative assessment with sustained interactions among learners and teachers supports high-order deep learning and fosters the formation of a meaningful learning community (Sorensen & Takle, Citation2005). Successful feedback can be identified in terms of frequency and detail, which should be focusing on students’ performance, timely, appropriate to students’ conception of knowledge, and meeting learning objectives (McCarthy, Citation2017). Computer-marked assessment could provide more prompt feedback to students (Ogange et al., Citation2018). Interactive formative feedback was significant and useful in helping students deal with shyness in expressions (Ogange et al., Citation2018). In the current study, students expressed that getting feedback was essential to their learning. While most students thought that feedback given by the teacher was fast and easy to understand, they had less interaction and received less feedback from their classmates. Students’ comments correlated with previous findings that common difficulties faced in implementing online formative assessments was the lack of supportive peer learning environment and the lack of high-quality feedback (Webb & Jones, Citation2009). Future development of online assessment should include strategies to engage discussion among students. In a research conducted in a Korean online university in 2019, six factors were found to have direct impact on student engagement in the e-learning environment, including psychological motivation, peer collaboration, cognitive problem solving, interactions with instructors, community support, and learning management (Lee et al., Citation2019). Besides, those engaged learners always demonstrated good communication skills with proficient cooperative and self-directed learning by utilising online technology effectively (Dixson, Citation2015). Therefore, future research in how to overcome the barriers in online assessments could focus on how to increase students’ engagement in a collaborative learning environment by boosting their competencies in online learning and shape high quality learning outside the traditional classroom (Golladay et al., Citation2000).

The current study has several limitations. Firstly, the survey was conducted in a single tertiary educational institution. The applicability of these findings to situations of other universities worldwide may be limited. Secondly, this survey only covered students’ perspectives on comment types of online assessments, more complicated types such as practicum, laboratory testing, or micro-teaching were not covered. Thirdly, a detailed research into the distribution of online formative and summative assessments of the faculty and the difficulties faced in the implementation of different types of online assessments was not covered in this research. Nevertheless, the current research is a call to the different stakeholders in higher education to pay attention to the difficulties students have faced in doing both formative and summative online assessments. Through this research, different stakeholders in the university will be encouraged to re-consider the possibilities of allowing more online courses in different faculties. Also, future research should focus on ways to establish a comprehensive system of validation for university examination to be taken online.

6. Conclusion

In conclusion, students’ satisfaction level on online assessment was low. Their major concern was related to the technical problems during assessment. The current study has revealed an urgent need to explore how to develop a safe, valid and reliable online assessment system that can meet the needs of higher education in Hong Kong. The current research project is only the beginning in the exploration of how students in different faculties responded to online assessment during the pandemic and the summary of their major problems and difficulties. Future research direction should focus on the accreditation process of online assessment, how university could provide a universal, formal, and third-party recognition of competence of students in the performance of specific tasks.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the University Grant Council Teaching Development and Language Enhancement Grant (2019-22). HK$ 99,904. Evaluation of the Concerns and Barriers of Online Assessment at CUHK – Students’ Perspectives. (Project Code: 4170684).

References

  • Aguirre, J. D., & Selampinar, F. (2020). Teaching Chemistry in the time of COVID-19: Memories and the classroom. Journal of Chemical Education, 97(9), 2909–13. https://doi.org/10.1021/acs.jchemed.0c00742
  • Bajzek, D., Brooks, J., Jerome, W., Lovett, M., Rinderle, J., Rule, G., & Thille, C. (2008). Assessment and instruction: Two sides of the same coin. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 560–565). Association for the Advancement of Computing in Education (AACE).
  • Bartley, J. M. (2005). Assessment is as assessment does: A conceptual framework for understanding online assessment and measurement. In S. Howell & M. Hricko (Eds.), Online assessment and measurement: Foundations and challenges (pp. 1–45). IGI Global.
  • Bloom, B. S. (1968). Learning for mastery. Instruction and curriculum. Regional education laboratory for the Carolinas and Virginia, topical papers and reprints, number 1. Evaluation Comment, 1(2), n2. https://files.eric.ed.gov/fulltext/ED053419.pdf
  • Carrillo, C., & Flores, M. A. (2020). COVID-19 and teacher education: A literature review of online teaching and learning practices. European Journal of Teacher Education, 43(4), 466–487. https://doi.org/10.1080/02619768.2020.1821184
  • Centre for Health Protection, Department of Health. Chp.gov.hk. (2021). Retrieved March 30, 2021, from https://www.chp.gov.hk/en/index.html
  • Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 3(7), 1-6. https://files.eric.ed.gov/fulltext/ED282491.pdf
  • Cleland, J., McKimm, J., Fuller, R., Taylor, D., Janczukowicz, J., & Gibbs, T. (2020). Adapting to the impact of COVID-19: Sharing stories, sharing practice. Medical Teacher, 42(7), 772–775. https://doi.org/10.1080/0142159X.2020.1757635
  • Coniam, D., SpringerLink, & SpringerLink. (2014). English language education and assessment: Recent developments in Hong Kong and the Chinese Mainland. Singapore: Springer. ISBN: 9789812870704 9812870709
  • Darling-Hammond, L., & Hyler, M. E. (2020). Preparing educators for the time of COVID … and beyond. European Journal of Teacher Education, 43(4), 457–465. https://doi.org/10.1080/02619768.2020.1816961
  • Deranek, J., & Parnther, C. (2015). Academic honesty and the new technological frontier. The Hilltop Review, 8(1), 4. https://scholarworks.wmich.edu/cgi/viewcontent.cgi?article=1134&context=hilltopreview
  • Dillon, G. F., & Clyman, S. G. (1992). The Computerization of Clinical Science Examinations and its Effect on the Performances of Third-Year Medical Students. Academic Medicine, 67(10), S66–8.
  • Dixson, M. D. (2015). Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learning, 19(4), n4. https://doi.org/10.24059/olj.v19i4.561
  • Elsalem, L., Al-Azzam, N., Jum’ah, A. A., & Obeidat, N. (2021). Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery, 62, 326–333. https://doi.org/10.1016/j.amsu.2021.01.054
  • Elzainy, A., El Sadik, A., & Al Abdulmonem, W. (2020). Experience of e-learning and online assessment during the COVID-19 pandemic at the College of Medicine, Qassim University. Journal of Taibah University Medical Sciences, 15(6), 456–462. https://doi.org/10.1016/j.jtumed.2020.09.005
  • Evans, J. C., Yip, H., Chan, K., Armatas, C., & Tse, A. (2020). Blended learning in higher education: Professional development in a Hong Kong university. Higher Education Research & Development, 39(4), 643–656. https://doi.org/10.1080/07294360.2019.1685943
  • Foung, D., & Chen, J. (2019). A learning analytics approach to the evaluation of an online learning package in a Hong Kong University. Electronic Journal of e-Learning, 17(1), 11–24. https://files.eric.ed.gov/fulltext/EJ1215541.pdf
  • Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
  • Golladay, R. M., Prybutok, V. R., & Huff, R. A. (2000). Critical success factors for the online learner. Journal of Computer Information Systems, 40(4), 69–71. https://doi.org/10.1080/08874417.2000.11647468
  • Gorsky, P., & Blau, I. (2009). Online teaching effectiveness: A tale of two instructors. The International Review of Research in Open and Distributed Learning, 10(3). https://doi.org/10.19173/irrodl.v10i3.712
  • Guangul, F. M., Suhail, A. H., Khalit, M. I., & Khidhir, B. A. (2020). Challenges of remote assessment in higher education in the context of COVID-19: A case study of Middle East College. Educational Assessment, Evaluation and Accountability, 32(4), 519-535. https://doi.org/10.1007/s11092-020-09323-x
  • Hayes, A. M. (2015). From assessment to instruction: The impact of online formative assessment in reading on teachers’ planning and instruction in the middle school english language arts classroom. (Order No. 10110670). Available from ProQuest Dissertations & Theses A&I. (1798048466). Retrieved from. http://easyaccess.lib.cuhk.edu.hk/login?url=https://www.proquest.com/dissertations-theses/assessment-instruction-impact-online-formative/docview/1798048466/se-2?accountid=10371
  • Hickman, C. J., Bielema, C., & Gunderson, M. (2005). Challenges in the deisgn, development, and delivery of online assessment and evaluation. In S. Howell & M. Hricko (Eds.), Online assessment and measurement: Foundations and challenges (pp. 132–164). IGI Global.
  • Higgins, R., Hartley, P., & Skelton, A. (2001). Getting the Message Across: The Problem of Communicating Assessment Feedback. Teaching in Higher Education, 6(2), 269–274.
  • Howlett, B., & Hewett, B. (2005). Securing and proctoring online tests. In S. Howell & M. Hricko (Eds.), Online assessment and measurement: Foundations and challenges (pp. 300–329). IGI Global.
  • Inuwa, I. M., Taranikanti, V., Al‐Rawahy, M., & Habbal, O. (2012). Anatomy practical examinations: How does student performance on computerized evaluation compare with the traditional format? Anatomical Sciences Education, 5(1), 27–32. https://doi.org/10.1002/ase.254
  • Khan, R. A., & Jawaid, M. (2020). Technology Enhanced Assessment (TEA) in COVID 19 Pandemic. Pakistan Journal of Medical Sciences, 36(COVID19–S4), S108–S110. https://doi.org/10.12669/pjms.36.COVID19-S4.2795
  • Lee, J., Song, H. D., & Hong, A. J. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11(4), 985. https://doi.org/10.3390/su11040985
  • Martínez, T., Vidal-Abarca, E., Gil, L., & Gilabert, R. (2009). On-Line Assessment of Comprehension Processes. The Spanish Journal of Psychology, 12(1), 308–319.
  • McCarthy, J. (2017). Enhancing feedback in higher education: Students’ attitudes towards online and in-class formative assessment feedback models. Active Learning in Higher Education, 18(2), 127–141. https://doi.org/10.1177/1469787417707615
  • McGee, P. (2013). Supporting academic honesty in online courses. Journal of Educators Online, 10(1), 1–31. https://doi.org/10.9743/JEO.2013.1.6
  • Moorhouse, B. L. (2020). Adaptations to a face-to-face initial teacher education course ‘forced’online due to the COVID-19 pandemic. Journal of Education for Teaching, 46(4), 609–611. https://doi.org/10.1080/02607476.2020.1755205
  • Morgan, C., & O’Reilly, M. (2005). Ten key qualities of asessment online. In S. Howell & M. Hricko (Eds.), Online assessment and measurement: Foundations and challenges (pp. 86–101). IGI Global.
  • Ng, T., Chu, S., Li, X., Reynolds, R., & Chan, M. (2020). Business (Teaching) as usual amid the COVID-19 pandemic: A case study of online teaching practice in Hong Kong. Journal of Information Technology Education: Research, 19(1), 775–802. https://doi.org/10.28945/4620
  • Nic Dhonncha, E., & Murphy, M. (2021). Learning new ways of teaching and assessment: The impact of COVID‐19 on undergraduate dermatology education. Clinical and Experimental Dermatology, 46(1), 170–171. https://doi.org/10.1111/ced.14364
  • Ogange, B., Agak, J., Okelo, K., & Kiprotich, P. (2018). Student perceptions of the effectiveness of formative assessment in an online learning environment. Open Praxis, 10(1), 29–39. https://doi.org/10.5944/openpraxis.10.1.705
  • Organisation for Economic Co-operation and Development. (2013). Student assessment: Putting the learner at the centre. In Synergies for better learning: An international perspective on evaluation and assessment. OECD Publishing. pp. 29–39. (ISSN 2304-070X).
  • Rolim, C., & Isaias, P. (2019). Examining the use of E‐assessment in higher education: Teachers and students’ viewpoints. British Journal of Educational Technology, 50(4), 1785–1800. https://doi.org/10.1111/bjet.12669
  • Sagoo, M. G., Vorstenbosch, M., Bazira, P. J., Ellis, H., Kambouri, M., & Owen, C. (2020). Online assessment of applied anatomy knowledge: The effect of images on medical students’ performance. Anatomical Sciences Education, 14(3), 342-351. https://doi.org/10.1002/ase.1965
  • Sorensen, E. K., & Takle, E. S. (2005). Investigating knowledge building dialogues in networked communities of practice. A collaborative learning endeavor across cultures. Interactive Educational Multimedia: IEM, 10, 50-60. http://www.ub.edu/multimedia/iem
  • Spaulding, M. (2009). Perceptions of academic honesty in online vs. face-to-face classrooms. Journal of Interactive Online Learning, 8(3), 183-198. http://www.ncolr.org/jiol/issues/pdf/8.3.1.pdf
  • Stowell, J. R., & Bennett, D. (2010). Effects of online testing on student exam performance and test anxiety. Journal of Educational Computing Research, 42(2), 161–171. https://doi.org/10.2190/EC.42.2.b
  • Tartavulea, C. V., Albu, C. N., Albu, N., Dieaconescu, R. I., & Petre, S. (2020). Online teaching practices and the effectiveness of the educational process in the wake of the COVID-19 pandemic. Amfiteatru Economic, 22(55), 920–936.
  • Vyas, V. S., Kemp, B., & Reid, S. A. (2021). Zeroing in on the best early-course metrics to identify at-risk students in general chemistry: An adaptive learning pre-assessment vs. traditional diagnostic exam. International Journal of Science Education, 43(4), 552-569. https://doi.org/10.1080/09500693.2021.1874071.
  • Weaver, M. R. (2006). Do Students Value Feedback? Student Perceptions of Tutors’ Written Responses. Assessment & Evaluation in Higher Education, 31(3), 379–394.
  • Webb, M., & Jones, J. (2009). Exploring tensions in developing assessment for learning. Assessment in Education: Principles, Policy & Practice, 16(2), 165–184.
  • World Health Organization. (2021). WHO Coronavirus (COVID-19) Dashboard. https://covid19.who.int