471
Views
0
CrossRef citations to date
0
Altmetric
EDUCATIONAL ASSESSMENT & EVALUATION

Assessment considerations during lockdown in Norway: An exploratory case study with focus on misconducts in university mathematics

ORCID Icon
Article: 2210456 | Received 14 Oct 2022, Accepted 01 May 2023, Published online: 11 May 2023

Abstract

The present paper raises a discussion about assessment formats in mathematics courses at Norwegian universities during the Covid lockdown. This proved to be challenging since the European GDPR regulations are strictly interpreted in Norway, making proctoring at home difficult. Based on analyses of nine university teachers’ feedback on how exams were carried out at their university during lockdown, a discussion is raised about assessment modes and misconducts. The result shows how a framework from another research field can be adjusted to analyze data about the assessment situations. Next, by utilizing the different components (themes) of the adjusted framework, we shed light on perspectives on misconducts in un-proctored home exams. In doing so, the paper informs the discussion on challenges related to assessing students in mathematics at home. Results are relevant for future educational settings since change in the demographic profile of students increase topicality of online assessment.

PUBLIC INTEREST STATEMENT

The present work raises a discussion about assessment formats in mathematics courses at Norwegian universities during the Covid lockdown. This proved to be challenging since the European GDPR regulations are strictly interpreted in Norway. It makes proctoring at home difficult. Based on an analysis of nine university teachers’ feedback on how un-proctored home exams were carried out at their university, a discussion is raised about assessment modes and misconducts. The analysis shows how a set of four different perspectives can be used to shed light on this issue. It informs the discussion about challenges related to assessing students in mathematics at home. Results are relevant for future educational settings since change in the demographic profile of students increase topicality of online assessment.

Introduction

In March 2020, well over 100 countries worldwide instituted a lockdown due to the COVID-19 outbreak. In Norway, a full lockdown was faced, and university students were destined to study from home. According to Crawford et al. (Citation2020), data from 172 sources in 20 countries show that a majority switched to fully online education, which also was the case in Norway. It lasted for nearly two years with a few opening periods, and during these years both teachers and students gained experience in online education and assessment. To meet physically was not recommended and home exams became the new norm. However, the youth of today interact on social networks (Jukes et al., Citation2010) and want to get information in dynamic ways (Dineva et al., Citation2019), thus ought to be familiar with use of technology in teaching. This also applies to mathematics teaching, as surveyed by Engelbrecht et al. (Citation2020). They show that students in mathematics collaborate well in digital environments, often by drawing on technology in the process. This is despite the fact that mathematics can be challenging since it is difficult to write mathematical symbols on various digital forums (Trenholm & Peschke, Citation2020).

Online teaching has its challenges, a vital one being that is puts more responsibility on the students. They have to work with resources themselves instead of being guided through them in a teacher-led face-to-face setting (Trenholm & Peschke, Citation2020). Assessing students online is even more challenging, particularly when it comes to integrity. The review by Butler-Henderson and Crawford (Citation2020) on online examinations shows that the most prevalent focus among researchers looking into online examinations relates to cheating. A crucial argument is whether to utilize online proctoring of students. There is growing evidence of its effectiveness, but there are many concerns especially when it comes to students’ privacy and anxiety due to a stressful situation (Butler-Henderson & Crawford, Citation2020; Eaton & Turner, Citation2020). In Norway, law regulations that protect privacy are strict (Lovdata, Citation2018); thus, online proctoring of written examinations is difficult. Without invigilation, it is a risk of academic misconducts, as many scholars have highlighted (references in Butler-Henderson & Crawford, Citation2020). This is the context of the present study, focusing on assessment of mathematics in an un-proctored environment.

Researchers have pointed to the powerful influence assessment practices have on students’ learning (Marriott & Lau, Citation2008), thus investigating home exam assessment issues is important. The present paper aims to contribute to this by enlightening different perspectives that may be taken on such examinations in mathematics. It is an exploratory case study where nine university teachers from different university campuses in Norway were asked open questions about the assessment situation in an anonymous questionnaire. The paper aims to find answer to the following question:

During lockdown in Norway, which considerations have influenced university teachers’ choice of assessment modes in mathematics and what considerations related to misconducts have been taken?

Theoretical background

The research object asks for a closer look on assessment, home exams, proctoring and misconducts. Adding to this, a relevant framework given by Bjerrum Nielsen (Citation2003) will be explained.

Assessment

It is well documented in research that assessment in higher education guides what students perceive to be the curriculum in a subject (Ramsden, Citation2003). Students adjust their approach to learning to what they interpret to be the assessment requests (Entwistle & Entwistle, Citation1991) and there is a strong relation between assessment and instruction (Heuvel-Panhuizen & Becker, Citation2003). Assessment is a main motivator for students’ learning in a course, and should thus be aligned with wanted outcome (Biggs & Tang, Citation2011). However, the direct relationship between assessment and learning has been questioned since deeper approaches to learning cannot be assumed simply by changing the assessment demands (Struyven et al., Citation2005), but assessment comes in different designs with different aims and with an indisputable importance (Suurtamm et al., Citation2016).

As pointed out by Wanner and Palmer (Citation2018), there is an increasing interest in including assessment for learning and assessment as learning where students are asked to assess their own learning. Still in mathematics, assessment of learning, the summative assessment, in a closed book examination is traditionally favored by lecturers (Iannone & Simpson, Citation2011). Iannone and Simpson show that in UK timed, written exams with no access to external material was dominating (2011) and has only slightly decreased a decade later (Iannone & Simpson, Citation2021). While the general assessment literature emphasizes students’ dislike of traditional assessment formats, mathematics students prefer them (Iannone & Simpson, Citation2015). When moving from face-to-face to online instruction, though, significant changes in communication, interaction and assessment are in place (Trenholm & Peschke, Citation2020).

Home exams

Modes of online assessment in mathematics are growing (Greenhow, Citation2015), but for many teachers and students a transition to fully online examinations during Covid was a new experience. Trenholm and Peschke (Citation2020) describe a typical fully online mathematics course to have mixed assessment practices with machine-marked testing mechanisms in a learning management system as one component. The other component, however, is proctored examinations, usually handwritten exams. Trenholm and Peschke identify six differences in teaching practices between face-to-face and online instruction with significant changes in communication, interaction and assessment needed. For the assessment part, decisions about invigilation are vital.

Online proctoring

Online proctoring may ask students for camera surveillance of their workplace during. In Norway, a number of carefully maintained laws are regulating such a setting. Laws protect personal privacy (Datatilsynet, Citation2022) and state that in the society, which offers education for free, it is problematic to ask students to purchase video equipment (Lovdata, Citation2006, § 3.1). Additionally, EU’s General Data Protection Regulation (GDPR), Article 6, is strictly interpreted (Lovdata, Citation2018), saying that proctoring should be voluntary. If some students are not comfortable with being online proctored during an exam, they should be offered another but similar assessment mode and teachers should ensure that this alternative is in no way negative to the student.

In their survey of literature about online examinations, Butler-Henderson and Crawford (Citation2020) find that the texts are focused on themes like anxiety, perception, performance, cheating, technology and authentication, and security. While the latter are related to physical settings, anxiety, perception and performance concerns individual features. According to the survey, there is an inconsistency in literature whether home exams generate test anxiety since some results show higher, others show lower degrees of such anxiety (Butler-Henderson & Crawford, Citation2020). Nevertheless, proctoring deter misconducts (Hylton et al., Citation2016).

Misconducts

As reviewed by Eaton and Turner (Citation2020), there is a generous amount of research on why and how students are involved in cheating. Reasons may be high levels of anxiety related to academic integrity, including consciousness-raising about mental health. Misconducts depend on what types of subjects that are assessed, and mathematics courses are in a category that are typically assessed in a proctored environment (Trenholm, Citation2007). In un-proctored settings at home, misconduct may have both social and technological perspectives. For the former, Trenholm offers a range in use of unauthorized help on assessment; paid/unpaid surrogates, unauthorized collaboration and unauthorized coaching (2007, p. 284). The technological perspective of cheating concerns use of not-permitted tools. Such tools may be written materials like textbooks and notes, online tutoring sites like Chegg (https://www.chegg.com/study) and calculators like Wolfram Alpha (https://www.wolframalpha.com/examples/mathematics). Tutoring sites present solutions but also how the tasks are accomplished (Richardson, Citation2021). A variety of remedies have been suggested to mitigate the cheating potential, be it proctoring, preventing unauthorized use of equipment/devices/test banks or limiting social activity (Hearn Moore et al., Citation2017). Still, as emphasized by Butler-Henderson and Crawford (Citation2020), research has focused primarily on technological challenges of cheating rather than ethical and social aspects.

The BN framework

Bjerrum Nielsen presented in 2003 a framework on “doing gender” that embraces four perspectives: structural, symbolic, interactional and personal gender (2003). This framework, called the BN framework, has similarities with themes developed in the present work. The first perspective is structural gender. This concerns gender interpreted in relation to the social structure and the working environment. The second perspective is symbolic gender, which relates to structures like symbols and symbolic dialogs that shows what is normal in a society. This evolves over a longer period, influenced by social structures that develops over time. The third perspective is personal gender, which is how the individuals comprehend gender as a personal matter. The final perspective is interactional gender which is about relations between individuals. The four perspectives represent different viewpoints used to look at the same situation (Wedege, Citation2007).

Methodology

The context and data collection

Ten teachers from universities around Norway were asked to respond to an anonymous questionnaire. This included universities that offer mathematics education (3), mathematics in engineering education (5) and both types of educations (2). Some of the participators had leading positions within their mathematics community but most were teachers and researchers. Anonymity was secured by distributing the questionnaire through an external webserver (nettskjema.no). In order to capture the teachers’ own explanations without predetermined statements, the questionnaire contained but three open questions of which two are relevant for the present paper:

Q1: Can you describe which types of assessment you/your colleagues have used during the Corona epidemic (type, place/platform, implementation, grading scale, etc.) and what experiences you had with these?

Q2: Can you describe how you/your colleagues have tried to adjust assessment methods to avoid use of aids that are not allowed, e.g., illegal cooperation, use of not-permitted tools, another person taking the exam, and how this has worked?

After two reminders, nine responses were obtained. The mean number of words responding to Q1 was 178, to Q2 it was 120.

The sampling strategy when selecting quotes from answers, was to show the range of feedback but also to show responses with insightful opinions. Seven of the teachers were healthily skeptical about home exams in mathematics in the particular setting. Quotes from these teachers can be argued to be typical cases to the group (Miles et al., Citation2019). The remaining two teachers had slightly different feedback. One was undividedly positive about home exams in this setting (T3). The other pointed out that he/she taught a special group of students (T4). Quotes from these two teachers were not commonly relevant for the entire group of teachers, but are nevertheless valuable because they give reflected feedback on the investigated topic. Most of the teachers gave explanations and adequate reflections. A few, however, mainly stated facts and problems. Thus, some teachers are more quoted than others. The answers are translated from Norwegian.

Data analysis drawing on thematic analysis

As data was collected by open questions to the teachers, thematic analysis was an advantageous analysis tool (Braun & Clarke, Citation2006). Its appropriateness was determined before the questionnaire was distributed. After the data had been collected, the first step included a number of readings of the responses to become familiar with the texts. Eventually, some initial codes were developed and responses were coded using these. With these temporary results at hand, it was realized that some codes were related and some had commonalities. This is part of the next step in thematic analysis; searching for themes (Braun & Clarke, Citation2006). During this phase, it was realized that the themes had resemblances with the different perspectives given in the BN framework on gender (Bjerrum Nielsen, Citation2003). This was since the teachers were referring to different levels of influences on the assessment situation, levels that could correspond to the perspectives. The BN-framework had been used in an earlier work (Rensaa & Fredriksen, Citation2022). The overall general level in this is the structural perspective. In the BN framework, this is about traditional practices between genders established as social structures in the society. In the assessment situation, this refers to the society’s formalized rules and regulations prevailing in the home exam situation. A similar resemblance was found between the symbolic gender and symbolic assessment issues. In the BN framework, this is about what becomes normal and natural for women and men to do over time since living with the social structures in the society. In home examinations, this encapsulates practices established within the frames given by rules and regulations. At the personal level, teachers in the present study referred both to types of interactions between students and to personal decisions made by each student. Both had parallels to perspectives in the BN framework. Interactional gender in the BN framework is about the continuous social interaction where gender perspectives are created, seen as something that is done. In an assessment situation, this is how students collaborate. The final and most local level of analysis is each person’s personal perspective. In BN’s framework, this perspective refers to gender as a personal matter where individuals shape their lives. Bjerrum Nielsen’s split this in subjectivity; the “what you are”, and identity; the “what you have”, both being personal feelings. In the current data, reports about personal perspectives are given on a meta level by the teachers’ reference to students’ personal behavior or actions. Subjectivity deals with “what students are” while identity may be interpreted as how much work each student has chosen to put into the mathematics course or exam. The BN framework acted by this as a catalyst for a revision of the initial developed themes, which is part of the final step in a thematic analysis (Braun & Clarke, Citation2006).

To exemplify the analysis, the following quote refers to the use of essays over a longer period of time as assessment form:

T1: Of course, it is possible to get someone else to write an essay for you, but it requires much more than timed home exams. We therefore believe that this happens to a much lesser extent.

Originally, this quote was coded as chosen level of cooperation on exams as students may take advantage of the situation and leave the work to someone else, thus being part of a personal setting theme. However, by relating to Bjerrum Nielsen’s personal perspectives, two realizations were done. The first was that it is about levels of misconducts rather than having a personal perspective. Thus, the final coding of the quote was threshold of misconducts within a context theme. Secondly, the quote illuminates the difference between collaboration being a shared work between students (Hadjerrouit, Citation2012) and cooperation where tasks are split between students (Barkley et al., Citation2014). This clarified the interactional theme.

To test the reliability of the coding, a fellow researcher was given a longer data sample together with the list of codes within each theme. The match/mismatch ratio in this test was 19:7. All mismatches represented one researcher finding more of the existing codes in the data than the other, the fellow researcher in five cases, the author in two. A follow-up discussion of this enriched the view on the data and was useful in a final review of the coding.

Some limitations

The collected data comprise a relatively low number of responses, 9 in all, which implies that there are mathematics communities in Norway missing. It is, however, a case study which can “contribute uniquely to our knowledge of individual, organizational, social and political phenomena” (Yin, Citation1984). While two teachers did mainly state facts, thus are not quoted, their feedback was valuable to get information about what has been done in universities around Norway.

Data received as written texts are not as rich as if interviews had been conducted. Additionally, follow-up questions are not possible. A favorable argument, though, is that answers could be submitted anonymously with an opportunity to reveal thoughts without being recognized. This was a weighty argument in the present case.

Analysis

The previous section explains how the BN framework inspired themes that cover the different perspectives on home examinations, resulting in the following reorganized themes:

  • Structural influenced assessment issues (SIAI)

  • Symbolic assessment issues (SAI)

  • Interactional assessment considerations (IAC)

  • Personal assessment values (PAV)

Similar to how gender perspectives represent different points of view to look at the same situation (Wedege, Citation2007), the themes capture different lenses on the assessment modes described by the university teachers. Each theme is presented by codes (in italics) and extracts from the data set to illustrate how the themes were relevant to the data.

SIAI; Structural influenced assessment issues

Rules of the society are given by law regulations from the Norwegian government, stating provisions to offer educations at a high international level (Lovdata, Citation2005). In this, the discipline level is important. Relevant to assessment issues is §5 about students’ right to appeal. In home exams due to Covid restrictions, however, inevitable issues are proctoring and threshold of misconducts since the setting is different from the traditional proctored school examinations:

T6: For my part, I have not seen myself able to ensure that no one else takes the exam. Since we do not use e.g. proctoring (video surveillance) it is almost impossible to know about this. Instead, I have focused my efforts on making collaboration between students more difficult. This is often done by dividing the course into about 5–6 topics, and a selection (about 5) different tasks are made for each of these topics. The exam draws a random task from each topic for each student.

A major worry for this teacher is the possibility of a substitute student doing the exam, which represents the dominating issue of authenticating learning (Butler-Henderson & Crawford, Citation2020). This is the worst type of misconduct as the student has not done any part of the exam himself. Teacher T6 has capitulated. He pursues an assumption that the students do the exams themselves and tries to meet the challenge of possible collaboration by giving each student a unique set of tasks on the exam.

SAI; Symbolic assessment issues

Symbolic assessment issues is about types of tasks, as illuminated by T6 above, but also types of exams, types of marks and the task setting like time and workload. Still, preparing students for the type of tasks is important, especially if the format is new. The following teacher comment on going from earlier years’ written examination to online oral examination when the society was closed down:

T9: We arranged for oral exams in spring 2021. This had to be prepared and we did it by arranging oral trial exams. Before the real oral exam, we asked each student which grade she or he aimed at getting. Then we asked questions on the exam in line with this level of difficulty to see if they met the requirement.

Earlier year’s exam tasks are often interpreted by students as guidelines to what are the assessment requests (Entwistle & Entwistle, Citation1991). If the assessment format is new - like described by T9 – the guidelines are changed. Iannone et al. (Citation2020) explain how preparing for oral examination by solving previous years’ closed book exams do not help. Thus, students would need to prepare for the new format in other ways. In the above case, trial exams were run. As part of this, the students got a clue about which grade level in mathematics they were at. Thus, it was possible to ask them to suggest their grade and let the level of difficulty in the examination questions correspond to this.

IAC; Interactional assessment considerations

In an assessment situation, codes within interactional considerations are dominated by collaboration between students statements. There are assessment modes where such collaboration is permitted—even presumed as in project works and some portfolios, but mostly collaboration is not permitted. Other codes are about infection of Covid as the lockdown situation made collaboration take new forms. Also, interaction in terms of common agreement between teachers and students on what is permitted is a part of the interaction theme. This last code is about trust as teachers require no collaboration and rely on students to act accordingly. Teachers may ask students to sign a vote of confidence on this, exemplified by Richardson (Citation2021, Figure 1). One of the teachers illuminates another way of doing this; assessment as a type of joint work between students and teachers with a common goal of getting the best assessment situation:

T3: For written exams, in spring 20 we turned them into 1-week home exams with workload corresponding to 1 day and with a pass/fail grade to reduce stress for students and teachers. It worked very well and turned the exam into a big learning activity (collaboration was allowed).

The quote illuminates how students may be trusted, saying that the assessment situation is one that students and teachers can jointly take the responsibility for. Collaboration challenges have been removed, allowing students to work together. This way of dealing with students is opposite of asking them to sign a vote of confidence since students here are given the trust expressed by the ministry as “responsibility for own learning” (KUF, Citation2001). In such situations, some students may grow with the confidence and act accordingly. Others, however, may take advantage of this, seeking to reduce own effort when being allowed to collaborate.

PAV; Personal assessment values

Subjectivity as part of the personal assessment values deals with “what students are”, and includes negative feelings towards the examination like being nervous, stressed, frustrated and so on. Similarly, positive feelings like being satisfied and conscientious are also included in the subjectivity perspective on assessment. Identity as the second part of the personal assessment values may be interpreted as how much work each student has chosen to put into the mathematics course or exam. Chosen level of work before the exam includes preparedness and to what extent students rely on supporting material and collaboration. Chosen level of work on the exam has the same components but with an addendum of drawing on misconducts. The following excerpt illuminates how personal behavior influence situations, where the group of students are of an authoritative type:

T4: The experience is that these students work very independently, almost too much. If I say that a project work can take place in groups they create groups, otherwise they automatically work individually. Sometimes I miss the informal discussions where we can find answers together. But the positive thing about this is that they work 100% individually, also during tests.

Nearly all the teachers in the present study brought up personal values to the students when discussing assessment settings, most often associated with stress or cheating. However, the quote by T4 emphasizes a vital argument about students in an assessment situation; their work habits are linked to their personal values.

Discussion

Answer to the research question about considerations that influence the assessment modes is given by the components of the adjusted framework described in the previous section. The present section discusses misconducts by utilizing the framework.

SIAI: Structural influenced assessment issues

SIAI is about the contexts of written home exams. With respect to misconducts, a major issue is proctoring which in such exams usually means online proctoring. In this, law regulations about online proctoring are fundamental. A number of the teachers had complaining opinions on this issue, as exemplified by the following quote:

T1: It is a general problem that no matter how you set up an individual home exam, you cannot be sure that you are testing the individual student’s skills when there are no possibilities for surveillance.

In a proctored school examination students cannot use illegal tools, discuss solutions with others or get someone else to do the exam (Hylton et al., Citation2016). In an un-proctored home exam such misconducts may take place (Trenholm, Citation2007). Still, not all the teachers agreed to this type of surveillance:

T3: We have done nothing but make the students responsible and emphasize that we have a trust-based system. We have little faith in video surveillance.

This statement is in line with what laws in Norway state (Lovdata, Citation2018). The quotes given by T1 and T3 do illustrate the variety of opinions on the proctoring issue when having exams at home. Discussions about this have been raised frequently among university teachers in Norway during the pandemic. Some teachers advocate the use of online proctoring while university managers explain why this is not possible (Mikkelsen, Citation2022). Nevertheless, enquiries have shown that a majority of Norwegian universities have looked into possibilities for monitoring students on home exams (Bye, Citation2022).

The context perspective includes many components of importance when discussing online proctoring. One is the technical arrangements, raising issues about what equipment one can ask students to buy to set the scene of an online proctored home exam. Another is to know what is actually proctored since students may arrange a camera to avoid visibility of not permitted tools. In Norway, there are laws to regulate some of this, like what is reasonable to ask students to buy (Lovdata, Citation2006, §3.1) and what individuals can be asked to monitor in a private atmosphere (Datatilsynet, Citation2022). However, there are also individual aspects when it comes to online proctoring issues. The setting may create stressful situations and anxiety among students (Butler-Henderson & Crawford, Citation2020), feelings that may prevent them from performing optimally on the exam. Online proctoring is about interrupting privacy since home is a personal arena. Such interventions are strictly regulated in Norway, based on EU’s General Data Protection Regulation (GDPR) Article 6 (Lovdata, Citation2018). It requires an alternative assessment mode similar to being online proctored at home for students who are not comfortable with online proctoring. This is difficult when the society is locked down.

The above arguments show that regulations about online proctoring in Norway do to a very large extent defend privacy of the students. This happens at the sacrifice of preventing misconducts since online proctoring is intended to ensure a fairer examination. To disregard possibilities of cheating by stating that students are responsible for their own learning may be too idealistic. It is based on the presumption that students read subjects to learn and be able to use their knowledge, not to achieve the best grade possible. Still, un-proctored exams may tempt some students to cheat to achieve better grades than they deserve. Thus, it is reasonable to conclude that law regulations in Norway hinder fair assessment results when students have exams at home. Arguments about plagiarism control are less relevant in mathematics, at least for traditional calculation tasks. To make plagiarism visible tasks must be formulated differently, for instance by asking students to explain a concept rather than doing a calculation. This raises another perspective on home examinations; the established practices of task formulations.

SAI: Symbolic assessment issues

SAI is about the established practices that set the educational scene for students’ assessment situation. Some of the codes in SAI are of a practical nature, like types of exam and types of marks, others deal with notifications like which aids are permitted and what adaptions may be done when having few or many students in a course. With reference to misconducts, tasks may be designed with an aim of minimizing possibilities of cheating (Hearn Moore et al., Citation2017). The teachers in the present investigation had various suggestions on how the tasks could be formulated to reduce possibilities of cheating. The following excerpt is an example:

T5: What we did to some extent was to have more reasoning types of tasks rather than pure arithmetic tasks, since copying reasoning is more visible than copying correct answers. Many of those I reported for cheating I reported because they had identical WRONG answers. Identical correct answers are more difficult to get people for.

Giving students reasoning types of tasks sounds wise when students are doing exams at home. When tasks are mainly calculative, it is difficult to reveal collaborations. Besides, there are many available online tools that can do calculations for you, also showing arithmetic steps, e.g. Wolfram Alpha and PhotoMath (Richardson, Citation2021). Calculatory tasks may tempt students to use such tools. One of the teachers proclaimed his awareness of such tools with an opening calculatory task on a home exam that asked students to use an optional tool to find an answer. He called this a warm-up task. When both students and teachers know about the available tools, it is no point in pretending their nonexistence. A warm-up task substantiates why the rest of the tasks cannot ask similar types of questions. Yet another reason why procedural tasks should be avoided is that they may be solved without understanding the mathematics. Students can just memorize solutions to tasks that can be solved in a step-by-step-manner and solutions that - with small efforts - can be translated to other situations (Lithner, Citation2008).

The above arguments are in favor of not asking calculatory questions in home exams, but arguments about giving other types of tasks are not new. This is not primarily to prevent misconducts, but rather to enhance students’ mathematical knowledge. One example is essays as examination form (Iannone & Simpson, Citation2011, 2021). Essays were also emphasized by one of the teachers in the present study to be valuable (see the quote in the methodology section), preferably in more advanced mathematics courses. This sounds reasonable since freshmen students may lack skills in written communication about mathematics (Rensaa, Citation2014). Another example is tasks where students are asked to find errors in a given argument (Richardson, Citation2021). Such tasks were also stated by one of the teachers to be useful on home exams since they make use of computational tools more difficult. When doing such tasks, students can of course work together to find the wrong parts. But if asked to argue why there is an error, the description can reveal collaboration by plagiarism control.

The variety in types of tasks that can be given in home exams to limit the possibilities of cheating is illuminated by the codes in the analysis section. Common to all is that before giving such new types of exam tasks, students should be prepared for the change. This is since previously given examination tasks serve as guidelines to what students interpret as relevant to learn (Entwistle & Entwistle, Citation1991), i.e. the real curriculum (Ramsden, Citation2003). Tasks represent a type of “deal” between the students and the teacher as students often assume this year’s exam to be in line with the previous year’s exams. Breaking this deal by giving new types of tasks should therefore be prepared. Lithner (Citation2008) has asserted the value of giving students tasks that ask for more than performing a set of algorithms with solutions familiar to the student, rather than asking for more creative solutions involving sequences of actions that are new to the student. Changing the types of tasks is valuable, but still problematic if students are not ready for them.

IAC; Interactional assessment considerations

If a teacher makes a set of tasks for a home examination and states permitted tools on the set, it is a matter of trust between the teacher and the students that these requests are met. The philosophical nature of trust is not to be discussed here. A vital component, though, is collaboration between fellow students during exams. In some settings, such collaboration is wanted and maybe presumed, like in portfolios where students work with a variety of projects that go into their file. Such collaboration prepare students for their working life since team-based problem solving is regarded as essential for the society to function (Griffin et al., Citation2012). It presumes that the students contribute with some acquired knowledge and this is not necessarily realized by all students:

T9: The good and the average students seem to benefit from working together in a home exam. The good ones learn a lot from explaining solutions to the average ones, and the average ones are able to understand the explanations. The losers are the low achieving students. They believe that studying is not necessary since they can ask for help from the good students on the exam, but then their limited knowledge prevent them from understanding the explanations during the exam. Thus, they solve the tasks in such a way that it is revealed that they have not understood what they are doing.

By low achieving students, the teacher means those performing poorly (in Norwegian; “svake studenter”). The teacher’s reflection stands in contrast to T3’s statement in the analysis section, stating that collaboration on home exams has been appreciated as a joint work of learning. T9 has experienced some clear disadvantages of collaboration. He or she reflects on low achieving students who may take advantage of the situation in a non-constructive way. These students interpret the situation as an opportunity to draw heavily on high achieving students through collaboration on un-proctored home exams. This indicates that students may interpret the value of collaboration in different ways. Students learn when observing and interacting with others, but new knowledge must complement their existing understanding (Barkley et al., Citation2014). If low achieving students regard it as an opportunity to exploit high achieving students, rather than utilizing possibilities of collaboration to increase knowledge in mathematics, it becomes a type of false security. It is based on a belief that it is possible to pass the exam without much work. Collaboration represents a joint effort among students to solve problems collectively (Hadjerrouit, Citation2012) and each individual needs to bring knowledge into the work to increase own learning (Barkley et al., Citation2014). If doing so, collaboration can be valuable and rewarding for learning in a home exam situation. However, this depends heavily on the individual student’s attitude towards what is the purpose of the exam. If the opinion is that an exam acts as a motivator for learning, collaboration can be beneficial. As highlighted by T9‘s explanation, there are still some students who have the wrong approach to such collaboration. Therefore, each student’s personal attitude is a highly relevant issue.

PAV; personal assessment values

The previous T9 statement about low achieving students not necessarily profiting from collaboration on home exams may be interpreted through PAV lenses in terms of identity and subjectivity. Low achieving student may not be able to realize what their lack of knowledge may entail. If so, they believe that help from fellow students or tools will be sufficient to pass the course without much work. This reveals an identity as student in a community of students and merely by having this role being allowed to reap solutions during a home examination. According to Gee (Citation2000), identity is “being recognized as a certain ‘kind of person,’ in a given context” (Gee, Citation2000, p.99). The subjectivity aspect is students with high self-efficacy since self-efficacy by Bandura is defined as “an individual’s own judgement of how well one can execute courses of action required to deal with prospective situations” (Bandura, Citation1982, p.122). Self-efficacy is shown to be closely related to task performance in mathematics among engineering students in Norway (Rensaa & Tossavainen, Citation2022), but this connection may be disturbed if thinking that help from others is enough to perform well on home exams. Low achieving students who reduce own effort in these terms have a mismatch between personal perceptions and reality. Such mismatches may develop if being part of a collaboration community that provides solutions effortlessly. In a home exam, there is limited time to exchange full explanations. Furthermore, similar solutions may be regarded as plagiarism and anxiety and stress may affect the teamwork (Butler-Henderson & Crawford, Citation2020). High-performance students may not have time to discuss solutions in detail, neither do they want to run the risk of being caught for plagiarism. Similarly, relying on use of tools during an exam may turn out to be time-consuming and may require some adjustments by the student using it, all of which can reveal lack of understanding as described by teacher T9 earlier.

Collaboration is valuable in a learning process, but not all students realize this. One of the teachers concludes that individual examinations are necessary addendums:

T7: Mandatory assignments/portfolios are of great value, but we see that the correlation between results here and in the proctored exams (like school exams) is not satisfactory: collaboration is good and valuable, but many do not take it seriously and do not learn the content well enough unless knowing that they are ‘seen in the cards’ individually.

Conclusion

Assessment in mathematics is often seen to be summative proctored exams (Iannone & Simpson, Citation2011, Citation2021), and a sudden move to un-proctored home examinations change the setting significantly. The present paper has investigated what considerations university teachers in Norway had to take into account in order to meet challenges arising in home examinations in mathematics. When the present teachers were asked about this, the issue of cheating was not surprisingly a major concern. Analyses of their feedback show that these worries had different perspectives, relating to both physical settings about the context (SIAI) and how to arrange the exam (SAI), and individual settings about interactions between students (IAC) and personal dealings (PAV). At the context level, absence of possibilities to proctor students at home set the scene for decisions on how to arrange the exams, both type of exams, type of content and permitted aids. The teachers’ arguments, though, were both supporting and problematizing online proctoring with ethical issues raised. These issues are framed by the Norwegian society’s interpretation of rights as an individual. It shows that online proctoring cannot be utilized in an assessment situation unless the interpretation at the society-level change. At the individual level, both interactions between students and personal dealings depend upon each student’s decisions. These are personal decisions determined by every single student’s conscience. However, as brought up in the discussion, students may not realize their own limitations and therefore do not make wise decisions. The conclusion from this is that the answer to the research question about considerations around home exams is indefinite since it depends on the society’s interpretation of individuality and the individual students’ conscience. Still, problematizing the issue is important. Arguments brought forward by the teachers in the present study and discussed accordingly contribute to the discussion on ethical issues that probably take place in most societies around the world.

The educational setting of the present investigation was highly unusual as the Norwegian community in March 2020 was locked down over night due to Covid. Nearly no preparations had been made since no one had experienced such consequences of a pandemic earlier. Accordingly, assessment practices had to be changed in the middle of the semester. Degrees of lockdown varied with infection rates, but lasted in Norway for nearly two years. Periodically, the universities opened for students to have more traditional examinations like proctored school examinations, but this brought about other types of worries like the following quote shows:

T5: This worked well, but the students experienced it as unfair to (potentially) be deprived of the opportunity to take exams because they had a cold, so many showed up with a cold without reporting it. From an infection prevention point of view, this was not wise.

Butler-Henderson and Crawford (Citation2020) emphasize that research on online examinations has been more engaged with technical challenges related to cheating than on social and ethical aspects and that the latter needs research considerations in the future. The current framework encourages a broader view, considering issues from four perspectives that are relevant lenses for interpreting home examinations. However, a conclusion based on the discussion in this article is that home exams in our country should be avoided. This is unless there are particularly obedient or honest students. Within the permitted settings of such exams, one simply cannot assess the students fairly as there are so many temptations to seek help. If getting help, the home exam will not assess all students’ knowledge, only some students’ knowledge and some students’ cleverness in using available helping aids. Tools designed specifically for mathematics support have been available and therefore a problem for a long time. But the issue has gained new relevance for a wider range of subjects in wake of the introduction of ChatGPT (https://openai.com/blog/chatgpt). Since students have such aids at hand, home exams should be avoided. As long as online proctoring is not allowed as in Norway, proctored school exams should be the preferred option. The results are relevant for future educational settings post Covid since change in the demographic profile of students increases topicality of online assessments.

Acknowledgments

Sincere thanks to the 9 experts who gave their time freely to answer the questionnaire. Also, thanks to professor Raymond Kristiansen, UiT, for proofreading the text.

Disclosure statement

No potential conflict of interest was reported by the author.

Additional information

Notes on contributors

Ragnhild Johanne Rensaa

Ragnhild Johanne Rensaa received her PhD in mathematics from the Norwegian University of Science and Technology in Trondheim, Norway. She is professor of mathematics at the Arctic University of Norway and holds a particular interest in mathematics education. Her research areas are mathematics teaching and learning of engineering students but also mathematical content and gender perspectives in mathematics.

References

  • Bandura, A. (1982). Self-efficacy mechanism in human agency. The American Psychologist, 37(2), 122–13. https://doi.org/10.1037/0003-066X.37.2.122
  • Barkley, E. F., Major, C. H., & Cross, K. P. (2014). Collaborative learning techniques: A handbook for college faculty (2nd ed. ed.). Wiley & Sons, Inc.
  • Biggs, J. B., & Tang, C. (2011). Teaching for quality learning at university: What the student, does (4th ed.). McGraw-Hill: Open University Press.
  • Bjerrum Nielsen, H. (2003). One of the boys? Doing gender in scouting. World Organization of the Scout Movement.
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers and Education, 159, 104024. https://doi.org/10.1016/j.compedu.2020.104024
  • Bye, K. (2022). Ti av elleve læresteder har vurdert å overvåke studentene [Ten out of eleven educational institutions have considered monitoring the students]. Retrieved from https://khrono.no/ti-av-elleve-laeresteder-har-vurdert-a-overvake-studentene/667021
  • Crawford, J., Butler-Henderson, K., Rudolph, J., Malkawi, B., Glowatz, M., Burton, R., Magni, P. & Lam, S. (2020). COVID-19: 20 countries’ higher education intra-period digital pedagogy responses. Journal of Applied Learning & Teaching, 3(1), 9–28. https://doi.org/10.37074/jalt.2020.3.17
  • Datatilsynet. (2022). Kameraovervåking - hva er lov? [Camera surveillance - what is allowed?]. https://www.datatilsynet.no/personvern-pa-ulike-omrader/overvaking-og-sporing/kameraovervaking/
  • Dineva, S., Nedeva, V., & Ducheva, Z. (2019). Digital generation and visualization in E-learning. Proceedings of the 14th international conference on virtual learning ICVL 2019. University of Bucharest.
  • Eaton, S. E., & Turner, K. L. (2020). Exploring academic integrity and mental health during COVID-19: Rapid review. https://doi.org/10.5281/zenodo.4256816
  • Engelbrecht, J., Llinares, S., & Borba, M. C. (2020). Transformation of the mathematics classroom with the internet. ZDM – Mathematics Education, 52(5), 825–841. https://doi.org/10.1007/s11858-020-01176-4
  • Entwistle, N. J., & Entwistle, A. (1991). Contrasting forms of understanding for degree examinations: The student experience and its implications. Higher Education, 22(3), 205–227. https://doi.org/10.1007/BF00132288
  • Gee, J. P. (2000). Identity as an analytic lens for research in education. Review of Research in Education, 25, 99–125. https://doi.org/10.2307/1167322
  • Greenhow, M. (2015). Effective computer-aided assessment of mathematics; principles, practice and results. Teaching Mathematics and Its Applications, 34(3), 117–137. https://doi.org/10.1093/teamat/hrv012
  • Griffin, P., McGaw, B., & Care, E. (2012). Assessment and teaching of 21st century skills. Springer.
  • Hadjerrouit, S. (2012). Investigating technical and pedagogical usability issues of collaborative learning with Wikis. Informatics in Education an International Journal, 11(1), 45–64. https://doi.org/10.15388/infedu.2012.03
  • Hearn Moore, P., Head, J. D., & Griffin, R. B. (2017). Impeding students’ efforts to cheat in online classes. Journal of Learning in Higher Education, 13(1), 9–23.
  • Heuvel-Panhuizen, M. V. D., & Becker, J. (2003). Towards a didactic model for assessment design in mathematics education. In A. Bishop, C. Clements, J. Keitel, J. Kilpatrick, & F. K. S. Leung (Eds.), Second International Handbook of Mathematics Education (pp. 689–716). Kluwer Academic Publisher.
  • Hylton, K., Levy, Y., & Dringus, L. P. (2016). Utilizing webcam-based proctoring to deter misconduct in online exams. Computers and Education, 92-93, 53–63. https://doi.org/10.1016/j.compedu.2015.10.002
  • Iannone, P., Czichowsky, C., & Ruf, J. (2020). The impact of high stakes oral performance assessment on students’ approaches to learning: A case study. Educational Studies in Mathematics, 103(3), 313–337. https://doi.org/10.1007/s10649-020-09937-4
  • Iannone, P., & Simpson, A. (2011). The summative assessment diet: How we assess in mathematics degrees. Teaching Mathematics and Its Applications, 30(4), 186–196. https://doi.org/10.1093/teamat/hrr017
  • Iannone, P., & Simpson, A. (2015). Students’ preferences in undergraduate mathematics assessment. Studies in Higher Education (Dorchester-On-Thames), 40(6), 1046–1067. https://doi.org/10.1080/03075079.2013.858683
  • Iannone, P., & Simpson, A. (2021). How we assess mathematics degrees: The summative assessment diet a decade on. Teaching Mathematics and Its Applications, 41(1), 22–31. https://doi.org/10.1093/teamat/hrab007
  • Jukes, I., McCain, T. D., & Crockett, L. (2010). Understanding the digital generation: Teaching and learning in the new digital landscape. 21st Century Fluency Project u.a.
  • KUF. (2001). Stortingsmelding nr. 27: Gjør din plikt - krev din rett [Do your duty - claim your rights]. https://www.regjeringen.no/contentassets/eebf61fb4a204feb84e33355f30ad1a1/no/pdfa/stm200020010027000dddpdfa.pdf
  • Lithner, J. (2008). A research framework for creative and imitative reasoning. Educational Studies in Mathematics, 67(3), 255–276. https://doi.org/10.1007/s10649-007-9104-2
  • Lovdata. (2005). Lov om universiteter og høyskoler (universitets- og høyskoleloven) [Act relating to universities and university colleges]. https://lovdata.no/dokument/NL/lov/2005-04-01-15
  • Lovdata. (2006). Forskrift om egenbetaling ved universiteter og høyskoler [Act on self-payment at universities and colleges]. Retrieved from https://lovdata.no/dokument/SF/forskrift/2005-12-15-1506
  • Lovdata. (2018). Lov om behandling av personopplysninger (personopplysningsloven) [Act on the processing of personal data (Personal Data Act)]. https://lovdata.no/dokument/NL/lov/2018-06-15-38/gdpr%2FARTIKKEL_6#gdpr/ARTIKKEL_6
  • Marriott, P., & Lau, A. (2008). The use of on-line summative assessment in an undergraduate financial accounting course. Journal of Accounting Education, 26(2), 73–90. https://doi.org/10.1016/j.jaccedu.2008.02.001
  • Mikkelsen, S. (2022). Faglærer sjekket markedet for juks på nett. Ble tilbudt en god besvarelse for 500 kroner [Teacher checked the market for online cheating. Was offered a good solution for 500 NOK]. Retrieved from https://www.universitetsavisa.no/andrey-chesnokov-eksamen-eksamensjuks/faglaerer-sjekket-markedet-for-juks-pa-nett-ble-tilbudt-en-god-besvarelse-for-500-kroner/206094
  • Miles, M. B., Huberman, A. M., & Saldaña, J. (2019). Qualitative data analysis : A methods sourcebook (Fourth edition. ed.). SAGE.
  • Ramsden, P. (2003). Learning to teach in higher, education (2nd ed). Routledge.
  • Rensaa, R. J. (2014). The impact of lecture notes on an engineering student’s understanding of mathematical concepts. The Journal of Mathematical Behavior, 34, 33–57. https://doi.org/10.1016/j.jmathb.2014.01.001
  • Rensaa, R. J., & Fredriksen, H. (2022). Gender perspectives on a flipped classroom environment. Cogent Education, 9(1). https://doi.org/10.1080/2331186X.2022.2115832
  • Rensaa, R. J., & Tossavainen, T. (2022). Norwegian freshmen engineering students’ self-efficacy, motivation, and view of mathematics in light of task performance. Nordic Journal of STEM Education, 6(1), 15. https://doi.org/10.5324/njsteme.v4i2.3927
  • Richardson, S. (2021). Mathematics assessment integrity during lockdown: Experiences in running online un-invigilated exams. International Journal of Mathematical Education in Science and Technology, 53(3), 1–11. https://doi.org/10.1080/0020739X.2021.1986161
  • Struyven, K., Dochy, F., & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education: A review. Assessment and Evaluation in Higher Education, 30(4), 325–341. https://doi.org/10.1080/02602930500099102
  • Suurtamm, C., Thompson, D. R., Kim, R. Y., Moreno, L. D., Sayac, N., Schukajlow, S., Silver, E., Ufer, S., & Vos, P. (2016). Assessment in Mathematics Education : Large-Scale Assessment and Classroom Assessment (1st ed.). https://doi.org/10.1007/978-3-319-32394-7
  • Trenholm, S. (2007). A review of cheating in fully asynchronous online courses: a math or fact-based course perspective. Journal of Educational Technology Systems, 35(3), 281–300. https://doi.org/10.2190/Y78L-H21X-241N-7Q02
  • Trenholm, S., & Peschke, J. (2020). Teaching undergraduate mathematics fully online: A review from the perspective of communities of practice. International Journal of Educational Technology in Higher Education, 17(1), 1–18. https://doi.org/10.1186/s41239-020-00215-0
  • Wanner, T., & Palmer, E. (2018). Formative self-and peer assessment for improved student learning: The crucial factors of design, teacher participation and feedback. Assessment and Evaluation in Higher Education, 43(7), 1032–1047. https://doi.org/10.1080/02602938.2018.1427698
  • Wedege, T. (2007). Gender perspectives in mathematics education: Intentions of research in Denmark and Norway. ZDM – Mathematics Education, 39, 251–260. https://doi.org/10.1007/s11858-007-0026-3
  • Yin, R. K. (1984). Case study research: Design and methods (1st ed.). Sage Publications.