1,408
Views
12
CrossRef citations to date
0
Altmetric
Research Article

Directing students to profound open-book test preparation: The relationship between deep learning and open-book test time

, , &
Pages e16-e21 | Published online: 23 Dec 2010

Abstract

Background: Considering the growing amount of medical knowledge and the focus of medical education on acquiring competences, using open-book tests seems inevitable. A possible disadvantage of these tests is that students underestimate test preparation.

Aims: We examined whether students who used a deep learning approach needed less open-book test time, and how students performed on open-book questions asked in a closed-book setting.

Method: Second- (N = 491) and third-year students (N = 325) prepared half of the subject matter to be tested closed-book and half to be tested open-book. In agreement with the Board of Examiners, some questions in the closed-book test concerned open-book subject matter, and vice versa. Data were gathered about test time, deep learning and preparation time. Repeated measurement analysis, t-tests and partial correlations were used to analyse the data.

Results: We found a negative relationship between deep learning and open-book test time for second-year students. Students scored the lowest on closed-book questions about open-book subject matter.

Conclusions: Reduction of the available test time might force students to prepare longer and deeper for open-book tests. Further research is needed to identify variables that influence open-book test time and to determine how restrictive this time should be.

Introduction

Medical knowledge is growing and changing fast (Glasziou Citation2007). It has therefore become important that students develop the competence to handle large amount of information (Cain Citation1979; Smidt Citation1999; Simons et al. Citation2000). Using open-book tests is a way to deal with the growing amount of knowledge and to assess the above-mentioned competence (Glasziou Citation2007; Heijne-Penninga et al. Citation2008a). A possible disadvantage of open-book tests is that students underestimate test preparation. The general research question analysed in this study is how test preparation influences open-book test performance.

Students perceive open-book tests as easy and unthreatening (Zeidner Citation1994; Ben-Chaim & Zoller Citation1997; Koutselini-Ioannidou Citation1997), which could be the reason that they experience less examination stress, and underestimate the need to prepare for open-book tests (Feller Citation1994; Loulou Citation1995; Hay Citation1996; Study Guide Zone Citation2009). Broyles et al. (Citation2005) found that after the implementation of open-book tests, 40% of the students changed their study behaviour in such a way that they spent less time studying and read the subject matter less deeply. Other studies indicated that open-book tests were preceded by less subject matter revision (Boniface Citation1985; Koutselini-Ioannidou Citation1997). Students also spent less time on open-book than on closed-book test preparation (Boniface Citation1985; Eilertsen & Valdermo Citation2000; Broyles et al. Citation2005; Heijne-Penninga et al. Citation2008b).

In an academic setting, the preferred way of test preparation is the deep learning approach (Bruinsma Citation2003; Abraham et al. Citation2006). Students are advised to prepare in a deep way for open-book tests: ‘prepare a summary’, ‘formulate your own commentary and arguments’ (Study Guide Zone Citation2009) and ‘conduct open-ended discussions on specific topics with peers’ (Loulou Citation1995). They are also told to spend as much time for open-book tests preparation as they would spend for closed-book tests preparation.

Students who did use a deep learning strategy or really spent much time on open-book tests preparation did not have higher test scores (Heijne-Penninga et al. Citation2010). A possible explanation for this finding is that the amount of time available to complete the test is too large. If students get ample time to complete an open-book test, students with less effective learning strategies are able to compensate for their arrears by consulting their references more often. However, it is likely that students who do not use a deep learning approach and/or spend less time on open-book test preparation need more time to answer all open-book questions. The available test time could be a factor that influences open-book test preparation more than closed-book test preparation, because open-book test time is not just being used to fill in the right answers, but also to consult reference sources. To our knowledge, no studies have been performed on the relationship between time spent on test preparation, deep learning, time needed to answer open-book questions and performance. Therefore, we examined how open-book test preparation (deep learning and individual time spent) is related to test time needed to answer open-book questions. Furthermore, we investigated, in a quasi-experimental setting how students performed when they prepared for an open-book test and had to answer the questions without consulting their references and vice versa.

Method

Context

The study was performed at the University Medical Center Groningen. Every written knowledge examination in the Bachelor's programme of the medical curriculum covers the theory of a 10-week module. Each module is examined in three sessions. Until 2003, it was common to assess students by closed-book tests. Since 2003, tests have comprised a closed-book and an open-book part. The closed-book part is concentrated on core knowledge, which is the knowledge that every medical professional should know immediately, without needing to consult reference sources. The open-book part covers the backup knowledge, which is the knowledge students need to understand and apply properly, with the use of reference sources, if so desired. The curriculum coordinator and the block coordinators jointly make the distinction between core and backup knowledge. The curriculum coordinator is responsible for the content and design of the entire curriculum and the block coordinators are responsible for the subsequent blocks. The core knowledge is explicitly communicated to students.

All tests are in multiple choice format and start with the closed-book part. After answering the closed-book questions, students hand over their answer forms and then can open their references to complete the second part of the exam, containing open-book questions. A complete test consists on average of 48 open- and 80 closed-book questions. In a previous study, open- and closed-book tests were shown to have equal levels of difficulty and good reliabilities (values ranging from 0.71 to 0.85), with the reliabilities of open-book tests being slightly lower than those of closed-book tests (Heijne-Penninga et al. Citation2008a).

Respondents and procedure

Second (N = 491) and third-year (N = 325) medical students participated in this study. They had been exposed to open- and closed-book questions from the first year of their medical training. To test our expectation in a real assessment setting without breaking the current examination rules, the following study design was developed in two modules.

At the start of the module the students were informed that the first of their regular three assessment sessions was included in an experimental context. They were informed that, for the benefit of this study, some questions in the closed-book part would concern open-book subject matter and vice versa, but that these questions would be merely for research purposes and would not be part of their final test scores. The students were instructed to prepare the subject matter from the second and fourth weeks for a closed-book test and the subject matter from the first and third weeks as usual – the core knowledge for the closed-book part and the backup knowledge for the open-book part of the test.

To avoid disadvantages from answering the experimental questions, the students got extra time to answer these questions. They had to answer 70 closed-book questions in 2.25 h and 30 open-book questions in 3 h.

Overall, the tests contained four types of questions, as given in .

Table 1.  Different types of questions

As given in , type 1 questions assessed core knowledge from weeks 1 and 3 and all knowledge from weeks 2 and 4. The students were instructed to prepare this knowledge for a closed-book test and type 1 questions were indeed part of the closed-book test. Type 2 questions assessed the backup knowledge from weeks 1 and 3. The students were instructed to prepare this knowledge for the open-book test, but the type 2 questions turned out to be part of the closed-book test.

Allowance was made to ensure that the amount of knowledge and questions did not overload the students. The faculty Board of Examiners approved this project.

Instruments

Deep learning and preparation time

The test for deep information processing (DIP) was used to measure the level of deep learning (Schouwenburg & Schilder Citation1996). The DIP has been validated in the Netherlands and consists of 24 items divided into three scales – ‘Critical reading’, ‘Broaden one's context’ and ‘Structuring’. Students completed this questionnaire twice, once with respect to their open-book test preparation and once with respect to their closed-book test preparation. The formulation of the items was tailored to the assessment format.

Examples of items are ‘When I read a text while preparing for this open/closed-book test I quickly distinguish facts from side issues’ (critical reading), ‘When I read a text while preparing for this open/closed-book test I compare what I read with things I already know’ (broaden one's context) and ‘When I read a text while preparing for this open/closed-book test I make notes of the most important issues’ (structuring). All the items were scored on a 5-point Likert scale from 1 ‘never’ to 5 ‘always’. An extra question, ‘How many hours a week did you spend preparing for this open/closed-book test?’ was added to the questionnaire.

Test time

During the tests, students were asked to note the time they started and the time they finished the tests.

Open- and closed-book test performances

Our investigation was focused on the first two tests of two modules. We gathered test scores for all students and calculated a score for each type of question (type 1 – closed-book questions about closed-book subject matter; type 2 – closed-book questions about open-book subject matter; type 3 – open-book questions about open-book subject matter; and type 4 – open-book questions about closed-book subject matter).

Analyses

Differences in mean scores between second- and third-year students were calculated with independent t-tests. Partial correlations were calculated to examine the relationships between open-book test time, preparation time, deep learning and test score. Partial correlation is the correlation of two variables while controlling for a third one or more other variables. Repeated measurement analysis was used to analyse the mean differences between the test scores.

Results

One-third of the respondents were male and two-thirds were female, which is representative of our student population. The mean ages of the second- and third-year students were 20.5 and 21.3 years, respectively. Log transformations were used to transform the non-normal distributed data.

Test preparation and test time

The questionnaire for DIP was completed by 419 second-year (85%) and 244 third-year students (75%). gives the mean scores for open-book test time, preparation time and level of deep learning. The mean scores on the Test for DIP (level of deep learning) are comparable with mean scores of other university students (Schouwenburg & Schilder Citation1996).

Table 2.  Mean scores and standard deviations (SDs) of test time, preparation time and level of deep learning

Second- and third-year students differed significantly in open-book test time (t(416) = 6.37, p < 0.01), open-book preparation time (t(662) = 2.25, p < 0.01) and deep learning (t(661) = −2.99, p < 0.01). Third-year students spent less test time, less preparation time and prepared more deeply. On an average, students spent 4.9 min on answering an open-book question. The correlations between open-book test time, preparation time and level of deep learning are given in .

Table 3.  Correlations controlled for preparation time for the closed-book test (cb prep time) and level of deep learning for the closed-book test (cb dl)

Open-book test time only has a significant relationship (r = −0.14) with deep learning of second-year students. For third-year students this relationship was not found. Furthermore, a relationship was found between preparation time and level of deep learning in both study years (r = 0.28 and 0.41, respectively).

Test preparation and performance

Complete test scores were available for 365 second-year (74%) and 270 third-year (83%) students. gives the mean score on each type of question in proportion and shows that third-year students outperformed the second-year students.

Table 4.  Score in proportion for the different types of questions

and show graphic representations of the comparison of the scores, according to question type. In both study-years, students scored the lowest on the closed-book questions about open-book subject matter (ob in cb = type 2; p < 0.00).

Figure 1. Mean score against type of question year 2. • Differs significantly from all other scores (p < 0.00). 1, Closed-book question in closed-book test; 2, open-book question in closed-book test; 3, open-book question in open-book test; and 4, closed-book question in open-book test.

Figure 1. Mean score against type of question year 2. • Differs significantly from all other scores (p < 0.00). 1, Closed-book question in closed-book test; 2, open-book question in closed-book test; 3, open-book question in open-book test; and 4, closed-book question in open-book test.

Figure 2. Mean score against type of question year 3. ▴ Differs significantly from 3 and 4 (p < 0.05); • Differs significantly from all other scores (p < 0.00). 1, Closed-book question in closed-book test; 2, open-book question in closed-book test; 3, open-book question in open-book test; and 4, closed-book question in open-book test.

Figure 2. Mean score against type of question year 3. ▴ Differs significantly from 3 and 4 (p < 0.05); • Differs significantly from all other scores (p < 0.00). 1, Closed-book question in closed-book test; 2, open-book question in closed-book test; 3, open-book question in open-book test; and 4, closed-book question in open-book test.

Discussion/conclusion

In this study, we examined the relationships between test preparation (deep learning and individual time spent), and test time needed to answer open-book questions and performance. The results showed that second-year students who used a deeper learning approach for an open-book test needed less time to complete all questions. In addition, students had most difficulties in answering questions about open-book subject matter that turned out to be tested under closed-book conditions.

The first part of our study concentrated on the possible relationship between open-book test preparation (deep learning and individual time spent) and test time needed to answer open-book questions. We only found a negative relationship between deep learning and open-book test time for second-year students. This indicates that students who focus on understanding and interpreting the learning material will be better able to locate information and answer open-book questions more quickly (Francis Citation1982).

We found that third-year students used a deeper learning approach and spent less open-book test time than second-year students. However, we did not find a relationship between deep learning and test time within the group of third-year students. A possible reason for not finding this relationship might be that third-year students had gained more experience with open-book tests. Preparing for open-book tests and using references during the tests seem to be difficult skills which can be improved by experience (Broyles et al. Citation2005). Consequently, third-year students needed less time to complete the questions, regardless their level of deep learning. Another reason for not finding a relationship might be that, due to the quasi-experimental setting we used, we had extended the available test time to ensure that students were not disadvantaged by participating in our study. This extra time could have given the students an opportunity to compensate for poor preparation by looking up more information (Weber et al. Citation1983). Besides, Broyles et al. (Citation2005) found that students did not adjust their study behaviour when they prepared for an open-book test, because it was a timed test. By restricting the open-book test time, students who prepared in a more deep way would benefit. If students are aware of this advantage, they might consider preparing for open-book tests as thoroughly as for closed-book tests. However, further research is needed to identify the variables that influence open-book test time and to determine how restrictive the response time for open-book questions should be.

The results of our study did not reveal a relationship between individual time spent on test preparation and test time used. This counter intuitive outcome might be explained by the finding that, next to students who use a deep learning approach, students who prepare less efficiently need more preparation time (Wilkinson et al. Citation2007). Inefficient learners probably need more time to answer all questions. Therefore, the positive influence of longer preparation time of deeper learners could have been neutralized by the negative influence of inefficient learners, resulting in no relationship being found between preparation time and test time spent.

The second part of our study concerned the influence of test preparation on performance. We found that students performed worst if they had prepared the subject matter under the assumption that it would be tested under open-book conditions and then had to answer the questions without consulting their references. They had less difficulty in answering open-book questions about subject matter they had prepared for a closed-book test. A possible explanation for these outcomes might be that students prepared less thoroughly for open-book tests (Boniface Citation1985; Koutselini-Ioannidou Citation1997; Heijne-Penninga et al. Citation2008b). Consequently, they were not able to answer the questions about open-book subject matter without consulting their references. Another explanation might be that students prepared very differently for open- and closed-book tests. When preparing for open-book tests students did not focus on remembering knowledge at all, since there would be no need to (Feldhusen Citation1961; Bouman & Riechelman Citation1995; Broyles et al. Citation2005). However, students’ search strategies were more effective if they were able to recall knowledge – knew it by heart (Francis Citation1982; Brye et al. Citation2005). Consequently, knowledge recall is also required when preparing for open-book tests. If the available test time would be restricted, students probably become aware of the need to remember knowledge, since they do not have time to refer to their sources for every detail.

A strong point of this study is that two large cohorts of students participated. All students had experience with open-book tests for at least 1 year, which is important since tests formats influence student learning behaviour and motivation (Frederiksen Citation1984; Cohen-Schotanus Citation1999; Van der Vleuten & Schuwirth Citation2005). In previous studies on open-book tests, students took an open-book test only once, specifically for research purposes. Having experience with open-book tests changes the way students prepare and are motivated, otherwise they would not know what to expect (Francis Citation1982; Broyles et al. Citation2005).

A possible limitation of our study is that the numbers of open-book questions about closed-book subject matter and closed-book questions about open-book subject matter were small and test-time was large. It was not possible to increase the number of questions and still carry out our study in a real assessment setting without breaching the current examination rules. We wanted to perform this study within an authentic context, because in an experimental setting student motivation and drive to perform well would be lost, which in turn would have influenced student behaviour before and during the tests. Besides, students received additional test time to avoid disadvantages from participation in our study. Even though the numbers of questions were less and the available test time was more, the results were in line with our expectations and were similar in both cohorts.

In conclusion, second-year students who prepared for open-book tests in a deep way needed less time to answer all questions. Restricting the available test time is a possible way to direct students to desirable test preparation, also focused on knowledge recall. Further research into the relationship between test preparation and open-book assessment is required.

Acknowledgement

The authors thank Mrs J. Bouwkamp-Timmer for her critical and constructive comments on several drafts of the manuscript.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

References

  • Abraham RR, Kamath A, Upadhya S, Ramnarayan K. Learning approaches to physiology of undergraduates in an Indian medical school. Med Educ 2006; 40: 916–923
  • Ben-Chaim D, Zoller U. Examination-type preferences of secondary school students and their teachers in the science disciplines. Instr Sci 1997; 25: 347–367
  • Boniface D. Candidates’ use of notes and textbooks during an open-book test. Educ Res 1985; 27: 201–209
  • Bouman IN, Riechelman HW. Open book exams: Aims, facts and future. Med Teach 1995; 17: 240
  • Broyles IL, Cyr PR, Korsen N. Open book tests: Assessment of academic learning in clerkships. Med Teach 2005; 5: 456–462
  • Bruinsma M. Effectiveness of higher education: Factors that determine outcomes of university education [dissertation]. GION, University of Groningen, Groningen 2003
  • Brye KR, Savin MC, Gbur EE, 2005. Graduate-level science classes: How important are extra time, open notes and book when evaluating students with calculation-oriented exams? NACTA J. [Published June 2005]. Available from: http://www.findarticles.com/p/articles/mi_qa4062/is_200506/ai_n13643632
  • Cain JC. Continuing medical education. JAMA 1979; 242: 1145–1146
  • Cohen-Schotanus J. Student assessment and examination rules. Med Teach 1999; 21: 318–321
  • Eilertsen TV, Valdermo O. Open-book assessment: A contribution to improved learning?. Stud Educ Eval 2000; 26: 91–103
  • Feldhusen JF. An evaluation of college students’ reactions to open-book tests. Educ Psychol Meas 1961; 21: 637–646
  • Feller M. Open-book testing and education for the future. Stud Educ Eval 1994; 20: 235–238
  • Francis J. A case for open-book tests. Educ Rev 1982; 24: 13–26
  • Frederiksen N. The real test bias: Influences of testing on teaching and learning. Am Psychol 1984; 39: 193–202
  • Glasziou P. How can we prepare students for the information flood?. In Presentation AMEE 2007 August 25–29. 2007, Trondheim, Norway
  • Hay I. Examinations I: Preparing for an exam. J Geogr High Educ 1996; 20: 137–143
  • Heijne-Penninga M, Kuks JBM, Schönrock-Adema J, Snijders TAB, Cohen-Schotanus J. Open-book tests to complement assessment-programmes: Analysis of open and closed-book tests. Adv Health Sci Educ 2008a; 13: 263–273
  • Heijne-Penninga M, Kuks JBM, Hofman WHA, Cohen-Schotanus J. Influence of open and closed-book tests on medical students’ learning approaches. Med Educ 2008b; 42: 967–974
  • Heijne-Penninga M, Kuks JBM, Hofman WHA, Cohen-Schotanus J. The influence of need for cognition, deep learning and preparation time on open and closed-book test scores. Med Educ 2010; 44: 884–891
  • Koutselini-Ioannidou M. Testing and life-long learning: Open-book and closed-book examination in a university course. Stud Educ Eval 1997; 23: 131–139
  • Loulou D, 1995. Making the A: How to study for tests. Pract Assess, Res Eval 4(12). Available from: http://PAREonline.net/getvn.asp?v=4&n=12
  • Schouwenburg HC, Schilder AJE. Handleiding bij de Test voor Diepgaande Leerstof Verwerking DLV’95 [Manual for the Test for Deep Information Processing]. Studie Ondersteuning, Groningen 1996
  • Simons PRJ, van der Linden J, Duffy T. New learning: Three ways to learn in a new balance. New learning, PRJ Simons, J van der Linden, T Duffy. Kluwer Academic Publishers, Dordrecht 2000; 1–20
  • Smidt SR. Is it time to close the book on closed-book examinations?. Med Health R I 1999; 82: 285–288
  • Study Guide Zone. 2009. Open-book tests. Available from: http://www.studyguidezone.com/openbooktests.htm
  • Van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: From methods to programmes. Med Educ 2005; 39: 309–317
  • Weber LJ, McBee JK, Krebs JE. Take home tests: An experimental study. Res High Educ 1983; 18: 473–483
  • Wilkinson TJ, Wells JE, Bushnell JA. Medical student characteristics associated with time in study: Is spending more time always a good thing?. Med Teach 2007; 29: 106–110
  • Zeidner M. Reactions of students and teachers towards key facets of classroom testing. School Psychol Int 1994; 15: 39–53

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.