833
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

Using criterion-referenced assessment and ‘preflights’ to enhance education in practical assignments

&
Pages 29-36 | Published online: 15 Dec 2015

Abstract

This paper looks at the reasons students often complain of ‘not enough time’ in practical assignments. One reason is that they come across ‘sticking points’ at a late stage and cannot deal with them satisfactorily. The use of criterion-referenced assessment is one way to identify sticking points before a task is set. A ‘preflight’ (using Just-in-Time teaching ideas) is a short assignment which helps students identify difficulties in advance and get started on the main task even before it is set. The following identifies ways of building ‘preflights’ and criterion-referenced assessment into experiential education.

Introduction

Students often complain that they ‘do not have enough time’ to do continuous assessment assignments. In the field, the reason may be because they have not planned sufficiently well, while in the laboratory, perhaps, because they did not do a measurement correctly and have to repeat it. Whether in the lab or field, task feedback can be very nearly instantaneously provided by a demonstrator or tutor; ‘Try it this way, don’t forget you have to…’ etc. This is useful experiential learning and the feedback helps to make sure that the activity is accomplished before the departure time etc. What frequently needs to be shown is some piece of tacit knowledge and to make this more explicit.

When students are writing on their own, this ‘lack of time’ is often manifest in that common human trait, ‘procrastination’. When problems arise here, there can be extreme difficulties; of failure to do as well as expected (by student or tutor) perhaps by non-completion of a part of the task, or, indeed, failure to submit a piece of assessed work. A less immediate, but no less serious, problem may arise when knowledge needs to be built upon and an earlier misunderstanding might be shown as a subsequent error in an examination or practical work. This paper presents some ways in which these difficulties can be recognised and minimised, if not avoided.

The concepts and practical issues discussed here are designed be part of the learning experience of practical work. The first is the development of criterion-referenced assessment (CRA), both by detailed examination to help identification of students’ ‘sticking points’ and its subsequent utility as a marking scheme that students can relate directly to their learning experiences. The second intervention is the use of ‘preflights’ before practical work (or a lecture) takes place. These interventions are aligned to assessment by the use of CRA.

The ideas and implementations presented here are part ‘tutor focussed’ and part ‘student focussed’, but can be implemented with little difficulty. Some examples are provided. Quotations by year two students (in italics) from their web-folios are used to illustrate some experiences using these tools. (BC waiting to hear who where these students are from and what year they were investigated)

Experiential learning

Active student participation is an important part of learning (CitationHealey and Roberts, 2004). In fact, rather more than ‘participation’ is required. What tasks are students actually asked to do? There is good evidence from education and cognitive psychology that experiential learning is a major way in which people ‘learn’ (CitationDror, 2006a; Dror, 2006b; Humphrey, 2006). This is a good justification for field trips and laboratory work. However, there is also a need to make learning experiences part of a pedagogic construct which provides alignment of tasks. Following CitationBiggs (2003) we should therefore provide learning experiences that are consistent between curriculum (the structure rather than the content of what we teach — the syllabus), teaching methods, learning environments, and assessment procedures used. Constructivist ideas figure largely in this approach and although not elaborated upon here, a problem based learning approach (CitationColes, 1997; Savin-Baden and Major, 2004) is an important methodology where tutors require their students to have good learning experiences in the laboratory or field.

Problem based learning (PBL) can be directly related to constructivist approaches; a good discussion is presented by CitationSavery and Duffy (1995). Following CitationLebow (1993), they suggest eight instructional principles which follow from the constructivist and PBL approach outlined above:

  1. Anchor all learning activities to a larger task or problem

  2. Support the learner in developing ownership for the overall problem or task

  3. Design an authentic task

  4. Design the task and the learning environment to reflect the complexity of the environment they should be able to function in at the end of learning

  5. Give the learner ownership of the processes used to develop a solution

  6. Design the learning environment to support and challenge the learner’s thinking

  7. Encourage testing ideas against alternative views and alternative contexts

  8. Provide opportunity for and support reflection on both the content learned and the learning process.

Such principles are all very well in designing learning activities, but there still remain the instances of students (as individuals or in groups) who have problems in understanding the basic principles behind an experiment or piece of practical work. Further, if we are to be suitably student-centred, how (well) do tutors know the problems students have? In any subject there are concepts to be understood before progress can be made. Several authors have indicated these ‘threshold concepts’ and their importance in students’ understanding (CitationCousin, 2006; Land et al., 2005; Meyer and Land, 2003).

There is often a general tutor complaint that students’ work schedules are too crowded to fit in more practical work. However, as suggested in the introduction, this is often because students leave things until the last minute and relatively minor difficulties for the experienced can become major problems for the inexperienced. The devices outlined in this paper largely have to do with principles 2, 4, 6 and 8 listed above. Although here we are concerned with practical tasks in physical geography and geology, these ideas could also be used in many educational settings.

Difficulties with concepts, tacit knowledge base and ‘sticking points’

Students frequently have problems with specific ideas or concepts (e.g. ‘regression’, ‘pH’) or generalities (e.g. ‘numbers’). Examples of these ‘threshold concepts’ and ‘troublesome knowledge’ have been recorded in Planet 17 (2006) where ideas and interventions to alleviate the problem have been discussed. However, there are also problems students encounter with a new topic, perhaps that delves further than their previous knowledge and these, too, are usually easy to identify. Less easy to spot are difficulties when students think they know about a subject, perhaps from A Level or from a previous year. We have found that this is particularly true for techniques (such as sampling) or tools (such as Excel). Further, there are topics that tutors believe their students should know about, and this is often the case with module systems where students have forgotten something, perhaps have not understood it fully, or believe they have understood it. Finally, there are problems concerned with tacit knowledge. Tacit knowledge, following CitationPolanyi (1967), is that which is not made explicit. It has become increasingly discussed in a wide variety of knowledge environments beyond the educational (CitationBaumard, 1999; Collins, 2001; Eraut, 2000; Goranzon and Ennals, 2005). This will not be discussed further here in detail: suffice it to say that it is increasingly significant in informal learning, such as provided by groupwork in practicals and laboratories. Furthermore, tacit knowledge is likely to be important in situated learning, where social learning is important and individual help usually close at hand (CitationMayes and de Freitas, 2007).

We call issues which cause the variety of difficulties mentioned above ‘sticking points’. Sometimes, a few minutes of instruction or guidance (which may come from other members of a group or the tutor) will solve the issue. How then can these sticking points be identified? Perhaps neither staff nor students realise their presence — perhaps not until very late in a period for a practical submission? We have tried two methods of identification—’advance feedback’ and ‘preflights’. The former consists of a list of mistakes or things not done well or correctly as identified from previous years and which are reported on the module website, before the students start their practical. This feedback material might be a substantial list on the website. However, despite students being shown this at the start of the practical, many often fail to use it. One might expect good marks if all the information was used, but students frequently fail to exploit such resources. The reasons for this failure to use material are not clear. It may just be that the list is too long and that students cannot identify from it the mistakes or omissions that they are about to make, or have made. However, it does point to a problem; that if people do not use feedback from previous years why is this (when it could gain them extra marks)? What does this imply about using the feedback from their own work? Before explaining about ‘preflights’ we turn to a method which is important both in assessment, and in helping to identify ‘sticking points’.

Table 1 A simple typology of ‘sticking points’ (see also text related to ‘troublesome knowledge’).

Criterion-Referenced Assessment (CRA)

We have no data on the marking schemes for practical work used by tutors at our or other institutions in GEES subjects or more generally. There may be generalised criteria for essays and examinations with essay answers, but these are not applicable to reports and practical work. However, asking colleagues (at several GEES meetings) suggests that many practical and fieldwork submissions are graded according to a perceived notion of expectations for the piece of work; a mental checklist of items that students should include. The expectations may provide a mark scheme but one that is implicitly approved by the marker. This has been called a ‘connoisseur approach’ (CitationRust et al., 2003). For preference, a criterion-referenced assessment scheme should be used (CitationHarvey, 2004; Whalley, 2008). Such schemes are supported in the current literature as being explicit and student-centred, and they improve student understanding and provide consistency in marking (CitationPrice and O’Donovan, 2006; Saunders and Davis, 1998; Woolf, 2004). Our experience supports the use of CRA in practical reporting and for fieldwork note books. We shall report this work and the tool used in detail elsewhere. Here, we report some ways in which CRA can be used for the identification of ‘sticking points’ and to produce better student understanding and attainment.

Detailed evaluation of the task is used to produce a set of criteria which correspond to the main activities in the practical piece of work. There may be any number of criteria (eg fifty or more) depending upon the task set. This requires careful reading of student submissions, and thus takes considerable time. However, it does provide a basis for subsequent CRA and the production of a simplified mark scheme for tutors and students. Use of detailed marking schemes has shown instances of students not appreciating features pointed out on a fieldtrip, things being pointed out incorrectly (by a postgraduate assistant), incorrect use of a formula etc. The patterns of problems are identified by this detailed CRA approach. Problems identified were, or could subsequently have been, ‘sticking points’ and so detailed criterion marking helps to show where correction, additional instruction or feedback is needed.

Students are presented with a much simpler scheme, usually from six to twelve categories or sub topics, with marks accordingly from five to twenty marks. is an example for a field and laboratory practical in which students examined grain size variation from a beach to dune system. This scheme was explicit to students before they embarked upon the work. There is thus alignment between the student participation, practical requirement and their assessment. Student marks in each category are mailed back after marking, together with appropriate comments (CitationFigures 2 and ). Students can, thus, see their marks against each part of the task and this provides specific feedback. Furthermore, the statistics of the class as a whole can be included (on the module website together with more detailed comments) or mailed to each student ().

Due to [the] marking system I was able to see where I went wrong and therefore hopefully work on these things in future practicals.

Figure 1 A criterion referenced assessment for a practical in a year 2 geomorphology module.

Figure 2 Part of spreadsheet mailed back to students individually showing marks achieved in each sub-topic (1–9) and the remarks made. The numbers in this instance for 1–9 are to check the spreadsheet and show maximum marks achievable for each.

‘Preflights’ and experiential tasks

If student ‘sticking points’ can be identified before going in the field or laboratory, then the learning experiences are likely to be maximised and made more enjoyable (and perhaps in the case of bad weather, less unpleasant). A basic design for a fieldwork or laboratory task suggests a formal linkage of pedagogic structure, student activities and tasks and assessment and feedback. How might this best be arranged and implemented?

‘Preflights’ help students tackle the various difficulties mentioned above. The name follows from pre-flight checks (rather than pre-publishing checks, although their derivation is the same); ‘warm up’ is an alternative name. These activities are done in advance of the events where they are required (lecture, lab or fieldwork). The idea has been developed in particular by CitationNovak et al. (1999) as part of a ‘just-in-time teaching’ arrangement.

The various schemes devised by Oliver and Herrington (CitationOliver et al., 2007; Oliver and Herrington, 2001) provide a good starting point to set up an appropriate pre-flight in conjunction with the identification of ‘sticking points’ mentioned above (). The important aspect is that the preflights should not be major task for students — although we feel that they should be as experiential as possible in tackling the problem or warming up for the event. They should involve an active task, not just ‘read up on’ — although this may be a necessary part of the task. Examples we have used include: ‘write a spreadsheet to do this calculation’, ‘define these terms via Wikipedia’, ‘plan the sequence of events to do x’, ‘using theory, plan what you will do in the field’ and ‘write a simple HTML index file’. In essence, these can (and should) be done in a short period of time, perhaps 20 minutes. If the task does take longer than expected then students know that there is a problem. How students (and tutors) tackle this emergent problem depends upon circumstances and the support mechanism(s) provided ().

Figure 3 Generalised list of marking criteria. The mail facility inserts an individual’s name, marks and appropriate comments from the spreadsheet listing as in .

Figure 4 Spreadsheet section mailed to students to show where they stand in relation to the class.

‘Preflights provided the basis for practical work and were a great asset.’

The preflights can be e-mailed in to the tutor and may be assessed or not, made advisory or mandatory before going to the event (field or laboratory). From experience, we suggest that preflights are assessed, not as standalone assignments but integrated into the summative CRA scheme (e.g. ). The marks may only be 5% of the whole but are achieved with self-checking by individuals or groups as much as the tutor.

If submitted, e-mails can be checked rapidly to see whether students have accomplished the task or not. In fact, depending on what is set up, you may only need to see that students have done the task rather than scrutinise the actual result. In other cases, students can be asked to check their results with a pre-derived answer which is sent to them or placed on the module website. It is then their responsibility to see what they did; if correct then fine, if there is an error then they need to revisit the ‘preflight’ task. This process involves students in their own learning and can provide self-generated feedback.

‘Again at the beginning I thought pre-flight’s were time consuming and I was annoyed at the fact that they weren’t graded. However, after completing a few pre-flight’s I realised that they helped me plan for my practicals and helped me understand the practical more.’

Figure 5 Temporal ‘Activity diagram’ modified from CitationOliver et al. (2007) for a rule-based learning design. Designs are also available for incident-based, strategy-based and role-based systems.

Figure 6 A practical mark scheme where element 1 was automatically given if the preflight was sent in advance. In this case the task was to set up a PowerPoint page in poster format; element 9 also counted in the pre-flight task.

‘I found completing these preflights rewarding as they gave you an early insight into what to expect in the actual practicals.’

As the previous student comment shows, not all preflights need to be marked, although some were, and the experience and value was assumed to continue. The first practical task, however, had an automatic mark awarded if the ‘preflight’ was submitted five days after having been published (). This was only 5% and might not have been ‘worth’ 5% of the total, but another element was also incorporated. The importance here is not just to identify sticking points, valuable though this may be, but to get students started on the task and to provide incentives (CitationHogarth and McKenzie, 1991). They know that they will be rewarded just for doing this. Such ‘marks in hand’ incentives are akin to ‘conditional cash transfers’ in aid development (CitationRawlings and Rubio, 2005).

Even though setting up the equation didn’t take long, the fact that it was already taken care of in a preflight helped relieve the pressure.

JiTT and ‘Warm ups’

Just-in-Time Teaching (JiTT) is:

a technique for teaching and learning that uses the Internet to improve student success in college science courses by enhancing and extending classroom instruction via the Web.

CitationNovak et al. (1999) in their presentation of Just-in-Time Teaching equate ‘preflights’ (based on US Air Force Academy) and ‘warm ups’ (Indiana University Purdue). Here we use the former, for a more formal check that all is well before and, perhaps, part of an assessed piece of work — something that needs to be done. Here we view a ‘warm up’ as a formative task to get students involved, perhaps before a lecture. In effect, there is little to differentiate between the two, other than in the style of implementation. Tasks for both can be set in class, although web-based technologies provide the main flexible medium of operation. CitationNovak et al. (1999) list examples of such tasks and activities, although all are in physics and engineering. Some geomorphological examples of ‘warm ups’ are given at: http://web.gg.qub.ac.uk/people/staff/whalley/teaching/jitt/warmups.html.

Students ‘preflighting’ their own work

We have found that students often fail to achieve their potential for a piece of assessed work, by not carefully re-reading it and looking for mistakes and non-sequiturs (perhaps themselves the result of ‘sticking points’). Unsurprisingly, this is often because they leave things until the last minute. Unfortunately, they then have little time to re-read. This is partly because they lack experience in this requirement and a ‘save-print-hand in’ attitude is common, particularly where the time left for this may be of the order of a few minutes before a deadline. One way of showing the benefits of a re-read is by making a practical submission a pre-flight itself. For example, an extra day can suddenly be allocated on submission day, students can then re-read their work and (re-)submit. During this time they have the chance to re-read and make corrections, thus seeing the advantage and importance of this time allocation. In this sense, the ‘preflight’ is similar to debugging a computer program. However, this ploy cannot be done more than once a session or students will begin to expect extra time!

Summary: Problem-based learning, apprenticeships and tacit knowledge

It has been argued elsewhere (CitationWhalley, 2008) that students need experience in skills and problem solving techniques, and that these tasks need to be experiential. However, students cannot easily be taught how to solve problems, although some techniques and schemes (CitationAllen and Allen, 1997; Eco, 1980) can be very helpful. In effect, students need an apprenticeship, especially to experience and practise the tacit items of a knowledge base and tricks of the trade. For the most part, providing course content is not a problem. Identifying sticking points is a way to provide this assistance. E-learning can be brought into this in ways which can be tailored to the modules and local circumstances (CitationSavin-Baden, 2007). ‘Preflights’ and warm ups do not need to be mandatory or summatively assessed. Using preflights depends upon a certain amount of imagination, but coupled with criterion, references assessment helps the learning experience and gives students confidence. Class variability does not allow us to say statistically that late submissions have been reduced, or that marks have been ‘better’. Nevertheless, these techniques, especially when coupled, make for better student engagement and involvement and can use CRA itself as one form of feedback. ‘Preflight’ activities provide active engagement as well as helping to avoid ‘sticking points’, and can be mediated by a variety of web-based technologies such as web form submission. Imagination from tutors and ease of use by students are also important in identifying sticking points and troublesome knowledge, and in implementing devices in an active and experiential learning environment.

References

  • AllenR. E., and AllenS. D. (1997). Winnie-the-Pooh on Problem Solving. London: Methuen.
  • BaumardP. (1999). Tacit knowledge in organisations. London: Sage.
  • BiggsJ. (2003). Teaching for quality learning at university. Buckingham: Open University Press.
  • ColesC. (1997). Is problem-based learning the only way? In The challenge of problem-based learning. (D.Boud, and G.Feletti, Eds.). London: Kogan Page 313-325.
  • CollinsH. M. (2001). What is Tacit Knowledge? In The Practice Turn in Contemporary Theory. (T. R.Schatzki, K.Knorr-Cetina, and E.von-Savigny, Eds.). London: Routledge 107-119.
  • CousinG. (2006). An introduction to threshold concepts. Planet 17: 4-5.
  • DrorI. (2006a). It is not what you teach, but what they learn that counts! Learning Light October: http://www.learninglight.com/FileRetriever.aspx?id=129 (Accessed 28 March 2008).
  • DrorI. (2006b). The architecture of human cognition paves the way to efficient and effective learning. Learning Light October: http://www.learninglight.com/FileRetriever.aspx?id=119 (Accessed 28 March 2008).
  • EcoU. (1980). The name of the rose. London: Picador Books.
  • ErautM. (2000). Non-formal learning and tacit knowledge in professional work. British Journal of Educational Psychology 70: 113-136.
  • GoranzonB., and EnnalsR. (2005). Dialogue skill and tacit knowledge. Chichester: Wiley.
  • HarveyH. (2004). Assessment criteria: reflections on current practices. Assessment & Evaluation in Higher Education 29: 479-493.
  • HealeyM., and RobertsJ. (2004). Engaging students in active learning: case studies in geography, environment and related disciplines. 140. Geography Discipline Network, University of Gloucestershire, Cheltenham.
  • HogarthR. M., and McKenzieC. R. M. (1991). Learning From Feedback: Exactingness and Incentives. Learning 17: 734-752.
  • HumphreyN. (2006). Seeing Red: A study in consciousness. Cambridge, Mass, London: Belknap, Harvard University Press.
  • LandR., CousinG., MeyerJ. H. F., and DaviesP. (2005). Threshold concepts and troublesome knowledge (3): implications for course design and evaluation. Improving Student Learning-equality and diversity, Oxford: OCSLD.
  • LebowD. (1993). Constructivist values for systems design: five principles towards a new mindset. Educational Technology Research and Development 41: 4-16.
  • MarrsK. A., and NovakG. (2004). Just-in-Time Teaching in biology: creating an active learner classroom using the internet. Cell Biology Education-Life Sciences Education 3: 49-61 (http://www.lifescied.org/cgi/content/full/3/1/49) (Accessed 24 March 2008).
  • MayesT., and de FreitasS. (2007). Learning and e-learning: the role of theory. In Rethinking Pedagogy for a Digital Age: Designing and Delivering E-learning. (H.Beetham, and R.Sharpe, Eds.). London: Routledge 13-25.
  • MeyerJ. H. F., and LandR. (2003). Threshold concepts and troublesome knowledge (1): linkages to ways of thinking practising within the disciplines. In Improving student learning — ten years on. (C.Rust, Ed.). Oxford: Oxford Centre for Staff Learning and Development. Also at ETL project Occasional Report 4.
  • NovakG. M., PattersonE. T., GavrinA. D., and ChristianW. (1999). Just-In-Time Teaching. Upper Saddle River, N.J.: Prentice Hall.
  • OliverR., HarperB., WillsS., AgostinhoS., and HedbergJ. (2007). Describing ICT-based learning designs that promote quality learning outcomes. In Rethinking Pedagogy for a Digital Age: Designing and Delivering E-learning. (H.Beetham, and R.Sharpe, Eds.). London: Routledge 64-80.
  • OliverR., and HerringtonJ. (2001). Teaching and learning online: a beginner’s guide to e-learning and e-teaching in Higher Education. Edith Cowan University, Mt. Lawley, Western Australia.
  • PolanyiM. (1967). The Tacit Dimension. New York: Doubleday.
  • PriceM., and O’DonovanB. (2006). Improving performance through enhancing student understanding of criteria and feedback. In Innovative Assessment in Higher Education. (C.Bryan, and K.Clegg, Eds.). Abingdon: Routledge 100-109.
  • RawlingsL. B., and RubioG. M. (2005). Evaluating the Impact of Conditional Cash Transfer Programs. The World Bank Research Observer 20: 29-55.
  • RustC., PriceM., and O’DonovanB. (2003). Improving students’ learning by developing their understanding of assessment criteria and processes. Assessment & Evaluation in Higher Education 28: 146-164.
  • SaundersM. N. K., and DavisS. M. (1998). The use of assessment criteria to ensure consistency of marking: some implications for good practice. Quality Assurance in Education 6: 162-171.
  • SaveryJ. R., and DuffyT. M. (1995). Problem based learning: An instructional model and its constructivist framework. In Constructivist learning environments: Case studies in instructional design. (B.Wilson, Ed.). Englewood Cliffs, NJ: Educational Technology Publications 135-148.
  • Savin-BadenM. (2007). A practical guide to problem-based learning online. London: Routledge.
  • Savin-BadenM., and MajorC. H. (2004). Foundations of problem-based learning. Buckingham: Society for Research into Higher Education and Open University Press.
  • WhalleyW. B. (2008). What should a (Geography) degree for the 21st century be like? Planet 19: 36-41.
  • WoolfH. (2004). Assessment criteria: reflections on current practices. Assessment & Evaluation in Higher Education 29: 479-493.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.