Publication Cover
Engineering Education
a Journal of the Higher Education Academy
Volume 8, 2013 - Issue 2
512
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Using Online Testing for Engineering Studies

Pages 77-89 | Published online: 15 Dec 2015

Abstract

The use of online resources in the assessment of engineering modules is generally being encouraged in higher education institutions. This paper reflects upon the experience of open book testing of first year engineering students in two elements of assessment in a continuously assessed two semester long module. This mode of assessment is more convenient for the large percentage of part-time students on the programme, who only attend one day per week. A longitudinal study over a four year period shows that not all students like to be assessed in this way, even though evidence indicates improvement in pass rates over those achieved from more traditional modes of testing.

Introduction

In higher education, we are being encouraged towards greater use of online resources for teaching and assessment. As a later life academic after a career in industry, I was concerned about teaching the right material at the right level to start with. I soon realised that with large class sizes, the marking load could be quite high and that, given that online tests could be set up on our student web service, doing at least one assessment online in a module would be helpful, both to myself and my students. This is particularly true for those who study part time, as they are working in industry and only attend for one intense day per week. They have to fit personal study into the evenings and weekends, so that effective time management is crucial for them.

I decided to start with a first year module where the teaching is over 24 weeks/two semesters and assessment is continuous. My thinking was that if one test per semester were online/open book it would help both student and lecturer workload management. The other assessments were then a report/essay and an exam style final test each semester. My expectations were that students would prefer this open book mode of assessment to more formal, exam style testing, marking and feedback and that it might be possible to enhance pass rates for the module by using online resources, under my previous personal assumption that most students are web savvy.

Pedagogical background

The pedagogy to using technology for teaching and assessment has its roots in twentieth century behavioural and cognitive theories of education (CitationHarasim 2012). Behaviourist theories from the early twentieth century (CitationThorndike 1905, CitationSkinner 1948) rely upon the observation of people’s behaviours and how to change them and are based upon empirical, observable and measurable factors and responses to stimuli. In terms of teaching, this means that the teacher provided information, formulae, experiments or ideas – the stimuli and the learner reacted to them by accepting, or questioning if they did not understand the material provided. This is quite a traditional way of teaching and is rigid, inflexible and does not help the teacher to know if the learner has understood; as understanding cannot be seen or measured and only by testing can the student ‘be seen to have learned’ (CitationHarasim 2012). In engineering teaching, the model (CitationGoodhew 2010) was to ensure that the design process (the formulae and calculations) was communicated and as long as the student could emulate it at exam time, they could pass the course. Actual understanding of the material, for many students, did not come until well into their time in professional practice, when confronted by real world problems to solve.

Moving forward into the mid-twentieth century, cognitive theories emerged (CitationAtkinson & Shiffrin 1968, CitationBaddeley & Hitch 1974). These stressed the importance of the mind (the unseen) in learning and attempted to emulate its functions by providing artificial and intelligent tutoring mechanisms that mimic the mind’s processes. These still required the learner to assimilate what was taught, but it was possible to do this at the most appropriate time for them, which might not be in a classroom environment. It was a complex issue to cater for the different modes of learning of all students, such as preferences for visual, sensory or auditory assimilation of data, alongside the required formula and design process content. In terms of teaching, a generic teaching model and economies of scale were applied.

Most recently, constructivist theories of learning have evolved (CitationVygotsky 1962, CitationPiaget 1970). These require the student to take an active role and responsibility for his/her learning within the context of making sense of their own world. These theories recognise that an individual cannot be programmed and that their way of being depends upon nature, nurture and, most importantly, the social context they live in. It is these facets of human life that dictate how learners construct and interpret meaning for themselves, via transactions with teachers, peers and their social and knowledge networks (CitationLaurillard 2012). In constructivism, knowledge is not absolute or even static, but changes in the perceptions of the learner over time and depends upon the context they find themselves in at any given point in time. From this perspective, learners are the creators of their own knowledge and the teacher’s role, then, is to facilitate access to knowledge sources and networks that meet both their needs and those of their programme of study (CitationBiggs 2003). This is where online resources are particularly applicable, as they have the ability to provide a wide range of information sources and knowledge networks that the student can engage with, guided by the teacher, who is him/herself the hub of the knowledge network for his/her students (CitationWolf & Kolb 1984, CitationBransford et al. 2004).

CitationGoodhew (2010), in his book about teaching engineering refers to chartered engineers as being creative problem solvers and innovators. They need to be ‘rational and pragmatic, interested in the practical steps necessary for a concept to become reality … want to solve problems … and have strategies … employing their knowledge in a flexible manner’ (Engineering Benchmark Statement, QAA, 2006, cited in CitationGoodhew 2010: 10). These abilities and skills do not come naturally to all engineers, so somehow they need to be taught. It seems that online resources have a role to play in facilitating this knowledge transfer, due to their facility for accessing a range of sources and explanations of theory in a format that is digestible to students with different interests and abilities (CitationCreanor et al. 2007–2008).

The role of information technology in engineering teaching

Given the global social context of life that is facilitated by web and telecommunications today, it can be no surprise that the attitudes, behaviours, expectations and ways of learning of students today must be different from those of previous generations. The sound bite, all access, shared information culture that has emerged over that last 20 years means that ways of gathering information and knowledge are very different in the twenty-first century from before. No longer is the teacher the font of all knowledge and expertise – this is available freely via a range of online resources – but the teacher needs to be a guide to what the appropriate knowledge is, where to access it and how to interpret it (CitationCase 2008). This means that the way engineering is taught must also be quite different in future.

Data overload

Over time, the amount of knowledge students are expected to assimilate has increased. I remember when I was in my final undergraduate year; the year below me was already studying a proportion of what I was studying. Thirty-five years later, the expansion of knowledge has been exponential and not only do graduates have to be engineers; they also have to think globally, be entrepreneurs, intrapreneurs and team players. This is in order to have the transferable skills required to compete in a global market [e.g. as in the Institution of Civil Engineers (ICE) strategy (ICE 2013) and other Engineering Council member institutions] and means that, as teachers, we must learn to support and facilitate in ways that may seem quite alien to us (CitationFarrell 2003). After all, we were taught in a different learning and teaching paradigm and our comfort zone may well still be there.

Another issue is the underpinning learning that students have had before they come to study. This may differ wildly due to culture, language, types of school or college qualifications taken. Lack of prior access to information technology (IT) services may also limit the students’ ability to engage with all the web-based services that are available to them. I certainly assumed when I started teaching that all students younger than me were au fait with social and learning technologies that I often struggle to engage with and interpret myself, but many are not.

Using online resources in teaching

There is a range of services that online resources can provide to support and facilitate learning and teaching. They do need to be moderated by the teacher though, as not all forms of IT are suitable or appropriate for all types of learning and teaching.

At the early stages of learning, it may be better to control access to external, uncontrolled data until the student is discerning enough to use it properly (CitationGoodhew 2010). Internal virtual learning environments (VLEs) and other university fora can provide a wide range of learning resources; including lecture slides, videos, podcasts, tutorial answers, discussion boards, blogs, wikis and generalised feedback information, as well as online tests and assignments which can be carried out remotely by the student and automatically marked or moderated by the teacher. Excel spreadsheets and modelling software can facilitate the exploration of theory through practical examples/experiments that test the students understanding.

As the student develops and progresses and more complex concepts are studied, access to external online resources is useful to support guided learning (CitationFry et al. 1999). These can be incorporated into lecture slides and introduced to the student as part of a wider world of learning that they can choose to access. These must be seen to be relevant to the discourse and their introduction needs to be timed to demonstrate the more global aspects and concerns that arise at a given point in the teaching. They can also be used to recapture attention at times when dense or complex material is being imparted. Links to regulatory guidance, or professional body resources or global environmental scenarios, maps and climate change charts for example, can broaden the impact of the face-to-face learning and allow the student to explore his/her own interests alongside the core material in a controlled and guided way.

What online resources can and cannot do

Online services are a resource for learning and teaching, but are not a substitute for it. The student still has to assimilate the knowledge and the teacher still has to impart it, or guide the discovery of it in a digestible form. Online resources can, however, make it more fun to learn and show the relevance of the taught material to the real world the student will eventually have to work in (CitationBates et al. 2007–2008). It can help students to keep up by accessing material in their own time and at the time during the day when they learn best, which is not necessarily in class. It should also help to enhance progress and grades.

Methodology

An online test each semester was instituted in 2009 for first year fluid mechanics and thermodynamics students in a 24 week taught module. The test comprises a series of mathematically oriented questions which require the use of theoretical equations to solve engineering problems. This format works well with calculations that have a specific answer and rounding can be accounted for by setting a range to the answer. Some questions have several sections and some have one overall answer. Although questions overall have the same score, because of the limitations of layout in the software, questions with fewer sections can appear to be marked more heavily.

A typical question would be:

A reversible Rankine steam power plant operates with a boiler pressure of 40 bar and a condenser pressure of 1 bar. Both the turbine and the feed pump operate adiabatically and the work required by the feed pump is negligible. The temperature at the turbine inlet is 600°C.

  1. Determine the enthalpy at the main four state points in the cycle (10)

  2. Determine the thermal efficiency of the plant (5)

  3. Determine the temperature at turbine outlet (5)

Similar examples are given in class and the students can refer to class notes (open book) when attempting the test, in their own time. The test is automatically assessed by the grade centre software. Feedback is instantly given on success in answering the questions correctly, within a narrow range and the correct answer is cited as feedback.

The students are given a dummy test to allow them to practise at answering questions open book in this format, to help to overcome any lack of expertise in using the software. They can access this dummy up to five times and for as long as they wish in a sitting. It remains open for three weeks, in advance of the actual test period. The dummy test is shorter than the actual test, but has a number of similar questions. Its main purpose is familiarisation with the software and test administration, before doing the actual test.

The actual assessed test then follows the same format as the dummy but with more questions, is one attempt, one hour and is open for three weeks to allow students to take it at a time to suit their workload (each online test is worth 15% of total marks). Two other assessments per semester, one a written piece of work (each being 20% of total marks) and the other a formal test (each being 15% of total marks) are also taken in each semester. After the first, trial test, the students were canvassed for their acceptance of the test, through a student feedback survey. After a bedding in period, during which verbal feedback was gathered and used to improve the test, this survey was repeated, to test whether the outcomes and student views had changed.

Findings

Marks in general

A general observation is that, as shown in , over the four years since its inception, most students pass these online, open book tests. If they do not, it is usually because they have not practised effectively with the dummy test, or press the submit button too early, or run out of time (noted from verbal student feedback). They appreciate, in the main, being able to do them at a time that suits them best. As we have a large percentage of part-time students, this flexibility of assessment is helpful to them. The first year was 2009–2010, or pilot year, for the test and the feedback from students was fed into the subsequent testing set up.

Table 1 Marks for online tests from 2009–2010 to 2012–2013.

There are always some students who have difficulty executing these types of test correctly, but most get a better pass rate for these tests than for more formal styles of testing (, compared with ). Over the 24 weeks of module teaching this balances the marks overall, testing not just memory, but ability to use theory in different ways and apply it under different test conditions, as well as identifying where numeracy and writing skills require work. Over time, the test has been adjusted with student feedback to:

  • Open up all questions at the start to allow students to choose which to do first

  • Add a value range to answers to allow for calculation rounding

  • Cite the actual expected answer so students can rework offline and see where they went wrong.

This has enhanced performance, although the questions themselves have not changed in the period.

Table 2 Marks for other tests in 2009–2010.

Table 3 Marks for other tests in 2010–2011.

Table 4 Marks for other tests in 2011–2012.

Table 5 Marks for other tests in 2012–2013.

Another interesting finding is that students fare worse in written coursework than in numerical class tests, possibly to a focus on science/mathematics rather than verbal/writing skills during pre-learning. In each semester, the first numerical test to be taken is the online one, with the second per semester being in the last week as a formal exam style test. Written courseworks are issued in the middle of the semester for submission in around week 10. This is to spread the workload and cognisance is taken of work being issued on other modules, via consultation with colleagues and the students themselves.

It is also interesting to note the variation in marks with cohort, as every year presents different data. Some years, e.g. 2012–2013, have a cohort with poorer writing/analytical skills than other years, whereas numerical testing reveals greater numeracy. In other years, this trend is reversed and in some years, everyone seems to hone to the middle road in both areas of skill.

This would seem to indicate that assessment is not an exact science, in that all people respond differently to different ways of being tested and their variety in any year cannot be predicted. The best we can do then, as assessors, is to:

  • Listen to student feedback

  • Take a balanced view of what they indicate may be helpful to their performance

  • Use established and recognised assessment designs

  • Monitor a range of cohorts over several years to ensure that fairness and learning outcomes are achieved.

Online testing

The modus operandi for taking the tests and strong encouragement to practise with the dummy test first were highlighted several times in the run up to the testing period in both years the survey was run. The online testing has run for four years now and pass rates for it have generally been high.

The first survey about the online test for first years showed () that, although the majority of respondents (87%) found the test easy to navigate, 83% did not feel they had sufficient feedback, and only 47% actually liked doing tests in this way. During post-survey discussion with students, they indicated that:

  • They prefer open book testing, but would like it:

    1. in a formal test setting or

    2. online without a time limit [see , comments (number of students with this preference indicated in parentheses)] and

  • They would like more details of the solution, or a link to one, and the calculated units in each question should be specified.

These responses were taken into account and the answers to the test question are now incorporated into the module documentation.

Table 6 The 2009–2010 student feedback survey.

Some students failed to log in/out correctly, despite having tried the dummy test, which is accessed and submitted in the same way. Some students asked for all questions to be open at the same time so they could choose which to answer first, as they would in a normal test.

These inputs were fed into the following year’s test format, which had units specified, answer ranges and all questions available at once, to enable the students to choose which to do first. Submission of the test only occurs once the student is satisfied he/she has answered them all, unless he/she runs out of time, in which case the test closes, without marking the last uncompleted question. Any student who fails to get a pass mark is offered a paper version of the test.

In the second survey four years later (), again the majority of respondents (95%) found navigation easy, but said they would prefer individual question pages – the opposite of the original cohort – or better layout of questions. Again, some could not sit the test, despite having tried the dummy test and some were locked out, ‘not realising’ it was a one shot test. Some (, comments) asked for a worked example, which is given in lecture notes and should have been a help in executing the online test. Again, students asked for details or links to a solution, even though lecture notes have similar examples. Some students said it was easier to make mistakes online under a time constraint and one thought the dummy test should be directly relevant to the actual test. In fact, the dummy test questions are similar to the first couple of actual test questions. One student in this survey thought the questions should be evenly rated so that incorrect answers would not skew the results.

Table 7 The 2012–2013 student feedback survey.

Student engagement in the survey

These feedbacks are interesting in that, despite continual improvement in the test administration over the years, student perceptions do not really seem to reflect that improvement. In fact, at the last survey, a greater percentage of respondents did not like doing tests in this way (57% in 2012–2013 versus 47% in 2009–2010). The percentage of the cohort responding to the survey also increased in the second survey (68% in 2012–2013 versus 48% in 2009–2010). It is not clear without further investigation whether this is significant in terms of engagement generally, or in terms of preference for test method.

Pass rates and progression

Pass rate and progression analysis is complicated by students who withdraw and/or resit more than once. Students are allowed up to four attempts at a module overall and so in some cases will carry one forward to the following session. This data () attempts to smooth out these issues, by removing those students from the analysis.

Table 8 Smoothed out module pass rates since the online testing has been in place.

Student numbers vary and this is partly due to changes to other programmes, which at various times engage with civil engineering during their first year of study, and to the economic climate, which reduces part-time, company funded student numbers. Part-timers achieve higher overall module marks than do full-timers (), probably because they are company funded and their employer expectations are high, as is their personal motivation to do well. Also, they generally exhibit more maturity in their work organisation and in the first year of study, this can cause a significant difference in attitude and behaviour between full- and part-timers and in quality of work and pass rates.

Table 9 Part-time (PT) versus full-time (FT) overall first diet percentage at level (no data for 2009–2010).

Conclusions

The full-time engineering students I teach have better numeracy than writing skills in their first year of study. There is a greater spread of marks for these students than for company funded part-timers, whose marks are generally better in any case. Findings conclude that:

  • Our engineering students are more numerical than verbal in their skill sets in first year and so fare worse in written coursework than in numerical tests at this level of study.

  • Many first year students have not had the benefit of regular or wide access to IT facilities during pre-learning, which inhibits their engagement with, and liking for online testing until they are more confident with using computers and a range of software and university web-based services.

  • Over the four years of this investigation, it was noted that different cohorts of students display differences in these skill sets year-by-year, which has implications for changes and enhancements to assessments over time. Assessment is therefore not an exact science, in that different students respond differently to different ways of being tested and their variety in any year cannot be predicted.

  • Lecturers need therefore to listen to student feedback and take a balanced view of what students indicate may be helpful to their performance. Established and recognised assessment designs and monitoring of cohorts across several years are needed to ensure that assessments are fair and sustainable.

  • Students like open book testing and want detailed solutions to be provided as feedback soon after testing so that they can get the most out of the learning experience.

  • Despite continual adjustments and improvements to the online test over the four years, based on student feedback, improvements are not reflected in student perceptions of the test.

Positives of online testing:

  • Most students found the test easy to navigate, especially having practised with the dummy test beforehand.

  • The online test provides an alternate type of assessment for students who do not perform so well in more formal examinations or written courseworks and allows them to increase their average pass rates for the module.

  • The online test particularly suits part-timers (noted from verbal feedback), who do much of their study in their own time and this has tended to enhance their marks overall. In general the students like being able to do the test ‘in their own time’.

  • The test forces students to consult their lecture notes early in the semester in order to complete the test, using similar worked examples from class. This should support learning outcomes related to theory and design of engineering solutions to problems set.

  • More students tend to pass these online tests than more formal exam style closed book tests, despite their lack of enjoyment in executing them. If they do not, it is usually because they have not practised effectively with the dummy test, or press the submit button too early in the test.

  • My workload, instead of being focused on marking, has been used to explain how the online testing works and to facilitate discussions in class during the dummy test practice period.

Negatives of online testing:

  • Some students find the time constraints stressful, even in an open book test, when they have had a practice test beforehand to help them.

  • Despite expectations of working with ‘digital natives’, many students do not like online testing due to their lack of experience with this type of software and despite having a dummy test to practise on.

  • The nature of the online test, i.e. involving questions that require calculations that have specific answers, tends to cause stress when time limited for some students.

  • Most students would like a link to, or a more detailed solution to be provided as feedback. The software has limited feedback capability for this type of questioning, but answers are provided for students in their web-based module link.

  • This mode of assessment does not seem to have had any impact upon the reluctance of many students to engage with the lecture notes/course material and did not seem to increase their familiarity with the module topics in many cases.

Ideas for future development of the online test:

  • I need to explain with greater clarity the purpose and learning objectives of the online test and to introduce the students to the dummy test on screen during class to overcome their reluctance to engage fully with this type of assessment.

  • I need to ensure a balanced question structure in terms of mark allocation so that students feel the marking is fair, particularly if they ran out of time on a more heavily weighted question.

  • It is essential to give written feedback, as answers to questions posed, in order that students can see where they went wrong exactly.

  • In addition, more investigation is required into why a proportion of students say they do not like this type of test even if they do well in it.

In summary, using online testing does not suit everyone and requires up front detailed explanation of how the test works and practise with the software before testing can begin. Equally, balancing questions in the test is important to reducing the student concerns about large questions potentially skewing the final mark. To get real student engagement in the online test process, students need to see the benefits of it for them, in their mode of learning. Good students and part-timers gain benefit from this type of testing, which they can do at a time that suits them best. On balance, online testing can be seen to be an effective method of assessment, from this limited study, but it needs to be set up and administered in a clear and consistent fashion.

References

  • Atkinson, R.C. and Shiffrin, R.M. (1968) The Psychology of Learning and Motivation ( eds. K.W.Spence and J.T.Spence), 2nd edition, pp89–125. New York: Academic Press.
  • Baddeley, A.D. and Hitch, G.J.L. (1974) The Psychology of Learning and Motivation: Advances in Research and Theory ( ed. G.ABower), 8th edition, pp47–89. New York: Academic Press.
  • Bates, J., Hardy, J., Hill, J. and McKain, D. (2007–2008) How design of online learning materials can accommodate the heterogeneity of student abilities, aptitudes and aspirations. Learning and Teaching in Higher Education 2, 3–25.
  • Biggs, J. (2003) Teaching for Quality Learning at University, 2nd edition. Milton Keynes: OU Press.
  • Bransford, J.D., Brown, A.L. and Cocking, R.R. (2004) How Students Learn: History, Mathematics, and Science in the Classroom. National Research Council of the National Academes.
  • Case, J. (2008) Education Theories on Learning. An Informal Guide for the Engineering Education Scholar. Engineering Subject Centre Guide. York: The Higher Education Academy.
  • Creanor, L., Trinder, K., Gowan, D. and Howells, C. (2007–2008) Lifelong learning and technology: Views from the learners. Learning and Teaching in Higher Education 2, 26–41.
  • Farrell, M. (2003) Collaborative Circles: Friendship, Dynamic and Creative Work. Chicago, IL: University of Chicago Press.
  • Fry, H., Ketteridge, S. and Marshall, S. (1999) A Handbook for Teaching and Learning in Higher Education – Enhancing Academic Practice. London: Routledge.
  • Goodhew, P. (2010) Teaching Engineering. All you Need to Know about Engineering Education but were too Afraid to Ask. York: UK Centre for Materials Solutions, The Higher Education Academy.
  • Harasim, L. (2012) Learning theory and online technology, Online resource.
  • Laurillard, D. (2012) Teaching as a Design Science. London: Routledge.
  • Piaget, J. (1970) Science of Education and the Psychology of the Child. New York: Orion Press.
  • Skinner, B.F. (1948) ‘Superstition’ in the pigeon. Journal of Experimental Psychology 38, 168–172.
  • Thorndike, E.L. (1905) The Elements of Psychology. New York: A. G. Seiler.
  • Vygotsky, D. (1962) Thought and Language. Cambridge, MA: MIT Press.
  • Wolf, D.M. and Kolb, I. (1984) Career development, personal growth and experiential learning. In Organisational Psychology: Readings on human behaviour ( eds. D.Kolb, I.Rubin and J.MacIntyre), 4th edition. Englewood Cliffs, NJ: Prentice-Hall.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.