Publication Cover
Engineering Education
a Journal of the Higher Education Academy
Volume 8, 2013 - Issue 2
930
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Making Learning Accessible and Encouraging Student Independence with Low Cost Developments

Pages 15-29 | Published online: 15 Dec 2015

Abstract

This paper focuses on the production and use of learning resources within an engineering curriculum that support the development of independent learners. The most important contribution is to propose pragmatic learning approaches which improve the student learning experience and reward their engagement. The paper gives evidence that the introduction of new learning resources and approaches need not be labour intensive, expensive or complicated and thus be both cost effective and straightforward to implement, creating a win/win scenario for the staff and students. Several examples of novel learning resources and approaches developed by the author for use within an engineering curricula are presented and evaluated.

Introduction

Many academics have focussed attention on the potential of technology to enhance the student learning experience, and indeed student learning. Since the late 1990s there has been an increasing use of the Web and indeed virtual learning environments, as evidenced by conference series such as ‘Web based Education’ and the ‘Blended Learning Conference’. Many authors have shown that the 24/7 availability of web resources is much appreciated by students and increases both access and diversity, thus meeting the needs of students with different preferred learning styles. Nevertheless, it has equally been noted (e.g. CitationRossiter & Rossiter 2007) that provision of resources in itself is insufficient; if resources are not embedded effectively enough into the curriculum, lectures and assessment, then many students simply do not access them.

A parallel issue is the need to engage students early on (CitationYorke 2006) and provide a framework which ensures they are active in their studies; often lectures alone fail to achieve this. As part of transition, it is important that the curricula is designed to encourage independent learning and active engagement by the students (CitationRossiter & Gray 2012, CitationThomas 2012). Once again technology can be a major facilitator of these objectives (CitationOliver and Herrington 2003, CitationFidler et al. 2006, CitationNortcliffe and Middleton 2007, CitationParson et al. 2009, CitationRossiter 2011).

A worthwhile aim for academic departments planning curricula and module development is to answer questions over the extent to which proposed learning resources and approaches are valuable, which means resolving issues such as:

  1. Does interaction with this resource provide a learning experience which is significantly more valuable than just reading notes or attending lectures? If not, is it worth the effort?

  2. Is there a strong incentive or necessity for students to engage with the resource? If not the effort on producing the resource may be wasted.

  3. Is the resource engaging and/or interactive, thus promoting student activity as opposed to passivity?

  4. Can the resource be accessed anywhere and anytime? Poor accessibility is a significant obstacle to student usage.

  5. Can the resource be produced efficiently and at low cost? Can it be re-used so that there is payback over a long-time scale? For most academic staff efficient use of their time is a prime driver in any learning and teaching developments; cash may be a smaller obstacle.

  6. Do the proposed resources provide learning opportunities which are more likely to be absent otherwise and thus enhance the student experience and learning? If not, why are we bothering?

  7. Do the resources and approaches selected encourage students to develop their independent learning skills and attitudes?

This paper will propose and evaluate a few relatively novel approaches which are pragmatic because they provide resources which: (i) promote active student engagement; (ii) are different from lectures and textbooks; (iii) have improved accessibility on alternatives; (iv) are embedded so students need to use them; (v) are relatively cheap on staff time to develop; (vi) have the potential for long-term usage and thus good payback and most importantly of all, require the students to develop their independent learning skills and associated time management.

The paper will focus around three main areas. The second section will look at laboratory provision and propose alternatives to remote and virtual laboratories which are easier to manage and still give significant improvements in student accessibility to hardware. The third section will consider lecture recording, which is not new in itself, and demonstrate, with evidence, that the effective use of technology enables the modernisation of the lecture, thus giving significant opportunities to improve student engagement, and personal responsibility, without detriment to core support requirements. The fourth section will revisit the role of the virtual learning environment and proposes a minor, but we believe novel, use of quiz environments for enabling efficient assessment and feedback of learning outcomes where staff may not immediately think a quiz is appropriate. The paper finishes with some conclusions.

Improving accessibility to laboratories

There is a lot in the literature about remote access laboratories (e.g. CitationAbdulwahed 2010, CitationQiao et al. 2010, CitationReload 2010) and indeed virtual laboratories (e.g. CitationMemoli 2011). It is recognised that where feasible these are a valuable addition to the student learning experience. However, it is also known that the development and maintenance of a remote access laboratory is non-trivial and thus may not be as cost efficient as first suspected. For a start, such laboratories can only be effective if reliable which means the technicians have to prioritise maintenance and regular checks throughout the period the laboratory is available.

In the author’s department, trials with remote access laboratories came to the conclusion that, with the exception perhaps of relatively simple equipment, the overhead of ensuring reliability was not achievable with the resources available. Specifically the judgement was that more interesting equipment may have many more failure modes which are easily corrected when a student is in situ, but not for someone who is remote. Moreover, a remote laboratory is still remote in that students are not actually working on the hardware, but instead making choices which are implemented by software only. Hence a pilot study was done using the concept of an anytime access laboratory. The evaluation shows that this approach is cheaper in terms of staffing cost and preparation and yet more effective in terms of student participation and engagement than traditional laboratory classes, although at a small price to accessibility.

Anytime access laboratories

The proposal is to develop some laboratories which students can access and use outside of the normal timetable, thus in theory, at any time between 9 a.m. and 5 p.m. (normal building opening hours) when they do not have lectures or other commitments (the easiest time was within the six hours of timetabled laboratory slots as typically students compulsory attendance is just three hours a fortnight). The equipment is given a fixed position, for the weeks where it is required, and students are encouraged to come in and use it at their convenience. Critically, this means the cost, both cash and staff time, is relatively minimal; once the equipment exists, leaving it set up and ready to use is a normal day-to-day job for technical staff.

There is no registration, no demonstrator (although technicians will be around for emergencies), no requirement for students to access alone or in groups and thus attendance is entirely flexible. However, students are required to collect evidence of their interaction with the equipment within a given access window; this could be screen dumps, results data or other as appropriate to the associated assessment.

Remark

It is accepted that some students may chose not to attend and plagiarise their colleagues’ results, but motivated students will want to attend and use the equipment; the prime objective here is to improve access and thus learning experiences for those who want this.

Learning outcomes with anytime access laboratories

This paper gives some generic guidance on how to ensure students get the most learning from the provision of accessible laboratory equipment. The proposal is to use something akin to the trilab approach described by CitationAbdulwahed (2010). Laboratory sheets comprise a number of components:

  1. Preparation material which focuses on technical learning outcomes such as: which module content is relevant, which computations may be required, anticipating likely results, which background reading or research might be useful and so forth.

  2. Instructions for a virtual laboratory (CitationMemoli 2011, CitationRossiter & Shokouhi 2012) are available online 24/7. This emulates the activities of the real hardware and thus allows students to anticipate likely tests, conclusions and to reinforce what they need to understand before attendance.

  3. Instructions for attendance follow steps 1 and 2. The expectation is that the student can now be highly efficient and focus solely on issues linked to usage of the equipment rather than theoretical aspects.

Of course, students can also revisit the hardware on a later date if their first visit revealed a lack of understanding and the need for further preparation or indeed gave imperfect results.

Reinforcing the learning and promoting engagement

In the author’s department, the intention was to have relatively open-ended objectives in order to encourage independent learning, research led-learning and experimentation. The philosophy was to let the students try whatever they like, within reason and this freedom may also spark some enthusiasm. This is in stark contrast to more typical laboratories that often have relatively fixed activities and outcomes.

However, effective assessment of the student engagement is critical in order to encourage students to get the most from the opportunity as the well known mantra is ‘assessment drives learning’. If there is no associated assessment, that is the activity is not properly embedded in the curriculum, many students will not bother.

Hence, the associated assignment was designed to have limited requirements on technical learning outcomes and was instead focussed on a more reflective type of exercise. Students gave a group presentation to a personal tutor, thus in a relaxed atmosphere where dialogue would flow more freely; also preparing a group presentation encouraged student reflection on the activities before the tutorial. Following the presentation, the tutor engaged the students in a discussion of technical and other possible learning outcomes; the expectation was that students would be active in such a discussion as it was based on something they have just sweated over producing. The mark awarded was linked to the presentation, the evidence provided and student participation in any discussion and indeed it was common for tutors to give different marks to students in the same group.

Description of anytime access laboratories

For completeness a brief description of the hardware used and associated learning outcomes is given here. Two hardware activities were made available. Both activities were chosen to be intrinsically safe and minimal risk so that although a technician needs to be in the locality, students do not need careful monitoring or instruction while using the hardware.

Experiment 1 consists of a DC servo system for which students can make changes in voltage. A simple DC servo is known to have velocity dynamics which can be approximated quite well by a first-order differential equation; this laboratory is used both to affirm that assumption and to give students practice in model fitting using real equipment/data. The Labview interface allows students to perform three distinct tasks. Critically, each of these tasks can be performed on the actual equipment in just a few minutes.

  1. Put a steady voltage supply of numerous different values, positive and negative, and plot the associated steady-state angular velocities, to determine the system gain. The plot also allows students to observe any non-linearities and issues linked to using real data.

  2. A square wave signal is used to produce the step response from which students can validate the shape, as is expected for a first-order system, and estimate the time constant.

  3. The final activity encourages students to produce a step response for an ideal system with their estimated gain/time constant and compare with the responses from the equipment.

Equipment 2 is a position control system, in essence a DC servo drives a carriage along a track (about 2 m in length) and the objective is to design a proportional plus integral (PI) controller which moves the carriage to the target point smoothly, correctly and fast enough. Students are encouraged to experiment with:

  1. Proportional only control – this produces offset and over active control signals for large values of proportional.

  2. Integral only control – while this removes offset it is impossible to get satisfactory transient responses. For small integral the response is too slow and for larger integral the response is too oscillatory.

  3. Proportional and integral – students should be able to get a good response.

Students will also notice many ‘real hardware’ features such as stiction, jerkiness in movement, etc. while reinforcing their understanding of the role and design of PI compensators. Again, one can easily perform a wide range of tests in under 10 minutes.

Evaluation of anytime access laboratories

This section gives a brief evaluation of the efficacy of the anytime access laboratories.

Staff and financial resource

From a staff perspective the laboratories are potentially very efficient to run. Assuming the laboratory is opened already for other formal timetabled slots, or indeed the technicians are located in the laboratory space anyway, there is no extra staff load in having the equipment available. Moreover, as students need only access the activity for relatively short periods, and can return, it is possible to operate with a single piece of equipment whereas formal timetabled laboratories often need numerous duplicate copies of each set of equipment. Including the assessment in a personal tutorial slot gave tutors something meaningful to do with their tutees, which they and the students seemed to enjoy, but without increasing actual workload on those staff.

Students

A survey of the participating students, cohort size 70, was taken in the first lecture of semester 2 (hence most students were in attendance):

  • Most (about 80%) accessed the equipment in small groups.

  • Access was not as good as expected (or desired) due to the temporary housing of departmental laboratories in a different building 400 yards from the technician office and thus technicians were unable to be present as often as desirable. Nevertheless most students still made good use of the equipment.

  • Eighty per cent of students engaged in active discussion of the laboratory and associated theory with fellow students.

  • The majority (about 85%) felt anytime access laboratories was a good idea and helped their learning; the problems with access probably explain those who did not think this. Eighty per cent feel this concept should be extended and maintained.

  • The majority (about 75%) used the opportunity to experiment a little and try things out rather than doing the minimum required for the assignment.

  • The majority (about 80%) felt the assignment helped their learning.

In summary, while there are some teething issues linked closely to local logistics, the overall impression is of a concept that will work very well in the future once technicians are situated in the same corridor as the laboratories. The laboratories have been effective in generating some independent learning and reflection, student discussion and critically, at relatively low cost in terms of staff time. Consequently, this pilot study demonstrates this is a win/win approach.

  1. Workload for staff was minimal, both in preparation and assessment and certainly not beyond alternatives. Also, equipment requirements are if anything lower than obvious alternatives.

  2. Students enjoyed the free access and evidence from the tutorial sessions is that they engaged quite deeply in learning outcomes that many students usually struggle with. Definite evidence of independent learning.

Virtual lectures, lecture recording and changing the role of lectures

There is a growing awareness among ordinary academics that the role of the lecture needs to change (CitationMazur 2012). Historical practice arose when there were no computers, copying was difficult or expensive and text books were not so readily available and consequently a prime role of the lecture was didactic, that is focussed on the delivery of information. Nevertheless, it is recognised that such delivery mechanisms encourage relative passive behaviour from the students and hence may be ineffective for encouraging learning. Moreover, many students believe having the notes is enough until revision time and thus do not engage. It is now straightforward to give students access to comprehensive notes (hard copy and/or soft copy) and thus a lecture no longer needs to fulfil a didactic requirement.

Simple improvements on hardcopy notes are possible. For example, it is now relatively easy to produce animations, perhaps with sound, that means the resources engage more than just reading (Porter 2007); it is well known that the more senses that are engaged, the more likely the brain is to retain some of what is going on. Secondly, in conjunction with the ready availability of computing, there has been an increase in learning resources which support student activity, for example, the use of virtual environments in which students are players (CitationGuzman et al. 2006, CitationKhan & Vlacic 2006, CitationCameron 2009). Perhaps one of the most obvious and straightforward mechanisms for obtaining some of the earlier benefits is lecture recording (CitationRossiter et al. 2009), as this gives students the opportunity to revisit the lecture in their own time and update/correct their notes; notably students are active in this process and many evaluations reinforce the benefits to students.

A logical outcome of the earlier observations is the potential to separate didactic information giving from student activity. Information, in the main, does not need to be delivered in a didactic lecture, unless emphasis is needed, because it can equally be delivered in several other formats. The example pursued in this paper is to suggest that short lecture snippets covering core information such as data and algorithms (CitationWilliams & Fardon 2005, CitationRossiter & Gray 2010, CitationSaunders & Hutt 2012) can be made available on the Internet, perhaps in place of formal lecture delivery and possibly even in place of hard copy notes. Students are required to view these in their own time as self study and make personal notes as appropriate. Consequently, the lecture time itself can be used for ‘student activities’ which improve engagement and learning.

Readers will be interested in what format a short lecture snippet should take. To begin, there are two obvious alternatives:

  1. Record traditional lectures and save these.

  2. Create new e-resource material specifically.

A weakness of recording traditional lectures is that there is a little opportunity to change the lecture content and delivery as it must then contain the core data and algorithms and hence such recordings, while known to be valuable in themselves, have much more limited potential for tackling student engagement. The downside of creating new e-resource material is the associated transient workload, although the resource can be re-used over a number of years, and indeed modules, if made in a sufficiently generic manner.

Creation of short lecture snippets

The most appropriate form of lecture snippet depends upon the intended usage. In the author’s case, the motivation was simple.

To provide detailed coverage of core module learning outcomes, in a didactic fashion, that students can use in combination with text books and tutorial sheets to learn and apply core skills. By providing the didactic content online, the lecturer is then freed to use lecture time for student activity which encourages engagement, understanding, reflection and engineering problem solving. Moreover, they can be less stressed about ‘time keeping’ or covering enough content. Expecting students to view the online material before the relevant lectures also embeds an independent learning expectation into the module delivery.

In order to succeed therefore, the lecture snippets have to be effective as learning resources from which students could pursue independent learning. Earlier experiences with the teaching of MATLAB (CitationRossiter & Gray 2010) had demonstrated over several consecutive student cohorts, that students responded well to independent learning tasks where a supporting online lecture was provided. Nevertheless, certain points were important such as the need to replicate the form of information display available in a ‘chalk and talk’ lecture, such as:

  • Don’t show everything at once or students do not know what to look at and become confused. Use animations to introduce items into the screen just a few at a time.

  • When doing worked solutions of a mathematical nature, solve these by hand rather than introducing several lines of pre-formatted mathematical equations. Students can follow the process of the mathematics being developed, simultaneously associate with the words the lecturer is using and it also enforces a speed of delivery which is more appropriate. (The videos on CitationMathtutor (2012) are a good example of this where the lecturer wrote on paper or a white board and the process was videoed.)

However, if the practice of producing lecture snippets is to be possible for an average academic, it needs to be possible in the privacy of an office and without recourse to expensive camera equipment and other hardware requirements. Ideally it should be possible on a normal desktop computer with readily available microphones and other input devices (such as screens which allow input from a pen). Fortunately there are a number of software tools available now, at relatively low cost, which facilitate this. For completeness it is noted that the author used MyEcho software in conjunction with a sympodium screen; this produces mp4 files which capture whatever happens on the screen and synchronises with audio. The author made all his videos at his desk in his office.

Minor practical issues

Videos made in an office may be ‘imperfect’ both in terms of audio quality and presentation: there is a high potential of disruption, for example through knocks on the door, phone calls or similar and also it is harder to write neatly and in straight lines on a screen than on a piece of paper. However, on the positive side, the lack of expense or need to book a recording suite means that videos can be made very efficiently and within relatively small gaps during the day. For example, a typical video of 15 minutes can be recorded in 15 minutes, assuming the background work is complete. Checking the video takes another 15 minutes but can be done later if necessary. This flexibility makes it easier for busy academics to make steady progress in producing a set of videos to support a given learning objective because they can utilise the small spaces that arise in their timetable on a regular basis.

Evaluation of using lecture snippets to support student engagement, learning and independent study

The main evaluation here is based on a module in the area of modelling analysis and control. The videos existed before the module started and thus the lecture delivery could be modified to integrate the videos into the learning process. For information, videos are stored at the following website http://www.youtube.com/channel/UCMBXZxd-j6VqrynykO1dURw.

Staff

Assuming relatively small-scale interruptions, it is possible to make about three videos in a working day from scratch; roughly equivalent to a one hour lecture. This constitutes the time for generating the slides which is roughly equivalent to what is needed for preparing the corresponding lecture and then the extra 1–2 hours is the actual recording, checking and uploading. While this takes more time than lecture preparation in the first year, the e-resources produced have long term use and thus will reap efficiency benefits in due course, especially as resources can be shared across several modules and departments. Nevertheless, for staff, the prime focus is the development of an effective student learning environment and the inclusion of videos gives a significant enhancement at relatively low cost in time.

Students

A survey was undertaken during the sixth week of lectures to determine student perceptions so far. The cohort were finalists and hence the requirement to view material before a lecture was entirely novel. Results of the questionnaire are summarised in and show that the response is largely very positive. A few students did not like the expectation that they prepare for a lecture, but these were the minority. Some neutral responses may be due to students having not yet ‘got around’ to using the videos properly. A large number of students had not accessed the videos at all, or only once, and many admitted lack of organisation, ‘mea culpa’, etc. Nevertheless this points to an underlying issue of student expectations and culture; they are used to a didactic led curriculum and facilitating a change to a student led learning will not happen easily in a single module in the final year.

Figure 1 Results of the questionnaire.

Remark

Students on several other modules (the videos are relevant to several different cohorts and modules) gained access to the videos after lectures, and for completeness a survey was conducted to gain their views on the project. Surprisingly there were over 4000 Youtube views in just a few months (November–January) from a student cohort of about 200. The vast majority very positive about existence of recordings and potential uses and large numbers had used them. The vast majority found recordings helpful for learning. A good majority found it easier to learn from recordings than textbooks and notes and about 50% would support a change in lectures to more problem solving and expecting students to view videos in advance.

In summary, this pilot study is very encouraging and the videos are now available for future years on several modules so staff can begin the process of rethinking how to make the most of valuable lecture time.

Using the virtual learning environment to improve engagement while reducing assessment loads

Virtual learning environments are widely used and discussed in the literature and thus there are many well known uses which encourage student activity. Consequently this paper focuses on a relatively small innovation, whose intention is to encourage student engagement with other learning resources. The key proposal is to explore the potential within a quiz environment (CitationBull et al. 2006) to assess student engagement and learning whilst ensuring constructive alignment of the curriculum and assessments (CitationOliver & Herrington 2003). The main aim here is to achieve a substantial reduction in the time taken by staff to assess student learning, without detriment to the potential for effective feedback.

Quiz environments have been used extensively (e.g. CitationSangwin 2010, CitationHELM 2012) to test student learning in areas such as mathematics (lots of question which require numerical answers) and medicine (lots of multiple-choice questions) to name just a few. Here, we show how the quiz environment can be used to test learning in computer programming. In particular, production of correct code and, indirectly, to give students the motivation to undertake responsibility for self validation of their work, which is linked to the development of independent learning skills. Of particular note to staff here is the use of the quiz environment to provide automated marking for tasks which may involve students undertaking significant ‘working in advance’ to prepare a solution.

Background on existing practice for assessing MATLAB skills

The author’s department focuses on two main languages within its systems and control programs, that is MATLAB and C; this paper focuses on MATLAB. While MATLAB does allow programs to be written and indeed is internationally popular in the control community and industry, there are a number of obstacles to effective inclusion of MATLAB assessment in the curriculum (CitationIrving & Crawford 2010, CitationRossiter et al. 2011, CitationRossiter 2013). Most significantly, marking student work is time consuming, especially for large classes and consequently several authors have investigated different techniques for improving this efficiency.

A very popular method is to come up with some form of automated marking method whereby student code is automatically marked and moreover the marking code generates generic feedback and marks and compiles all of these for the entire cohort. Certainly, with large classes, such automated marking saves time and potentially allows students to receive very rapid and personalised feedback. Nevertheless, there are a number of weaknesses such as: (i) the feedback is rather generic and may not pinpoint the actual problem or failing; (ii) if students do not submit code with the correct naming or other conventions, substantial time may be needed to correct this before submitting code for marking; (iii) authoring effective marking code is a substantial task in itself and thus will not become efficient until class sizes are large; (iv) automated marking gives no credit or insight into program design and layout, comments and other important skills.

Where classes are smaller, the author has preferred to use face-to-face marking at the computer. While this can take five minutes per student, it has the advantage of being personalised, involves dialogue, can identify precisely where student code is failing or can be improved and so forth. Hence, given the quality of feedback available in a relatively short time, this is definitely more efficient and more effective than using marking code for class sizes up to around 80. However, face-to-face marking is only efficient when the number of marking criteria is small; as soon as the number of criteria increase or become more complex to assess, completing this in five minutes per student becomes unmanageable.

Using a quiz environment to assess MATLAB competence

In making judgements about the most efficient mechanism for marking student MATLAB skills, one needs to be precise about what exactly is to be assessed, e.g.

  1. Program structure, syntax, design, comments, etc.

  2. Program produces correct numerical or other output.

The most efficient mechanism is different for each of these. Face-to-face marking is efficient for the former as widely varying program designs and comments may be equally good. However, the emphasis of this section is on numerical or other output which can be tested very efficiently by some form of automated system.

Quiz environments (e.g. CitationHELM 2012) are very good at assessing simple numerical answers, or for matching/multiple-choice type questions. An experienced user can write questions at the rate of one every few minutes, especially where the requirement is small. A typical example would be:

Write a program to solve for the step response of G(s) = (s + 1)/[(s 2 + 4s + 1)(s + 3)] and determine the output at 3.2 seconds. Give the answer to two decimal places.

Hence, assessing student competence at using MATLAB to compute exact numerical answers can be done very efficiently, certainly much more easily than generating automated marking code. Alternatively, students may be asked to produce a figure as an output to represent a given engineering scenario – to obtain this figure students may need to compute, correctly, solutions to some challenging numerical problems and thus can only do so as competent MATLAB users.

The author gives students indicative questions many weeks in advance so they can prepare and validate their code, especially as some tasks may require relatively complex coding and the use of numerous functions. The ability to assess the production of figures also enables staff to assess student ability to use software or code that is provided to them. On the day of the quiz, under examination conditions, the scenarios will be only slightly different so that a student who understands their own code can provide edits rapidly and obtain a correct answer. A student who ‘borrowed’ code, will not know how to make the requisite edits.

Examples of a range of questions are given in .

Figure 2 Examples of questions.

In summary, it is possible to assess student competence at using MATLAB tools, both code that is provided and code they have been instructed to produce in advance themselves, over a wide range of topics and with relatively small set up time. In fact, for tests under pseudo examination conditions, the author often simplifies the set up time further still by writing the questions in a word document and thus the quiz environment is used solely to collect, store and mark student responses; this is because writing neat questions in a word processor is far easier than doing so in a quiz environment. The word document does not need to be printed as it can be provided on the same virtual learning environment site as the quiz.

Evaluation of using a quiz environment to assess MATLAB competence

The key evaluation point here is staff load – how efficiently is it possible to mark the technical progress of a large number of students over a wide variety of MATLAB skills areas? In simple terms, producing the quiz is the same load as would be required for developing an assessment anyway. Thereafter, marking is essentially cost free for staff as opposed to the earlier scenario of either generating automated marking code (two days work – CitationIrving & Crawford 2010) or marking each student’s work face-to-face – over 10 minutes each for an assignment which looks at around 15–20 different skills. Even better, by using the virtual learning environment quiz environment, the marks are automatically logged and immediate feedback is provided to each student. Consequently, from a staff perspective the advantages are clear.

Nevertheless, we should emphasise that this form of task also plays a major role in developing student independent learning. Students need to prepare in advance and validate their own code, in preparation for the assessment day.

Finally, it is noted that students get the same quality of feedback as with obvious alternative approaches. For more fundamental programming errors and misunderstandings there is no simple alternative to providing regular tutorials where they can gain formative feedback and guidance.

Conclusions

The paper has made several linked contributions. The underlying message is that one can significantly improve student engagement, the learning experience, feedback and embedding the requirement for independent learning without necessarily requiring an increase in staff loading by making imaginative use of technology that is now readily available. This paper has proposed and evaluated three separate pragmatic approaches and demonstrated that in each case we have a win/win scenario, that is, there are benefits both for the student development and experience as well as for the staff. In particular the following benefits are observed:

  1. Anytime access laboratories allow students more flexible access and the opportunity to explore or undertake research led learning. The cost in terms of staff time and equipment is less than providing multiple copies of a piece of equipment and including in a formal timetable. Effective embedding via an appropriate assessment ensures students achieve useful learning.

  2. The development of video type lecture materials is now a simple task that academic staff can do in their own offices. This paper gives evidence that this is achievable without excessive workloads and also enables the modernisation of the lecture, as encouraged in the literature (e.g. CitationMazur 2012). The approach appears to be popular with many students, effective at improving engagement and embeds a requirement for independent learning. Also resources can be used across many separate modules.

  3. Assignment marking is one of the major workloads for academic staff and thus a barrier to students receiving rapid and regular feedback. This paper has shown how imaginative uses of quiz environments can facilitate marking of quite complex learning outcomes and questions with minimal requirements on staff marking time. In addition, using this approach, students need to undertake independent learning and validation of that learning in order to prepare effectively.

References

  • Abdulwahed, M. (2010) Towards Enhancing Laboratory Education by the Development and Evaluation of the trilab Concept, PhD Thesis, Loughborough University.
  • Bull, S., Quigley, S. and Mabbott, A. (2006) Computer based formative assessment to promote reflection and learner autonomy. Engineering Education 1 (1).
  • Cameron, I., (2009) Pedagogy and immersive environments in the curriculum, Blended Learning conference.
  • Fidler, A., Middleton, A. and Nortcliffe, A. (2006) Providing added value to lecture materials to an iPod generation. In Proceedings of the 6th Conference of the International Consortium for Educational Development, Sheffield.
  • Guzman, J., Astrom, K., Dormido, S., Hagglund, T. and Piguet, Y. (2006) Interactive learning modules for PID control. In Proceedings of the 7th IFAC Symposium on Advances in Control Education ACE06, Madrid.
  • HELM. 2012. Helping engineers learn mathematics. Available at http://helm.lboro.ac.uk/.
  • Irving, A. and Crawford, A. (2010) Automated assessment and feedback on MATLAB assignments, ICTM10 proceedings. Available at http://www.ictmt10.org.uk/.
  • Khan, A and Vlacic, L. (2006) Teaching control: benefits of animated tutorials from viewpoint of control students. In Proceedings of the 7th IFAC Symposium on Advances in Control Education ACE06, Madrid.
  • Mathtutor (2012) [last access data]. Available at http://www.mathtutor.ac.uk.
  • Mazur, E. (2012), Plenary, Educating the Innovators of the 21st Century Lecture. In Proceedings of the ISEE.
  • Memoli, P. (2011) Virtual experiments, Project funded by HESTEM. Available at http://www.edshare.soton.ac.uk/6589/1/preloader-diode.html.
  • Nortcliffe, A.L. and Middleton, A. (2007) Audio feedback for the iPod generation. International Conference on Engineering Education, Portugal.
  • Oliver, R. and Herrington, J. (2003) Exploring technology-mediated learning from a pedagogical perspective. Journal of Interactive Learning Environments 11 (2), 111–126.
  • Parson, V., Reddy, P., Wood, R. and Senior, C. (2009) Educating an iPod generation: undergraduate attitudes, experiences and understanding of vodcast and podcast use. Learning, Media and Technology 34 (3), 215–228.
  • Porter, P. (2009) Using two and three dimensional graphical animations in powerpoint, Blended learning conference.
  • Qiao, Y., Liu, G., Zheng, G. and Luo, C. (2010) Design and realization of networked control experiments in a web-based laboratory. In Proceedings of the UKACC 2010, Manchester.
  • Reload (2010) Real labs operated at a distance. Available at http://www.engsc.ac.uk/mini-projects/reload-real-labs-operated-at-dist (accessed 1 September 2010).
  • Rossiter, J.A. (2011) Which technology can really enhance learning within engineering? International Journal of Electrical Engineering Education 48 (3), 231–244.
  • Rossiter, J.A. (2013) Efficient assessment of MATLAB for control. In ACE 2013, the 10th IFAC Symposium on Advances in Control Education, Sheffield.
  • Rossiter, J.A. and Gray, L. (2010) Supporting development of independent learning skills. In Engineering Education Conference, Madrid.
  • Rossiter, J.A. and Gray, L. (2012) Using teamwork to engage students and manage transition. Engineering Education Journal 7 (1), 48–59.
  • Rossiter, J.A. and Rossiter, D. (2007) A blended learning module design approach to engage new students. In Second International Blended Learning Conference, Bedford.
  • Rossiter, J.A. and Shokouhi, Y.B. (2012) Developing virtual laboratories for introductory control. In Proceedings of the UKACC 2012.
  • Rossiter, J.A., Nortcliffe, A., Griffin, A. and Middleton, A. (2009) Using student generated audio to enhance learning. Engineering Education Journal 4 (2), 52–61.
  • Rossiter, J.A., Irving, A., Lynch, S., Mohtahdi, C. and Becerra, V. (2011) HEA Discipline Workshop and Seminar Series 2011–12, The use of MATLAB within Engineering degrees. Available at http://www.shef.ac.uk/acse/events/heaseminar2011.
  • Sangwin, C. (2010) Who uses STACK. Available at http://mathstore.ac.uk/headocs/WhoUsesSTACK.pdf (accessed September 2013).
  • Saunders, F.C. and Hutt, I. (2012) Richness, responsiveness and relationship: using rich media materials to enhance the teaching of core concepts. In EE2012, International Conference on Innovation, Practice and Research in Engineering Education.
  • Thomas, L. (2012) A summary of “What works: student retention and success programme”. Higher Education Academy.
  • Williams, J. and Fardon, M. (2005) On-demand internet-transmitted lecture recordings: attempting to enhance and support the lecture experience. In ALT-C 2005: Exploring the Frontiers of e-Learning – Borders, Outposts and Migration, Manchester.
  • Yorke, M. (2006) The first year experience in higher education in the UK: report on phase 1 of a project funded by the Higher Education Academy.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.