Publication Cover
Engineering Education
a Journal of the Higher Education Academy
Volume 7, 2012 - Issue 2
809
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Manufacturing excellent engineers: skill development in a Masters programme

&
Pages 38-50 | Published online: 15 Dec 2015

Abstract

An MPhil programme, delivered by the Engineering Department at the University of Cambridge, claims to be excellent at preparing graduates for manufacturing industry careers. The course uses a combination of different educational experiences, including industry-based assignments, industrial visits and practical exercises.

This research explores how problem solving skills are developed during the first module, Induction, which is designed to enable students to undertake their first industrial assignment.

From the literature, four conditions necessary for skill development were identified:

  • Provision of a skill description, making explicit key components

  • A number of different experiences with a range of contextual variables

  • A teaching process which includes regular feedback and student reflection

  • Students motivated to learn.

These were used to construct a skill development framework (SDF).

Using a case study research design, multiple types of evidence were collected to test for the above conditions using both classroom observation and questionnaire methods.

The results confirmed the presence of the SDF conditions at different levels, with reflection aspects considered the weakest. Conflicting results were obtained regarding the students’ self-awareness of skill levels. A plausible explanation is a change in the students’ frame of reference.

This initial study set out to develop a better understanding of the process of skill development. Whilst the SDF appears reasonable, there is a need for further work in three broad areas of defining skills, assessing skills and developing reflection skills.

Introduction

The purpose of this study was to develop a better understanding of skill development in higher education (HE), with particular reference to the more complex skills used in work-related contexts. The aim of this initial work was to construct a skill development framework from the literature and then test it by evaluating the development of problem solving skills in a Masters programme. As skill evaluation can be resource-intensive, student self-assessment was explored as a potential resource-efficient option. Further areas were identified in which to continue research related to skills development.

The Industrial Systems, Manufacture and Management (ISMM) MPhil programme at the University of Cambridge (CitationInstitute for Manufacturing, 2011) claims to be excellent at preparing graduates for careers in industry. The programme, initiated 46 years ago, was designed to prepare engineering graduates for operational roles in industry. A study at the 36-year point (CitationRidgman and Wiggins, 2003) concluded that the programme was very successful but could not determine whether this was a function of the course or, for example, effective selection of candidates.

Four industrial assignments, accounting for 50% of the programme assessment, are a key component of ISMM. Groups of two students spend two weeks working on a real life issue of some significance to a company. Students present their problem definition, analysis and proposed solutions on their last day and submit a written report the following week.

The first module of the programme, Induction, lasts four weeks and is followed by the students’ first industrial assignment. During Induction there is a focus on developing students to undertake industrial assignments. Problem solving skills are considered fundamental and a method of developing these has been established. This starts with a lecture on problem solving and is then followed by five experience-based exercises, with group membership, type of task and group size being varied. Course tutors have found that this preparation has enabled the vast majority of students to successfully attempt the problem faced during their first industrial assignment.

Literature review

This research considers the broad fields of employability, higher education and professional development and how these contribute to skill development. The key bodies of knowledge and how they interrelate are presented in .

Employability

Individual skills are a common feature of employability models (CitationHillage and Pollard, 1998; Knight and Yorke, 2002; Dacre Pool and Sewell, 2007).

The work of Knight and Yorke has been key to developing a definition and model of employability for HE. Graduate employability is defined as ‘a set of achievements — skills, understanding and personal attributes — that makes a graduate more likely to gain employment and be successful in their chosen occupations’ (CitationYorke, 2006). A key claim (CitationKnight and Yorke, 2004) is this is an academic research-driven view, not driven by the perceived needs of employers or governments.

The USEM Model (CitationKnight and Yorke, 2004) identifies four key components comprising understanding (U), skilled practice (S), efficacy beliefs (E) and metacognition (M). A key feature of the model (see ) is that the efficacy beliefs provide a foundation to employability and feed the U, S and M components.

The E component represents a persons’ belief that they can make an impact on a situation. It includes a broad range of theoretical contributions (CitationKnight and Yorke, 2004), including Bandura’s work on self-efficacy and work on practical intelligence (Sternberg and Grigorenko, 2003) that has a particular resonance as it closely resembles ISMM problem solving. The term skilled practice (S) was chosen to capture the view that skills are context specific, not easily transferable, and assessed with difficulty.

There are few publications citing this model being used in practice. It is suggested that assessment could be a factor, as broad complex constructs such as employability resist summative assessment (CitationKnight and Yorke, 2003).

Figure 1 Key bodies of knowledge

Figure 2 USEM model of employability (CitationKnight and Yorke, 2002)

Skills

There is no simple definition for skill (CitationEraut, 1994; Tight, 1996; Moon, 2004). At a high level there is broad consensus that a skill is the ability to do something, for example: ‘skill […] is the ability to do something that has been learnt’ (CitationMoon, 2004). Also agreed is that there is a range of skill types, including physical, practical and cognitive (CitationMoon, 2004).

However, below this level, agreement disappears — with overlapping categories and interpretations of skill, as in the case of “employability” and “professional” skills. Each category is typically labelled and defined for a particular community, causing difficulties in direct and consistent translation.

In HE neither “employability” or “professional” are used in the specification of programme content. Four categories are used (CitationQAA, 2006): knowledge and understanding, intellectual skills, practical skills and transferable skills, with transferable skills appearing to map most closely to employability skills.

There is broad agreement that a skill requires some knowledge “that” combined with some knowledge “how” (CitationEraut, 1994; Moon, 2004). In terms of professional skills, Eraut states that they ‘require unique combinations of propositional knowledge, situational knowledge and professional judgement’. As judgements are made relative to a particular context it is important that skill development is undertaken during context-specific activities.

Skills are learned by a combination of methods (CitationEraut, 1994; Moon, 2004; Tether et al., 2005), including education, training and experience.

There are a number of models which propose different levels of skill in a professional or work context, such as the five level Dreyfus Model (CitationDreyfus and Dreyfus, 1986) and the four level International Project Management Association (IPMA) to describe project management skills (CitationIPMA, 2011). A common thread is the use of multiple experiences to develop higher levels of context-specific knowledge and judgement-making skills.

It was CitationKolb (1984) who proposed a general model of learning by experience. Whilst all Kolb’s dimensions are likely to contribute to learning, the circular model is perceived as being too simplistic (CitationCoffield et al., 2004; Race, 2010). However, the circle is helpful in reinforcing that a number of related experiences support a person’s learning.

CitationRace (2010) returns to the concerns of Kolb to identify factors that underpin successful learning. Race accepts that there are relationships between these factors but argues that they will vary between people. A key difference with Kolb’s model is the presence of wanting or needing to learn.

In summary, four key aspects of skill development are identified from the above literature: student motivation, experience, feedback and reflection. These concepts will now be explored further for HE.

Student motivation

A common categorisation of student motivation is intrinsic and extrinsic (CitationHarter, 1981). Intrinsically motivated students are curious and want to learn, whilst the extrinsically motivated worry about grades and approval by others.

Skill development could be a challenge for extrinsically motivated students, given that it is problematic (CitationKnight and Yorke, 2003). However, motivation can increase with understanding (CitationRace, 2007) and understanding can be developed by experience and interaction. Principles to promote motivation include discussion of importance and utility, provision of clear and accurate feedback and the provision of stimulating and interesting tasks (CitationPintrich, 2003).

Experience

Of the experience-based teaching methodologies, problem-based learning (PBL) (CitationBarrett and Moore, 2011) and project-based learning (PjBL) (CitationGraham and Crawley, 2010) would appear to be relevant. However, the best match is consultancy projects used in MBA programmes (CitationJennings, 2002).

Exercises or simulations are able to provide a wide range of skill development opportunities (CitationJennings, 2002; Goodhew, 2010). It is noted that they are particularly effective for learning related to complex situations and they tend to be rare in practice due to the time required to develop them (CitationGoodhew, 2010).

Experience can also be used to develop a student’s self efficacy — a key component of the USEM model. Ways of influencing efficacy development are identified as mastery and vicarious experiences (CitationBandura, 1995).

Feedback

Feedback can be summative in terms of what was or wasn’t achieved and formative in identifying how performance can be improved in future. Some widely recognised indicators of effective feedback include being prompt (CitationRace, 2010) and being positive and constructive (CitationLandsberg, 2003).

Feedback can be relevant to the whole class or to specific individuals or groups, thus different mechanisms are required to ensure this crucial aspect of learning is carried out effectively and efficiently (CitationRace, 2010).

Reflection

Reflection is a process that connects the notions of learning and thinking (CitationMoon, 2004). As well as being key in the learning process (CitationKolb, 1984), it also plays a significant role in ongoing professional development. The idea of the reflective practitioner was developed by CitationSchön (1987), who describes an outcome as the ability to handle complex problems with confidence, skill and care.

CitationMoon (2004) reports a depth dimension to reflection, with shallow reflection being less likely to be as effective in supporting learning. Reflective learning can be improved by providing a clear purpose and a list of questions linked to associated learning goals. However, students have been found to have limited understanding of reflection (in terms of its value), what it means and how it is undertaken in practice (CitationMoon, 2004).

Framework development

Individual components have been described above and their success in supporting learning could be amplified if they are constructively aligned (CitationBiggs, 2003). This suggests that a skill development framework (SDF), integrating the components identified from the literature, would help inform the design and delivery of skill development activities. To that end, a SDF must be practical and appropriate for a wide range of skills.

The following conditions for the effective development of complex skill sets are identified:

  1. The provision of a description of the skill, making explicit key components of knowledge “that” and knowledge “how” and examples of the types of typical judgements associated with the delivery of the skill

  2. A number of different experiences with a range of contextual variables and at the level of difficulty to provide mastery and/or vicarious experiences

  3. A teaching process which includes student reflection and the provision of feedback after each experience

  4. Students motivated to learn, of which indicators are engagement in learning activities and learning outcomes linked to summative assessment.

Indicators are identified for each condition. The number has been limited to a maximum of four to present a framework of practical size. This is shown in .

The SDF will be tested with regard to problem solving skills, so it is important that these are defined. As with many skills, these have been labelled and defined for a particular community — in this case by the Institute for Manufacturing (IfM) — and defined in relation to industrial problems that students face.

In the absence of a formal definition, a working definition was compiled by extracting key learning points or learning outcomes from course documentation. This identified three key components: problem solving, managing a project and group work.

The first test of the SDF was to evaluate how it relates to the development of problem solving skills, with the objective of determining if the conditions identified are present in practice. In Induction this is carried out in a lecture followed by five exercises and from now on is referred to as the development method (DM).

Further testing of the SDF will require a mechanism to measure levels of problem solving skills. This has been identified as problematic in terms of resource requirements and ability to provide a reliable scaled measurement. With problem solving being much less complex than employability, and a quick method of assessment being very attractive in terms of resource requirements, it was decided to investigate whether a self-assessment instrument could be used.

Methodology

This exploratory research lends itself to the case study research method set out by CitationYin (2009), using ISMM Induction as the case. To enable results to be unbiased, measures were taken to ensure that the research had a minimal impact on student or facilitator behaviour. The overall research design is shown in .

The research requires multiple sources of evidence to be collected using both qualitative and quantitative approaches. The methods used were observation of the DM and questionnaires which were applied at the start and the end of the DM.

This research design does not take into account the student perspective on skill development. As this could have a significant impact this will be investigated via the questionnaire at the end of the DM.

The data collection method for each aspect of the SDF is identified in . Each cell has a different coding: I denotes indicator, M denotes method and R will be used later to denote result.

At the end of each exercise, each group was given time to discuss their performance and identify three aspects that had gone well and three that could be improved. Results presented to the class were analysed. Four categories were found: problem solving, group working, project management and presentation. These align with the definition of problem solving skills, with the addition of presentations.

Table 1 SDF - indicators

Figure 3 Research design

Table 2 SDF — showing research method per indicator

Table 3 Judgement levels

A consolidated view of the DM was undertaken by making a judgement of the adequacy of each component of the SDF using the four levels of judgements described in .

To understand the student perspectives of skill development, three open questions were asked in order to identify what aspects of Induction had influenced their learning of skills. All reasons were analysed and answers were sorted by those considered to be more supportive of skill development (e.g. problem solving exercises) and less supportive of skill development (e.g. subject-specific lectures).

The data collection for self-assessment was carried out via questionnaires. The start questionnaire was designed to capture self-assessment of skill levels using both a comparison with their previous undergraduate peer groups and self-confidence in their ability to perform skills in an industrial company environment. The final questionnaire was designed to repeat the capture of self-assessment of skill levels described above and explore how students related their current level and understanding of skill to their level at the beginning of the module. From a cohort of 49 students, response rates of 94% and 96% were achieved for the start and end questionnaires. This resulted in 88% complete data sets where comparisons could be made between the start and end. With such high response rates the findings from this research are considered valid for this course.

Ethical issues in relation to the students were considered (CitationCreswell, 2009). All were informed about the research project and reassured that all data would be kept confidential and used only for the purposes of research.

Results

Comparison of SDF to DM

Each element of the SDF is compared to the DM and the results are summarised in .

Reflection output results

Each output (see ) was shaded to reflect the aspect skill using the key below.

Adequacy of the DM

Given the above results, the levels of judgements described in were applied and the results are shown in .

Student self-assessment

For student self-assessment relative to the module start, the majority of students failed to answer the set of questions as intended. However, it was possible to determine that 57% thought their skill levels had increased and 47% felt that problem solving skills were more complex than they had thought at the beginning of the module.

Student perception of skill development

The percentage of correct answers per question was 75%, 88% and 51%, indicating that understanding of skill development is variable.

Table 4 Results — SDF applied to DM

Table 5 Reflective outputs - example results from Exercise 2b

Table 6 Skill development framework — level of presence results

Table 7 Student self-assessment results

Discussion

Comparison of the SDF to the DM

Reflection aspects were considered to be the weakest. The exercise was problematic as it did not encourage individual, focused or in-depth reflection and students were required to reflect quickly, immediately following an exercise. However, it did provide an effective way to identify issues and prompt discussion on aspects of skill development.

The outputs were judged superficial due to their general nature (e.g. time management) and contradicting statements (e.g. ‘good at timing’ and ‘rushed too much then had to go back and redo’).

Reflective outputs remained superficial throughout the exercises as students demonstrated poor levels of reflective skills and instruction on reflective skills was not provided. The issues identified were mostly project management or team working aspects and, as the students had only had limited descriptions of these elements, there was little basis for reflection.

Some elements of feedback provision, such as prompt feedback of a workable solution and supporting handouts, reflected good practice. However, the workable solution was presented quickly and, as the reflection activity followed immediately, there was little time for digestion.

Levels of student engagement fell during feedback activities. Although this could be due to tiredness on completion of the activity, given that other aspects of skill development have been found to be poorly understood, it would be useful to test understanding of feedback.

Two components of skill description where some weakness was identified was the description of knowledge “how” and “judgement”. These aspects lend themselves to further explanation following an exercise as they are often context-specific. This raises the question of how to incorporate additional teaching time or resources to support this activity.

Strengths and limitations of SDF

The SDF describes skill development as a multi-dimensional construct in which all components should be present, ideally at an adequate or fully adequate level.

As with any framework, when trying to capture complex constructs there is a danger of oversimplification. It is argued that the teaching process condition should be split into separate feedback and reflection conditions. There would then be five conditions which would map closely with the core components of Race’s learning model (CitationRace, 2010). Another issue is that many components are interrelated, a feature poorly represented in the current presentation.

As an indicator, student engagement appears to offer a useful lens, with the ability to highlight whether components are working effectively or not. This correlates with the literature on student motivation which identifies links to many aspects of the learning process.

This framework is very much at the initial stages of development, having been developed as an evaluative tool in one context and only in relation to problem solving skills. A more reliable method of evaluation is required to support further testing. It should also be possible to revise the SDF for use as a design tool at a later stage.

Student self-assessment of skill levels

The importance of student self-assessment lies not only in its contribution to their ability to reflect but also in the practical viewpoint of the resources required to run and provide external assessment for these forms of experiential exercises.

Any self-assessment of skill is made in comparison to a reference point, such as a peer’s or a defined level of competency. Plausible explanations for the conflicting results point to issues of reference point selection and recalibration.

One reference point was a student’s undergraduate peer group. Valid results rely on this reference point remaining fixed for the duration of Induction. It is suggested that for many their reference point changed to be more in line with the current cohort — typically top level undergraduates.

A second reference point was an “industrial company environment”. Given that this was where careers were sought, it might be expected that they had a perception of this. In practice, many students had little exposure to industrial company environments (as determined from work experience data in the start questionnaire).

The final reference point tested was themselves at the beginning of the module. Again, this is problematic as it relies on their ability to keep this point fixed during the module. It was possible to ask more qualitative questions (e.g. is the skill more complex than they thought? Do they know how to improve?) This might form the basis of a reflective exercise towards the end of the module that could help students consolidate their skills learning and identify where they need to improve. These conclusions are similar to those of Knight and Yorke (2007).

Conclusions and further work

Using the SDF to analyse the DM gave the following insights:

  • All conditions of the SDF were found to be present in the DM but with different degrees of judged adequacy. Eight of the 15 SDF components were considered fully adequate, five adequate and two less than adequate. Reflection aspects were weakest and exercise aspects the strongest.

  • The exercises were judged fully adequate, providing a range of problem types and contexts as well as the level of challenge to produce mastery experiences.

  • The knowledge “that” aspect of the skill description was good but there was limited provision of knowledge “how” and “judgement” aspects. These are often context-specific, so lend themselves to further explanation following an exercise.

  • There was a lack of formal definition of problem solving skills.

  • There was limited provision of formative feedback and opportunities for general feedback were limited by the class time allocated.

  • A combination of poor student reflection skills, the use of a general reflective exercise and its timing resulted in superficial reflective outputs.

Since the DM was previously considered to be well developed this suggests that the SDF, even in its current early stage of development, could be a useful evaluation tool.

Student self-assessment proved problematic as they were not able to reliably identify changes to their levels of problem solving skills during Induction and conflicting ratings resulted. Problems arose with identifying appropriate reference points that remain fixed for the duration of Induction.

Whilst the SDF appears reasonable, there is a need for further work to enable future testing and refinement. This involves defining problem solving and developing effective assessment methods, as well as addressing the two further problems discovered that students had a poor understanding of skill development and poor reflective skills.

Three phases of investigation are proposed, defined by the research questions below and shown diagrammatically in .

Phase 1

  • What is the problem solving skill set?

  • How can performance levels be defined and measured?

  • How can students’ understanding of skills be developed?

Phase 2

  • How can students assess their skill levels?

  • How can reflective learning skills be developed?

  • What reflective learning tools support skill development?

Phase 3

  • How can the DM be improved?

  • What implications are there from Phases 1 and 2 for the SDF?

Acknowledgements

The author wishes to thank Dr David Good, Dr Rick Mitchell and Dr Tim Minshall for their support and advice.

Figure 4 Phases of further work

References

  • BanduraA. (1995) Exercise of personal and collective efficacy in changing societies. In: BanduraA. (ed.) Self-efficacy in changing societies. Cambridge: Cambridge University Press, 1-45.
  • BarrettT. and MooreS. (2011) An introduction to problem-based learning. In: BarrettT. and MooreS. (eds.) New approaches to problem-based learning. Abingdon and New York: Routledge, 3-17.
  • BiggsJ. (2003) Teaching for quality learning at university. 2nd edition. Maidenhead: Open University Press.
  • CoffieldF., MoseleyD., HallE. and EcclestoneK. (2004) Learning styles and pedagogy in post-16 learning. London: Learning and Skills Research Centre.
  • CreswellJ. W. (2009) Research design. Thousand Oaks: Sage.
  • Dacre PoolL. and SewellP. (2007) The key to employability: developing a practical model of graduate employability. Education & Training, 49 (4), 277-289.
  • DreyfusH. L. and DreyfusS. E.(1986) Mind over machine: the power of human intuition and expertise in the era of the computer. Oxford: Blackwell.
  • ErautM. (1994) Developing professional knowledge and competence. London: The Falmer Press.
  • GoodhewP. (2010) Teaching engineering. Liverpool: UKCME.
  • GrahamR. and CrawleyE. (2010) Making projects work: a review of transferable best practice approaches to engineering project-based learning in the UK. Engineering Education: Journal of the Higher Education Academy Engineering Subject Centre, 5 (2), 41-49.
  • HarterS. (1981) A new self-report scale of intrinsic vs extrinsic motivation in the classroom. Developmental Pyschology, 17 (3), 302-312.
  • HillageJ. and PollardE. (1998) Employability: developing a framework for policy analysis. London: Department for Education and Employment.
  • Institute for Manufacturing (2011) Industrial Systems, Manufacturing and Management MPhil programme. Available from http://www.ifm.eng.cam.ac.uk/ismm/ [accessed 28 July 2011].
  • IPMA (2011) IPMA competence baseline. Available from http://ipma.ch/resources/ipma-publications/ipma-competence-baseline/ [accessed 12 July 2011].
  • JenningsD. (2002) Strategic management: an evaluation of the use of three learning methods. Journal of Management Development, 21 (9), 655-665.
  • KnightP. T. and YorkeM. (2002) Employability through the curriculum. Tertiary Education and Management, 8 (4), 261-276.
  • KnightP. T. and YorkeM. (2003) Assessment, learning and employability. Maidenhead: SRHE and Open University Press.
  • KnightP. T. and YorkeM. (2004) Learning, curriculum and employability in higher education. London and New York: RoutledgeFalmer.
  • KolbD. A. (1984) Experiential learning. Englewood Cliffs, New Jersey: Prentice Hall.
  • LandsbergM. (2003) The tao of coaching. London: Profile Books.
  • MoonJ. A. (2004) A handbook of reflective and experiential learning. London and New York: RoutledgeFalmer.
  • PintrichP. R. (2003) A motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95 (4), 667-686.
  • QAA (2006) Guidelines for preparing programme specifications. Available from http://www.qaa.ac.uk/Publications/InformationAndGuidance/Documents/guidelines06.pdf [accessed 22 October 2012].
  • RaceP. (2007) The lecturer’s toolkit. London and New York: Routledge.
  • RaceP. (2010) Making learning happen: a guide for post-compulsory education. London: Sage.
  • RidgmanT. W. and WigginsC. N. (2003) Postgraduate problem-based learning for manufacturing. London: IEE.
  • SchönD. A. (1987) Educating the reflective practitioner. San Francisco: Jossey-Bass.
  • SternbergR. J. and GrigorenkoE. L. (2000) Practical intelligence and its development. In: Bar-OnR. and ParkerJ. D. A. The handbook of emotional intelligence: theory, development, assessment and application at home, school and in the workplace. San Francisco: Jossey-Bass, 215-243.
  • TetherB., MinaA., ConsoliD. and GagliardiD. (2005) A literature review on skills and innovation. Manchester: ESRC Research Centre for Research on Innovation and Competition.
  • TightM. (1996) Key concepts in adult education and training. London: Routledge.
  • YinR. K. (2009) Case study research, design and methods. Thousand Oaks: Sage.
  • YorkeM. (2006) Employability in higher education: what it is — what it is not. York: Higher Education Academy.
  • YorkeM. and KnightP. T. (2007) Evidence-informed pedagogy and the enhancement of student employability. Teaching in Higher Education, 12 (2), 157-170.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.