133
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Student reactions to the development of professional engineering competencies

ORCID Icon, ORCID Icon, &
Received 02 Aug 2023, Accepted 03 May 2024, Published online: 17 May 2024

ABSTRACT

The ability of Engineering graduates to function as successful professionals depends not only on technical disciplinary knowledge but also on a wide range of professional competencies. Students’ often react differently to educators attempts to develop professional competencies compared to technical competencies. Understanding the nature of these reactions is important if we are to design the most effective educational approaches to the development of professional competencies. In this paper we report on an exploratory factor analysis that used data from a detailed survey of students (N = 339) to identify underlying factors that provide insights to understand the variations in student reactions. A set of five factors were identified from this analysis: capability; learning experience; learning outcomes; employment experience; environment. Two of these factors (learning experience and environment) also exhibited significant differences in the impact between professional and technical competencies. A number of implications for educational design were explored.

Introduction

The ability of contemporary engineering graduates to function as successful professionals depends not only on their technical disciplinary knowledge, but also on a wide range of ‘professional competencies’ (Scott and Yates, Citation2002). There are a wide range of interpretations of the term ‘professional competencies’ (see (Malheiro et al. Citation2019) for a discussion), though in this paper we adopt the interpretation encapsulated within the Engineers Australia Stage 1 Competency Framework – i.e. those competencies that apply to all engineers and that are necessary for competent performance in a holistic way at the stage of attaining registration. This essentially covers the broader transferrable skills that go beyond the specific disciplinary technical or scientific knowledge and abilities.

Most degree programs that lead to an accredited engineering qualification include a focus within their defined learning outcomes on these competencies. Indeed, the Washington Accord, and hence the various national accrediting bodies, explicitly include related learning requirements (International Engineering Alliance Citation2019). Despite this increase and required emphasis on teaching professional skills within the engineering curriculum many employers still find the professional skill development of engineering students does not meet their expectations (Byrne, Weston, and Cave Citation2020). Whilst the approach taken to include the development of professional competencies of students within engineering programs varies considerably (Arlett et al. Citation2010; Shuman, Besterfield-Sacre, and McGourty Citation2005), any negative student perceptions and or reactions to this learning and development can be problematic. While some resistance may result from not demonstrating the case for learning, poor scaffolding, curriculum design or teaching methods, learning that stretches students beyond where they are at or feels institutionally coerced is highly likely to result in resistance from a significant number of students (Brookfield Citation2017).

Hence it is important for students to feel that their learning is valuable and important to their learning goals and desired achievement. Negative student reactions often undermine the learning opportunities provided irrespective of the teaching approach used by educators, leading to reduced student motivation and engagement, and inhibit achievement of the intended outcomes.

This suggests that if we are to improve the development of students’ professional engineering competencies within our engineering programs then, as educators, we should ensure we understand the students’ views – and especially those that may be related to any such negative reactions to this development. Whilst there is no shortage of assumptions made by educators regarding student views, and hence reactions to the design of educational approaches, there is little specific research that has tested these assumptions.

The purpose of the research described in this paper is to contribute to addressing this shortcoming. By understanding students’ views more clearly it may be possible to design the learning activities in ways that respond to, or even divert or diffuse, these views and lead to stronger outcomes. A clearer understanding of the nature and diversity of students’ views should inform improvements in scaffolding, learning activity, and assessment design to both increase the value to and engagement of students with their professional development, leading to improvements in the achievement of associated learning outcomes. Within this context, in this paper, we explore the following research questions:

RQ1: Are there specific factors that may explain differences in students’ reactions to the learning of professional vs technical competencies within engineering degree programs?

RQ2: Are the identified factors influenced by: (a) the level of employment experience in professional settings; and/or (b) the level of study in non-engineering disciplines.

Background

Professional competency in engineering

The requirement for engineering graduates to have developed broader professional competencies has become well recognised. This is strongly reflected in program accreditation criteria, where various professional competencies are identified and often accompanied by detailed indicators of attainment to demonstrate those competencies (e.g. ABET Citation2023; ENAEE Citation2021; Engineers Australia Citation2018; UK Engineering Council Citation2014). Possibly most importantly, it is evident in the Washington Accord (International Engineering Alliance Citation2019) that underpins each of the national frameworks. The Accord includes a set of 12 elements that together form a graduate attribute profile. Many of these elements relate to non-technical competencies (e.g. WA8 – Ethics; WA10 – Communication, etc.)

Beyond just a need for specific professional competencies, there is also a growing recognition that graduates need to integrate their technical expertise and their broader professional competencies and development into a coherent integrated whole (Crosthwaite Citation2019; Passow and Passow Citation2017). This has been acknowledged in various reviews of Engineering education (Graham Citation2012; King Citation2008; National Academy of Engineering Citation2004) and is also reflected in the emergence of a range of ‘integrated engineering’ programs into engineering curricula (Bates et al. Citation2022).

These integrated engineering programs typically adopt approaches such as open-ended and/or multidisciplinary projects as a pathway to integrating different competencies and developing a more coherent professional identity. Often these programs have been informed by research into the development of professional identities (Trede, Macklin, and Bridges Citation2012). Jackson (Citation2016) explores the relationship between professional identity formation and graduate employability. A common theme in the research is the exploration of those factors that contribute to the development of this identity. Whilst it is not uncommon for undergraduates to be shown to have strong disciplinary identities from quite an early stage (particularly in professional disciplines such as medicine and engineering) the evolution of this identity into a form that reflects that of disciplinary professionals often relies on significant exposure to practice; see, for example, (Kemmis et al. Citation2020; Schatzki Citation2012). In some cases, this has been shown to be because practice exposes students to specific discordances between their current concepts (often more connected to an ‘academic identity’) and the nature of professional practice (Pratt, Rockmann, and Kaufmann Citation2006).

Development of professional competencies

It is reasonable to consider the question of how, and indeed whether, professional competencies can be ‘taught’. Shuman, Besterfield-Sacre, and McGourty (Citation2005) consider this question in the context of the ABET ‘professional skills’, citing several successful program examples. They acknowledge that assessing these skills is a major challenge, but note that (underlining is ours):

We are very positive about a number of creative ways that these skills are being learned, particularly at institutions that are turning to global and/or service learning in combination with engineering design projects to teach and reinforce outcome combinations. We are also encouraged by work directed at assessing these skills, but recognize that there is considerable research that remains to be done. (Shuman, Besterfield-Sacre, and McGourty Citation2005, 41)

Possibly the most common approach across a wide range of disciplines to developing professional competencies has been the use of internships, practicums, or industry placements (Ryan, Toohey, and Hughes Citation1996). This is potentially related to both accreditation requirements and the long history of related research. In terms of the former, accreditation bodies often suggest, either explicitly or implicitly, that time spent directly in industry settings is a preferred approach. For example, the Engineers Australia (EA) accreditation criteria refer specifically to ‘workplace placements’, denoted as EPP (Engineering Professional Practice) below:

Student engineers need in addition to knowledge, formative experiences of how engineering professionals: a) Think, work and continually learn … EPP must culminate in a set of meaningful experiences that result in the habituation of professional working styles. … The outcome should be that student engineers are able to aggregate different experiences towards their portfolio of EPP. For maximum pedagogical value, education programs should be designed to enable student engineers to complete this requirement prior to the final study period (semester, trimester, term, etc). The recommended EPP is nominally the equivalent of 60 days (12 weeks) in a workplace placement. For accreditation, documentation must be provided explaining how the various experiences contribute to the equivalent 60 days, and how they contribute to the overall education design. The overall EPP experiences should enhance a graduate’s capacity to move with ease into a professional workplace. (Engineers Australia Citation2018, 17–18)

There is also significant research that explores the value of explicit industry engagement. In many cases, this goes even further and argues that full development of professional expertise can only be developed in ‘practice’ and hence academic programs on their own cannot be sufficient. For example, the Dreyfus model of skill acquisition (Dreyfus and Dreyfus Citation1986) explores students’ learning stages, and especially the move from analytical to intuitive decision making. Similarly, the theory of situated learning (Lave and Wenger Citation1991) recognises the importance of the social context in which learning occurs, and hence argues that professional learning should take place, at least in part, within the same context within which that learning will be applied – i.e. practice settings.

More recent work comparing various theories of expertise identifies a common theme where ‘an emphasis on the holistic nature of expertise, with the implication that experts’ understanding cannot by analysed into components, leads to different types of curricula, where engagement in real-life situations is emphasized’ (Gobet and Chassy Citation2009, 172).

Whilst there are some valuable exceptions, such as the work by Trevelyan (Citation2013) and Reich et al. (Citation2015), the importance of practice contexts to the development of professional understanding is often not clearly articulated and hence not adequately addressed within curricula and professional development frameworks within higher education. For example, the EA Stage 1 Competency standards (Engineers Australia Citation2018) describe 16 categories covering specific knowledge and competencies, the ability to apply that knowledge, and broader professional attributes. It does not however make explicit mention of understanding the cultural practices and contexts of the engineering profession.

Student perceptions of learning outcomes

There is a broad range of research that considers the ways in which students perceptions of intended learning outcomes affects their achievement of those outcomes. For example, research by Ryan and Deci (Citation2000) shows learning motivation is reduced if a student perceives there to be low value in the concepts being learnt.

Within engineering there have been several studies that have investigated engineering students’ perceived value of the skills needed for engineering practice. Caeiro-Rodríguez et, al (Citation2021) in a review of the pedagogical methods used to teach professional skills in engineering education in five European countries asked students if they thought that their courses allowed them to develop the skills they considered important for their academic and future professional career. Only 14% of students responded positively, with 45% being negative while 41% being undecided.

In addition, many studies have found students perceive technical skills as the most important for professional success (Forman and Freeman Citation2013; Winters et al. Citation2013). Nguyen (Citation1998) found both students and academics rated the importance of technical-related skills and knowledge to be higher than those working in the profession. Similarly, recent successful graduates identified professional skills and competencies as being the most important for their success (Scott and Yates Citation2002).

While there have been numerous studies looking at engineering student’s perception of the value and importance of professional skills, few if any have considered student’s perception to where these skills are best learnt. Having recognised that students’ views on professional competencies will affect their approach to learning, and that this approach, in turn, will affect the learning outcomes, we argue that there is a significant need to improve the understanding of those students’ views by educators within higher education.

Method

As discussed previously, in designing educational activities related to the development of professional competencies, we often make assumptions regarding why students might respond in certain ways. These assumptions can then drive our pedagogic approaches. As an example, if we were to assume that students largely believe that professional competencies are important, but that they are better learnt in practice settings, then we would have a better basis of understanding the need to ensure that our educational approaches related to student development of professional competencies emphasises authentic practice setting.

It is therefore reasonable to consider whether there are specific factors that play lesser or greater roles in shaping students’ reactions to learning and developing professional competencies. In order to answer this question, we undertook a detailed survey of students’ views in this area.

The design of the survey was informed partially by the limited existing literature on student reactions to the development of professional practice (Kadi and Lowe Citation2019; Passow Citation2012) – though generally this literature explored the reactions, but did not consider the factors that influenced those reactions. We therefore also explored student feedback and reflections on the existing programs at the lead author’s University (Lowe and Kadi Citation2019). Specific question domains included seeking students’ views on each of the following, with respect to a range of different competencies:

  • – The quality of teaching of each competency

  • – The respondents’ degree of interest in each competency

  • – The degree of difficulty in becoming capable in each competency

  • – Whether each competency should be taught within degree programs

  • – The respondents’ perceived level of capability (both now, and at earlier stages) with regard to each competency

  • – The importance of each competency at varying career stages

  • – The extent to which each competency is underpinned by rigorous theory

  • – Where it is easier to learn each competency, in academia vs industry

The specific competencies asked about in the survey were drawn from the Engineers Australia stage 1 Competency Standard and covered both technical and professional competencies (Engineers Australia Citation2018). These competencies are however broadly similar to those in both EUR-ACE (ENAEE Citation2021) and ABET (Citation2023). As such, the findings from this research should be relevant across all jurisdictional contexts. To assist in assessing this relevance, following is a general mapping from the specific EA competencies assessed to those used in EUR-ACE:

An initial survey was designed and then pilot tested with an initial cohort of 30 respondents. These respondents were then interviewed to assess their interpretation of the questions, allowing us to assess the construct validity. The survey was refined based on this evaluation, resulting in the survey provided in Appendix 1. The survey was then disseminated to students enrolled in undergraduate and postgraduate engineering degrees at the University of Sydney. The participants were recruited through broadcast announcements on student forums. Participation was anonymous and voluntary. Details on the participation demographics are provided in the results section.

In stage 1, we carried out an exploratory factor analysis (EFA) (Costello and Osborne Citation2005) on the data to attempt to identify the underlying factors that were most significant in accounting for the variations in students’ responses. For each of the identified factors we then looked at student responses to the questions that contributed to those factors – and in particular explored whether there were significant differences in their responses to professional competencies and technical competencies.

In stage 2 we repeated the analysis, but filtered the data for subsets associated with level of employment experience in professional settings (stage 2a) and whether they had significant experience of study in a non-engineering discipline (stage 2b) so that a comparison could be performed to assess the impact of these aspects.

Results

We began by removing responses that contained incomplete or erroneous data: e.g. where the survey was abandoned whilst only partially complete; or where a respondent had clearly responded with the lowest response to all questions. This left N = 339 responses. Summary demographic data for these 339 respondents include:

  • - Age: 55.8% aged 20 or younger; 38.4% aged 21..25; 5.9% aged 26 or older

  • - Gender: 58.4% male; 40.7% female; 0.9% other or rather not say. It is worth noting that the percentage of female respondents in the sample population (40.7%) was higher than that in the study population (∼30%). This may suggest a form of selection bias.

  • - Study stage: 32.5% first year undergraduate; 52.8% middle years undergraduate; 7.4% final year undergraduate; 6.4% postgraduate

  • - Level of cumulative experience in any type of employment: 14.8% none; 15.6% < 1 month; 17.1% 1 month to <3 months; 26.0% 3 months to <12 months; 17.40% 1 year to <3 years; 9.14% 3 + years

  • - Level of cumulative experience in professional employment: 69.0% none; 11.8% < 1month; 5.9% 1 month to <3 months; 7.4% 3 months to <12 months; 1.5% 1 year to <3 years; 4.4% 3 + years

  • - Engineering discipline: 5.6% aeronautical; 22.1% biomedical; 10.6% electrical/electronic; 18.3% civil; 8.3% chemical; 9.4% mechanical; 10.0% mechatronics; 12.7% software; 3.0% other

  • - Have they studied 6 months or more of another discipline: 37.2% yes; 62.8% no

An analysis of the resultant data was then undertaken using an exploratory factor analysis (EFA). The first step was to remove any open-ended questions (e.g. Q7.1) or questions with nominal (rather than ordinal or interval etc.) data (e.g. Q1.4, Q1.8). The remaining questions were then handled differently depending on the nature of the questions (see Appendix 1 for the full list of questions):

Questions where participants selected from an ordinal rating: Q1.1 (rated 1-4), Q1.3 (rated 1-9), Q1.6 and Q1.7 (rated 1-7). For these questions we calculated the average response.

Questions where participants rated each of the eight competencies on a 5-point likert scale: Q2.1-Q2.3, Q4.2, Q5.1-Q5.3. For each of these questions, we separately calculated the average response for the 4 technical competencies (labelled in the following as, for example, Q2.1t), and the average response for the 4 professional competencies (labelled as, for example, Q2.1p).

Questions where participants placed the eight competencies in a rank order: Q3.1.-Q3.3, Q4.1. For each of these questions, we separately calculated the average ranking for the four technical competencies (e,g, Q4.1t), and the average response for the 4 professional competencies (e.g. Q4.1p). Note that because participants ranked all competencies (professional and technical) together, the result for each pair of items were symmetric (e.g. Q4.1t was symmetric with Q4.1p) and so only one item from each pair was included.

The result of the above data cleaning resulted in 32 data items for each participant, as shown in . Having identified the data to incorporate into the exploratory factor analysis, it is appropriate to first test the univariate and multivariate normality of the data, given that the correlation measures that underpin the analyses assume normally distributed data. Univariate normality can be assessed by considering the skewness and kurtosis of the responses for each item (e.g. a normal distribution has a skewness of zero and an excess kurtosis of zero). Whilst there is no definitive value required, Hair et al. (Citation2010) argue that it is reasonable to consider data to be normally distributed if skewness is between -2 to +2 and kurtosis is between -7 to +7, which is true for all items, with the exception of Q1.3 (which asked about the respondents’ study/career stage). This item was therefore removed from subsequent analyses, given that inclusion of data with high skew or kurtosis can lead to the identification of artificial factors.

Table 1. Analysis of survey items test for univariate and multivariate normality.

In terms of multivariate normality, applying Mardia’s test (Mardia Citation1970) results in a skew for the whole data set of 158.1 and a kurtosis of 1133, both with p values well below the accepted threshold of 0.05. This indicates that whilst the data is approximately univariate normal, it is not multivariate normal, and hence an analysis based on a simple regression is likely to give flawed results. A more robust factor score estimation technique is appropriate, such as Bartlett’s (Watkins Citation2018) weighted least-squares scores, rather than the more usual regression method. Also, to ensure that the data is suitable for factor analysis (by testing sampling adequacy for each variable) the Kaiser – Meyer – Olkin (KMO) test is calculated for each item – see . In general a value is 0.8 or higher is desirable, but as low as 0.6 is considered acceptable. There are several items that are close to the threshold of being exclused (e.g. Q3.2p) but none that are sufficiently low to require removal. And finally, Cronbach’s alpha has been calculated for the test data, with a value of 0.734 indicating acceptable (though not high) reliability.

An exploratory factor analysis was then carried out. This used principal components for extraction and varimax rotation (Costello and Osborne Citation2005). The top 15 resultant eigenvalues for the extracted factors are as shown in , with the associated scree plot in .

Figure 1. EFA scree plot.

Figure 1. EFA scree plot.

Table 2. Eigenvalues (and cumulative contribution to variation) resulting from EFA.

Using the Kaiser criteria (i.e. eigenvalues >1) suggests the identification of 10 factors that account for 68% of the variation in the student responses. The top 5 of these however account for almost 50% of the variation. The item weightings, determined using Bartlett’s method (Taherdoost, Sahibuddin, and Jalaliyoon Citation2022), are as shown in , with the most significant values shown highlighted (i.e. these show the contribution of each item to each factor).

Table 3. Item weightings for top 5 factors, with higher weightings shown highlighted. Those cases where here are significant differences between responses to questions on Professional vs Technical competencies are shown in bold.

By considering the specific items that contributed to each factor, we can then assign an interpretation to the factors as follows (showing the percentage of the respondents’ response variation attributed to each factor – as listed in ):

  • F1 (16%): Capability: Students’ perception of their own capability with respect to different competencies

  • F2 (11%): Learning Experience: Students’ perceptions with respect to their experience of learning different competencies (explicitly with respect to professional competencies)

  • F3 (9%): Learning Outcomes: Students’ perceptions of the quality of learning outcomes achieved (for both technical and professional competencies)

  • F4 (6%): Employment Experience: Age and level of employment experience

  • F5 (5%): Environment: Students perceptions as to the environment in which the competency might best be learnt (e.g. university vs workplace)

It is useful to note different patterns in the data in . There are some cases where both a ‘p’ (professional) item and a ‘t’ (technical) item have contribute similarly to a given factor. An example of this is the contribution of Q2.1t and Q2.1p to factor 3 (Q2.1 asks whether the participant believes that their University does a good job of developing each competency). In a case such as this, the nature of that question suggests that it is relevant to the factor, but that it does not explain different student reactions to professional and technical competencies. In other cases, the pair of ‘t’ and ‘p’ items contribute quite differently to a given factor. An example of this is the contribution of Q6.1t and Q6.1p to factor 2 (Q6.1 asked participants about whether the participant found the study of each competency interesting). In this case, the difference suggests that this element of factor 2 played a contributory role in explaining differences in student reactions to professional competencies, but not for technical competencies.

Having identified the various factors, it is then useful to extend the observations above regarding the difference in the contributions to these factors by questions on professional competencies vs questions on technical competencies. highlights those item pairs (i.e. professional vs technical) where there is a significant difference in the weighting on at least one factor. As can be seen, there are no significant differences for F3 and F4, but there are for the other three factors.

For F1, Capability, the students’ response to the Question 5.3 set (their current capability with respect to the eight competencies) shows that their response to the professional competencies affects their overall reactions much more than their response to the technical competencies.

For F2, Learning Experience, the responses are much starker. Their response to numerous questions on professional competencies (particularly those in section 6 related to the nature of the learning experience) tended to drive their overall reaction, whereas the same was not true for technical competencies. This will be discussed in more detail in the following section.

And finally, for F5, Environment, there is an interesting difference, particularly with regard to Q4.2, related to where a particular competency might best be learnt. Students’ responses in terms of professional and technical competencies both strongly affected their overall reactions, but in opposite directions.

Discussion

We should begin by emphasising that this research has been focused on determining possible underlying factors that affect students reactions to the teaching of engineering competencies, and in particular professional competencies. We are not assessing the validity of those student views.

It is also worth noting that our approach, based on exploratory factor analysis, does have inherent limitations. Factor analysis is essentially a technique for explaining variations in observed correlated variables in terms of a lesser number of unobserved variables. The observed variables are derived directly from the participant responses to the survey questions, and so the design of the survey and especially the choice of the survey questions can directly influence the factors that are subsequently identified. This does not invalidate the outcomes, but it does mean that it is important to recognise that the factors that are identified will not be definitive. Rather they will provide one set of explanations for variations in student responses. These explanations can then be used to inform decision making in our approaches to the design of educational approaches in this area.

In our particular analysis, the five factors that were identified were are as follows:

  1. Capability: Students’ perception of their own capability with respect to different competencies.

  2. Learning Experience: Students’ perceptions of their experience with respect to their experience of learning different competencies, explicitly with respect to professional competencies.

  3. Learning Outcomes: Students’ perceptions of the quality of learning outcomes achieved, for both technical and professional competencies.

  4. Employment Experience: Age and level of experience working in different contexts

  5. Environment: Students perceptions as to the environment in which the competency might best be learnt: university vs workplace.

Understanding these factors is significant in terms of guiding the design of curricula as well as pedagogical approaches to teaching engineering competencies. For example, we can see that F1 (Factor 1: students’ perception of their own capability with regard to different competencies) was the factor that explained the greatest variation in the survey questions that were included in the factor analysis. These questions were designed to explore student reactions to the teaching of various engineering competencies, so this suggests that students’ perception of their own capability does play a significant role in influencing their reactions, though whether it is the most significant factor would depend on the extent to which the survey questions comprehensively identified variations in student reactions. Nevertheless, this does suggest that if students have a misperception of their own capability then it may lead to unjustified reactions. This in turn supports arguments that it is highly beneficial to consider how we assist students in accurately self-assessing their own capability: i.e. given that this has an impact, we want that impact to be based on an accurate self-assessment.

Another finding from the above results that is potentially significant relates to the second factor: learning experiences. It was this factor that had the greatest difference between professional and technical competencies. The data suggest that variations in learning experiences with professional competencies are likely to elicit stronger student reactions than equivalent variations in learning experiences with technical competencies. Whilst this has a range of implications, an interesting one relates to how we interpret student feedback, given that it suggests that poor learning experiences might be treated more harshly in courses where professional compatencies are taught. This, in turn, has implications for performance appraisals of those teaching in this area. This is an area that warrants further investigation.

One final aspect worth commenting on relates to student perceptions of the preferred environment within which to learn different competencies (Factor 5). There appears to be very significant assumptions being made by students, without a clear basis, that professional competencies are more effectively learnt within a workplace environment, whereas technical competencies are more effectively learnt within a University environment. It is interesting to note that the question that contributed to this factor the most strongly was Q4.1: it asked students to identify which competencies most needed an understanding of associated theory. This may suggest that a lack of appreciation of the nature, or indeed existence, of theoretical concepts and frameworks for professional competencies is a major driver of student reactions. Again, this is an area that would warrant deeper investigation.

Finally, we acknowledge that our study only included engineering students from one university, limiting the generalisation of our findings. We have also relied on self reporting. While it is generally accepted as an effective way to investigate attitudes, alternative methods and forms of evidence may provide different insights.

ConclusionsThe success of engineering graduates to make an impact in their future careers relies on their ability to ascertain a wide range of professional competencies alongside competencies associated with their technical discipline. Modern engineering education through accreditation requires that the development of professional competencies is central to the learning provided within engineering degree programs. However, delivering teaching to support and promote professional competency development in students comes with challenges, including resistance from students themselves. This paper suggests that if we are to improve in our attempts to educate students with a core purpose of kick-starting their own professional development whilst at university, as educators, we need to better understand the negative views and reactions students hold towards this development. The research described herein has provided a contribution towards a better understanding of student reactions to teaching professional competencies within an engineering degree program.

The main research instrument for data collection was a survey completed by 300 + student participants representing over 8 engineering disciplines which asked questions seeking their views on various factors that could contribute to their reactions towards learning and developing professional competencies (as defined by Engineers Australia and inline with definitions presented in accreditation documentation used in accordance with the internationally recognised Washington Accord). An exploratory factor analysis was employed and identified underlying factors that were interpretated as influences on students’ responses to the teaching and development of professional competencies they’ve experienced. Student personal perceptions of their own capability, learning experiences, and attainment of different competencies as well as the environment in which they believe competencies might be best learnt (i.e. university vs workplace setting), were four of the five factors highlighted in this study. The fifth influencing factor was identified as their past employment experience. It was also found that factors associated with the students’ perception of their own capability and learning experience were factors that influenced their response to learning of professional competencies more than their response to technical competencies. Additionally, it was found that students have views on which environment each type of competency is best learned, the workplace being preferred for professional competencies and alternatively within a university environment for the technical.

Understanding the limitations of exploratory factor analysis whereby it is known that the results are influenced by the nature of the questions posed in the survey, this study nevertheless provides a relevant set of explanations to provide insights into pedagogical approaches aimed at addressing the negative student reactions to their development of professional competencies whilst also including recommendations where there is need for further research.

Ethics approval

This work was carried out with ethics approval from The University of Sydney Human Research Ethics Committee (HREC), under project 2020/576.

Acknowledgments

The authors wish to acknowledge the substantial contributions of the wider team involved in the development and implementation of the Professional Engineering programs at the University of Sydney and University College London.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

David Lowe

David Lowe is Professor of Software Engineering at the University of Sydney. From 2012 to 2019 was the Deputy Dean (Education), in the Faculty of Engineering. He has authored over 200 refereed research papers and 3 textbooks. He is a Principal Fellow of the HEA and recipient of the 2019 Australian Council of Engineering Deans National Award for Engineering Education Excellence.

Emanuela Tilley

Emanuela Tilley is Professor of Engineering Education and Director of the Integrated Engineering Programme at University College London. Professor Tilley has established an international profile in the areas of leading curriculum design. She is a Board Director for SEFI (European Society of Engineering Education), School Governor and UCL Engineering Sponsor at Elutec Academy in southeast London and Principal Fellow of the Higher Education Academy (PFHEA).

Keith Willey

Keith Willey is currently Professor and Director of Innovation In Engineering & IT Education at the University of Technology Sydney. He has held previous academic appointments at UNSW and the University of Sydney. He is a Principal Fellow of the Higher Education Academy and a recipient of numerous awards, including the Australian Council of Engineering Deans National Award for Engineering Education Excellence (2010 & 2020) and the AAEE Distinguished Member Award (2020).

Kate Roach

Kate Roach is an Associate Professor at University College London. She supports young engineers to develop a greater understanding of the impacts of technology on life, society and cultures so that they become responsive designers in the future.

References

  • ABET. 2023. "Criteria for Accrediting Engineering Programs." https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2023-2024/.
  • Arlett, C., F. Lamb, R. Dales, L. Willis, and E. Hurdle. 2010. “Meeting the Needs of Industry: The Drivers for Change in Engineering Education.” Engineering Education 5 (2): 18–25. https://doi.org/10.11120/ened.2010.05020018.
  • Bates, R., S. Lord, E. Tilley, and J. Carpenter. 2022. "A Community Framing of Integrated Engineering." 2022 ASEE Annual Conference & Exposition.
  • Brookfield, S. D. 2017. Becoming a Critically Reflected Teacher. 2nd ed. Wiley.
  • Byrne, Z. S., J. W. Weston, and K. Cave. 2020. “Development of a Scale for Measuring Students’ Attitudes Towards Learning Professional (i.e. Soft) Skills.” Research in Science Education 50: 1417–1143.
  • Caeiro-Rodríguez, M., M. Manso-Vázquez, F. A. Mikic-Fonte, M. Llamas-Nistal, M. J. Fernández-Iglesias, H. Tsalapatas, … L. T. Sørensen. 2021. “Teaching Soft Skills in Engineering Education: An European Perspective.” IEEE Access 9: 1417–1433. https://doi.org/10.1007/s11165-018-9738-3.
  • Costello, A. B., and J. W. Osborne. 2005. “Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most from Your Analysis.” Pract. Assess. Res. Eval 10 (7): 1–9.
  • Crosthwaite, C. 2019. "Engineering Futures 2035: A Scoping Study." http://www.aced.edu.au/downloads/Engineering Futures 2035_Stage 1 report for ACED_May_16_2019.pdf.
  • Dreyfus, S. E., and H. L. Dreyfus. 1986. Mind Over Machine. New York: Free Press.
  • ENAEE. 2021. EUR-ACE: Framework Standards and Guidelines, European Network for Accreditation of Engineering Education. https://www.enaee.eu/wp-content/uploads/2022/03/EAFSG-04112021-English-1-1.pdf.
  • Engineers Australia. 2018. Accreditation Management System: Accreditation Criteria User Guide – Higher Education (AMS-MAN-10).
  • Forman, S. M., and S. F. Freeman. 2013. "The Unwritten Syllabus: It_s not Just Equations—Student Thoughts on Professional Skills." Presented at the 120th ASEE Annual Conference & Exposition in Atlanta, Georgia.
  • Gobet, F., and P. Chassy. 2009. “Expertise and Intuition: A Tale of Three Theories.” Minds and Machines 19 (2): 151–180. https://doi.org/10.1007/s11023-008-9131-5.
  • Graham, R. 2012. "Achieving Excellence in Engineering Education: The Ingredients of Successful Change." The Royal Academy of Engineering (Vol. 101, Issue March). http://epc.ac.uk/wp-content/uploads/2012/08/Ruth-Graham.pdf.
  • Hair, J., W. C. Black, B. J. Babin, and R. E. Anderson. 2010. Multivariate Data Analysis. 7th ed. New Jersey: Pearson Educational International.
  • International Engineering Alliance. 2019. "Washington Accord." http://www.ieagreements.org/accords/washington/.
  • Jackson, D. 2016. “Re-conceptualising Graduate Employability: The Importance of pre-Professional Identity.” Higher Education Research & Development 35 (5): 925–939. https://doi.org/10.1080/07294360.2016.1139551.
  • Kadi, A., and D. Lowe. 2019. Diversity in Student Initial Reactions to a Professional Engagement Program. In Proc of AAEE2019: 30th Annual Conference of the Australasian Association for Engineering Education, University of Southern Queensland, Brisbane.
  • Kemmis, S., C. Edwards-Groves, R. Jakhelln, S. Choy, G.-B. Wärvik, L. Gyllander Torkildsen, and C. Arkenback-Sundström. 2020. “Teaching as Pedagogical Practice. Ch.5.” In Pedagogy, Education and Praxis in Critical Times, edited by K. Mahon, C. Edwards-Groves, S. Francisco, M. Kaukko, S. Kemmis, and K. Petrie, 85–116. Singapore: Springer.
  • King, R. 2008. "Engineers for the Future". http://www.altc.edu.au/carrick/go/home/grants/pid/343.
  • Lave, J., and E. Wenger. 1991. Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press.
  • Lowe, D., and A. Kadi. 2019. An Analysis of Possible Predictors of Student Early Engagement with Professional Development Opportunities. Research in Engineering Education Symposium (REES). Cape Town, South Africa: SASEE.
  • Malheiro, Benedita, Pedro Guedes, Manuel F. Silva, and Paulo Ferreira. 2019. “Fostering Professional Competencies in Engineering Undergraduates with EPSISEP.” Education Sciences 9 (2): 119. https://doi.org/10.3390/educsci9020119.
  • Mardia, K. V. 1970. “Measures of Multivariate Skewness and Kurtosis with Applications.” Biometrika 57: 519–530. https://doi.org/10.1093/biomet/57.3.519.
  • National Academy of Engineering. 2004. "The Engineer of 2020: Visions of Engineering in the new Century." National Academy of Engineering. https://www.voced.edu.au/content/ngv:63792.
  • Nguyen, D. Q. 1998. “The Essential Skills and Attributes of an Engineer: A Comparative Study of Academics, Industry Personnel, and Engineering Students.” Global Journal of Engineering Education 2 (1): 65–74.
  • Passow, H. J. 2012. “Which ABET Competencies do Engineering Graduates Find Most Important in Their Work?” Journal of Engineering Education 101 (1): 95–118. https://doi.org/10.1002/j.2168-9830.2012.tb00043.x.
  • Passow, H. J., and C. H. Passow. 2017. “What Competencies Should Undergraduate Engineering Programs Emphasize? A Systematic Review.” Journal of Engineering Education 106 (3): 475–526. https://doi.org/10.1002/jee.20171
  • Pratt, M. G., K. W. Rockmann, and J. B. Kaufmann. 2006. “Constructing Professional Identity: The Role of Work and Identity Learning Cycles in the Customization of Identity among Medical Residents.” Academy of Management Journal 49 (2): 235–262. https://doi.org/10.5465/amj.2006.20786060.
  • Reich, Rooney, A. Gardner, K. Willey, D. Boud, and T. Fitzgerald. 2015. “Engineers’ Professional Learning: A Practice-Theory Perspective.” European Journal of Engineering Education 40 (4): 366–379. https://doi.org/10.1080/03043797.2014.967181.
  • Ryan, R. M., and E. L. Deci. 2000. “Intrinsic and Extrinsic Motivations: Classic Definitions and new Directions.” Contemporary Educational Psychology 25 (1): 54–67. https://doi.org/10.1006/ceps.1999.1020.
  • Ryan, G., S. Toohey, and C. Hughes. 1996. “The Purpose, Value and Structure of the Practicum in Higher Education: A Literature Review.” Higher Education 31 (3): 355–377. https://doi.org/10.1007/BF00128437.
  • Schatzki, T. 2012. “A Primer on Practices: Theory and Research.” In Practice-Based Education: Perspectives and Strategies, edited by J. Higgs, R. Barnett, S. Billett, M. Hutchings, and F. Trede, 13–26. Rotterdam: Sense Publishers.
  • Scott, G., and K. W. Yates. 2002. “Using Successful Graduates to Improve the Quality of Undergraduate Engineering Programmes.” European Journal of Engineering Education 27 (4): 363–378. https://doi.org/10.1080/03043790210166666.
  • Shuman, L. J., M. Besterfield-Sacre, and J. McGourty. 2005. “The ABET “Professional Skills” - Can They be Taught? Can They be Assessed?” Journal of Engineering Education 94 (1): 41–45. https://doi.org/10.1002/j.2168-9830.2005.tb00828.x.
  • Taherdoost, H., S. Sahibuddin, and N. Jalaliyoon. 2022. “Exploratory Factor Analysis; Concepts and Theory.” Advances in Applied and Pure Mathematics 27: 375–382.
  • Trede, F., R. Macklin, and D. Bridges. 2012. “Professional Identity Development: A Review of the Higher Education Literature.” Studies in Higher Education 37 (3): 365–384. https://doi.org/10.1080/03075079.2010.521237.
  • Trevelyan, J. 2013. “Towards a Theoretical Framework for Engineering Practice.” In Engineering Practice in a Global Context, edited by B. Williams, J. Figueiredo, and J. Trevelyan, 33–60. London: CRC Press.
  • UK Engineering Council. 2014. "UK-SPEC: UK Standard for Professional Engineering Competence." https://www.engc.org.uk/standards-guidance/standards/uk-spec/fourth-edition-implemented-from-31-december-2021/.
  • Watkins, M. W. 2018. “Exploratory Factor Analysis: A Guide to Best Practice.” Journal of Black Psychology 44 (3): 219–246. https://doi.org/10.1177/0095798418771807.
  • Winters, K. E., H. M. Matushovich, S. Brunhaver, H. L. Chen, K. Yasuhara, and S. Sheppard. 2013. From Freshman Engineering Students to Practicing Professionals: Changes in Beliefs About Important Skills Over Time. Presented at the 120th ASEE Annual Conference & Exposition in Atlanta, Georgia.

Appendix

Appendix 1: survey questions

The following is a summary of the questions in the survey. For those questions that ask students to comment on, or rate, various engineering competencies, the following list of competencies is used (the distinction between professional and technical is used for our analysis, and is not made explicit to the survey respondents).

List 1: List of competencies used in Questions

  • Technical competencies

    • Understanding of underlying mathematics and science foundations

    • Technical knowledge associated with your particular field of engineering

    • Ability to clearly define and creatively solve open-ended problems

    • Ability to apply a systematic design approach addressing multiple perspectives

  • Profession competencies

    • Understanding of how other disciplines (including business, law and social sciences) intersect with engineering

    • Skills in communicating in both technical/non-technical and both written/verbal forms

    • Ability to work effectively as a member of a team

    • An understanding of professional/ethical obligations and an ability to manage your own development

Welcome/Consent

Q0.1: I have read the Participant Information Statement, and consent to participate (anonymously) in this study as outlined in the statement: Yes/No

Demographic Information

In this section we will ask a few short questions about your background. This will help us to determine whether students with different backgrounds perceive things differently with regard to engineering capability development.

Q1.1: What is your age? [Dropdown with ranges]

Q1.2: What is your gender? [Dropdown]

Q1.3: Which of the following best describes your current study/work stage? [Dropdown]

Q1.4: In which country are you now studying/working? [Dropdown]

Q1.5: In which country did you spend the largest part of your life prior to starting your engineering studies? [Dropdown]

Q1.6: Approximately how much total time have you spent working in any jobs (including part-time jobs, unskilled work, summer jobs, etc.)? (Give your answer in terms of equivalent full-time work; e.g. if you have worked 1 d per week for 2 years then this would be equivalent to about 5 months full time) [Dropdown]

Q1.7: Approximately how much total time have you spent working in any professional jobs (i.e roles that would normally expect the person to have a degree)? (Again, give your answer in terms of equivalent full-time work) [Dropdown]

Q1.8: Which engineering discipline are you mainly studying or working in (select the one that is closest)? [Dropdown]

Q1.9: Have you previously studied (for 6 months or more), or are you currently studying in parallel, another degree program separate from your engineering? [Yes/No]

Preliminary views

And now a few questions on your views about the development of competencies. Note that through the rest of this survey we will use the term ‘competency’ to refer to a broad range of skills, capabilities, etc. We will also separate technical and professional competencies. By these terms we mean:

Technical competencies: this refers to those aspects that are based on specialist technical knowledge, grounded in an understanding of maths and science, and which underpin your particular engineering discipline. Examples of these include the ability to calculate the stress on a structural beam, or the required control parameters for a PID motor controller, or knowledge of a particular programming language or modelling software.

Professional competencies: this refers to those broader competencies that support your application of the technical competencies and practice of engineering within a workplace setting. Examples include the ability to project manage an activity, to communicate complex technical concepts to a non-technical audience, or to deal with complexity and uncertainty. These types of competencies are more likely to be common across disciplines – some specifically within Engineering (such as general design or systems engineering skills) and others across a broader range of professions (such as teamwork or creativity).

The following 3 questions each ask separately about the following 2 items:

• Technical competencies

• Professional competencies

For each question, respond using the following scale: [5 point likert scale: strongly disagree; somewhat disagree, neither agree nor disagree; somewhat agree; strongly agree]

Q2.1: Please indicate whether you agree or disagree that your University does a good job developing each competency type: [2 items, Scale]

Q2.2: Please indicate whether you agree or disagree that you were already capable in each competency type when you commenced your University degree: [2 items, Scale]

Q2.3: Please indicate whether you agree or disagree that each competency type should be a core component of your Engineering degree program: [2 items, Scale]

Importance of competencies

In this set of questions we will ask you about your views on which competencies (from a set of 8) might be the most important.

Q3.1: Looking for your first job

Before you have your first job, you are applying to companies. Potential employers will be likely to judge you, and offer you a job, based on their perception of your level of ability with regard to some or all of these competencies. Can you order them (by dragging them up and down) from most important to least important, according to how important they are in terms of impressing a recruiter and resulting in you being offered a job (i.e. what competencies might you emphasise in your CV). [Ordering of items from List 1]

Q3.2: Working in your first engineering job

The list below contains the same items as the previous question, but this time you should order them according to how important you feel they will be to you in being able to actually carry out your job during the first year of your job as a new graduate engineer (i.e. which competencies might you actually use the most). [Ordering of items from List 1]

Q3.3: Working as an experienced engineer

And again, the same set of eight competencies. This time, imagine you are now a more experienced engineer, maybe 5–10 years into your career. You have a few younger engineers reporting to you in your team, are responsible for liaising with clients, and have to meet budgets, manage your projects and report to senior management on progress. So this time you should order these competencies according to how important you feel they will be to you in this more senior role. [Ordering of items from List 1]

Developing competencies

OK, now we want to change focus. Instead of considering the importance of each competency, we want to explore how you think that competency might be best developed.

Q4.1: Theory vs practice

Becoming really capable with some competencies may require students to first learn the background theory (e.g learning how to sketch realistic drawings might be helped by first learning about geometry), whereas other competencies might be able to be developed just through practice without needed to know any formal theory first.

Put the list of competencies below into order (by dragging them up and down) starting at the top with the one that most needs an understanding of formal theory, and ending at the bottom with the one needs the least amount of formal theory. [Ordering of items from List 1]

Q4.2: Work vs University

Rate each of the following competencies in terms of whether it is easier to learn at University or easier to learn in a work environment?

Rating of each item from List 1 against following scale: [5 point scale: Much easier at University; A little easier at University; About the same; A little easier at work; Much easier at work]

Competence level

We are almost done. We now want to explore how capable you think you are with respect to each of the eight competencies. For each competency can you judge this in three ways:

For each of the following questions, rate each of the 8 competencies using: [5 point scale: 0 = No capability or no awareness; 1 = Not very competent; 2 = Moderately competent; 3 = Quite competent; 4 = Very competent]

Q5.1: At the time you started your degree, how capable did you think you were at the time? (e.g. When I started my degree I thought I was really quite competent at writing code. Rating = 3)

Q5.2: Now, looking back to the start of the degree, how capable do you think you were at the start of your degree? (e.g. I now realise I was wrong, and that when I started the degree I was actually not very competent at coding at all. Rating = 1)

Q5.3: How capable do you think you are right now? (I've improved, and am now actually moderately competent at coding. Rating = 2)

Nature of learning experience

And finally a few quick questions to wrap it all up. We want to explore a little bit about learning different competencies.

The following 5 questions each ask separately about:

• Technical competencies

• Professional competencies

Q6.1: Rate how interesting you have found each of the following areas of study: [Sliding Scale, from 0 = Not interesting at all to 100 = Extremely interesting]

Q6.2: Rate how difficult you found it to understand the material and concepts associated with each of the following areas of study: [Sliding Scale, from 0 = Extremely easy to 100 = Extremely difficult]

Q6.3: Rate how well taught you have found each of the following areas of study: [Sliding Scale, from 0 = Extremely bad to 100 = Extremely good]

Q6.4: Rate your overall experience of the classes and learning activities that you have done at University and which were aimed at developing each of the following: [Sliding Scale, from 0 = Extremely negative to 100 = Extremely positive]

Q6.5: Rate how much you think you have learnt so far in your engineering degree with respect to each of the following: [Sliding Scale, from 0 = None at all to 100 = A great deal]

Q6.6: With specific regard to where you have developed the professional skills that you do have, can you rate the contribution to your development from different areas? [Sliding Scale, from 0 = None at all to 100 = A great deal; + Not applicable]

• In subjects mainly focused on professional skills

• In subjects mainly focused on technical skills

• In other subjects (e.g. project units)

• During employment

• Somewhere else (personal life, family, volunteering, etc.)

Survey End

Q7.1 And that's it!

Do you have any other comments on the competencies and skills you have (or haven't) learnt at University, or anything else that you think might be helpful to us in understanding your views and experiences? [Open ended response]