275
Views
5
CrossRef citations to date
0
Altmetric
Research Article

Influences on U.S. undergraduate engineering students’ perceptions of ethics and social responsibility: findings from a longitudinal study

ORCID Icon, , ORCID Icon &
Pages 88-99 | Received 06 Dec 2021, Accepted 22 Nov 2022, Published online: 08 Dec 2022

ABSTRACT

Engineering students’ views of ethics and social responsibility can be complex, multi-faceted, and influenced by participation in diverse experiences. To explore these influences, we surveyed engineering undergraduates at four U.S. universities to understand how their perceptions of ethics and social responsibility changed over time and whether changes were related to participation in curricular and co-curricular experiences. Students were surveyed three times: during the first, fifth, and eighth semesters of their undergraduate studies. We analyse the responses of students (n = 226) who responded to all three surveys. We report results from five measures used in the survey: Fundamentals of Engineering/Situational Judgement, Ethical Climate Index, Justice Beliefs, Political and Social Involvement Scale, and Moral Disengagement. Analysis used two-way mixed ANOVA to identify changes over time on these measures, including whether changes were influenced by self-reported participation in certain experiences (internships, service-learning, ethics instruction, etc.). When we compared groups of students – those who did and did not participate in various experiences – we saw no interaction effects for most measures. We hypothesise this reflects a pattern of self-selection into experiences. Our findings suggest the difficulty of developing impactful ethics interventions, given that students arrive at university with pre-existing knowledge and perceptions about ethics and morality.

1.

According to ABET accreditation guidelines, undergraduate degree programs are expected to cultivate ethical and professional responsibilities among engineering students (ABET, Inc Citation2018). Yet their systematic review of engineering ethics interventions, Hess and Fore (Citation2018) observed many engineering programs do not include ‘an explicit focus on students’ ethical development’ (p. 552). Leading stakeholders like the National Academy of Engineering have begun identifying best practices and exemplary programs as ‘a resource for those who seek to improve the ethical development of engineers at their own institutions’ (National Academy of Engineering Citation2016). While such efforts are encouraging, there remains limited evidence about how engineering students develop ethical and professional responsibilities through participation in various experiences and interventions (Hess and Fore Citation2018).

To better understand what U.S. students learn about ethics and social responsibility, and with the intent of identifying practices to improve engineering ethics education, we conducted a longitudinal study of engineering undergraduates enrolled in four U.S. universities. This research question guided our analysis:

How do foundational measures and understandings of social and ethical responsibility change during a four-year engineering degree program in relation to specific learning experiences?

We present results addressing students’ responses to relevant measures considering whether they participated in a variety of curricular and co-curricular experiences (study abroad, internships, undergraduate research, etc.) over the course of four years.

2. Literature review

Our work is informed by several well-known and widely accepted theories of moral development. In general, the development of morality is a process that becomes both ‘more complex and more decentred’ as people age and are exposed to or participate in more experiences (Vozzola and Senland Citation2022, p. 121). These experiences can be informal (e.g. family and friends) or formal (e.g. religion or school). Colby’s et al. (Citation1983) extensive, 20-year longitudinal study showed that the participants they studied (n = 58 boys/men) moved through Kohlberg’s stages of moral development over time (with only 4% regressing to a lower stage and no participants skipping over a stage). In addition to age, changes in moral judgement were also related to the subjects’ eventual education level, intelligence (as measured by IQ at the initiation of the study), and socioeconomic status. Further, these researchers found that development at each stage is enhanced (i.e. becomes more consistent) with ‘environmental support’ (Colby et al. Citation1983, p. 102).

Other researchers have studied the impacts of various environmental supports in engineering education, including curricular interventions like ethics courses or exposure to individual case studies. For example, Self and Ellison’s (Citation1998) study of two student cohorts found that their moral reasoning abilities significantly improved following a course on ethical and social issues within engineering, as measured by the Defining Issues Test (DIT). In contrast, Drake et al. (Citation2005) found that exposure to a brief module, or even a full semester ethics course, did not improve engineering students’ moral reasoning skills as measured by the Defining Issues Test-2 (DIT-2). Loui (Citation2006) also used the DIT-2 to study the effects of engineering case studies on students’ moral reasoning. He found that viewing a case study, followed by class discussion, led to an increase in DIT-2 scores. A later study found that students who completed an engineering ethics course showed increased awareness of responsibility, and confidence in their decisions, when faced with two engineering-related case studies (Hashemian and Loui Citation2010).

Additionally, Finelli et al. (Citation2012)’s work suggests that both formal curricular experiences and co-curricular experiences influence engineering students’ ethical development. Their survey of 3,914 students attending 18 U.S. institutions found 76% of respondents participated in at least one engineering-related co-curricular activity (e.g. engineering design competitions) and 68% participated in non-engineering related co-curricular activities (e.g. political organisation). They propose such experiences could help develop critical thinking abilities about ethics. Qualitative research by Burt et al. (Citation2013) found students who participate in co-curricular experiences reported having a broader perspective on ethics and their roles as engineers. Researchers in other fields have found that undergraduate students completing internships in accounting (Brown-Liburd and Porco Citation2011), recreation management (Craig and Oja Citation2013), and education (Oja, Graham, and Andrew Citation2011) all showed increases in moral judgement as measured by the DIT-2 instrument.

Others have attempted to measure changes in students’ perceptions of ethics and social responsibility following participation in co-curricular activities. Johnston, Caswell, and Armitrage (Citation2007) surveyed engineering students who participated in a three-week Engineers Without Borders (EWB) project wherein nearly 70% indicated improved social and environmental awareness. Bielefeldt and Canney’s (Citation2016) study of 448 engineering students over eighteen months found community service activities (e.g. Engineers Without Borders) improved attitudes towards social responsibility. Knight et al. (Citation2016) surveyed 918 U.S. engineering students about learning outcomes associated with their involvement in co-curricular activities, finding that engineering professional societies, honour societies, engineering service activities, and undergraduate research were settings where students reported learning about societal impacts of engineering. Bielefeldt et al. (Citation2018) found about one-quarter of 1,118 surveyed U.S. engineering students indicated learning about ethics and social impacts through engineering service groups or professional societies; these results were substantiated by qualitative work by Rulifson and Bielefeldt (Citation2018). Polmear, Chau, and Simmons (Citation2020) found that undergraduate participation in employment, sports, and design competitions was positively correlated with ethical standards and outcomes (e.g. social engagement, cross-cultural awareness), thus highlighting the value of co-curricular activities in developing understanding of ethics and social responsibility. Outside of engineering education, Brandenberger and Bowman (2015) similarly found that participation in college experiences such as service-learning and study abroad led to students having ‘higher levels of prosocial orientation’ (p. 339).

This accumulation of evidence suggests that students’ participation in curricular interventions and co-curricular activities may influence attitudes and knowledge about ethics and social responsibility. We sought to further quantify the relationship between engineering undergraduate students’ participation in diverse experiences and understanding of ethics and social responsibility.

3. Methods

We report quantitative data collected from a longitudinal, mixed-methods study of undergraduate engineering students at four U.S. universities. Surveys were administered at three time points (). Students were eligible to complete the initial survey if they were age 18 years or older, first-semester undergraduates, and enrolled full-time in an engineering or technology major at Arizona State University (ASU), Brigham Young University (BYU), Colorado School of Mines (Mines), or Purdue University. The four universities represent a variety of institution types (public and private, varying sizes, geographic regions, levels of research activity, etc.).

Figure 1. Quantitative data collection overview.

Figure 1. Quantitative data collection overview.

3.1. Longitudinal data collection

In fall 2015, 757 students responded to the first survey (Zoltowski et al. Citation2016). These students were contacted again in the fall of their junior year (Fall 2017), with 319 students completing this mid-point survey (Howland et al. Citation2018). The final survey was distributed in Spring 2019. Eligibility criteria for the final survey included completion of at least six semesters of study in, or graduation from, an engineering or technology major at one of the participating universities. 284 eligible students responded to the final survey with 226 students responding to all three surveys. The responses of those 226 students are the basis for the reported results. Students received gift cards for completing each survey ($5 for the first survey and $10 for subsequent surveys) and median completion time was 26 minutes.

3.2. Survey instrument

The survey was comprised of eight sections measuring various aspects of students’ perceptions of ethics and social responsibility. We report the changes over time seen on the five validated measures (a total of 71 items) used, in relation to the students’ participation in various experiences. These measures include a set of Fundamentals of Engineering/Situational Judgement items, the Ethical Climate Index, Justice Beliefs, Political and Social Involvement Scale, and Moral Disengagement. These measures cover a wide range of constructs related to ethics and social responsibility and have previously published evidence of valid results. Overall changes on these measures are reported in Howland et al. (Citationn. d.).

Our paper explores how scores on these instruments relate to participation in various activities. Students were also asked about participation in seventeen types of experiences that could impact views of social and ethical responsibility. These experiences were selected as they represent a broad spectrum of activities that undergraduate students may elect to participate in. These experiences included:

  1. Volunteer regularly (1+ time per month for 6 months or longer)

  2. Mission or volunteering trip (any location)

  3. Work or internship in a non-profit organisation

  4. Work or internship in a government organisation

  5. Work or internship in an engineering-related organisation

  6. Travel to or living in a developing country/region

  7. Student government (e.g. serving on a university-wide undergraduate council or a similar representative organisation for students at their institution)

  8. Service-learning course

  9. Formal religious instruction (e.g. seminary, religiously-affiliated school, etc.)

  10. Formal ethics instruction (e.g. a workshop, course, etc.)

  11. Honours program (the honours program at Colorado School of Mines has an emphasis on ethics)

  12. Extracurricular organisation like Engineers Without Borders or similar (student-led chapter of an engineering-focused service organisation based at the student’s university)

  13. Engineering professional society (e.g. IEEE, SWE)

  14. Undergraduate research experience

  15. Fraternity or sorority

  16. Study abroad (any duration)

  17. Grand Challenges Scholars Program (a curricular/co-curricular program sponsored by the U.S’.s National Academy of Engineering designed to prepare students to address the complex challenges facing today’s society)

3.3. Measures

The Fundamentals of Engineering/Situational Judgement measure is comprised of eight multiple-choice items scored as correct or incorrect. This scale assesses students’ knowledge of ethics and how to approach ethical dilemmas. Five items are similar to items on the Fundamentals of Engineering (FE) exam and three items asked students how they would respond to typical ethical dilemmas. These three scenarios were adapted from a previous project and reviewed by multiple domain experts to establish content and face validity (Jesiek, Buswell, and Zhu Citation2018).

The Ethical Climate Index is a 19-item measure of workplace ethical climate adapted to align with a university context (Arnaud Citation2010; Cronbach’s alpha 0.907 [2015] and 0.921 [2019]). Students responded to items on a seven-point scale from ‘completely false’ to ‘completely true’ with higher scores indicating students perceive their university climate to be more ethical.

The Justice Beliefs measure includes two subscales, Justice for Others and Justice for Self, each of which has four items measured on a seven-point scale from ‘strongly disagree’ to ‘strongly agree’ (Lucas, Zhdanova, and Alexander Citation2011). These two subscales measure respondents’ beliefs about distributive justice, defined as ‘evaluations of the fairness of outcomes, allocations, or distribution of resources’ (Lucas, Zhdanova, and Alexander Citation2011, p. 15) in relation to both other people (Justice for Others; Cronbach’s alpha 0.900 [2015] and 0.929 [2019]) and themselves (Justice for Self; Cronbach’s alpha 0.885 [2015] and 0.879 [2019])).

The Political and Social Involvement Scale (PSIS) asks students to rate the importance of twelve social and political issues (e.g. ‘enhancing racial understanding’) on a four-point scale from ‘not important’ to ‘essential’ (Blaich and Wise Citation2011). Responses of ‘very important’ or ‘essential’ were scored as 1 and ‘somewhat important’ and ‘not important’ were scored as 0 (Spinosa et al. Citation2008).

The Moral Disengagement measure includes twenty-four items that measure the propensity of respondents to act in ways that disregard relevant moral considerations (Detert, Trevino, and Sweitzer Citation2008; Cronbach’s alpha 0.883 [2015] and 0.889 [2019]).). Students respond on a five-point scale from ‘strongly disagree’ to ‘strongly agree’ with a higher score indicating that a student is more accepting of moral disengagement.

We did not use the Defining Issues Test-2 (DIT-2), despite its frequent use in ethics research, due to its cost and length. We designed our survey to use a variety of complementary measures and including the DIT-2 would have required elimination of other relevant measures.

3.4. Demographic information

Response rates for the three surveys are shown in . BYU had a higher attrition rate than the other three universities because many of its students complete a lengthy service mission (18 to 24 months long) while enrolled. Thus, some of the initial BYU respondents did not meet the eligibility criteria to participate in subsequent surveys. Demographic details are provided in for respondents to all three surveys (n = 226). These 226 respondents are not entirely representative of U.S. students enrolled in undergraduate engineering programs. Nationally, 26.3% of undergraduate engineering students are female; in our research, 34.3% of our respondents were female. White students compose 60.6% of undergraduate U.S. engineering students but 74.8% of our respondents were white. International students represented 4.9% of our respondents but are 9.9% of undergraduate engineering students enrolled in the U.S. are international (Roy Citation2019). Of our 226 respondents, 11 indicated they had not participated in any co-curricular experiences and on average, respondents reported participating in 5 or 6 co-curricular experiences (mean = 5.2, median = 5, mode = 6; ).

Table 1. Number of responses to each survey.

Table 2. Demographic information for 226 students who completed all three surveys.

Table 3. Number of experiences in which the 226 respondents participated.

3.5. Analysis

Students’ responses to the 71 items on the final survey were matched to their responses on the two previous surveys using each university’s approved Institutional Review Board (IRB) protocol. In the U.S. setting, an IRB is a group within a university that ensures that research on human subjects is conducted in an ethical manner. We generally only detail statistically significant results (i.e. p ≤ 0.05) though there are instances where a lack of change in one group (p > .05) may be discussed to highlight changes in another group.

In the analysis, student survey respondents were classified into groups of participants and non-participants for the seventeen experiences (meaning ultimately there are 34 groups as each experience has both participants and non-participants). Using independent sample t-tests, we determined that there was no difference in age between the groups of participants and non-participants for each of the 17 experiences. Participants were defined as students who indicated on the mid-point survey and/or final survey that they had participated in the experience within the last two years. Non-participants were defined as respondents who at no point (mid-point and final survey) indicated that they participated in these experiences. Because the group defined as participants included students who participated at any point during their four years of undergraduate studies, only scores from the initial and final surveys are examined here. The mid-point survey, then, is only used to determine if the respondent had or had not participated in any of these experiences and scores from that survey are not used in the analyses presented here (see Howland et al. (Citation2018) and Howland et al. (Citationn. d.) for additional information about those results).

The intent of the analyses was to determine if changes in students’ scores over time were related to participation in co-curricular and curricular activities. We analysed how participation in these activities at any point during their four years of college was related to their responses to various measures on the final survey.

In each case, participants and non-participants were compared using a two-way mixed ANOVA looking for an interaction effect of group by time, which detects if one group (i.e. participants) changed on a measure differently compared to another group (i.e. non-participants). When an interaction was found, simple main effects were analysed and reported.

Due to the small sample size, no comparisons were made for two experiences, student government (n = 15) and Grand Challenges Scholars Program (n = 5). Also, analyses showed that participation in study abroad was not associated with differences between participants and non-participants. We discuss the fourteen remaining experiences with statistically significant differences between participants and non-participants.

4. Results

For four of the measures (Fundamentals of Engineering/Situational Judgement [FESJ], Ethical Climate Index [ECI], Justice Beliefs, and Moral Disengagement), we saw an interaction effect between students who participated in three activities when comparing their scores to those of non-participants (). These three activities included participation in an honours program, service-learning course(s), or a fraternity/sorority. An interaction effect describes when groups of students (participants and non-participants) had different changes in a measure over time. For example, all students could have started with similar scores on the initial survey, but by the final survey participants scored higher than non-participants (or vice versa). Interaction effects were found only for select experiences and measures. No interaction effects were found on the Political and Social Involvement Scale for any of the 17 experiences – a result we discuss later.

Table 4. Measures with statistically significant interactions between participants and non-participants.

For these analyses, means for each relevant scale from each survey (initial and final) are presented in , along with standard errors. In , the means for each measure at each time point are presented followed by the standard errors of those means. These means are reported for participants (P) and non-participants (NP) in four experiences where this interaction effect was noted.

4.1. Honours program

Honours program participants, when compared to non-participants, had a statistically significant interaction by group and time (F(1,217) = 4.489, p = 0.035, partial eta squared = 0.020) on the FESJ (Fundamentals of Engineering/Situational Justice) measure. Simple main effects showed that participants initially scored higher on the FESJ measure than non-participants (F(1,221) = 17.115, p < 0.001, partial eta squared = 0.072). However, on the final survey, participants scored similarly on the FESJ compared to non-participants (F(1,220) = 2.436, p = 0.120, partial eta squared = 0.011).

Only students who did not participate in honours programs had a statistically significant increase in scores on the FESJ measure with an average increase of 0.5 points (F(1,145) = 13.132, p < 0.001, partial eta squared = 0.083)). This indicates honours students started with relatively higher scores on this measure but did not change over time, while non-honours students scored lower initially and increased over time (eventually rising to the same level as honours students).

4.2. Fraternity or sorority

For the FESJ measure, there was a statistically significant interaction between fraternity/sorority participants and non-participants (F(1,217) = 5.593, p = 0.019, partial eta squared = 0.025). Looking at simple main effects showed that initially, participants scored similarly on the FESJ measure than non-participants (F(1,221) = 0.079, p = 0.779). On the final survey, participants scored lower on the FESJ measure than non-participants (F(1,220) = 8.701, p = 0.004, partial eta squared = 0.038). Thus both groups of students scored similarly initially but non-participants scored higher on the final survey.

Only fraternity/sorority non-participants had a statistically significant increase in scores on the FESJ measure over time. Participants did not change over time (F(1,33) = 0.708, p = 0.406). Non-participants saw a statistically significant average increase of 0.5 points between the initial and final surveys (F(1,184) = 14.644, p < 0.001, partial eta squared = 0.081). This indicates that non-participants improved their FESJ scores while participants’ scores did not change over time.

There was also a statistically significant interaction between groups (fraternity/sorority participants compared to non-participants) and time on the Justice for Self scale (F(1,220) = 4.562, p = 0.034, partial eta squared = 0.020). Therefore, we analysed simple main effects. Fraternity/sorority participants initially scored higher on the Justice for Self subscale than non-participants (F(1,224) = 5.097, p = 0.025, partial eta squared = 0.022). On the final survey, fraternity/sorority participants did not score significantly different on the Justice for Self subscale than non-participants (F(1,220) = 0.157, p = 0.692).

Fraternity/sorority participants had a statistically significant average decrease of 2.4 points between the initial and final surveys on the Justice for Self scale (F(1,34) = 10.015, p = 0.003, partial eta squared = 0.228) while non-participants did not change over time (F(1,186) = 3.591, p = 0.060).

4.3. Service-learning course

For the Moral Disengagement measure, there was a statistically significant interaction between groups (service-learning participants and non-participants) and time (F(1,218) = 4.355, p = 0.038, partial eta squared = 0.020). Simple main effects showed that service-learning participants scored similarly on the Moral Disengagement measure than non-participants (F(1,224) = 0.004, p = 0.949) on the initial survey. On the final survey, service-learning course participants scored lower on the Moral Disengagement measure than non-participants (F(1,218) = 5.962, p = 0.015, partial eta squared = 0.027). This indicates that both groups scored similarly on the initial survey, but scores diverged on the final survey. This divergence in scores on the final survey is seen in the statistically significant effect of time on Moral Disengagement (F(1,50) = 4.004, p = 0.050, partial eta squared = 0.075). The change in scores over time was statistically significant for the service-learning participants (decrease of 2.7 points, p = 0.050) but not for the non-participants (increase of 0.8 points, p = 0.357). This finding is unusual because the Moral Disengagement scores, overall, were stable over time (Kim, Jesiek, and Howland Citation2021b; Howland, et al. Citationn. d.), yet participation in service-learning led to a measurable decrease in students’ reported willingness to morally disengage.

4.4. Experiences with consistent differences between participants and non-participants

However, we note there were measures and experiences where differences between participants and non-participants existed but those differences were consistent over time (i.e. no interaction effects). illustrates differences between participants and non-participants over time as measured by an ‘overall mean’ which is the mean for both time points for experiences with no interaction effects. This difference between participants and non-participants is particularly pronounced for the PSIS, and is further discussed in the Discussion section. The Justice Beliefs measure is not included in as there were no differences between participants and non-participants, other than the interaction effect reported above.

Table 5. Measures with statistically significant differences between groups of participants and non-participants (differences between groups are consistent over time).

4.5. Summary

Our results reflect interesting contrasts when comparing groups of participants and non-participants in various activities over time. Performance on the Fundamentals of Engineering/Situational Judgement measure (FESJ) among honours program participants did not change, as they already scored relatively high, but non-participants’ scores improved over time. Conversely, fraternity/sorority participants maintained their relatively lower FESJ scores, but non-participants improved over time. Fraternity/sorority participants also showed a decrease in Justice for Self subscale scores, while non-participants saw no changes over time. Finally, students who participated in service-learning decreased their scores on the Moral Disengagement measure but non-participants’ scores were unchanged over time. However, in all cases, the effect sizes were small, indicating the reported changes may be statistically significant but not practically significant. We saw no interaction effects for the Political and Social Involvement Scale (PSIS) or the other ten experiences.

5. Discussion

For most of the experiences that we investigated, there were no interaction effects for participants and non-participants over time. This means that for most of the activities investigated where there were differences between the participants and non-participants, these two groups were already different at the start of their undergraduate studies (), this difference was not linked to participation in a particular experience, and the difference was maintained across the study’s duration. This was surprising, as this study aimed to specifically investigate experiences hypothesised to impact students’ views of ethics and social responsibility. We find little evidence that participation in such experiences altered students’ views in ways that differed from any changes in non-participants, at least according to the deployed survey measures. We posit three reasons for these results.

5.1. Methodological differences

Variation between our results and prior work could be due to methodological differences. Much of the prior work investigating specific experiences took place on the timescale of a single course or shorter (Drake et al. Citation2005; Loui Citation2005, Citation2006; Clancy, Quinn, and Miller Citation2005). Such research might detect short-term impacts but not whether such impacts persist over longer time periods. Other work compared impacts across student populations (e.g. first-years vs. seniors) rather than tracking individual students (Shuman et al. Citation2004; Harding et al. Citation2004). For example, Finelli’s et al. (Citation2012) large-scale study similarly investigated students’ curricular and co-curricular experiences and their connection to the students’ ethical development. The researchers surveyed cross-sections of first-year through senior engineering students at 18 U.S. institutions about the nature of their experiences. However, by not tracking individual students longitudinally, such studies risk conflating the impacts of the interventions with other factors.

One exception can be found in the work of Bielefeldt and Canney’s (Citation2016), whose longitudinal study explored social responsibility attitudes among civil, mechanical, and environmental engineering students over 1.5 years and in relation to participation in a variety of experiences. They found that most students (57%) did not change significantly in their social responsibility attitudes. The remaining participants were split roughly evenly between those whose scores increased and decreased over time (20% and 23%, respectively). Students whose scores increased over time were more likely to cite engineering courses as contributing to their views of social responsibility compared to those whose scores decreased over time (Bielefeldt and Canney Citation2016). Their research joins a growing body of work reporting mixed results on the central question of this paper.

In light of this prior work, one possible explanation for the lack of interaction in our study between students’ scores on the survey instruments and the majority of experiences in which they participated is that any changes in students’ views and perceptions were not retained in the years following their participation. Short-term effects may have been present, but did not persist. Research in the business and medical education fields has similarly shown impacts of ethical interventions are often short-lived (Weber Citation1990; Beagan Citation2003).

5.2. Self-selection

Another possible explanation for our results is that students self-select into the activities we investigated. This is exemplified by our results from the Political and Social Involvement Scale (PSIS, ), where we consistently see differences on PSIS scores between participating and non-participating students (but no interaction effects). As one example, students who participate in service-learning courses score higher, across time, on the PSIS than students who do not. It is perhaps unsurprising that students who have high scores on the PSIS (i.e. students who rated volunteering, improving their communities, etc. as ‘essential’ or ‘very important’) actually participate in those experiences. We believe results such as this are indications of students self-selecting into experiences that affirm pre-existing attitudes and values, rather than being changed by participation in these activities.

Self-selection factors have been hypothesised in other studies of engineering ethics and social responsibility. Jesiek et al. (Citation2013) suggested that students who opt into global service-learning experiences may have stronger orientations towards political and social involvement (measured using the PSIS) compared to their peers. Similarly, James (Citation2006) notes the potential for self-selection to bias research results within business ethics. The students in our study may reflect two types of self-selection: (a) those who elect to participate in these activities may be different, prior to participation, than students who elect to not participate (as discussed above) and (b) the types of students who are willing to participate in a study about ethics may be different than those who chose not to participate in the study.

Additionally, our results show that students who reported participating in service-learning scored higher on the Fundamentals of Engineering/Situational Judgement (FESJ) measure across time. Considering the lack of interaction, it is possible that students who pursued service-learning as college students had already pursued such an experience prior to their entrance to college, and thus had pre-existing knowledge tied to ethical judgement and/or already had an interest in political and social issues. Alternatively, it is possible that such pre-existing knowledge led them to pursue service-learning as college students and to score higher on the FESJ and PSIS measures.

As another example of potential evidence for self-selection, we can examine the data from participants in a fraternity/sorority. Participation in a fraternity/sorority is the one activity we investigated whose participants performed consistently worse on multiple measures (higher Moral Disengagement scores and lower Ethical Climate Index and Political and Social Involvement Scale scores) compared to non-participants. Our data seems to suggest that students who perceive their university environments as less ethical, place less value on social involvement, and are more willing to morally disengage also tend to be more interested in participating in a fraternity or sorority.

5.3. Certain experiences do lead to change

It is worth noting, however, that there was evidence of interaction effects on select measures for three experiences, suggesting that these experiences do lead to measurable changes in perceptions of ethics and social responsibility. A set of interaction effects was found for students who reported participating in honours programs during their undergraduate studies. Students who entered honours programs started college with higher Fundamentals of Engineering/Situational Judgement (FESJ) results and these scores remained high over time. Students who did not participate in an honours program showed a significant increase on the FESJ measure. It seems that participation in an honours program may have had no effect because participating students were already quite high on the FESJ measure as compared to non-participants.

A second set of interaction effects concerned students who participated in a fraternity or sorority during college. Participants and non-participants initially had similar scores on the FESJ, and participants had consistent scores over time. Those who did not participate in a fraternity or sorority showed an increase in their scores over time. It seems that participating in fraternities/sororities lessened the positive increases that non-participants experienced on the FESJ measure over time. We also found an interaction effect for these students on the Justice for Self subscale, where participants had a bigger decrease on this measure than non-participants. The experience of being in a fraternity or sorority may have led students to be less confident that they would experience justice in their own lives.

Perhaps the most interesting interaction is the one for students who participated in service-learning courses. These students, on average, started with similar scores on the Moral Disengagement measure as non-participants but only the participant group saw a decrease in their scores over time, meaning they were less morally disengaged by the final survey. Because we saw no other changes over time for other experiences or for the students overall on this measure, this is an unusual finding. There may be some unique aspects of service-learning courses that reduce moral disengagement among participating students.

Finally, we note that there were no statistically significant differences between the initial and final outcomes of any measure on our survey for the respondents who did and did not participate in study abroad. This was unexpected because study abroad has often been framed as particularly impactful for students’ perceptions of ethics and social responsibility (Luo and Jamieson-Drake Citation2015).

5.4. Future work

Data analysis is ongoing, including continued analysis of the interviews conducted in the students’ first and final years of college. Analysis of the interviews has provided additional nuance beyond what is captured by these measures, particularly regarding the influence of co-op and internships on students’ understanding of ethics and social responsibility (Claussen et al. Citation2021a; Kim, Howland, and Jesiek Citation2021a). Finally, a new phase of the project will track these survey respondents as they begin professional work (Claussen et al. Citation2021b), motivated in part by the mixed and sometimes contradictory results from previous research investigating how engineers’ views of ethics and social responsibility are impacted by the school-to-work transition. For example, Cech found that the decrease in engineering students’ commitment to public welfare that occurs over their undergraduate studies fails to rebound when they enter professional practice (Cech Citation2014), while other large-scale studies of engineering alumni and professionals suggest that ethics remains important for engineers across career stages (Lattuca et al. Citation2014; Trevelyan Citation2014). As part of our new project, we have administered a fourth survey in the early years of our longitudinal participants’ careers to understand how their views change or stabilise during the transition to professional work.

5.5. Limitations

This study does have limitations, including as related to the demographic characteristics of the respondents. As noted above, students who identified as white, female, and domestic (U.S.) students were over-represented. As such, these results should not be considered generalisable to undergraduate engineering students throughout the United States or elsewhere. Our research approach similarly has limitations. It was exploratory in nature and sought to identify co-curricular experiences that had an influence on engineering students’ perceptions of ethics and social responsibility, as measured by the instruments we selected. As discussed above, this research is continuing, and we look forward to results from additional surveys and also from more qualitative analyses that are also ongoing.

6. Conclusion

Significant investments have been directed towards curricular interventions and other programming with the goal of bolstering social and ethical commitments of students in engineering and other STEM fields (Baligar et al. Citation2018; Balakrishnan, Tochinai, and Kanemitsu Citation2017; Gorur, Hoon, and Kowal Citation2020; Safatly et al. Citation2020; You and Lee Citation2011). Yet the main story we find in our data is one of self-selection. Most students seem to opt into experiences aligned with their pre-existing values and commitments. While this does not rule out the possibility of incremental or even transformative growth for some students, we were not able to detect much in the way of aggregate impacts.

If these findings prove valid and replicable, they point towards three main implications. First, most types of experiences (e.g. taking a single ethics course, participating in a service-learning experience) are probably not substantial enough to have measurable, long-term impacts on most students. Educational interventions aiming to significantly shift student perceptions of ethics and social responsibility may require a longer duration, potentially cutting across multiple years and many courses as suggested by an ethics across the curriculum approach (e.g. Englehardt and Pritchard Citation2018). Second, more attention should be paid to recruitment to identify and reduce barriers to participation for students who might benefit the most from interventions – but are also least likely to participate. Requiring certain kinds of learning experiences might also prove impactful. Third, efforts to improve recruitment into, and enhance the impacts of, such interventions should avoid a one-size-fits-all approach, including recognising the wealth of pre-existing perspectives and experiences that students possess when they arrive at university. This is especially true for required curricular experiences, where a failure to meet students ‘where they are’ may lead to subpar or even regressive learning outcomes.

Nonetheless, a general lack of evidence in this study regarding whether and how specific types of experiences impacted students’ perceptions of ethics and social responsibility does not imply that the perceptions of students are unchanged by participation. Indeed, students have reported the importance of some co-curricular activities (Rulifson and Bielefeldt Citation2019; Bielefeldt et al. Citation2018). What these results do suggest is that changing students’ perceptions may be difficult without coordinated, concerted long-term effort and outreach. Martin, Conlon, and Bowe (Citation2021) noted that engineering ethics education is a ‘complex and multi-layered system’ (p. 3) and these efforts to provoke changes in students’ perceptions of ethics and social responsibility will need to occur at all levels, ranging from individual students and instructors to institutions, policies (including those issued by accrediting bodies), and even the ‘wider cultural milieu’ (p. 3).

It may also be the case that such changes are difficult to detect using existing quantitative measures, suggesting the need for more sensitive survey instruments and/or further qualitative investigations by engineering ethics education researchers. Engineering ethics instructors could be encouraged to use existing standardised assessment instruments, as some research has shown that many are unfamiliar with these instruments (Bielefeldt and Canney Citation2016). In addition, many existing instruments are lengthy (i.e. DIT-2) which may discourage instructors from using them in their courses. Amidst growing expectations that engineering schools cultivate these attitudes in their students, increased collaborative efforts within and across courses, curricula, and co-curricular activities could yield changes that are both quantitatively and qualitatively measurable. Further, provoking such changes may require more intensive and widespread efforts to weave ethics into the fabric of engineering education.

Acknowledgments

This material is based upon work supported by the National Science Foundation under Grants 1449370, 1449470, 1449479, 2024304, and 2024301. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. We would also like to acknowledge the contributions and assistance of Dayoung Kim, Gregg Warnick, Debra Fuentes, and Randall Davies.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the National Science Foundation [1449479]; National Science Foundation [1449470]; National Science Foundation [1449370]; National Science Foundation [2024301]; National Science Foundation [2024304].

References

  • ABET, Inc. 2018. Criteria for Accrediting Engineering Programs: Effective for Reviews During the 2019–2020 Accreditation Cycle. ABET, Inc. https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2019-2020/
  • Arnaud, A. 2010. “Conceptualizing and Measuring Ethical Work Climate: Development and Validation of the Ethical Climate Index.” Business and Society 49 (2): 345–358. doi:10.1177/0007650310362865.
  • Balakrishnan, B., F. Tochinai, and H. Kanemitsu. 2017. “Engineering Ethics Education: A Case Study in Japan and Malaysia.” 2017 IEEE Global Engineering Education Conference (EDUCON), 128–131. doi:10.1109/EDUCON.2017.7942836.
  • Baligar, P., S. Kavale, M. Kaushik, G. Joshi, and A. Shettar. 2018. Engineering Exploration: A Collaborative Experience of Designing and Evolving a Freshman Course. 2018 World Engineering Education Forum - Global Engineering Deans Council, WEEF-GEDC 2018. doi:10.1109/WEEF-GEDC.2018.8629768.
  • Beagan, B. 2003. “Teaching Social and Cultural Awareness to Medical Students: “It’s All Very Nice to Talk About It in Theory, but Ultimately It Makes No Difference.” Academic Medicine 78 (6): 605–614. doi:10.1097/00001888-200306000-00011.
  • Bielefeldt, A. R., and N. E. Canney. 2016. “Changes in the Social Responsibility Attitudes of Engineering Students Over Time.” Science and Engineering Ethics 22 (5): 1535–1551. doi:10.1007/s11948-015-9706-5.
  • Bielefeldt, A. R., M. Polmear, D. Knight, C. Swan, and N. Canney. 2018. “Education of Electrical Engineering Students About Ethics and Societal Impacts in Courses and Co-Curricular Activities.” 2018 IEEE Frontiers in Education Conference (FIE). doi:10.1109/FIE.2018.8658888.
  • Blaich, C., and K. Wise. 2011. “The Wabash National Study—the Impact of Teaching Practices and Institutional Conditions on Student Growth.” American Educational Research Association Annual Meeting, New Orleans, LA, April 8–12, 2011. http://www.liberalarts.wabash.edu/storage/Wabash-Study-Student-Growth_Blaich-Wise_AERA-2011.pdf
  • Brown-Liburd, H. L., and B. M. Porco. 2011. “It’s What’s Outside That Counts: Do Extracurricular Experiences Affect the Cognitive Moral Development of Undergraduate Accounting Students?” Issues in Accounting Education 26 (2): 439–454.
  • Burt, B. A., D. D. Carpenter, M. Holsapple, C. J. Finelli, and R. M. Bielby. 2013. “Out-Of-Classroom Experiences: Bridging the Disconnect Between the Classroom, the Engineering Workforce, and Ethical Development.” International Journal of Engineering Education 29 (3): 714–725.
  • Cech, E. A. 2014. “Culture of Disengagement in Engineering Education?” Science, Technology, & Human Values 39 (1): 42–72. doi:10.1177/0162243913504305.
  • Clancy, E. A., P. Quinn, and J. E. Miller. 2005. “Assessment of a Case Study Laboratory to Increase Awareness of Ethical Issues in Engineering.” IEEE Transactions on Education 48 (2): 313–317. doi:10.1109/TE.2004.842900.
  • Claussen, S., S. J. Howland, S. Nittala, C. B. Zoltowski, and B. K. Jesiek. 2021a. “Longitudinal Qualitative Case Study of One Student’s Perceptions of Ethics and Social Responsibility: Corvin’s Story.” Paper presented at the ASEE Annual Conference and Exposition, July 26-29, 2021.
  • Claussen, S., B. K. Jesiek, C. B. Zoltowski, and S. J. Howland. 2021b. “Early Career Engineers’ Views of Ethics and Social Responsibility: Study Overview.” Paper presented at the ASEE Annual Conference and Exposition, July 26-29, 2021.
  • Colby, A., L. Kohlberg, J. Gibbs, M. Lieberman, K. Fischer, and H. D. Saltzstein. 1983. “A Longitudinal Study of Moral Judgment.” Monographs of the Society for Research in Child Development 48 (1/2): 1–124.
  • Craig, P. J., and S. N. Oja. 2013. “Moral Judgement Changes Among Undergraduates in a Capstone Internship Experience.” Journal of Moral Education 42 (1): 43–70. doi:10.1080/03057240.2012.677603.
  • Detert, J. R., L. K. Trevino, and V. L. Sweitzer. 2008. “Moral Disengagement in Ethical Decision Making: A Study of Antecedents and Outcomes.” The Journal of Applied Psychology 93 (2): 374–391. doi:10.1037/0021-9010.93.2.374.
  • Drake, M., P. M. Griffin, R. Kirkman, and J. L. Swann. 2005. “Engineering Ethical Curricula: Assessment and Comparison of Two Approaches.” Journal of Engineering Education 94 (2): 223–231. doi:10.1002/j.2168-9830.2005.tb00843.x.
  • Englehardt, E. E., and M. S. Pritchard, eds. 2018. Ethics Across the Curriculum—pedagogical Perspectives. Springer.
  • Finelli, C. J., M. A. Holsapple, E. Ra, R. M. Bielby, B. A. Burt, D. D. Carpenter, T. S. Harding, and J. A. Sutkus. 2012. “An Assessment of Engineering Students’ Curricular and Co-Curricular Experiences and Their Ethical Development.” Journal of Engineering Education 101 (3): 469–494.
  • Gorur, R., L. Hoon, and E. Kowal. 2020. “Computer Science Ethics Education in Australia - a Work in Progress.” Proceedings of 2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering, TALE 2020, 945–947. doi:10.1109/TALE48869.2020.9368375.
  • Harding, T. S., D. D. Carpenter, C. J. Finelli, and H. J. Passow. 2004. “Does Academic Dishonesty Relate to Unethical Behavior in Professional Practice? An Exploratory Study.” Science and Engineering Ethics 10 (2): 311–324.
  • Hashemian, G., and M. C. Loui. 2010. “Can Instruction in Engineering Ethics Change Students’ Feelings About Professional Responsibility?” Science and Engineering Ethics 16 (1): 201–215. doi:10.1007/s11948-010-9195-5.
  • Hess, J. L., and G. Fore. 2018. “A Systematic Literature Review of US Engineering Ethics Interventions.” Science and Engineering Ethics 24 (2): 551–583. doi:10.1007/s11948-017-9910-6.
  • Howland, S. M. J., B. K. Jesiek, S. A. Claussen, and C. Z. Zoltowski. n.d. “Measures of Ethics and Social Responsibility Among Undergraduate Engineering Students: Findings from a Longitudinal Study.” Science and Engineering Ethics (revised paper under review).
  • Howland, S. M. J., G. M. Warnick, C. Z. Zoltowski, B. K. Jesiek, and R. Davies. 2018. “A Longitudinal Study of Social and Ethical Responsibility Among Undergraduate Engineering Students: Comparing Baseline and Mid-Point Survey Results.” Proceedings of the ASEE Annual Conference and Exposition, Salt Lake City, UT, June 24–27, 2018.
  • James, H. S., Jr. 2006. “Self-Selection Bias in Business Ethics Research.” Business Ethics Quarterly 16 (4): 559–577. doi:10.5840/beq200616449.
  • Jesiek, B. K., N. T. Buswell, and Q. Zhu. 2018. “Global Engineering Competency: Assessment Tools and Training Strategies.” Proceedings of the 2018 ASEE Annual Conference and Exposition, Salt Lake City, UT, June 24–27, 2018.
  • Jesiek, B. K., A. Dare, J. Thompson, and T. Forin. 2013. “Global Engineering Design Symposium: Engaging the Sociocultural Dimensions of Engineering Problem Solving.” Proceedings of the 2013 ASEE Annual Conference and Exposition, Atlanta, GA, June 23–26, 2013.
  • Johnston, C. R., D. J. Caswell, and G. M. Armitrage. 2007. “Developing Environmental Awareness in Engineers Through Engineers Without Borders and Sustainable Design Projects.” The International Journal of Environmental Studies 64 (4): 501–506.
  • Kim, D., S. J. Howland, and B. K. Jesiek. 2021a. “Encountering Engineering Ethics in the Workplace: Stories from the Trenches.” Paper presented at the ASEE Annual Conference and Exposition, July 26-29, 2021.
  • Kim, D., B. K. Jesiek, and S. J. Howland. 2021b. “Longitudinal Investigation of Moral Disengagement Among Undergraduate Engineering Students: Findings from a Mixed-Methods Study.” Ethics & Behavior 32 (8): 691–713. doi:10.1080/10508422.2021.1958330.
  • Knight, D. W., A. R. Bielefeldt, N. E. Canney, and C. Swan. 2016. “Macroethics Instruction in Co-Curricular Settings: The Development and Results of a National Survey.” 2016 IEEE Frontiers in Education Conference (FIE). doi:10.1109/FIE.2016.7757437.
  • Lattuca, L., P. Terenzini, D. Knight, and H. K. Ro. 2014. 2020 Vision: Progress in Preparing the Engineer of the Future. Ann Arbor, MI: University of Michigan, School of Education. https://deepblue.lib.umich.edu/handle/2027.42/107462
  • Loui, M. 2005. “Ethics and the Development of Professional Identities in Engineering Students.” Journal of Engineering Education 94 (4): 383–390.
  • Loui, M. 2006. “Assessment of an Engineering Ethics Video: Incident at Morales.” Journal of Engineering Education 95 (1): 85–91. doi:10.1002/j.2168-9830.2006.tb00879.x.
  • Lucas, T. S., L. Zhdanova, and S. Alexander. 2011. “Procedural and Distributive Justice Beliefs for Self and Others: Assessment of a Four-Factor Individual Differences Model.” Journal of Individual Differences 32 (1): 14–25. doi:10.1027/1614-0001/a000032.
  • Luo, J., and D. Jamieson-Drake. 2015. “Predictors of Study Abroad Intent, Participation, and College Outcomes.” Research in Higher Education 56 (1): 29–56. doi:10.1007/s11162-014-9338-7.
  • Martin, D. A., E. Conlon, and B. Bowe. 2021. “A Multi‑level Review of Engineering Ethics Education: Towards a Socio‑technical Orientation of Engineering Education for Ethics.” Science and Engineering Ethics 27 (5): 60. doi:10.1007/s11948-021-00333-6.
  • National Academy of Engineering. 2016. Infusing Ethics into the Development of Engineers: Exemplary Education Activities and Programs. The National Academies Press. doi:10.17226/21889.
  • Oja, S. N., S. E. Graham, and M. D. Andrew. 2011. “Change in Moral Judgment of Teaching Interns in a Full-Year Internship.” Journal of Research in Character Education 9 (1): 35–56.
  • Polmear, M., A. D. Chau, and D. R. Simmons. 2020. “Ethics as an Outcome of Out-Of-Class Engagement Across Diverse Groups of Engineering Students.” Australasian Journal of Engineering Education 16 (1): 64–76.
  • Roy, J. 2019. Engineering by the Numbers. American Society for Engineering Education. https://ira.asee.org/wp-content/uploads/2019/07/2018-Engineering-by-Numbers-Engineering-Statistics-UPDATED-15-July-2019.pdf
  • Rulifson, G., and A. Bielefeldt. 2018. “Influence of Internships on Engineering Students’ Attitudes About Socially Responsible Engineering.” 2018 IEEE Frontiers in Education Conference (FIE). doi:10.1109/FIE.2018.8658647.
  • Rulifson, G., and A. Bielefeldt. 2019. “Evolution of Students’ Varied Conceptualizations About Socially Responsible Engineering: A Four Year Longitudinal Study.” Science & Engineering Ethics 25: 939–974.
  • Safatly, L., M. Itani, I. Srour, and A. El-Hajj. 2020. “A Comprehensive Overview of Approaches to Teaching Ethics in a University Setting.” Journal of Civil Engineering Education 146 (2). doi:10.1061/(ASCE)EI.2643-9115.0000009.
  • Self, D. J., and E. M. Ellison. 1998. “Teaching Engineering Ethics: Assessment of Its Influence on Moral Reasoning Skills.” Journal of Engineering Education 87 (1): 29–34.
  • Shuman, L. J., M. F. Sindelar, M. Besterfield-Sacre, H. Wolfe, R. L. Pinkus, Miller B. Olds, and C. Mitcham. 2004. “Can Our Students Recognize and Resolve Ethical Dilemmas?” Proceedings of the 2004 ASEE Annual Conference and Exposition, Salt Lake City, UT, June 20–23, 2004.
  • Spinosa, H., J. Sharkness, J. H. Pryor, and A. Liu. 2008. Findings from the 2007 Administration of the College Senior Survey (CSS): National Aggregates. Higher Education Research Institute. May. https://www.heri.ucla.edu/PDFs/CSS_2007%20Report.pdf
  • Trevelyan, J. 2014. The Making of an Expert Engineer. Leiden, The Netherlands: CRC Press.
  • Vozzola, E. C., and A. K. Senland. 2022. Moral Development: Theory and Applications (2nd Ed.). New York: Routledge.
  • Weber, J. 1990. “Measuring the Impact of Teaching Ethics to Future Managers: A Review, Assessment, and Recommendations.” Journal of Business Ethics 9 (3): 183–190.
  • You, M., and Y. Lee. 2011. “Design Ethics Education in Taiwan: A Study of Syllabi of Ethics-Related Courses.” Lecture Notes in Computer Science 6775: 594–603.
  • Zoltowski, C. B., B. K. Jesiek, S. A. Claussen, and D. H. Torres. 2016. “Foundations of Social and Ethical Responsibility Among Undergraduate Engineering Students: Project Overview.” Proceedings of the 2016 ASEE Annual Conference and Exposition, New Orleans, LA, June 26–29, 2016.