911
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Building capacity in engineering education research through collaborative secondary data analysis

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon show all
Pages 8-16 | Received 30 Jul 2022, Accepted 11 Jan 2023, Published online: 23 May 2023

ABSTRACT

This paper proposes the use of collaborative secondary data analysis (SDA) as a tool for building capacity in engineering education research. We first characterise the value of collaborative SDA as a tool to help emerging researchers develop skills in qualitative data analysis. We then describe an ongoing collaboration that involves a series of workshops as well as two pilot projects that seek to develop and test frameworks and practices for SDA in engineering education research. We identify emerging benefits and practical challenges associated with implementing SDA as a capacity building tool, and conclude with a discussion of future work.

1. Introduction

Building capacity in engineering education research (EER) occurs on multiple levels, from national and institutional infrastructures to development of individual researchers. This paper addresses the individual level, focusing on developing emerging researchers’ capacity in qualitative methods. The need for such work surfaced early in the emergence of EER as a field. For example, in their study of skilled technical researchers who participated in the 2005 U.S.-based Rigorous Research in Engineering Education (RREE) workshops, Borrego (Citation2007) identified multiple challenges facing participants, including developing research questions, applying theoretical frameworks to research design, operationalising constructs in data collection, and, most relevant for this paper, implementing qualitative methods.

Similar observations have surfaced around the globe. For example, Jawitz, Case, and Marshall (Citation2009) described their journeys as South Africans building EER capacity through studies on diversity and on student experiences. They showed the need to move beyond positivist epistemological approaches familiar in engineering towards interpretive and critical methodologies from the social sciences; their account foregrounded challenges associated with making sense of qualitative data through varying theoretical lenses. More recently, Dart, Trad, and Blackmore’s (Citation2021) study of participants in the Australasian Engineering Education Association’s Winter School, designed to introduce researchers to EER, identified qualitative data collection and analysis as a needed skills central to their development. Similarly, Gardner and Willey (Citation2018) found intellectual engagement with qualitative research formed a component of identity transformation among a group of Australasian technical engineering faculty who transitioned to EER.

Today, while the global growth of EER graduate programs provides one mechanism for new researchers at some institutions to address these challenges, others still have limited options. In this paper, we argue that secondary data analysis (SDA) offers one powerful but mostly overlooked means to address this gap. SDA can help build capacity in qualitative research by lowering the time and cost barriers associated with data collection while allowing emerging researchers to grapple with the messy complexity of qualitative data. Simultaneously, SDA collaborations can enable the original researchers to more fully explore their data in new ways. Moreover, when collaborations cross institutional and national boundaries, they can both build global capacity and spur needed intra- and inter-national comparisons.

The global challenge of capacity building stems partly from cross-national variations in the evolution of EER. The education of engineers has been a focus for scholarly debate since the late 1800s (Case Citation2017), but it was in the late 1900s that EER as a distinctive research field began to coalesce. In the U.S., significant funding from the National Science Foundation (NSF) for EER was a key driver, launching major efforts to build capacity. The NSF-funded RREE workshops provided a pathway for engineering educators to develop skills in engineering education research (Streveler and Smith Citation2006). The Journal of Engineering Education, already nearly a century old, explicitly aligned itself with systematic empirical research, publishing articles that proposed tenets for EER (Radcliffe Citation2006; “The Research Agenda for the New Discipline of Engineering Education Citation2006; Streveler and Smith Citation2006). Similarly, the Center for the Advancement of Engineering Education (CAEE) published its report identifying future research directions (Atman et al. Citation2010). Simultaneously, U.S. universities began establishing PhD programs in EER (Benson et al. Citation2010).

Globally, EER has a longer, if somewhat scattered, history (Borrego and Bernhard Citation2011). For example, while contexts like Australia, New Zealand, and South Africa have limited dedicated research funding available compared to the U.S., EER in these countries arose in response to government imperatives for engineering education reform. These imperatives were taken up by universities in various ways, leading to distinctive career patterns for EER academics located in engineering schools that are strongly linked to impacting student experiences (Klassen et al. Citation2023; Kumar et al. Citation2021). In contrast, China has seen dramatic expansion of engineering education programs led by state priorities, and EER has largely been established by policy-oriented scholars based in education schools (Cao et al. Citation2021; Klassen et al. Citation2023).

Globally, then, various national imperatives and supports have helped drive the growth of EER. This growth has been accompanied by an expansion of journals and conferences; the Research in Engineering Education Network (REEN) lists 17 general engineering education journals and 11 discipline-specific journals (https://reen.co/eer-journals/), while the Engineering Education List Wiki (http://engineeringeducationlist.pbworks.com/) includes additional EER journals and more general education journals that publish EER. EER conferences are held around the globe, including those sponsored by Australasian, European, and American engineering education professional societies (AAEE, SEFI, and ASEE, respectively) as well as the International Conference on Engineering Education and Innovation, the Research in Engineering Education Symposium, the Annual Colloquium on International Engineering Education, and others.

Despite this growth, opportunities for learning EER remain limited. This challenge is especially acute for countries and institutions with limited or no dedicated research funding and/or few if any formal programs to train researchers. The growth of EER doctoral programs such as those in the U.S., Sweden, Denmark, Malaysia, South Africa, Spain, India, and elsewhere provide one mechanism for individuals with both access to and time for graduate work. But undergraduates and engineering educators who seek to complement or replace their technical engineering research with EER have fewer opportunities. In the U.S., two NSF funding programs support capacity building for those with PhDs by pairing new researchers with experienced scholars who serve as mentors; one of those is limited to new researchers without experience in education or social science research, while the other is more broadly constructed. But the two-year time frame for these grants, along with the need to engage with an experienced EER mentor, still poses barriers, and access is limited to those eligible for NSF funding. The Winter and Summer Schools sponsored by the Australasian Association for Engineering Education (https://aaee.net.au/winter-summer-school/) provide another key global opportunity, with a focus on learning to design robust research studies. Similar workshops are held in conjunction with professional society conferences such as ASEE and SEFI. However, the time constraints of these programs limit their ability to build capacity in data collection and analysis, and participation is again limited to those with both time and funding.

As a result, the question remains: How can new researchers gain skills not only in research design, but in data collection and analysis, to support their development as EER scholars?

2. Secondary data analysis as a tool for capacity building

In response to this question, we posit that secondary data analysis (SDA) represents a significant untapped opportunity for helping new researchers develop skills in qualitative methods. While existing textbooks and workshops help new scholars learn the basics of research design, data collection and data analysis – especially in qualitative approaches – are messy processes that require sustained engaged practice with real data and all its richness, ambiguities, and limitations. SDA, we suggest, can provide new researchers with access to robust, complex data sets that are well-suited to sustained analysis, while also yielding insights into data collection through interview and focus group transcripts, field notes, and observations, and/or videos.

2.1. An expanding vision of SDA

To situate our discussion, we first unpack our own journey. As scholars working in the U.S., our interest in SDA was initially grounded in the wealth of data collected with NSF funding. Over the past decade, NSF’s Engineering Education and Centers (EEC) division alone has funded over 500 projects, representing $150 M USD, with most projects collecting new data. In cases of qualitative research in particular, researchers can rarely fully mine the resulting rich data sets before moving to the next project (Johri, Vorvoreanu, and Madhavan Citation2016). Moreover, across prominent EER journals, few studies explicitly identify SDA as a method, suggesting that its use is not widespread.

With funding from NSF (ironically), authors Paretti, Case, and Matusovich (along with collaborators Joachim Walther and Nicola Sochacka) undertook a project to shift this paradigm by developing a framework for SDA that could expand its value and use in EER. The project brought together a group of U.S. researchers both to discuss SDA in a series of workshops and to implement it through pilot projects (Case et al. Citation2022). The core team conducted a systematic review of publications to identify NSF-funded qualitative research studies with high potential for SDA (i.e. those with particularly rich data sets) that represented a range of contexts and populations. We then invited a diverse (in terms of gender, race, EER experience, and institutional characteristics) group of study authors to participate in the workshop series. While all of the researchers involved currently work in the U.S., we represent a range of institutional contexts and include international scholars with experiences of EER in other countries.

As we convened, our vision of SDA grew substantially. We began thinking that SDA involved data collected by one set of researchers for one purpose subsequently being analysed by different researchers asking new questions. But by the end of our first workshop, we recognised that SDA also includes the same researcher coming back with the same or different questions later, as well as researchers merging data sets across projects. Equally important, we began to conceptualise data not simply as an artefact of data collection, but as a product itself that could be intentionally designed for subsequent sharing and analysis, either publicly through data repositories or with other researchers upon request.

Finally, and most relevant for this discussion, we expanded our understanding of the value of SDA, from simply more fully mining the data to using SDA to train new researchers by providing access to data they may otherwise lack the expertise, time, and/or funding to collect. This use of SDA is especially important for scholars interested in learning qualitative methods because these methods often require significant investments of time for data collection, funding for participant compensation, and time or funding for transcription and de-identification – investments that easily put it out of reach. While education-related quantitative data sets are often available publicly through government agencies and from within institutions, we were unable to identify any such public qualitative EER data sets. A search in 2022 of data repositories such as the U.S. government’s Data.gov, the Qualitative Data Repository (https://qdr.syr.edu/), the European Union’s data repository (https://data.europa.edu), and the Inter-university Consortium for Political and Social Research (ICPSR - https://www.icpsr.umich.edu/web/pages/) for data related to engineering education yielded only a few data sets addressing secondary school mathematics rather than EER. In many cases, ‘engineering education’ was not even a recognised keyword phrase and varying combinations of terms and search strategies produced no results. Similarly, we were unable to identify published studies in EER journals that explicitly reflected SDA collaborations in which the researcher who collected the data shared it with a researcher who was not part of the original study.

The current landscape, then, suggests that new EER researchers who want to build capacity in qualitative methods must collect new data – either as a doctoral student in an EER-related graduate program or independently. Re-examining our assumptions about and approaches to data collection and data sharing could simultaneously allow experienced researchers to make fuller use of rich data sets that represent significant investments of time and money, and generate collaborations that build capacity in the field globally. In addition, using SDA to build capacity across boundaries can advance comparative global research (Jesiek, Borrego, and Beddoes Citation2010); data sharing across projects, institutions, and countries could advance the key imperatives noted earlier that undergird the growth of EER (Borrego and Bernhard Citation2011).

2.2. Current SDA practices within and beyond EER

SDA, though not common in EER, is not new; it has long been discussed across social science fields (refer to Walther, Sochacka, and Pawley Citation2016 for a brief review of this work). Within EER, Advances in Engineering Education devoted a special issue in 2016 to data sharing that highlighted both opportunities and challenges associated with SDA in qualitative research (Johri, Vorvoreanu, and Madhavan Citation2016). In terms of data collection for SDA, the work of the international Design Thinking Research Symposium described by Adams, Radcliffe, and Fosmire (Citation2016) has long modelled the process of intentionally enabling diverse scholars to bring their own questions, tools, and perspectives to a shared data set. Regarding comparative work, Trevelyan (Citation2016) argued persuasively for the need to intentionally develop and combine qualitative data sets from different researchers to better understand engineering work globally. More recently, moves towards radically transparent research, including participant ownership and publication of transcripts, are also beginning to offer new ways of making qualitative data public (e.g. Chua Citation2012; Mazzurco Citation2016).

Such approaches are still nascent, however, and despite increasing scholarly advocacy for data sharing and increasing mandates from funding agencies globally, few qualitative EER researchers seem to be designing data sets for such access. For qualitative researchers in particular, concerns about epistemological fidelity, informed consent, data ownership, participant confidentiality, and related ethical issues pose significant barriers (Johri et al. Citation2016; Walther, Sochacka, and Pawley Citation2016). Johri, Yang, et al’.s (Citation2016) study of the perceptions of data sharing among engineering education researchers found that while many respondents supported the idea of data sharing, these concerns all emerged as potential barriers. Moreover, access to data alone is not sufficient to build capacity because while repositories can provide experienced researchers with rich data sets, they cannot teach emerging researchers how to use them. Similarly, repositories do not readily capture the implicit knowledge embedded in data collection that is needed for informed data analysis.

In light of these barriers, Walther, Sochacka, and Pawley (Citation2016) identified key considerations for sharing qualitative data in EER. Building on the framework for interpretive research quality (Walther, Sochacka, and Kellam Citation2013), they focus on communicative validation – that is the social construction of meanings from the data in ways that are attuned to contextual considerations, participants’ accounts of their experiences, and the conventions of the research community. Using example cases, they highlight the ways in which the study contexts, the researchers’ backgrounds, and the emergent nature of qualitative data collection all shape the resulting data in ways that make public data sharing challenging from both ethical and quality perspectives.

3. SDA in practice

The complexities articulated by Walther, Sochacka, and Pawley (Citation2016), however, also make collaborative SDA valuable for building capacity. We acknowledge the difficulty of developing metadata about the context, participants, and researcher knowledge and positionality that is sufficiently rich to ethically support posting many qualitative EER data sets to public repositories. However, we posit that collaborative SDA, in which new researchers work with one or more of the original researchers, simultaneously helps new researchers learn and helps the original researchers more fully explore how their own framing, positionality, and social realities influenced both making and handling the data. As Walther, Sochacka, and Kellam (Citation2013) note, deep attention to the context and processes are essential to research quality; robust engagement around these issues in SDA can help new researchers learn to attend to research quality in a sustained way even as it invites experienced researchers to make their implicit practices and assumptions increasingly explicit.

To illustrate these issues, we turn to two pilot projects that have emerged from our SDA work. At the second workshop, we invited participants to submit ideas for small projects, supported through our grant, to advance our understanding of SDA broadly. The two resulting projects exemplify, in different ways, the potential to use SDA for capacity building. The first involves developing undergraduate researchers, while the second involves developing a graduate researcher.

3.1. Project 1: SDA to train undergraduate researchers at an undergraduate institution

The first project is a collaboration between an EER scholar at a research-intensive institution (author Kajfez), where both external funding and EER graduate advising are supported and rewarded, and an EER scholar at a teaching-focused institution (author Zastavker) who supports the development of undergraduate researchers with no previous background or formal training in EER. Without either time and funding for research or a ready pool of graduate students to help collect and analyse data, scholars in contexts such as Zastavker’s often work with undergraduate researchers. But since EER is not an undergraduate subject, and undergraduates have far fewer research hours available than graduate students, training these emerging scholars is often time-intensive, with limited return on investment in data collection or analysis. In this case, the SDA collaboration offered a rapid on-ramp (five weeks) for two new engineering undergraduates to gain qualitative analysis skills in EER.

Kajfez oversaw the original data collection through an externally-funded research project designed to understand how engineering identities and student communities develop from the first year through to graduation. The data are interviews with over 35 engineering students; some completed three interviews throughout their undergraduate experience, while others only completed one or two. The full data set includes 77 interviews representing approximately 60 hours of data collection (for which participants were compensated), followed by hours of transcription and cleaning to remove identifying information, reflecting the often high cost of qualitative research in both time and money. While the project achieved its original goals (Faber et al. Citation2021; Kajfez et al. Citation2021), the richness of the data meant many emerging issues were underexplored. These underexplored issues in turn created an opportunity to train undergraduate researchers while also more deeply investigating engineering student experiences in ways that benefitted the whole research team.

To enable the collaboration, both researchers worked with their Institutional Review Boards (IRBs). Kajfez maintained data control, access, and ownership; Zastavker and her students were enrolled into the existing study as external collaborators, working with de-identified transcripts only. But because data alone, even with written explanations, were not sufficient, Kajfez periodically met with Zastavker and her research team. As Zastavker mentored her students through data analysis, they maintained a ‘parking lot’ of questions for Kajfez regarding both the context and the content of the interviews. Kajfez could then negotiate the tensions between providing enough rich context and ethically maintaining the intended confidentiality. In addition, Kajfez helped create boundary conditions for the new team, enabling them to remain focused on issues within the broad scope of the original study that could be addressed with the existing data. Kajfez thus became an important mentor in helping the students understand both data collection and data analysis, though at a lower time commitment than the original study. Importantly, the resulting publication(s) will be co-authored by the full team: Kajfez, Zastavker, and the two undergraduates.

In addition, Kajfez and Zastavker had the two undergraduates continually reflect on their learning and on the process of engaging with this data. In this case, the undergraduates were reading transcripts from participants who are effectively peers – undergraduates at other institutions experiencing engineering programs. However, the contexts were extremely different (e.g. large research focused institution versus small private institution). This component added a second layer to the training as it allowed these emerging researchers to explore their own positionality in complex ways, and to reflect on who they are as ethical and empathetic scholars, engineers, and individuals. Engaging engineering students in EER through SDA focused on the undergraduate experience thus had a secondary effect: encountering the experiences of peers at other institutions through qualitative data analysis supported development of the undergraduate researchers’ own identities as learners, engineers, and scholars. This process opened doors for them to question their positionality and role in creating and participating in their own communities.

At the same time, this cross-context work allowed Kajfez and Zastavker to explore comparative research questions even without comparative data sets. Their collaboration has elicited new insights into the original data (manuscripts in process) as well as questions such as: How do specific contexts set up learning cultures differently and how does learning about those cultures allow for personal and professional growth of those studying them? What can we learn from students’ learning through their engagement in SDA? How does participation in this type of SDA research support engineering undergraduates’ development into more empathetic and ethical global citizens and engineers?

This SDA project thus illustrates multiple layers of capacity building. By enabling the relatively rapid training of inexperienced undergraduate engineering students, the project provided Zastavker with a trained research team for her own projects that would have otherwise been less accessible. At the same time, it enabled Kajfez to see the data through fresh perspectives and explore new issues in ways that raised questions about the role of context that might otherwise have remained invisible. And finally, it allowed both the engineering students and the senior researchers to identify new questions related to the impacts of engaging engineering students in EER.

3.2. Project 2: SDA in a doctoral dissertation to bring a new lens to existing data

The second project, a collaboration between EER scholars at two research-intensive institutions, addresses capacity building at the graduate level. In this case, author Jordan (a non-Indigenous scholar) and his research team had conducted interviews with engineering professionals who were members of the Navajo Nation to develop culturally-relevant engineering design curricula for Navajo middle school students (Jordan et al. Citation2019). Author Young, a graduate student advised by author Delaine, is an emerging scholar whose interests centre on the experiences of Indigenous peoples in engineering in the U.S., but who faced challenges in developing research capacity in this area. Neither Delaine nor Young identify as Indigenous, and while Delaine’s research focuses on historically marginalised individuals, his expertise does not include Indigenous populations. At the same time, policies governing the sovereignty of Indigenous nations in the U.S., coupled with a history of abusive research practices, means that researchers generally must obtain approvals from both tribal and university review boards when potential participants live within the geographic boundaries of an Indigenous nation. Approvals from tribal review boards appropriately involve significant relationship- and trust-building, but that process easily exceeds the typical dissertation timeframe. In this case, then, SDA did not simply enable Young to ‘do a dissertation’ involving Indigenous peoples, but also enabled Delaine and Young to learn ethical and responsible research practices in this area by engaging with an existing rich, robust, ethically collected data set.

Project 2 differs from Project 1 in that the SDA in Project 2 involved asking questions that fell outside the scope of the original research. The data include transcripts of interviews with 20 Navajo engineers about how they experienced, understood, and applied engineering design and practice in the context of their culture and community; the project goal was to develop culturally relevant middle-school curricula. Young’s interests, in contrast, concern how participants’ understandings of tribal sovereignty and their identities as tribal citizens mediate their perception of engineering and their academic or work pursuits. This divergence meant that the team first had to determine whether the data could support this new analysis, which involved detailed joint discussions about the proposed focus, as well as reviews of and reflections on the data by the original researcher. But the researchers also considered whether the data should be used for this new analysis. SDA can honour participants’ time by making more complete use of existing data rather than continually inviting people to re-answer versions of the same questions. But re-using data from historically marginalised communities requires close attention to the interests of the community in light of past actions (e.g. the case of Henrietta Lacks (Skloot Citation2010)). Thus before involving either IRBs or participants, the team asked whether the new analysis would produce outcomes aligned with the benefits offered by the original study and whether consenting to the SDA would be consistent with participants’ original reasons for agreeing to interviews. Re-use, that is, must benefit the community and the participants, not simply the researchers.

Once the team determined that SDA was both feasible and ethically appropriate, they worked with both universities’ IRBs. Like Kajfez, Jordan retained control of and access to the original data, and Delaine and Young were given access only to de-identified transcripts. However, given policies related to Indigenous peoples and tribal sovereignty, Jordan’s university also required that he obtain new consent from the participants for SDA. Because these participants were living outside the Navajo Nation, the team did not technically need to also obtain approval from the nation, but they still considered whether they were ethically bound to do so. Given that the participants were adults and would be re-consented individually, they opted only for the reconsent. Fourteen of the original 20 participants provided this consent, and their reconsent helped confirm the research team’s consideration of the project’s value. That is, though the team made an initial decision regarding SDA, the participants had final say. Such practices are particularly important given critiques from Indigenous scholars about the ways non-Indigenous researchers impose themselves upon communities, extracting value only for the benefit of the researcher (Deloria Citation1969; Tuhiwai Smith Citation1999).

Importantly, the months of negotiation needed to obtain all approvals itself provided significant capacity building. The rich discussions among the research team, paired with both detailed documentation of the process and (as in Project 1) ongoing reflections by Young helped Young move from an academic understanding of ethical research with Indigenous peoples gleaned through training modules to a grounded experience of practice. Equally important, this capacity building occurred without additional undue burden on the Navajo participants. An existing researcher functioned as an ally, training the new researcher rather than placing the onus on the Navajo Nation to again educate those outside their community.

As in Project 1, Jordan also met periodically with Delaine and Young to both provide context not available from the transcripts or other project documentation and to continually help ensure that the SDA remained ethically aligned with the needs and expectations of the Navajo nation and the individual participants. Such bounding, though also part of Project 1, takes on an additional layer of importance when dealing with data from marginalised peoples. Here, too, Jordan has benefited from his time investment through further exploration of the data and emerging co-authored publications (in process).

Capacity building in this project, then, focused less on learning data analysis (though such learning has certainly happened), and more on the processes of conducting research with participants from marginalised communities. Using SDA to build such capacity is, as suggested above, crucial in light of both the need to better understand the experiences of marginalised peoples in engineering and the potential burden involved in asking members of these groups to again and again educate new researchers in appropriate, ethical, equitable practice. SDA enabled Delaine and Young both outside the Indigenous community, to gain expertise more rapidly and at far less cost to potential participants in ways that would otherwise have been impossible. (Though beyond the scope of this paper, the team has since added a fourth researcher, an Indigenous person who recently completed an undergraduate engineering degree and is now interested in learning EER, creating an additional capacity building opportunity while simultaneously bringing an Indigenous voice into the team directly.)

4. SDA as capacity building: insights from practice

These two examples suggest several recommendations for SDA both in general and in the context of capacity building in particular.

First, broadly, neither original research study was designed with SDA in mind, leading to extended negotiations with university review boards. Ideally, researchers could plan for SDA prior to data collection, first carefully considering whether the planned data could and should be available for SDA and, then, as appropriate, defining the project scope and subsequent documentation (consent forms, participant information sheets, etc.) with potential SDA work in mind. Participants should be aware of how their data both will and may be used, as well as when and why they might be re-contacted. As exemplified by Project 2, such considerations are of heightened importance for research with marginalised communities, and researchers must balance the power of SDA as a learning opportunity with the risks and benefits to the community. Given the variation in human subjects review within and across national boundaries, there may be no single ‘best practice’ or ‘standard form’ researchers around the world can use, but ethically, such considerations should be at the forefront of any SDA work.

Second, in our capacity-building SDA projects, engagement with the original researcher(s) was essential in helping emerging scholars learn the complexity and nuances of data collection and analysis in context. While experienced researchers may be able to work effectively with well-documented published qualitative data sets (though our field does not yet have effective standards for such documentation), the emerging researchers in these projects needed periodic dialogue with the original researchers to understand how the data was collected and constructed, and how those processes shape subsequent analyses. Qualitative data is shaped by tacit knowledge; as a result, emerging researchers – even graduate students trained in EER ﹘ benefit when that tacit knowledge becomes explicit. Such engagement certainly mitigates researchers’ ethical concerns about how their data might be (mis)used in SDA, but equally importantly, in these cases, even with experienced EER scholars Zastavker and Delaine as local mentors, the emerging researchers learned extensively from the original researchers. This engagement, of course, increases the costs of SDA for the original researcher, making mutual benefits key.

This second point also undergirds questions of trust between researchers. While ‘trust’ can cover a range of issues, these projects highlight two: 1) the need to trust the new researcher to bring the same level of respect not just to the data, but to the participants whom they have not met, and 2) the need to trust the new researchers with the potential flaws or gaps in the data – metaphorically, SDA is like having a houseguest who sees all the dust under the furniture. Qualitative data is often highly personal for participants and researchers, and researchers who share the data are unavoidably making themselves and their work vulnerable.

Third, in capacity building SDA, reflective practice for those learning EER is key. Beyond the memoing qualitative researchers typically engage in during data analysis, both projects established structured reflection guidelines about the learning process. These guidelines enabled the emerging scholars to identify questions about the research process and about themselves as researchers that elicited learning moments throughout the team discussions. Moreover, for undergraduate researchers, engaging in reflective SDA in EER transformed their personal and professional identities.

5. Conclusions and considerations moving forward

By removing data collection costs, the collaborative SDA projects described here allowed emerging researchers to build their understanding of qualitative methods, as well as the policies governing qualitative data collection across institutions, the ethics of working with human subject data, and the deep contextual factors and assumptions that shape data collection. At the same time, the original researchers further unpacked their assumptions and biases, seeing the data in new ways and simultaneously developing new practices for future data collection. And, for those exploring research with marginalised communities, carefully considered SDA can help reduce the research burdens often imposed on these groups.

However, both pilot projects are still situated within an ‘inner circle’ of EER within one country. In Project 1, an experienced EER scholar without access to graduate students used SDA to build capacity in undergraduate engineering students, while in Project 2, experienced EER scholars partnered to build capacity in an EER doctoral student. Moreover, the collaborations began in workshops that brought established EER scholars together. As such, these projects illustrate the potential to use SDA for capacity building while nonetheless raising questions about how individuals outside these networks – and outside nations where such networks exist – can forge such collaborations.

To begin addressing these questions, we are developing conference workshops to foster discussion among current and emerging EER scholars. Even these venues, though, pose barriers – they only reach those who know about, can afford, and choose to attend. We are also working with journal editors to identify expectations for quality and contributions of SDA, ideally in conjunction with one or more special issues.

Still, building collaborations such as those described here are investments of time, and collaborative SDA is not without time costs to the original researchers. In our work, beyond hosting the initial workshops to illuminate key issues and generate potential collaborations, we funded an intensive 1.5 day working session for the two pilot teams early in the project, with additional funded working sessions for manuscript development scheduled.

Despite the challenges, however, this work illustrates the potential of collaborative SDA for growing EER internationally. Current EER scholars who choose to share data with emerging scholars serve the community and the field, not only adding to their own research portfolios (which could likely be done with less effort), but also deepening their work, honouring the time invested by their participants, and growing the future of the field.

Acknowledgments

We would like to thank everyone who has contributed to this project, including Joachim Walther and Nicola Sochacka who helped develop the grant, all workshop participants, and Hope House, who provided logistical support. We also thank the participants who were involved in the two projects now being used for SDA.

This material is based on work supported by the National Science Foundation under Award No. 2039864. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

The work was supported by the National Science Foundation (U.S.) [2039864].

Notes on contributors

Marie C. Paretti

Marie C. Paretti is a Professor in the Department of Engineering Education at Virginia Tech. She holds a Ph.D. in English from the University of Wisconsin - Madison.

Jennifer M. Case

Jennifer M. Case is Department Head and Professor in the Department of Engineering Education at Virginia Tech. She holds a Ph.D. in Education from Monash University in Australia.

Lisa Benson

Lisa Benson (she/her) is a Professor of Engineering and Science Education at Clemson University. She holds a Ph.D. in bioengineering from Clemson University.

David A. Delaine

David A. Delaine is an Associate Professor within the Department of Engineering Education at The Ohio State University’s College of Engineering. He holds a Ph.D. in electrical engineering from Drexel University.

Shawn Jordan

Shawn Jordan is an Associate Professor in the Ira A. Fulton Schools of Engineering at Arizona State University. He holds a Ph.D. in Engineering Education from Purdue University.

Rachel L. Kajfez

Rachel L. Kajfez is an Associate Professor of Engineering Education at The Ohio State University. She holds a Ph.D. in Engineering Education from Virginia Tech.

Susan M. Lord

Susan M. Lord is Chair and Professor of Integrated Engineering at the University of San Diego. She holds a Ph.D. in Electrical Engineering from Stanford University.

Holly M. Matusovich

Holly M. Matusovich (she/her) is a Professor of Engineering Education at Virginia Tech. She holds a Ph.D. in engineering education from Purdue University.

E. Tyler Young

E. Tyler Young is a Ph.D. student and graduate research associate at The Ohio State University. He holds a B.S. in Aerospace & Mechanical Engineering from Case Western Reserve University.

Yevgeniya V. Zastavker

Yevgeniya V. Zastavker is a Professor of Physics and Education at Olin College of Engineering. She holds a Ph.D. in Biophysics from MIT.

References

  • Adams, R. S., D. Radcliffe, and M. Fosmire. 2016. “Designing for Global Data Sharing, Designing for Educational Transformation.” Advances in Engineering Education 5 (2): 1–24. https://advances.asee.org/publication/designing-for-global-data-sharing-designing-for-educational-transformation/.
  • Atman, C. J., S. D. Sheppard, J. Turns, R. S. Adams, L. N. Fleming, R. Stevens, R. A. Streveler, et al. 2010. Enabling Engineering Student Success: The Final Report for the Center for the Advancement of Engineering Education.
  • Benson, L. C., K. Becker, M. M. Cooper, O. H. Griffin, and K. A. Smith. 2010. “Engineering Education: Departments, Degrees and Directions.” International Journal of Engineering Education 26 (5): 1042–1048.
  • Borrego, M. 2007. “Conceptual Difficulties Experienced by Trained Engineers Learning Educational Research Methods.” Journal of Engineering Education 96 (2): 91–102. doi:10.1002/j.2168-9830.2007.tb00920.x.
  • Borrego, M., and J. Bernhard. 2011. “The Emergence of Engineering Education Research as an Internationally Connected Field of Inquiry.” Journal of Engineering Education 100 (1): 14–47. doi:10.1002/j.2168-9830.2011.tb00003.x.
  • Cao, Y., X. Ma, J. M. Case, B. K. Jesiek, D. B. Knight, W. C. Oakes, M. C. Paretti, X. Tang, Z. Xie, and H. Zhao 2021. Visions of Engineers for the Future: A Comparison of American and Chinese Policy Discourses on Engineering Education Innovation. 2021 ASEE Virtual Annual Conference, Virtual.
  • Case, J. M. 2017. “The Historical Evolution of Engineering Degrees: Competing Stakeholders, Contestation Over Ideas, and Coherence Across National Borders.” European Journal of Engineering Education 42 (6): 974–986. doi:10.1080/03043797.2016.1238446.
  • Case, J., H. Matusovich, M. C. Paretti, N. Sochacka, and J. Walther 2022. Changing the Paradigm: Developing a Framework for Secondary Analysis of EER Qualitative Datasets. ASEE Annual Conference & Exposition. https://peer.asee.org/41938
  • Chua, M. 2012. RAT: The Document. Mel Chua Blog. Accessed May 18 2023. http://melchua.com/blog/2012/10/30/rat-the-document/
  • Dart, S., S. Trad, and K. Blackmore. 2021. “Navigating the Path from Technical Engineering to Engineering Education Research: A Conceptual Model of the Transition Process.” European Journal of Engineering Education 46 (6): 1076–1091. doi:10.1080/03043797.2021.1992609. 2021/11/2.
  • Deloria, V., Jr. 1969. Custer Died for Your Sins: An Indian Manifesto. Norman, Oklahoma: University of Oklahoma Press.
  • Faber, C. J., R. L. Kajfez, D. M. Lee, L. C. Benson, M. S. Kennedy, and E. G. Creamer. 2021. “A Grounded Theory Model of the Dynamics of Undergraduate Engineering students’ Researcher Identity and Epistemic Thinking.” Journal of Research in Science Teaching 59 (4): 529–560. doi:10.1002/tea.21736.
  • Gardner, A., and K. Willey. 2018. “Academic Identity Reconstruction: The Transition of Engineering Academics to Engineering Education Researchers.” Studies in Higher Education 43 (2): 234–250. 2018/02/01. doi:10.1080/03075079.2016.1162779.
  • Jawitz, J., J. Case, and D. Marshall. 2009. “Grappling with Methodologies in Educational Research: Science and Engineering Educators Finding Their Way.” In Researching Possibilities in Mathematics, Science and Technology Education, edited by K. Setati, R. Vithal, C. Malcom, and R. Dhunpath, 139–152. Hauppauge, NY: Nova Science Publishers.
  • Jesiek, B. K., M. Borrego, and K. Beddoes. 2010. “Advancing Global Capacity for Engineering Education Research (AGCEER): Relating Research to Practice, Policy, and Industry.” Journal of Engineering Education 99 (2): 107–119. doi:10.1002/j.2168-9830.2010.tb01048.x.
  • Johri, A., M. Vorvoreanu, and K. Madhavan 2016. Guest Editorial: Data Sharing in Engineering Education. Advances in Engineering Education, 52). https://advances.asee.org/publication/guest-editorial-data-sharing-in-engineering-education/
  • Johri, A., S. Yang, M. Vorvoreanu, and K. Madhavan. 2016. “Perceptions and Practices of Data Sharing in Engineering Education.” Advances in Engineering Education 5 (2): 1–25. https://advances.asee.org/publication/perceptions-and-practices-of-data-sharing-in-engineering-education/.
  • Jordan, S. S., C. H. Foster, I. K. Anderson, C. A. Betoney, and T. J. D. Pangan. 2019. “Learning from the Experiences of Navajo Engineers: Looking Toward the Development of a Culturally Responsive Engineering Curriculum.” Journal of Engineering Education 108 (3): 355–376. doi:10.1002/jee.20287.
  • Kajfez, R. L., D. Lee, K. Ehlert, C. Faber, L. Benson, and M. Kennedy. 2021. “A Mixed Method Approach to Understanding Researcher Identity.” Studies in Engineering Education 2 (1): 1–15. doi:10.21061/see.24.
  • Klassen, M., B. K. Jesiek, L. Zheng, and J. M. Case. 2023. “Institutionalizing Engineering Education Research: Comparing Australia, China, and the United States.” In Engineering, Social Sciences, and the Humanities: Have Their Conversations Come of Age?, edited by S. H. Christenson, A. Buch, E. Conlon, C. Didier, C. Mitcham, and M. Murphy, XXII, 427. Cham, Switzerland: Springer Cham. doi:10.1007/978-3-031-11601-8.
  • Kumar, S. S., Y. Gamieldien, J. M. Case, and M. Klassen 2021. Institutionalizing Engineering Education Research: Comparing New Zealand and South Africa Research in Engineering Education Symposium & Australasian Association for Engineering Education Conference, Perth. https://aaee.net.au/wp-content/uploads/2021/11/REES_AAEE_2021_paper_310.pdf
  • Mazzurco, A. 2016. Methods to facilitate community participation in humanitarian engineering projects: Laying the foundation for a learning platform. (Doctoral dissertation, Purdue University).
  • Radcliffe, D. F. 2006. “Shaping the Discipline of Engineering Education.” Journal of Engineering Education 95 (4): 263–264. doi:10.1002/j.2168-9830.2006.tb00901.x.
  • The Research Agenda for the New Discipline of Engineering Education. 2006. The Research Agenda for the New Discipline of Engineering Education. Journal of Engineering Education 95 (4): 259–261. doi:10.1002/j.2168-9830.2006.tb00900.x. Article Journal
  • Skloot, R. 2010. The Immortal Life of Henrietta Lacks. New York, NY: Crown Publishers.
  • Streveler, R. A., and K. A. Smith. 2006. “Conducting Rigorous Research in Engineering Education.” Journal of Engineering Education 95 (2): 103–105. doi:10.1002/j.2168-9830.2006.tb00882.x.
  • Trevelyan, J. 2016. “Extending Engineering Practice Research with Shared Qualitative Data.” Advances in Engineering Education 5 (2): 1–31. https://advances.asee.org/publication/extending-engineering-practice-research-with-shared-qualitative-data/.
  • Tuhiwai Smith, L. 1999. Decolonizing Methodologies: Research and Indigenous Peoples. London, England: Zed Books.
  • Walther, J., N. W. Sochacka, and N. N. Kellam. 2013. “Quality in Interpretive Engineering Education Research: Reflections on an Example Study.” Journal of Engineering Education 102 (4): 626–659. doi:10.1002/jee.20029.
  • Walther, J., N. W. Sochacka, and A. L. Pawley. 2016. “Data Sharing in Interpretive Engineering Education Research: Challenges and Opportunities from a Research Quality Perspective.” Advances in Engineering Education 5 (2): 1–16. https://advances.asee.org/publication/data-sharing-in-interpretive-engineering-education-research-challenges-and-opportunities-from-a-research-quality-perspective/.