1,036
Views
0
CrossRef citations to date
0
Altmetric
PROFESSIONAL EDUCATION & TRAINING

Professional development of STEM-H faculty: Implications of a cross-disciplinary workshop series for improving research on student learning

ORCID Icon, , ORCID Icon &
Article: 2043995 | Received 06 Jul 2021, Accepted 30 Jan 2022, Published online: 26 Feb 2022

Abstract

A growing number of science, technology, engineering, mathematics, and health sciences (STEM-H) faculty are interested and find value in learning how to conduct research in the education setting of their discipline. Recent funding initiatives have been announced that aim to expand education research efforts and capability with discipline faculty in order to properly evaluate and ultimately improve STEM-H education. This study explores the impact of an innovative professional learning workshop series for faculty in university STEM-H disciplines who are interested in learning how to study their teaching and its impacts more rigorously.STEM-H faculty came to the workshop series with a variety of motivations, including to better understand and evaluate their teaching practices, to acquire better methods of studying and sharing what they learn, and to improve their education research skills overall. Our interdisciplinary research team collected data through open-ended pre- and post-series surveys, and individual 6-month post-series interviews. We engaged in consensual qualitative analysis to extract themes. Findings indicated STEM-H faculty left the workshop series with (1) deepened perspectives of education research complexities, (2) stronger insights and awareness about key components of education research, and (3) identified actionable next steps to pursue their individual education research goals. We explore potential follow-up steps as we build and institutionalize education research support structures to sustain this interest.

PUBLIC INTEREST STATEMENT

Science, technology, engineering, mathematics, and health sciences (STEM-H) faculty are interested and find value in learning how to conduct research in the education setting of their discipline. Various funding initiatives are expanding education research efforts and capability among discipline faculty. This study explores the impact of an innovative professional learning workshop series for faculty in university STEM-H disciplines who are interested in learning how to study their teaching and its impacts more rigorously. Findings indicated STEM-H faculty left the workshop series with (1) deepened perspectives of education research complexities, (2) stronger insights and awareness about key components of education research, and (3) identified actionable next steps to pursue their individual education research goals. We explore potential follow-up steps as we build and institutionalize education research support structures to sustain this interest.

1. Introduction

Across the disciplines encompassed by STEM and Health Sciences (STEM-H), there has been growing interest among disciplinary faculty to learn more about conducting education research. The US National Science Foundation ([NSF] 2019) has called for proposals for “Building Capacity for Science, Technology, Engineering, Mathematics (STEM) Education Research” via its program solicitation 20–521. Projects are supported that build STEM-H faculty capacity to carry out high quality STEM education research. This is motivated in part by the recognition that improved STEM education will benefit from qualitative and quantitative research (National Science Board, Citation2018), and for the need to evaluate the effectiveness of various initiatives being explored (National Academies of Sciences, Engineering and Medicine, Citation2018).

In response to specific STEM-H faculty interest expressed on our campus for how to best conduct STEM-H education research, we developed a brief, focused introductory workshop series. A detailed description of the workshop series rationale and design decisions is found in Zhong et al. (Citation2020). Recognizing that education research is a complex endeavor, in this paper we investigate if the particular set of workshop experiences was efficacious in meeting needs of the participants in spite of the relatively brief time commitment.

2. Literature review & framework

We first present an overview of disciplinary education research perspectives. This research landscape includes perspectives on Education Research (ER), Discipline-Based Education Research (DBER), and the Scholarship of Teaching and Learning (SoTL) communities. We then summarize bodies of literature that have identified aspects of education research that STEM-H disciplinary faculty will likely be unfamiliar with, so that we can explore if our workshop series impact included meaningful growth on participants’ understandings of these aspects. Finally, we outline the evidence gap in the literature about STEM-H faculty development of education research skills.

3. The scholarly research landscape

The first aim of our workshop series was to help workshop participants develop an understanding of the diversity of education research contexts and goals. We characterize the disciplinary education research landscape as communities spanning across ER, DBER, and SoTL, with the understanding that there is much debate over what constitutes membership in one community or another. Here, we present a summary of this synthesis; interested readers can find a fuller description of the characterization of that landscape in Zhong et al. (Citation2020).

We introduced to workshop participants a definition of Education Research (ER) that the American Educational Research Association (Citation2019) articulated:

Scholarship in the field seeks to describe, understand, and explain how learning takes place and how formal and informal contexts of education affect all forms of learning. Education research embraces the full spectrum of rigorous methods appropriate to the questions being asked and also drives the development of new tools and methods.

Some scholars frame DBER as a subset within ER (Bodner, Citation2016), with specific intent to address discipline-specific problems and challenges. Contrasted with ER more broadly, DBER is characterized with the distinction that DBER combines expert knowledge of a science or engineering discipline, and the challenges of learning and teaching in that discipline, with the general science of learning and teaching.

Finally, we introduced the Scholarship of Teaching and Learning (SoTL) as an approach that developed in parallel with DBER (Boyer, Citation1990). According to McKinney (Citation2007), SoTL

involves the systematic study of teaching and/or learning and the public sharing and review of such work. … SoTL focuses on teaching and learning at the college level, and is primarily classroom based. Ideally, SoTL also involves application and use. (p. 10)

One of the most important takeaways we sought to convey to participants was the extensive variation in characterizations of the relationships between the ER, DBER, and SoTL communities, and the existence of considerable overlap between their contexts and research aims. Recognition of degrees of overlap can be found in the literature from zero to complete overlap, depending on the researcher’s perspective (Clegg, Citation2012; Dolan et al., Citation2018; Kanuka, Citation2011; Larsson et al., Citation2017; Shipley et al., Citation2018). We positioned the intent of our workshop series to be helpful for strengthening the quality of the research anywhere in this landscape.

4. New concepts for stem-h disciplinary faculty

We designed the workshop series to intentionally impact participants’ understandings of concepts central to education research but likely unfamiliar to many STEM-H disciplinary faculty. Borrego and Streveler (Citation2014) discussed the importance of teaching STEM-H faculty that crafting education research questions is a process that involves familiarizing oneself with the literature to properly identify variables of interest necessary to develop a conceptual framework that provides a lens for analysis. Borrego and Streveler (Citation2014) stated that natural science scholars, in particular, may struggle with the idea of a conceptual framework given that much of their disciplinary research is founded on universal scientific truths. STEM-H disciplinary scholars may also find education research measurement challenging, beginning with the very idea that a variable of interest may not be directly measurable but instead may be a latent (hidden) construct. Social science survey development and psychometric testing likely include unfamiliar processes (Borrego & Streveler, Citation2014).

5. Prior ER faculty development efforts: the gap

A final component that Borrego and Streveler (Citation2014) deemed important when introducing STEM-H disciplinary faculty to education research was the importance of interdisciplinary collaborations. Interdisciplinary collaborators can support education research initiatives of STEM-H disciplinary faculty (McBride & Kanekar, Citation2015; Webb, Citation2019). However, collaborative research efforts are not common (Talanquer, Citation2014). Social scientists are often not involved in DBER research (Lo et al., Citation2019), and many note that for DBER fields to grow, there is a great need for collaboration between DBER, SoTL and ER researchers (e.g., Arthurs, Citation2019; Lo et al., Citation2019; Shipley et al., Citation2018).

In recognition of the need for such collaborations, in 2019 NSF issued a proposal request for “Building Capacity for Science, Technology, Engineering, Mathematics (STEM) Education Research” (updated to program solicitation 20–521) to support projects that build STEM-H faculty’s capacity to carry out high quality STEM education research. Recent awards have targeted STEM graduate students or early career professionals seeking to become STEM education researchers (Istaplet & Hancock, Citation2019; Roman-Dixon, Citation2019; Tipton, Citation2019). Outside of NSF-funded projects, we could only find one study specifically targeting STEM faculty development to conduct education research. Nadelson (Citation2016) found that creating a faculty community of practice to increase STEM faculty expertise in education research was an effective way to enhance understanding of education research.

Recognizing the need for similar endeavors, we intended for our workshop series to respond to growing interest, both on our campus and nationally, to support STEM-H faculty and disciplinary researchers’ development of education research skills. In addition to serving as a delivery vehicle to address faculty interest, the workshop series shared duality in purpose as a research methodology (Ørngreen & Levinsen, Citation2017). First, our workshop series provided the experience of a shared place for faculty, both in-person meetings and online, that encouraged the practice of community (Wenger et al., Citation2002) across time and distance. Second, the workshop process enabled us to investigate the phenomena of faculty experiences in construction or deconstruction of knowledge and/or understanding. Ørngreen and Levinsen (Citation2017) identified three distinct perspectives of workshop experiences: 1) as a means of achieving a goal, 2) as a practice of investigation into the format, and 3) as a research methodology that produces reliable and valid data about the experience. Our workshop series was designed as a means of revealing faculty’s new competencies, practices, knowledge or ideas while producing reliable and valid data that detailed the learning experiences of faculty.

This workshop series was designed to be a short, focused overview of the spectrum of education research skill sets necessary to guide participating faculty regarding possible next steps should they choose to continue this strand of their work. In this study, we explored the efficacy of this workshop series to strengthen disciplinary STEM-H faculty education research skill sets through a focus on three research questions.

6. Research questions

How does a short, focused workshop series influence STEM-H faculty’s …

  1. … perspectives of the complexities of education research contexts?

  2. … insight and/or awareness about key components of high-quality education research?

  3. … ability to identify actionable next steps for themselves?

7. Methods

Workshop Series Design: Topics, Experiences, and Sequencing

To best fit within the busy schedules of as many interested STEM-H faculty as possible, the workshop design was a focused, brief introduction to the challenging and complex endeavor of conducting education research. A 4-part series, consisting of 75-minute sessions once a week for four consecutive weeks, was chosen as the best compromise between efficiency of limited time commitment but still providing opportunity for a reasonably thorough introduction (see, Zhong et al. (Citation2020) for more complete description of workshop series development). Key workshop series topics are summarized in .

Table 1. Overview of workshop series on education research in STEM-H (see AUTHORS, 2020)

8. Workshop series facilitators and research team

Workshop series facilitators included personnel from several different parts of the University. This effort was led by collaborators from the university’s college of education, school of engineering, and the university professional learning center. All of these project facilitators have previously worked with a variety of university STEM-H disciplinary faculty on related efforts, including some workshop participants. The workshop facilitators included:

  • A professor of science education and director of an education research center in mathematics and science teaching and learning

  • A professor of engineering and director of an engineering education center focused on scholarship of teaching and learning engineering

  • Two personnel from the university’s professional learning center

  • Two advanced doctoral students in education with expertise in education research methodologies

The research team for this study of the workshop series’ efficacy are the workshop facilitators minus the personnel from the university’s professional learning center: two faculty and two education doctoral students in the candidacy phase of their programs. All four have worked collectively for multiple years on similar studies that required consensual qualitative analysis techniques, which were also used in this study (see analytic approach). These prior experiences with one another have generated a sense of collegial trust and rapport that strengthens the authenticity of the consensual qualitative analysis outcomes. In coordination with authenticity, the research team’s width and breadth of expertise and experience supports varying analytic perspectives, bolstering credibility, dependability, and overall trustworthiness of outcomes (Guba & Lincoln, Citation1994). As part of the consensual qualitative research process, personal biases and expectations were discussed, recorded, and revisited through analysis.

In addition to positive aspects of prior relationships of the research team with one another and with some workshop participants, we also recognize potential biases. As recommended by Hill et al. (Citation2005), we acknowledge two researcher biases that could potentially impact the study:

  1. Existence of long-term professional working relationships between researchers and some workshop series participants.

  2. One faculty researcher was the chair of the department for three workshop series participants.

9. Workshop series participants

STEM-H faculty at the University of Louisville were invited to participate in this optional workshop series. The invitation went to STEM-H faculty known by the workshop facilitators to be interested in education research, as well as to leaders within STEM-H departments at the university. Faculty participants were from various stages of their career; see, . Because of the level of independent professionalism and responsibilities, in this paper we refer to all of these as faculty participants even though three of them were not technically in faculty positions.

Table 2. Workshop series participants

10. Data collection

Workshop participants completed a short, open-ended pre-survey questionnaire prior to the first workshop day, and also a post-workshop series questionnaire (see Appendix A). Individual interviews were conducted approximately six months following the workshop series. Eleven of the 13 original faculty participants agreed to the individual interview, and the other two faculty indicated their inability to participate due to an unexpected increase in workload caused by the COVID-19 pandemic. However, these two faculty did provide written retrospective feedback and what work they have continued since the workshop series concluded. The semi-structured interviews were conducted virtually by the doctoral student researchers using a protocol developed by the research team (see Appendix A). Digital video recording and field notes were taken and were then transcribed. The three timepoints of data collection, as well as the instruments with a brief description of each, are outlined in .

Table 3. Data collection timeline and instruments (see appendix A)

11. Analytic approach

Guided by our research questions, an inductive analysis approach of consensual qualitative coding (Hill, Citation2012) was conducted by all four researchers. This approach allowed for multiple perspectives that best extracted participant workshop series experiences. Consensual qualitative analysis is more likely to reduce individual researcher bias since it involves an iterative coding process and common themes are informed by multiple people with varying perspectives. We chose to use emergent coding as our core data analytic strategy, rather than a-priori, since the literature does not support a-priori specificity of coding themes for our research questions. Because the body of data collected for this study included both written and interview responses at three timepoints, in the first analysis phase we maintained each participant as a single case across timepoints. Each unique idea expressed by participants was characterized as a data element for analytic purposes; thus, some sentences included multiple embedded ideas whereas other sentences may have overlapped with prior ideas already expressed. In the second analytic phase we compared across the 13 cases to identify common themes and trends.

The data analysis involved a three-step iterative process: (a) sorting and characterizing data elements into broad domains (phase 1—within each case); (b) categorizing the data within domains into specific core ideas relevant to our research questions (phase 1—within each case); and (c) cross-analyzing cases to extract common themes across participants and timepoints (phase 2; Hill et al., Citation2005).

We analyzed the pre-survey and post-survey data immediately following the workshop series conclusion. The two doctoral students independently performed the initial open coding, then met as a team of two to discuss and establish an initial list of emergent domains. The team of four then met to characterize the essence of each domain, after which the two-university faculty independently analyzed the raw data to align it within an established domain or create a new one. The full research team then held a consensus meeting to finalize the domains. The same process was then used to generate and finalize specific core ideas within each domain, separately applying this process to each of the 3 timepoints.

12. Findings

Findings of this study are organized by the three research questions that guided thematic analyses of the data. Findings across the three timepoints of data collection are interwoven to robustly present the impact of the overall workshop experience on participants. Emergent domains (section headers within each question) are operationalized by reporting specific core ideas extracted from participant responses.

13. Research Question #1: deepened perspectives

13.1. Overall strengthened perspectives

Faculty participants came to the workshop series with a well-grounded sense of many key features and complexities of Education Research (ER), with 11 of the 13 having a variety of education research projects already in the works. Despite varied prior experiences, all participants expressed having deepened their overall perspectives. A common expression of these strengthened perspectives was reflected by increased confidence; “I think it’s [the workshop] certainly helped me put more of what I’d like to try to do into a context, where I can now at least verbally describe what I think I should be doing … ” One participant who was new to ER said,“ … [the workshop] provided an overview of where to start … ”

One participant emphasized how essential it is to focus on the intricacies in ER, “There’s a reason that this [education research skill development] is a four-year study [degree] and this is not something you do in an afternoon despite you having a physics degree.” A post-survey response from another participant summed up their overall enhanced perspective of the intricacies by saying, “ … [my biggest learning from workshop was] how complicated education research is … ” Comments such as these suggest that the workshop series was effective in conveying the message that ER complexities are nontrivial.

Applied new perspectives to own circumstances. An active application of deepened perspectives is evident in the frequent instances connecting workshop experiences to their own specific circumstances. “[The workshop] made the waters less muddy … it certainly helped me try to organize them [my research questions] on my own.” Others expressed the value they found in the resources and ideas shared during the workshop series. “ … these resources [from the workshop] were hugely helpful [for her work as a medical education researcher with med school faculty], almost like a train-the-trainer model … ” Others indicated an intent to incorporate select resources by sharing them with their own students when appropriate. Most participants echoed that the workshop series provided a deeper appreciation of the many intricacies involved in conducting quality ER.

Faculty wanted more. One strand of evidence that the participants found the workshop series helpful in deepening their own perspectives of ER was a consistent request for additional follow-up experiences. Almost all participants made at least one explicit comment about wanting some sort of continuation of the workshop series: “I think my big thing is it kind of left me wanting more … ”; “we should have another meeting of [the workshop] cohort.” Participants also shared many positive comments about the value of the workshop series: “very pleased with what the workshop went over” and “Overall, really well put together [workshop] …,” explaining that the workshop series allowed him to strengthen the quality of his own ER efforts.

13.2. Specific perspectives most impacted

In addition to general comments about the overall value of the workshop series for deepening their individual perspectives, several specific aspects of growing awareness of ER intricacies were commonly highlighted.

Validity. One particularly valuable feature of enhanced ER understanding was attention to validity, especially validity of interpretation of results. The essence of ER is foundationally grounded in the social sciences; for many STEM faculty and discipline-specific researchers, the noise present in measuring human traits can be uncomfortable and disconcerting, and there can be a natural tendency to ascribe stronger meaning to numerical results than may be warranted. In the follow-up interview, a participant spoke to this point, “ … how can we do this [ER] that has the most validity for what we’re trying to do? And I think that [the workshop] really hammered it home in a way that I don’t think is talked about enough.” Another participant was describing a particular ER project he was involved with and commented that the outcome measure of final exam grade was statistically non-significantly impacted by the study’s intervention. He then reflected on validity considerations by noting that this outcome measure may not be appropriately sensitive to their intervention, and said, “Now [following the workshop], I’d like to go back and look at some other factors … ”

Value of interdisciplinary teams. The workshop series provided access to ER experts that several participants expressed as a motivation for their participation, “You know, sometimes it’s nice when you have someone who’s in an authority position, someone who’s an expert to tell you.” Several participants spoke of their desire to form interdisciplinary research teams; “[I] desire to be a part of a team with multiplicity of skills so that the outcomes would be high-quality and contribute meaningfully to the field as a whole … ” Most emphasized the value of interdisciplinary teams by highlighting differences between ER and their own field’s research approaches; “ … the world of education research is very different from my discipline specific research.” Responses such as these underscored a strengthened perspective about the potential value of creating interdisciplinary teams to collectively engage in high-quality ER.

14. Research Question #2: key components

Many of the participants had previous experience engaging in ER and were familiar with most or all key components necessary to conduct such research. Additional or enhanced insight about four specific key ER components clearly emerged in the data: (1) research question formulation, (2) importance and use of conceptual frameworks, (3) data analysis decision-making, and (4) coherence across all components. Participants frequently used their own ER work to describe how these workshop topics strengthened these insights.

14.1. Research questions

One participant reflected on her progress in refining her research questions since the workshop series, “ … it was a research question I was already working on, so I have to move forward a little bit with that project … ” For most, prior to the workshop series the process for formulating education research questions was given relatively little thought since many were accustomed to viewing that research component as straightforward. One participant expressed recognition of this as a personal area of weakness as well as the helpfulness of the workshop series experience, “ … that’s where I was weakest at, [formulating] research questions for education. This [workshop] really helped … ”

Workshop facilitators illustrated that in education research, the constructs embedded in a typical education research question often require additional careful consideration and definition. Most important human-related behaviors, attitudes, or abilities are often very messy and complicated in comparison to questions about the physical world, where most of our participants had concentrated their STEM disciplinary research. For example, if one were to ask, “do my students do better if ….” then the core construct of “do better” needs to be clearly defined. That question could mean one or more of many different things, such as higher course grades, higher final exam scores, greater interest in the topic by semester end, stronger interactive participation, or stronger ability to transfer problem-solving skills to unique problems.

During the workshop series we shared a process for identifying the key constructs embedded in a research question. One consideration was the need for measurability of constructs. Measurability requires clear and specific articulation, something often unnecessary in STEM-H research where outcome variables are well-defined by the physical world (e.g., lower blood pressure, water pipe ability to withstand pressure surges). One participant explained their new thinking, “I like the perspective on developing research questions that are, you know, measurable … I don’t remember all the pieces [ER components], but measurable is a big one that I think we forget … it [a research question] has to be testable; it can’t just be there … starting with an idea and taking the time to own that idea [articulate specificity] and seeking feedback from others.”

This heightened appreciation of well-crafted research questions carried throughout many participants’ follow up interview responses. “I mostly benefited from [the workshop], while trying to put some structure around the different kinds of questions you might want to think about … ” and “ … we were able to sit down and develop independent research questions … the [workshop] structure and clarity allowed me to think back and really pinpoint the context necessary for forming a good, specific research question” were common reflections.

14.2. Conceptual frameworks

Many participants were accustomed to well-defined existing frameworks for their STEM research, in contrast to what might be needed in ER. As one participant stated about how a well-defined conceptual framework guides the education research and provides an interpretative lens,“ … now, I keep circling back to things because I think there are some interesting, unanswered questions with those projects [prior studies in his discipline] … in terms of moving forward, those were going to be spaces where I now could imagine varying angles [varying conceptual frameworks] … ”

One participant expressed how identifying and clarifying a clear conceptual framework would add needed structure to ongoing collaborative ER work he was doing with other STEM colleagues, “ … there was just so much information and it [ongoing research project] wasn’t very targeted … I think that using what we learned from the workshop has really helped us sort it out and has helped us look at our framework a little bit better. We’ve really homed in on what we need and where that’s going to take us next … really targeting what we’re looking at.” Another reflected on how the workshop series helped him realize that any analysis of data absent a conceptual framework could be deficient in interpretable meaning, “We could possibly even hang a story around it [analysis of existing data], but I will have no context, no literature … ”

The centrality of conceptual frameworks across the research process was a key concept that participants took away from workshop experiences. One workshop series pedagogical approach was to consistently return to a few specific examples through the entire process (developing research questions, conceptual frameworks, methodological design considerations, analyses, interpretations) to illustrate coherence, especially emphasizing the centrality of the conceptual framework. Participants responded to the use of these examples as, “ … very helpful in understanding how these [conceptual frameworks] fit into the formation of research questions and study design.” Some participants noted that, “ … conceptual framework, it’s not something I think about a lot … [the workshop] emphasis on flexibility and the reason to define terms was very helpful and enlightening … ” Participants also recognized this as a potentially challenging aspect for them to master, “This [conceptual frameworks] is an important topic, but its hard to grasp and it doesn’t feel like I came away with a strong understanding … ”

14.3. Data analyses

The main workshop series resource that faculty participants referenced was in support of data analyses. On day 3, we presented a flow chart depiction of data analysis decision-making (adapted from Howell, Citation2008; see Appendix B). Participants were guided in the workshop series to develop their own ER research questions, consider what sorts of data might be useful to collect, and then explore how this data analysis decision tree could help them navigate the analysis stage.

This resource was particularly helpful to support consideration of non-experimental design studies, including the types of data one might collect which then leads to considering how to analyze those data. Practical challenges of structuring a true experimental design in education were noted, “as a scientist, I got trained in thinking about how things evolve over time, and that [control of data collection] becomes really difficult, because once [students] leave my classroom, my ability to control what’s going on [relationships between variables] drops to almost nothing.” Many participants articulated better understanding of how the nature of the research questions informs the nature of the data. The helpfulness of this data analysis decision tree for considering research design elements was highlighted by one participant, “In my personal opinion, and based on prior experience coming into the workshop, the decision tree for analysis was the most useful component of the entire workshop thus far. I plan on utilizing this for all my future research.”

14.4. Coherence across all components

This final emergent domain for research question 2 about key research components is not actually about an individual component, but rather about the need and value for intentional coherence across all components of ER. This was not an unfamiliar concept to our participants, but when discussing the intricacies of the interdependence of study components, many of our participants identified themselves or colleagues in STEM as having overlooked to some degree the essential nature of this notion. One participant said of this oversight, “ … but you can totally do some research … like, I could probably fit an S-curve to whatever numbers I have and possibly even hang a story around it … but I will have no context, no literature, so pairing up with other people’s education research [referring to situating the study in literature] and determining what needs to be answered [research questions] … that would turn into something useful … ”.

One participant described a strengthened understanding of the value of education research coherence by contrasting that with the unhelpfulness of a decontextualized suite of resources and techniques, “I was hoping that it [the workshop] would just be a collection of very useful, practical, application information … absolutely, what I took away from that [workshop] was a lot of best practices that were summarized for me … it’s nice that somebody thinking about education research put together this workshop, because the context [education research design] is loaded, and just putting together a lot of the tools doesn’t always translate [or lead to coherence] … ”

15. Research Question #3: next steps

Our final research question was to explore if the workshop series was helpful in supporting next steps the faculty participants wanted to pursue related to ER. We extracted four domains from the data: (1) dissemination efforts, (2) crafting grant proposals, (3) barriers to next steps, and (4) requests for follow-up.

15.1. Dissemination

Unsurprisingly, most participants reported a desire to disseminate by publishing the work in a journal or conference proceedings or by sharing with their department to improve the teaching and learning of their students. The impression by most was that the possibilities of ER in their field were promising and fruitful. One participant indicated, “I’m very excited and I think that the field [DBER] is wide open for publication … we [STEM department] don’t really have anything that’s substantial in a journal for scholarship of teaching and learning … this [workshop] helps us improve.” Most participants shared specific dissemination efforts they had engaged in within the 6 months following the workshop series.

15.2. Grant proposals

Some faculty participants reported the completion of or active engagement in ER grant writing, due in part to their workshop series experience. Most faculty who reported that their next steps included a grant proposal described how the workshop series provided clarification for their ER approach. One participant discussed how the workshop series connected ER work that had begun prior to the workshop series to a now more plausible grant proposal direction, “ … we are still looking to move forward with some analysis on the previous data … hoping to get to the point where we’re able to use some of that [analysis] to write a grant proposal … ” An overview of workshop participant’s reported next steps outcomes is provided in .

Table 4. Overview of faculty participant’s next-step outcomes

15.3. Barriers to faculty’s intended next steps

COVID-19 pandemic. The onset of the COVID-19 pandemic, occurring just a couple of weeks following the completion of the workshop’s final day, brought a wave of unprecedented shutdowns, changes, and adjustments to the university. In the follow-up interviews almost all reported a necessary pause to their initial ER study plans because of the pandemic, but most indicated that they still planned to pursue the project in the future. A common description of the pandemic-driven barrier to intended next steps was, “So, I haven’t done any work on the research yet. Again, it’s the sort of combined research that I was focused on with [my interdisciplinary research team], and so we haven’t really met. The summer was just crazy (referring to the onslaught of the COVID-19 pandemic) and difficult.” Despite the COVID-19 pandemic interrupting implementation of intended education research, most participants expressed that they found the workshop series ideas to continue to be helpful in future planning.

Limited time. A common response was faculty’s overall understanding of the substantial time commitment that ER requires, and that the workshop series helped some conceptualize the magnitude of the work. One participant explained, “being a relative novice in doing education research and trying to figure out how to do it [ER] … I know how I’m approaching it [ER] but I also know that that’s not the only way, maybe not even the correct way. I worry about that and spending so much time drawing those connections … the examples [during the workshop] were helpful.”

One participant spoke about bridging the gap between having classroom data and making the time commitment to proper ER study design, “Now we [department research team] have mountains of data, we need to put it [data collected] together, form research questions and then write the publications. It just needs to be done but time is difficult … the sifting through and kind of staying on the structure [ER structure] … I feel like that’s the hardest part.” Similar reflections of the necessary time commitment were expressed by multiple participants, sharing that the workshop series underscored the complexity of high-quality ER work and concomitant time required.

15.4. Requesting additional workshop series follow-up

Faculty expressed appreciation for what the time-limited workshop series was able to do, but as a next step many of them expressed a wish for additional follow-up to incorporate more specific guidance and input on their own study. In part this is likely because these participants acquired insight into some of the complexities to address, saying, “ … I was very pleased with what the workshop went over. But all that [ER study development] is still very much like, I still don’t have a very good sense for how to attack those [ER] questions.” Another said, “I would like to see [how others are] applying some of the workshop concepts to their own studies … you guys [ER workshop facilitators] did that for me but … more is always better.” Another response that indicated a next step of more time to focus on individual circumstances indicated, “I know that everybody [ER facilitators] was already donating time-but I think it would have been helpful to have one more session … our group was always full of questions and really good discussion … we were able to talk to several of you guys [ER facilitators/experts] about how to approach a particular question or research question.” Other faculty responded similarly in their follow-up interview, indicating a wish for future follow-up that could be targeted to their specific circumstances.

16. Discussion

16.1. Positive impact on faculty

While many participants came into the workshop series with some prior education research experience and even current projects, they indicated that the workshop series activities led to a greater appreciation of the complexities and nuances in education research. This empirically supports Borrego and Streveler’s (Citation2014) suggestion that conceptual hurdles exist for STEM faculty conducting education research. Through our workshop series, it not only emerged that these conceptual differences were indeed new to some participants, but that the brief introduction provided by the workshop series constituted a meaningful step in advancing their understanding.

In addition to participants learning about the complexities of education research, some participants also spoke of learning when to step back. One participant frankly stated that simply because he could conduct a statistical analysis did not mean that it should be done or would constitute quality education research. As a complement to the idea of recognizing one’s own limitations in education research expertise, participants also voiced a renewed appreciation for interdisciplinary collaboration. It is important to note that this cohort of participants was by no means collaboration-averse at the beginning of the workshop series; several participants had previously collaborated with the workshop facilitators and/or each other.

Coming out of the workshop series, participants spoke positively of the multiple opportunities for collaboration presented by the workshop series, including learning about others’ research contexts and methods, building a cohort of peers from whom they could receive feedback, and even starting research projects together. Moreover, the workshop series itself is a product of an effective interdisciplinary collaboration between the workshop facilitators who are faculty in engineering, education, and the University’s faculty development center. To elaborate, the idea for the workshop was sparked by dialogue between the faculty development center and STEM-H disciplinary faculty who wanted to initiate or refine their SoTL/educational research endeavors and desired guidance. The faculty development center tapped the expertise of faculty and students from the education college, as well as the director of the engineering school’s center for teaching and learning, to craft the workshop content. STEM-H disciplinary faculty with existing relationships to the facilitators identified more participants by spreading the word to interested colleagues. As Henderson et al. (Citation2017) state, the ideal function of a cross-disciplinary DBER-STEM alliance is to facilitate the transfer of ideas and methods across STEM fields, contribute to the growth of an interdisciplinary research community with shared projects, norms, and practices, and reinforce the value of DBER-STEM work to its members and to University stakeholders. Our workshop series highlights the fact that this work is iterative and long-term, and that similar activities can be both the fruit of collaborative outreach between campus organizations as well as a foundation for future STEM-H education research.

In addition to expressing an overall appreciation, participants highlighted specific content from the workshop series that they felt lent clarity and organization to their own work. While the workshop series did not drastically alter participants’ general research goals or interest areas, many participants spoke of using their newly acquired knowledge to focus their interest area and identify key literature, refine their research questions, better structure their procedures, or provide additional structure for data collection and analysis.

Participants with prior ER knowledge still expressed appreciation for the review of key concepts, the clear and succinct presentation of common analysis approaches, or for the introduction to material that was less familiar to them. Participants’ positive responses to the content of the workshop series suggests that even though the workshop series was intentionally quite brief to fit within participants’ busy schedules, the selection of topics, presentation approaches, and overall sequencing of ideas was effective for STEM-H faculty regardless of their existing levels of experience.

17. Project contexts bolstering claims of positive impact

Two aspects of our study design are worth discussing in support of our claim of positive workshop series impacts. First, as described in the methods section, we (the facilitators) have preexisting relationships with many of the workshop participants, serving as collaborators and consultants on numerous occasions over multiple years. Our study results and interpretations have been enriched by our contextual knowledge of the participants and their ER agendas. Existing collaborative relationships also were of benefit enhancing the validity of our data, because participants responded quite openly and honestly both about strengths and to voice a need that was unmet.

Second, the quality of our data was strengthened because it was drawn from three different points in time as well as collected in different formats (written and interviews). The longitudinal nature of this qualitative data offered insight into participants’ immediate reactions as well as reflections enriched by a half-year perspective. This time spacing of the data collection provided a robust opportunity to explore our research questions about both immediate and longer-term impacts from our relatively brief workshop series. Despite a 6-month time lag, participants were still able to discuss numerous specific aspects of the half-year-distant workshop experience and its impact on them.

18. Lessons learned for future iterations

A critical examination of the workshop design and evidence of faculty experiences led the research team to identify some features that may be improved in future iterations. We learned that the pre-readings, while reported as helpful by some participants, could have been more strongly aligned to the initial meeting’s purpose and thus more impactful with a more detailed and systematic guide for processing those ideas prior to the first face-to-face meeting. Similarly, a more targeted and well-defined streamlined approach for presenting key concepts may have yielded stronger faculty learning gains with more efficient use of the very limited time available. A goal of the facilitators was to not trivialize nuances of educational research design, and thus some of the workshop time was used to promote faculty discussion and creative brainstorming. In reflection based on faculty feedback, we realize this instructional time may have been too broadly structured and thus may not have been the most productive use of that time. This may have been primarily due to the fact that the vast majority of faculty participants were already somewhat familiar with education research in some way, and we did not need to emphasize aspects of complexity and challenge as much as we initially planned. We also recognize that the last-minute (pre-COVID pandemic) decision to video-record and make available the sessions asynchronously for interested faculty unable to attend in person resulted in less-than-optimal quality, so that those videos were at best only marginally helpful for interested parties. In future iterations we believe the workshop series could be made more impactful by: (a) focusing the pre-readings with additional structure; (b) omitting brainstorming and in-person idea-creation time in favor of more time spent on core workshop learning goals; and (c) more thoroughly planning for video-recording for asynchronous access to ensure better quality resources for those not able to physically attend each session.

19. Conclusions

The findings presented in this paper provide support for the efficacy of the brief workshop series in deepening and broadening STEM-H faculty’s understanding of the nuances of education research, including the ability to apply workshop series topics to their own research agendas. Faculty reported leaving the workshop series with tangible next steps in pursuing their education research goals and, despite the challenges and interruptions of an ongoing global pandemic, expressed continued interest in accomplishing these goals even months later.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The authors received no direct funding for this research.

Notes on contributors

Patricia A. S. Ralston

The research team is composed of senior professors and advanced doctoral students. The team for this study is also part of a long-standing research group that has focus on interdisciplinary educational research. As a whole the research team has well over 50 years of cumulative academic research, 5 of which were working together in STEM-H specific educational research initiatives. The goals of the research team remain grounded in collaborative approaches to all research projects with special appreciation in interdisciplinary teams. This study was a culmination of the team’s many years of working with STEM-H faculty development and the research that followed.

References

  • American Educational Research Association. (2019). What is education research? Retrieved https://www.aera.net/About-AERA/What-is-Education-Research
  • Arthurs, L. A. (2019). Undergraduate geoscience education research: Evolution of an emerging field of discipline-based education research. Journal of Research in Science Teaching, 56(2), 118–19. https://doi.org/10.1002/tea.21471
  • Bodner, G. M. (2016). Changing how data are collected can change what we learn from discipline-based education research. 2016 physics education research conference proceedings. Sacramento, CA (American Association of Physics Teachers). Retrieved https://www.compadre.org/per/items/detail.cfm?ID=14294
  • Borrego, M., & Streveler, R. A. (2014). Preparing engineering educators for engineering education research. In A. Johri, and B. M. Olds (Eds.), Cambridge handbook of engineering education research. Cambridge University Press. Print 457–75 .
  • Boyer, E. L. (1990). Scholarship reconsidered: Priorities of the professoriate. The Carnegie Foundation for the Advancement of Teaching. Print.
  • Clegg, S. (2012). Conceptualising higher education research and/or development as ‘fields’: A critical analysis. Higher Education Research & Development, 31(5), 667–678. https://doi.org/10.1080/07294360.2012.690369
  • Dolan, E. L., Elliott, S. L., Henderson, C., Curran-Everett, D., St. John, K., & Ortiz, P. A. (2018). Evaluating discipline-based education research for promotion and tenure. Innovative Higher Education, 43(1), 31–39. https://doi.org/10.1007/s10755-017-9406-y
  • Education Research (ECR: BCSER). Program solicitation #NSF 20-521. Retrieved https://www.nsf.gov/pubs/2020/nsf20521/nsf20521.htm?WT.mc_id=USNSF_25&WT.mc_ev=click
  • Guba, E., and Lincoln, Y. 1994 Competing paradigms in qualitative research Denzin, N., and Lincoln, Y. Handbook of qualitative research (Thousand Oaks, CA: Sage) 163–194
  • Henderson, C., Connolly, M., Dolan, E. L., Finkelstein, N., Franklin, S., Malcom, S., & St. John, K. (2017). Towards the STEM DBER alliance: Why we need a discipline-based STEM education research community. Journal of Engineering Education, 106(3), 349–355. https://doi.org/10.1186/s40594-017-0076-1
  • Hill, C. E. (2012). Consensual qualitative research: A practical resource for investigating social science phenomena. American Psychological Assocation. Print.
  • Hill, C. E., Knox, S., Thompson, B. J., Williams, E. N., Hess, S. A., & Ladany, N. (2005). Consensual qualitative research: An update. Journal of Counseling Psychology, 52(2), 196. https://doi.org/10.1037/0022-0167.52.2.196
  • Howell, D. C. (2008). Fundamental statistics for the behavorial sciences (6th ed.). Wadsworth. Print.
  • Istaplet, L. S., & Hancock, G. (2019). NSF Award #1937745 abstract: Quantitative Research Methods for STEM Education Scholars Program. Retrieved https://www.nsf.gov/awardsearch/showAward?AWD_ID=1937745&HistoricalAwards=false
  • Kanuka, H. (2011). Keeping the scholarship in the scholarship of teaching and learning. International Journal for the Scholarship of Teaching and Learning, 5(1). https://doi.org/10.20429/ijsotl.2011.050103
  • Larsson, M., Mårtensson, K., Price, L., & Roxå, T. (2017). Constructive friction? Exploring patterns between education research and the scholarship of teaching and learning. In Transforming patterns through the scholarship of teaching and learning: The 2nd European Conference for the Scholarship of Teaching and Learning (pp. 161–165). http://konferens.ht.lu.se/fileadmin/_migrated/content_uploads/Larsson_etal.pdf
  • Lo, S. M., Gardner, G. E., Reid, J., Napoleon-Fanis, V., Carroll, P., Smith, E., & Sato, B. K. (2019). Prevailing questions and methodologies in biology education research: A longitudinal analysis of research in CBE-Life sciences education and at the society for the advancement of biology education research. CBE-Life Sciences Education, 18(9), 1–10. https://doi.org/10.1187/cbe.18-08-0164
  • McBride, L. G., & Kanekar, A. S. (2015). The scholarship of teaching and learning: Origin, development, and implications for pedagogy in health promotion. Perspectives on Pegagogy, 1(1), 8–14. https://doi.org/10.1177/2373379914557498
  • McKinney, K. (2007). Enhancing learning through the scholarship of teaching and learning: The challenges and joys of juggling. Jossey-Bass. Print.
  • Nadelson, L. (2016). The influence and outcomes of a STEM education research faculty community of practice. Journal of STEM Education, 17(1), 44–51. https://jstem.org/jstem/index.php/JSTEM/article/view/1898/1731
  • National Academies of Sciences, Engineering, and Medicine. (2018). Indicators for monitoring undergraduate STEM education. The National Academies Press. Retrieved https://www.nap.edu/catalog/24943/indicators-for-monitoring-undergraduate-stem-education
  • National Science Board. (2018). Our nation’s future competitiveness relies on building a STEM-capable U.S. workforce: A policy companion statement to science and engineering indicators. National Science Foundation. Retrieved https://www.nsf.gov/nsb/sei/companion-brief/NSB-2018-7.pdf
  • Ørngreen, R., & Levinsen, K. (2017). Workshops as a research methodology. Electronic Journal of E-learning, 15 (1), 70–81. Retrieved https://files.eric.ed.gov/fulltext/EJ1140102.pdf
  • Roman-Dixon, E. (2019). NSF Award #1937490 Abstract: Institute in Critical Quantitative, Computational, and Mixed Methods Training for Underrepresented Scholars. Retrieved https://www.nsf.gov/awardsearch/showAward?AWD_ID=1937490&HistoricalAwards=false
  • Shipley, T. F., McConnell, D., McNeal, K. S., Petcovic, H. L., & St. John, K. E. (2018). Transdisciplinary science education research and practice: Opportunities for GER in a developing STEM Discipline-Based Education Research Alliance (DBER-A). Journal of Geoscience Education, 65(4), 354–362. https://doi.org/10.5408/1089-9995-65.4.354
  • Talanquer, V. (2014). DBER and STEM education reform: Are we up to the challenge? Journal of Research in Science Teaching, 51(6), 809–819. https://doi.org/10.1002/tea.21162
  • Tipton, E. (2019). NSF Award #1937633 Abstract: Modern Meta-Analysis Research Institute. Retrieved https://www.nsf.gov/awardsearch/showAward?AWD_ID=1937633&HistoricalAwards=false
  • Webb, A. S. (2019). Navigating the lows to gain new heights: Constraints to SOTL engagement. The Canadian Journal for the Scholarship of Teaching and Learning, 10(2). https://doi.org/10.5206/cjsotl-rcacea.2019.2.8173
  • Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Harvard Business Press.
  • Zhong, J., Ralston, P. A., Tinnell, T. L., Tretter, T., & Brown, M. (June 2020). Designing a streamlined workshop for STEM-H faculty engaged in the scholarship of teaching and learning. 2020 ASEE Virtual Annual Conference June 22, 2020.

Appendix A Data Collection Instruments

Citation1994Pre-Survey

  1. Why did you choose to participate in this workshop? (broadly – motivations, reasons, curiosities, …)

  2. What do you hope to gain from participating in this workshop? (as specific as possible please – could be several)

  3. How would you characterize your prior experiences/knowledge of education research? (could include brief summaries of aspects such as if you’ve conducted a self-study of your teaching, for how long, etc.)

Post-Survey

Requested reflections on the most informative and least valuable aspects of the various topics addressed in the workshop series.

6-Month Follow Up Post-Interview

Semi-structured Interview Question Protocol (with follow-ups by interviewer depending on response):

  1. Thinking back on your experience with the workshop, do you feel like you were able to address some or parts of your original motivation to participate?

  2. At the end of the workshop we had you think about possible next steps, were you able to complete these, given the impact of COVID-19?

  3. Any future next steps, given all the changes due to the implications of COVID-19?

Appendix B

Data Analysis Decision Tree (not comprehensive)