Publication Cover
Journal of Education for Teaching
International research and pedagogy
Volume 49, 2023 - Issue 4
1,830
Views
1
CrossRef citations to date
0
Altmetric
Articles

How to promote student teachers’ research knowledge and skills online

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 569-582 | Received 13 Oct 2021, Accepted 12 Aug 2022, Published online: 08 Dec 2022

ABSTRACT

Promoting research knowledge and skills (RKS) is an important task of teacher training programmes. One way of fostering RKS is through research-based learning (RBL). However, student teachers often struggle with the acquisition of RKS and do not benefit from RBL like students of other study programmes. This paper suggests e-learning as one part of the solution to support the design of courses that prepare students for conducting research projects. We present an interactive e-learning module designed to develop student teachers’ RKS and analyse the effectiveness of (1) the e-learning module, (2) its combination with online instruction and (3) online instruction only. Unlike previous research, we assessed the development of student teachers’ RKS (n = 402) using a standardised test. Findings revealed that both e-learning formats were effective in developing students’ RKS. Moreover, the study provided evidence of the sustainability of learning gains obtained through the e-learning module. Both e-learning formats proved superior to online instruction only, which in turn did not promote students’ RKS significantly, regardless of whether the course taught research methods explicitly or dealt with the contents of research. Thus, the development of further e-learning opportunities in the context of RBL should receive more attention in the future.

Introduction

Assessing, using and generating empirical evidence is part of teachers’ professional competence and a foundation of their professional development (Sachs Citation2016; Evans, Waring, and Christodoulou Citation2017). Therefore, teachers require research knowledge and skills (RKS), which enable them to inform, evaluate and innovate their own everyday practice and that of their colleagues.

One approach to promote RKS is research-based learning (RBL). In Germany, RBL has gained increasing importance due to the implementation of the long-term school practicum (‘Praxissemester’), which in most of Germany’s federal states takes place during students’ master’s studies. As part of their practicum, student teachers are required to conduct their own research projects in which they analyse one aspect of teaching or school practices based on empirical methods and scientific theories (Weyland Citation2019).

However, student teachers often struggle with the content and pace in the courses preparing them for their research projects, as they are confronted with many unfamiliar concepts that they need to absorb and apply to their own research in a short amount of time (Bailey and van Harken Citation2014). Hence, student teachers tend to show little enthusiasm about research methods, which makes teaching these courses a ‘rocky road’ for many teacher educators (Christenson et al. Citation2002). Students’ lack of RKS, which also manifests in their practicum reports (Flores Citation2018), is regarded as one reason why student teachers often perceive RBL as being excessively demanding, stressful and frustrating (van Katwijk et al. Citation2019). This may contribute to the fact that they do not benefit from RBL as do students in non-teaching professions (Thiem, Preetz, and Haberstroh Citation2020).

Until now, little was known about student teachers’ RKS, the areas they struggle with most and how they can be developed (Reis-Jorge Citation2005; van der Linden et al. Citation2015). The few existing studies focus on analysing the development of students’ RKS in traditional classroom settings and measuring them by students’ self-reported knowledge or using concept maps (van der Linden et al. Citation2012, Citation2015).

The current study thus presents an interactive e-learning module designed to promote student teachers’ RKS and analyses its effectiveness to inform future online practices in the context of RBL. We compare how far students develop their RKS through (1) the e-learning module, (2) its combination with online instruction and (3) online instruction only. In this study, the term online instruction exclusively refers to courses that took place as synchronous online meetings during scheduled course sessions. The e-learning module, by contrast, provided asynchronous self-study with automated feedback. To measure student teachers’ RKS, we used a standardised test. This allowed us to make differentiated statements about how students develop their RKS in the different cognitive dimensions needed for RBL.

Research-based learning and its cognitive requirements

In recent decades, research units have become part of many teacher education programmes worldwide. However, the ways in which research components have been integrated into the curricula vary and range from reading and interpreting educational research to carrying out research projects (van Katwijk et al. Citation2019). When we refer to RBL, we follow the definition of Huber (Citation2014). According to Huber, RBL (which he prefers to call learning by inquiry) takes place when students conduct research in a self-regulated manner, starting with the development of a research question and leading to the presentation of its results.

RBL is regarded as a good opportunity to foster student teachers’ RKS (Gess, Deicke, and Wessels Citation2017). At the same time, student teachers need a certain degree of RKS to cope with the demands of RBL. To specify the cognitive requirements student teachers need for RBL in empirical educational research contexts, (Cammann et al. Citation2018) developed a theoretical model that considers the knowledge and skills underlying the engagement with research (i.e., reading and using it) as well as the engagement in research (i.e., conducting it) (Borg Citation2010). According to the model, student teachers’ RKS is operationalised in this study by five empirically confirmed dimensions that represent the phases of a research cycle: research question (including the transformation of a practical problem into a research question, henceforth research question), research design, data collection, data analysis as well as interpretation and utilisation of data (). Each of the dimensions is represented by three criteria that describe the concrete requirements for RBL independently of the methodology the student uses.

Figure 1. The requirements of RBL in empirical educational research contexts (translated from Cammann et al. Citation2018, with kind permission).

Figure 1. The requirements of RBL in empirical educational research contexts (translated from Cammann et al. Citation2018, with kind permission).

The e-learning module

Based on the requirements of RBL (Cammann et al. Citation2018; see ), an e-learning module was developed to encourage and assist student teachers acquire RKS. The e-learning module focuses on quantitative methods, which students perceive as especially difficult (Murtonen and Lehtinen Citation2003). In the e-learning module, the students proceed through the entire research process on the basis of a best-practice example, since the use of examples from research practice is one of the most important factors contributing to students’ acquisition of RKS (van der Linden et al. Citation2012). Corresponding to the phases of the research cycle, the e-learning module consists of five units, which take about 10 to 15 minutes each.

The e-learning module is oriented towards the principle of a goal-based scenario (GBS) (Schank et al. Citation1994). In a GBS, the learners have to solve problems to achieve a predefined goal, which is relevant to them. The reasons for using elements of a GBS were twofold: First, it seemed suitable, as the engagement with research methods is unpopular among many student teachers and a GBS creates a motivational learning environment. Second, the idea was to create a setting for students in which they cannot only acquire RKS, but also to show them situations of educational practice in which they can utilise this knowledge and enable them to transfer this knowledge to their own research projects – typically the most difficult part for students.

Following the principle of a GBS, the e-learning module is based on a cover story: The users accompany a fellow student through conducting her research project. The acquisition of RKS is therefore continuously embedded within an authentic context. For example, when the students learn about the different types of sampling methods, they concurrently get to know how their fellow student could have realised these in her research project. During each research step, the students have to assist their fellow student with planning, designing and analysing her project (mission). Thereby, the users get the opportunity to apply their knowledge and skills. To fulfil this mission, they have to perform scenario operations, which, for example, include giving advice on research decisions, such as the selection of an appropriate research question. As in each GBS, the learners immediately get feedback on each task. They are either given the explanation for a wrong decision or verbally confronted with the consequences of their decisions (e.g., feedback on a data collection task: ‘If Isabel chose to collect the teachers’ subjects at her school in her questionnaire, she might violate the principle of anonymity’.).

Aside from the feedback for self-assessment, the e-learning module has further interactive components (Palacios and Evans Citation2013). Learners can control the pace and content presented to them. For instance, the e-learning module contains help buttons to look up technical terms or additional practical examples for ambitious students. In this way, the e-learning module accounts for the heterogeneous prior knowledge of its learners.

The present study

The aim of this study is to analyse the effectiveness of the e-learning module, the e-learning module combined with online instruction and online instruction only. For that reason, we compare the following groups: The first intervention group (IG1) completed all five e-learning units consecutively at the beginning of the semester. In the second intervention group (IG2), the e-learning units were contextualised within research methods courses and completed on a weekly basis. In the control groups, the students received online instruction exclusively. In the first control group (CG1), the students were systematically trained in research methods. To examine whether student teachers also acquire RKS when research methods are not explicitly covered as a topic, the second control group (CG2) dealt with instructional research from a content-related perspective. Hence, we address the following research questions:

(I) To what extent do student teachers acquire RKS through

(1) the e-learning module (IG1)

(2) the e-learning module combined with online instruction (IG2)

(3a) online instruction focusing on research methods (CG1)

(3b) online instruction focusing on the contents of teacher research (CG2)?

We predict that student teachers in all groups will improve their RKS. We assume that students in IG2 will have the highest learning gains. Since students’ RKS were developed in greater depth and over a longer period in CG1 than in IG1, we predict higher growth for the former group. We expect the smallest increase in RKS among student teachers in CG2, as the course did not explicitly focus on research methods. Furthermore, we assume that the students will have gains in all dimensions of RKS. Since data on the effectiveness of the different units of the e-learning module do not yet exist, we have not formulated specific hypotheses on how the different modules will promote student teachers’ RKS.

(II) Can the learning effects of the e-learning module in IG1 be identified at the end of the semester?

We predict that the students will be more competent in data analysis when tested again at the end of the semester (i.e., three months later) because the courses in IG1 focused on teaching statistical knowledge. Regarding the other four dimensions, we assume that the students will perform worse compared to their RKS measured immediately after the completion of the e-learning module due to forgetting effects.

Methodology

Context

The study was carried out at the University of Cologne during the academic year 2020/21. To attain a teaching degree, students have to earn bachelor’s and master’s degrees. The investigated courses were offered for student teachers starting their master’s. They had not previously attended a course on empirical research methods in the context of their educational studies. As part of the courses, the students were prepared to conduct a research project during their long-term school practicum in the second semester. All seminars were accompanied by a lecture on the basics of qualitative and quantitative research. We examined students’ development of RKS in ten courses, which contained eight content sessions with a length of 90 minutes each. They were taught by three different lecturers with 10 to 45 years of teaching experience. To minimise potential teaching effects, we tried to make sure that each lecturer taught intervention as well as control groups; however, this was not possible for CG2.

Design and participants

Based on a convenience sample, a quasi-experimental design was realised, as it was not feasible to randomly assign the participants to the conditions resp. courses for institutional reasons (). To avoid self-selection bias, the instructional format of the courses was not mentioned in the course catalogue.

Figure 2. Graphical representation of the research design. The Roman numerals display the e-learning units; the rectangles represent the online sessions (Figure created by the authors).

Figure 2. Graphical representation of the research design. The Roman numerals display the e-learning units; the rectangles represent the online sessions (Figure created by the authors).

IG1, in which the student teachers worked through all five e-learning units at the start of the semester, consisted of four courses. To assess the isolated effect of the e-learning module on students’ RKS, they were tested before and after doing the e-learning module. The students were asked to complete the e-learning module and both tests by the third week of each course. In the penultimate week of the semester, they again took a follow-up test to measure long-term training effects. IG2 also comprised four seminars. In this condition, the students completed the e-learning units in preparation for the respective online sessions. Pre- and post-tests took place during the second and second-last weeks of the semester. The same applied to both control groups, which only contained one seminar each, as the study focused on investigating the effectiveness of the e-learning module and its combination with online instruction.

All tests were integrated into the seminars and administered online. During data collection, university ethics procedures were followed. An eight-digit code that was created by each student and based on non-sensitive information (e.g., the first two letters of the mother’s name) served to match time points and ensure anonymity.

Overall, 402 student teachers participated in the study (pre-test: 391, post-test: 371, follow-up: 98). Twelve students had to be excluded from the analysis because they reported in the post-test not to have used the e-learning module; four students were excluded, as they attended two courses at the same time. Of the detected outliers, one participant for the pre-test, ten for the post-test (IG1: 3, IG2: 7) and three for the follow-up were not included in the analysis, since these were identified as error outliers (Aguinis, Gottfredson, and Joo Citation2013). The respective students took less than 50% of the average processing time to complete the questionnaires. These observations were therefore not regarded as valid representations of the construct under investigation.

A total of 345 student teachers completed both the pre-test and post-test (IG1: 135, IG2: 107, CG1: 70, CG2: 33) (). We observed a return of 92% for IG1, a return of 90% for IG2, a return of 95% for CG1 and a return of 97% for CG2. The majority of the students were female (74%) and in the first semester of their master’s programme (85%). Their average age was 24.8 years (SD = 3.3). The participants were enrolled in the five types of teacher training programmes offered at the University of Cologne (primary school: 8%, lower secondary school: 14%, lower and upper secondary school: 37%, special needs education: 42%, vocational school: 1%). These percentages correspond approximately to those of the university’s general population of master student teachers.

Table 1. Demographic characteristics for both intervention (IG1 and IG2) and control groups (CG1 and CG2).

Course contents, structure and methods

In IG2 and CG1, students received instruction on research methods. The content taught in these seminars () was identical and structured according to the five phases of the research process covering the requirements needed for RBL (Cammann et al. Citation2018; see ). After an introduction to the research process, one to two sessions were usually dedicated to each phase of the research cycle with a focus on quantitative methods. The sessions comprised brief input presentations by the lecturer and tasks in which the students had to apply their knowledge to examples from practice in groups (van der Linden et al. Citation2012).

Table 2. Content taught in IG2 and CG1.

The focus of IG1 was on promoting students’ statistical knowledge (e.g., measures of central tendency, significance testing). Thereby, the courses in IG1 also touched upon the basics of empirical research (e.g., types of hypotheses, research instruments). However, they did not teach RKS step-by-step along the phases of the research process like IG2 or CG1. The statistical concepts were addressed in the context of educational questions, for example, measuring the influence of clarity of instruction on student achievement. During each session, the relevant theoretical and statistical concepts were presented and explained by the lecturer. Afterwards, the students calculated the statistics using fictive data and discussed the results in groups.

CG2 dealt with research on a content-related level. The course addressed aspects of teaching (e.g., instructional quality, diagnostics) and school development, different research approaches to the teaching profession (e.g., the expert paradigm) as well as empirical studies within these approaches. For each of the topics, the students presented and discussed relevant theoretical concepts, research results and exemplary studies in groups.

Instrument

To assess student teachers’ development of RKS, a standardised test (Cammann et al. Citation2020) was used. In the test, participants were confronted with situations that student teachers encounter when conducting a research project during their long-term practicum. The test is based on the model of the requirements of RBL (Cammann et al. Citation2018; see ). Each of the five dimensions is represented by seven to twelve items (see sample items in the Supplements). All criteria are covered by at least one item referring to the quantitative or qualitative research paradigm or by several items that equally pertain to both research paradigms.

A scaling analysis according to the item response theory (IRT) was conducted using the software ConQuest (Adams, Wu, and Wilson Citation2015), which attributes (a) a difficulty parameter to each item based on the solution rate and (b) an ability parameter to each participant according to their performance, using a maximum likelihood procedure. To increase the analytical power (Bond, Yan, and Heene Citation2020), observations from all three time points were included in one scaling file (concurrent scaling) (von Davier, Carstensen, and von Davier Citation2006).

One-dimensional and five-dimensional IRT scaling analyses were performed to find evidence of dimensionality in the data. EAP reliability, which is comparable to Cronbach’s alpha, was good for the one-dimensional model (.81) and at least acceptable for the five dimensions (research question .70; research design .80; data collection .79; data analysis .66; interpretation and utilisation of data .78). All weighted mean squares fell into the recommended range for the one-dimensional (0.87 < MNSQ < 1.12) as well as for the five-dimensional model (0.90 < MNSQ < 1.11) (Bond, Yan, and Heene Citation2020). On average, item discrimination was good (rit = .32). A comparison of the global model fit (Akaike information criterion, Bayesian information criterion and χ2-square test, see Supplements) indicated that the five-dimensional model fitted the data better than the one-dimensional model. Due to the high intercorrelations between some dimensions (.64 < r  < .95) (see Supplements) and to gain a general impression of the development of student teachers’ RKS, we will additionally report analyses based on the overall test scores.

Data analysis

Data analysis was conducted using IBM SPSS Statistics. The students’ answers were coded as correct (1) or incorrect (0). A one-way ANOVA was used first to analyse whether the groups differed in their RKS at the pre-test. Subsequently, descriptive analysis was performed on the groups’ pre- and post-test scores. The development of student teachers’ RKS between the pre- and post-tests as well as between the post-test and follow-up was determined by paired t-tests. To identify differences in the development of RKS between the four groups, we used an ANOVA with repeated measures followed by Bonferroni post hoc tests with time as the within-subject factor and group as the between-subject factor.

Results

There was no significant difference among the groups’ RKS at the pre-test [F(3, 341) = 1.89, p = .131; ηp2 = .02]. As assumed, the students in both intervention groups increased their RKS significantly (p < .001) (). While the effect size was small for the student teachers in IG1 (d = .39), it was moderate for the students in IG2 (d = .59) (Cohen Citation1988). Contrary to our hypothesis, no significant difference between the participants’ overall test scores at the beginning and the end of the semester was found in the control groups.

Table 3. Student teachers’ overall test scores at the pre- and post-tests.

A 2 × 4 ANOVA (time × groups) yielded significant main effects for the factor time [F(1, 341) = 6.34, p = .012, ηp2 = .02] and group [F(3, 341) = 6.02, p = .001, ηp2 = .05] as well as a significant interaction [F(3, 341) = 8.13, p < .001, ηp2 = .07]. In line with our hypothesis, Bonferroni-adjusted post hoc tests showed that the students in IG2 improved their RKS significantly more than the students in CG1 (p = .009) and CG2 (p = .013). In addition, the students in IG1 achieved a significantly greater increase in their mean test scores from pre- to post-test than the students in both control groups (p ≤ .032). Unexpectedly, there was no significant difference between the two intervention groups.

Our hypothesis that the students would improve their RKS in all dimensions was only partly confirmed, as student teachers in all groups performed worse on the dimension research question at the post-tests (). In all other dimensions, students in the intervention groups improved their knowledge significantly with small to medium effect sizes. In both control groups, the students did not enhance their knowledge significantly in any of the five dimensions. Only the increase of CG1 in the dimension data analysis was significant by trend (p = .093) with a small effect size (d = .20).

Table 4. Student teachers’ test scores in the five dimensions at the pre- and post-tests.

Contrary to our hypothesis that there would be forgetting effects in IG1, the overall scores of the student teachers who completed the e-learning module at the beginning of the semester did not decrease significantly between the post-test and follow-up [t(87) = −0.28, p = .784, d = .03]. As expected, there was a small gain in the dimension data analysis; however, it was not significant [t(87) = −1.50, p = .138, d = .16].

Discussion

Main research findings

This study examined the extent to which student teachers acquire RKS through the presented e-learning module, online instruction and their combination. The e-learning module as a ‘stand-alone’ (IG1) as well as the e-learning module combined with online instruction (IG2) have been shown to be effective educational formats. Thus, the design of the e-learning module seems suitable for promoting student teachers’ RKS and can be used as a reference point for developing future digital learning opportunities in this context. The descriptive results indicate the superiority of the combination of the e-learning module and online instruction. This corresponds to findings on blended learning showing that combined formats have greater effectiveness in terms of student achievement (Bernard et al. Citation2014).

Both the e-learning module as a ‘stand-alone’ and in conjunction with online instruction were more effective than the two online instruction courses. Even though the increase was only small in IG1, it can still be regarded as substantial considering that the students merely received an e-learning-module of about one and a half hours and had no interaction with their peers or lecturers.

Remarkably, both seminars in this sample, regardless of whether they explicitly taught research methods (CG1) or dealt with research from a content-related perspective (CG2), did not improve student teachers’ RKS on average, although all seminars were accompanied by a lecture on research methods. One explanation for the absence of learning gains might be the lack of fit between test and course contents. This explanation, however, only applies to CG2, since the contents in CG1 were constructively aligned with the test (Biggs and Tang Citation2011). The students in CG1 might not have improved their RKS due to a larger course size, which might have led to less active participation and student engagement. The interactive e-learning module, by contrast, actively engaged the students in the learning resp. research process, which is likely to have contributed to its effectiveness. On the one hand, the e-learning module allowed the students to constantly apply and assess their knowledge. On the other hand, by simulating a research process the e-learning module tried to convey the feeling of being an active participant in the conduction of a research project, which has been shown to be effective in making students benefit from research (Healey Citation2005).

A closer look at the development of RKS in the five dimensions revealed that student teachers in the intervention groups improved their RKS in all dimensions, apart from the dimension research question. As students’ RKS declined in this dimension in all groups independently of the content covered, this development might be associated with the difficulty of items in the dimension, which was already far below average in the pre-test. In future studies, students’ apparent decline in this dimension should therefore be scrutinised. The fact that students were able to increase their knowledge in all other dimensions is important evidence of the e-learning module’s effectiveness. It shows that the e-learning module promotes students’ knowledge in dimensions in which they had comparatively high (research design) and low (data analysis) prior knowledge. Nevertheless, the relatively low scores in analysing and interpreting data at the post-test suggest that student teachers struggle particularly with these dimensions. Since basic statistical knowledge and the ability to interpret empirical studies or standardised tests (such as PISA) are essential aspects of teachers’ professional competence, future courses should focus specifically on these dimensions.

Additionally, we investigated the sustainability of learning gains achieved through the e-learning module for those students who completed it at the beginning of the semester. The findings revealed that student teachers’ RKS remained high after three months. On the basis of the results, reasons for the sustainability of the e-learning module can only be adduced carefully. It seems possible that elements that are part of the e-learning module (e.g., case-based learning, active engagement, immediate feedback) might have facilitated sustainable learning. Besides, the fact that students were repeatedly confronted with related concepts in the seminars and lecture, might have renewed their RKS and thus prevented forgetting effects.

Limitations and directions for future research

When interpreting the findings of this study, the following aspects need to be taken into account: First, we used a convenience sample and a quasi-experimental design. Therefore, results in general, but in the control groups, in particular, should be treated with caution, as both control groups contained only one seminar each (70 resp. 30 students). Due to sample size limitations, the influence of further factors (e.g., teacher training programmes) could not be considered in the statistical calculations. Second, the standardised test used in this study does not cover all facets of research. Thus, the students in all groups might have also acquired knowledge on other aspects of research (e.g., knowledge of current research findings, calculating statistics etc.). Third, the course sessions took place online due to the Covid 19 pandemic. Even though studies show that online instruction is at least as effective in developing student knowledge as face-to-face instruction in higher education (Ebner and Gegenfurtner Citation2019), the specific circumstances of the study must be considered. Hence, it is desirable to perform replication studies to examine whether the e-learning formats are also more effective when online instruction is replaced by traditional classroom teaching.

In addition, the findings on the effectiveness of the e-learning module are limited due to the context in which the e-learning module was used. Since the implementation of RBL differs significantly across German universities, further studies are necessary to analyse the e-learning module’s effectiveness in other curricular contexts. Finally, the question remains in how far the students are also capable of transferring the knowledge and skills to their research projects and to what extent they further develop their RKS while conducting research. Also, it would be interesting to explore how far the different instructional settings foster student teachers’ affective-motivational dispositions. Promoting these dispositions is also important to motivate student teachers to put the acquired knowledge into practice (Blömeke, Gustafsson, and Shavelson Citation2015). Therefore, a follow-up study is already being carried out in which these three aspects are investigated. Knowing when and to what extent students develop the different facets of research competence will provide important information on how to support RBL in the context of long-term school practica more effectively.

Acknowledgements

We would like to thank the teacher educators and student teachers who participated in this study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work is part of a larger project called “Zukunftsstrategie Lehrer*innenbildung Köln (ZuS): Inklusion und Heterogenität gestalten”. ZuS is part of the ’Qualitätsoffensive Lehrerbildung’ (Quality Initiative for Initial Teacher Education), a joint initiative of the Federal Government and the Länder which aims to improve the quality of teacher training. The programme is funded by the Federal Ministry of Education and Research [grant number 01JA1815]. The work was supported by the Schmittmann-Wahlen Stiftung.

References

  • Adams, R. J., M. L. Wu, and M. R. Wilson. 2015. ACER ConQuest: Generalised Item Response Modelling Software. Version 4 [ computer programme]. Camberwell: Australian Council for Educational Research.
  • Aguinis, H., R. K. Gottfredson, and H. Joo. 2013. “Best-Practice Recommendations for Defining, Identifying, and Handling Outliers.” Organizational Research Methods 16 (2): 270–301. doi:10.1177/1094428112470848.
  • Bailey, N. M., and E. M. van Harken. 2014. “Visual Images as Tools of Teacher Inquiry.” Journal of Teacher Education 65 (3): 241–260. doi:10.1177/0022487113519130.
  • Bernard, R. M., E. Borokhovski, R. F. Schmid, R. M. Tamim, and P. C. Abrami. 2014. “A Meta-Analysis of Blended Learning and Technology Use in Higher Education: From the General to the Applied.” Journal of Computing in Higher Education 26 (1): 87–122. doi:10.1007/s12528-013-9077-3.
  • Biggs, J., and C. Tang. 2011. Teaching for Quality Learning at University: What the Student Does. 4th ed. Maidenhead: Open University Press.
  • Blömeke, S., J. E. Gustafsson, and R. J. Shavelson. 2015. “Beyond Dichotomies: Competence Viewed as a Continuum.” Zeitschrift für Psychologie 223 (1): 3–13. doi:10.1027/2151-2604/a000194.
  • Bond, T. G., Z. Yan, and M. Heene. 2020. Applying the Rasch Model: Fundamental Measurement in the Human Sciences. 3rd ed. New York: Routledge. doi:10.4324/9780429030499.
  • Borg, S. 2010. “Language Teacher Research Engagement.” Language Teaching 43 (3): 391–429. doi:10.1017/S0261444810000170.
  • Cammann, F., K. Darge, K. Kaspar, and J. König. 2018. “Anforderungen Forschenden Lernens im Praxissemester: Entwicklung eines Modells und erste empirische Befunde zur Validität.“ [Requirements of Research-Based Learning During the Practical Semester: Development of a Model and First Empirical Results Concerning Its Validity]. Herausforderung Lehrer_innenbildung 1 (2): 17–34. doi:10.4119/UNIBI/hlz-57.
  • Cammann, F., K. Darge, K. Kaspar, and J. König. 2020. “Forschendes Lernen in der Lehrkräftebildung: Erfassung und Struktur von studentischen Kompetenzen.“ [Research-Based Learning in Teacher Training: Measurement and Structure of Student Competencies]. In Evidenzbasierung in der Lehrkräftebildung, Edition ZfE, Band 4, edited by I. Gogolin, B. Hannover, and A. Scheunpflug, 13–37. Wiesbaden: Springer VS. doi:10.1007/978-3-658-22460-8_2.
  • Christenson, M., R. Slutsky, S. Bendau, J. Covert, J. Dyer, G. Risko, and M. Johnston. 2002. “The Rocky Road of Teachers Becoming Action Researchers.” Teaching and Teacher Education 18 (3): 259–272. doi:10.1016/S0742-051X(01)00068-3.
  • Cohen, J. 1988. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale: Erlbaum. doi:10.4324/9780203771587.
  • Ebner, C., and A. Gegenfurtner. 2019. “Learning and Satisfaction in Webinar, Online, and Face-to-Face Instruction: A Meta-Analysis.” Frontiers in Education 4 (92): 1–11. doi:10.3389/feduc.2019.00092.
  • Evans, C., M. Waring, and A. Christodoulou. 2017. “Building Teachers’ Research Literacy: Integrating Practice and Research.” Research Papers in Education 32 (4): 403–423. doi:10.1080/02671522.2017.1322357.
  • Flores, M. A. 2018. “Linking Teaching and Research in Initial Teacher Education: Knowledge Mobilisation and Research-Informed Practice.” Journal of Education for Teaching 44 (5): 621–636. doi:10.1080/02607476.2018.1516351.
  • Gess, C., W. Deicke, and I. Wessels. 2017. “Kompetenzentwicklung durch Forschendes Lernen.” [ Competence Development Through Research-Based Learning]. In Forschendes Lernen: Wie die Lehre in Universität und Fachhochschule erneuert werden kann, edited by H. Mieg, and J. Lehmann, 79–90. Frankfurt: Campus.
  • Healey, M. 2005. “Linking Research and Teaching to Benefit Student Learning.” Journal of Geography in Higher Education 29 (2): 183–201. doi:10.1080/03098260500130387.
  • Huber, L. 2014. “Forschungsbasiertes, Forschungsorientiertes, Forschendes Lernen: Alles dasselbe? Ein Plädoyer für eine Verständigung über Begriffe und Unterscheidungen im Feld forschungsnahen Lehrens und Lernens.“ [Research-Led, Research-Oriented, Research-Based Learning: All the Same? A Pleading for an Agreement on the Terms and Their Differentiation in the Field of Research-Related Teaching and Learning]. Das Hochschulwesen 62 (1+2): 22–29.
  • Murtonen, M., and E. Lehtinen. 2003. “Difficulties Experienced by Education and Sociology Students in Quantitative Methods Courses.” Studies in Higher Education 28 (2): 171–185. doi:10.1080/00313830500109568.
  • Palacios, L., and C. Evans. 2013. The Effect of Interactivity in E-Learning Systems. Newcastle upon Tyne: Cambridge Scholars Publishing.
  • Reis-Jorge, J. M. 2005. “Developing Teachers’ Knowledge and Skills as Researchers: A Conceptual Framework.” Asia-Pacific Journal of Teacher Education 33 (3): 303–319. doi:10.1080/13598660500286309.
  • Sachs, J. 2016. “Teacher Professionalism: Why Are We Still Talking About It?” Teachers and Teaching: Theory and Practice 22 (4): 413–425. doi:10.1080/13540602.2015.1082732 ?scroll=top.
  • Schank, R. C., A. Fano, B. Bell, and M. Jona. 1994. “The Design of Goal-Based Scenarios.” The Journal of the Learning Sciences 3 (4): 305–345. doi:10.1207/s15327809jls0304_2.
  • Thiem, J., R. Preetz, and S. Haberstroh. 2020. “‘Warum soll ich forschen?‘: Wirkungen Forschenden Lernens bei Lehramtsstudierenden.“ [‘Why Should I Do Research?‘: Effects of Research-Based Learning among Student Teachers]. Zeitschrift für Hochschulentwicklung 15 (2): 187–207. doi:10.3217/zfhe-15-02/10.
  • van der Linden, W., A. Bakx, A. Ros, D. Beijaard, and L. van den Bergh. 2015. “The Development of Student Teachers’ Research Knowledge, Beliefs and Attitude.” Journal of Education for Teaching 41 (1): 4–18. doi:10.1080/02607476.2014.992631.
  • van der Linden, W., A. Bakx, A. Ros, D. Beijaard, and M. Vermeulen. 2012. “Student Teachers’ Development of a Positive Attitude Towards Research and Research Knowledge and Skills.” European Journal of Teacher Education 35 (4): 1–19. doi:10.1080/02619768.2011.643401.
  • van Katwijk, L., A. Berry, E. Jansen, and K. van Veen. 2019. “‘It’s Important, but I’m Not Going to Keep Doing It!’: Perceived Purposes, Learning Outcomes, and Value of Student Teacher Research among Educators and Student Teachers.” Teaching and Teacher Education 86: 1–11. doi:10.1016/j.tate.2019.06.022.
  • von Davier, A. A., C. H. Carstensen, and M. von Daviervon Davier. 2006. “Linking Competencies in Horizontal, Vertical, and Longitudinal Settings and Measuring Growth.” In Assessment of Competencies in Educational Contexts, edited by J. Hartig, E. Klieme, and D. Leutner, 53–80. Göttingen: Hogrefe.
  • Weyland, U. 2019. “Forschendes Lernen in Langzeitpraktika: Hintergründe, Chancen und Herausforderungen.“ [Research-Based Learning in Long-Term-Internships: Backgrounds, Chances and Challenges]. In Herausforderung Kohärenz: Praxisphasen in der universitären Lehrerbildung, edited by M. Degeling, N. Franken, S. Freund, S. Greiten, D. Neuhaus, and J. Schellenbach-Zell, 25–64. Bad Heilbrunn: Verlag Jul ius Klinkhardt. doi:10.25656/01:17265.