1,495
Views
1
CrossRef citations to date
0
Altmetric
TEACHER EDUCATION & DEVELOPMENT

Difficulties pre-service science teachers encountered in conducting research projects at teacher education college

& ORCID Icon
Article: 2196289 | Received 15 Feb 2023, Accepted 24 Mar 2023, Published online: 01 Apr 2023

Abstract

The purpose of this study was to investigate the difficulties pre-service teachers experience in conducting and writing research projects. A concurrent mixed research design was utilized to identify the primary difficulties. To collect the data, the study employed a questionnaire and document analysis. A closed-end questionnaire was administered to natural science candidates of Teacher Education College. The questionnaire data were analyzed using statistical procedures like frequencies and percentages. For document analysis, nine research papers were selected and analyzed by setting evaluation criteria. Two coding cycles (in vivo and axial) were used to evaluate the quality of the papers, and the results were embedded in the quantitative questionnaire. Contrary to the previous findings, the most challenging area candidates struggled with when conducting research was found to be a lack of knowledge and skills in analyzing, organizing, and interpreting data. It was also found that the candidates lacked primarily procedural knowledge and also epistemic knowledge to accomplish research projects. Thus, it was suggested that courses and delivery strategies should include high-level knowledge such as procedural and epistemic knowledge to promote students’ problem-solving skills.

1. Introduction

A country’s social and economic progress is inextricably connected to the quality of education provided to learners. Thus, to maintain quality education it becomes imperative to integrate research with education to satisfy rapidly growing and constantly changing societal needs. In this regard, educational research, which entails systematic investigation and application of empirical methods to solve educational challenges, plays a critical role and is required to contribute to the existing body of knowledge by providing solutions to a variety of pedagogical problems while also improving teaching and learning practices (Gay et al., Citation2012; Johnson & Christensen, Citation2008).

In most African countries, including Ethiopia, teaching-learning practices have been criticized for failing to prepare students for today’s challenges (Alemu et al., Citation2017; Joshi & Verspoor, Citation2013). As a result, in this era, higher-level scientific knowledge and skills, such as critical thinking, problem-solving, scientific reasoning, and scientific inquiry, are more vital than ever (Sjøberg, Citation2018). This implies that students need to have adequate knowledge and skills to engage in high-level activities, such as performing research projects, solving daily problems, and practicing new tools, ideas, and discoveries. These learners can handle even the most difficult tasks with little help from their instructors (Carr et al., Citation2015). Furthermore, there is a correlation between such autonomous learning and the development of research skills (Azmi & Daud, Citation2018). Therefore, it would be a concern of anyone engaged in teaching profession to examine the teaching learning activities and make sure that students are actually learning as intended. Questioning and reflecting on one's work in such a manner usually paves the way for further investigation.

In line with this, Haller and Kleine (Citation2001) pointed out that well-crafted research begins with a clearly stated research problem, which may be derived from gaps in the existing literature or an unsolved problem derived from practice. The rationale for conducting this research, therefore, emanated from the experience of the first author teaching consecutive research courses. The courses cover the theoretical concepts of educational research and discuss how to write scientific research and techniques for research report writing. The courses were designed in a way candidates exercise their skills of identifying research problems, forming a hypothesis, researching literature, gathering and analyzing data, drawing conclusions based on evidence, and eventually writing and presenting research reports. Thus, the course was intended to enable the pre-service teachers to examine the underlying assumptions behind their day-to-day performances (teaching-learning activities) and find out what works and what does not work for them in their local school setting. Then, apply the new knowledge gained and skills developed for the benefit of the students and their professional development.

In view of this, in the next practical research course, the candidates often conduct their research in groups of five to six members in parallel with practicum when they leave the campus for about 6 weeks to their respective catchment schools for actual teaching and learning practices. However, the effectiveness of the candidates’ performance could heavily be determined by the way how the theoretical course is designed and delivered, and the knowledge and skills the candidates obtained from this course. Because the second course is a self-regulated project work designed to be accomplished by the candidates. In this course, candidates are expected to devise what to do, manage overall work, practice researching on their own without much dependency on their instructors and take responsibility to complete within the allocated time schedule. This could imply that such tasks demand candidates to be equipped with the abilities to think creatively, reasonably, and critically, and develop a skill of communication and collaboration with others (Halpern, Citation2014). Hence, there is a need to find out whether candidates are equipped with such reasoning abilities that could enable them to accomplish such demanding tasks.

Although there are studies that are engaged in identifying challenges researchers faced in conducting research work, most studies focused on in-service teachers (deBorja, Citation2018; O’connor et al., Citation2006), and they were not conducted concerning uncovering how reasoning skill is related to conducting research projects. Thus, it is vital to investigate candidates’ higher thinking abilities needed to conduct research projects for such practical issues require more investigation and alternative solutions.

To this end, the authors’ frequent observations of class activities, evaluation of research project papers and presentations, and naive responses given to questions raised have made apparent candidates’ immature reasoning ability that then initiated the researchers to conduct this study. The purpose of this study was, therefore, to identify the difficulties that the candidates faced both in conducting a research project and writing a research report in relation to different forms of scientific knowledge needed to conduct research. To achieve this, the following basic research questions were developed:

1. What difficulties do candidates encounter in conducting research projects?

2. What research skills do candidates experience difficulty in writing research reports?

2. Conceptual framework

African nations, including Ethiopia, are struggling with several challenges, including political unrest due to ethnic disputes, poverty, corruption (power is viewed as a source of wealth), students’ poor academic performance, and other societal concerns (Joshi & Verspoor, Citation2013; MoE&;, Citation2016). Conducting research projects is one of the significant tools to produce new knowledge, resolve conflicts, and solve a variety of human and environmental problems (Fayomi et al., Citation2018). To tackle global, local, and personal problems, learners must be equipped with the knowledge required to identify problems that are amenable to research as well as the necessary skills for observation, measurement, data gathering, analysis, and evaluation. To this end, there is a need to identify major difficulties that might hinder candidates to conduct research projects properly and the way how to solve the problems that necessitate the use of relevant abilities.

Some studies revealed that one of the difficulties in conducting research projects for beginners is their inadequate knowledge of the research process (Akyürek & Afacan, Citation2018; Rahman et al., Citation2014). According to the literature, finding a researchable problem is the most challenging step for novice researchers (Gay et al., Citation2012). They also argued that this problem even causes anxiety and stress among candidates. Some others faced difficulty on how to write a literature review because it takes a long time for candidates to identify relevant, related, and updated literature on the selected topic (deBorja, Citation2018; O’connor et al., Citation2006; Toquero, Citation2021). This shows that there is a need to identify difficulty areas concerning undertaking research projects for naive researchers and propose possible suggestions.

According to some other studies, one of the causes for these challenges is classroom learning gives less emphasis on skills that require students’ higher-order knowledge (Dole et al., Citation2016;). Most sub-Saharan African classroom learning is heavily dominated by passive methods and characterized by low-order thinking, which does not promote learners’ higher cognitive abilities and denies their autonomous learning (Alemu et al., Citation2017; Joshi & Verspoor, Citation2013). Learning strategies that do not foster higher abilities might affect learners to perform activities that require higher-order thinking such as conducting research projects that demand higher knowledge and skill.

Therefore, there is a need to investigate candidates’ ability to conduct research projects concerning different forms of knowledge. To perform projects that require higher abilities, candidates need to develop the skill of thinking critically and reasoning at a higher level. Literature confirms that scientific reasoning encompasses different forms of scientific knowledge: content, procedural and epistemic (Kind & Osborne, Citation2017; Kind, Citation2013). Content knowledge refers to knowledge of facts, concepts, ideas, and theories about the natural world and its explanations (OECD, Citation2016). Procedural knowledge involves the tools and procedures used to generate evidence and solve inquiry tasks in scientific reasoning (Gott et al., Citation2008). Scientists use this knowledge to design experiments, make reliable measurements, and present and analyze data. On the other hand, epistemic knowledge refers to knowledge that requires evaluating the results and evidence (Kelly & Duschl, Citation2002). Scientists use this knowledge in their reasoning as criteria for evaluating knowledge, data, and results. Thus, to reason critically and perform tasks that demand higher ability, one needs to have theoretical conceptualization (using content knowledge), follow recognized procedure (using procedural knowledge), and adapt certain epistemic criteria to evaluate knowledge (using epistemic knowledge) (Kind, Citation2013).

Traditionally, however, most classroom teaching has had a main focus on content knowledge alone and ignored the need to teach scientific reasoning using procedural and epistemic knowledge that is vital for higher-level thinking (Alemu et al., Citation2017; Kind & Osborne, Citation2017; Osborne, Citation2014). Studies revealed that these two forms of scientific knowledge are useful for candidates to be engaged in scientific practices and investigations, such as developing scientific questions, generating hypotheses, evaluating evidence, and drawing scientific conclusions (Abate et al., Citation2020; Abate, Mickael & Angell, Citation2021; Vorholzer et al., Citation2020). This suggests that there is a need to investigate candidates’ project tasks and activities to determine the extent to which these two forms of knowledge are exercised.

Content knowledge might include candidates’ knowledge of encompassing knowledge of concepts, definitions, laws, and theories. However, there is a need for procedural knowledge in conducting research projects to apply the content knowledge in a practical context. This means procedural knowledge is required to identify problems and gaps in studies, develop research questions, and conceptualize the problem under study (Vorholzer et al., Citation2020). Procedural knowledge enables candidates to identify methods of inquiry, procedures, algorithms, and techniques (Gott et al., Citation2008). It describes tools, procedures, and how they can be applied using constructed knowledge (Kind, Citation2013). This allows them to select a proper method of data gathering and analysis, select sampling, and set guidelines to analyze and interpret data. Epistemic knowledge, on the other hand, is characterized by comparing and contrasting the findings with literature, judging and evaluating results on where to go, and how to contextualize (Gott et al., Citation2008; Kind, Citation2013). At this level of reasoning, the candidates are expected to consider different viewpoints and develop a relationship between findings and literature to create a meaningful picture of the study (synthesis) and be able to draw conclusions and judge what the findings imply and suggest where and how to use results.

To conduct research projects, candidates are required to be equipped with skills of problem identification and skill of problem-solving (Gay et al., Citation2012). According to Gay et al. (Citation2012), the first three steps (in Table ) are considered elements of problem identification, while the next five steps are considered problem-solving. Having equipped with procedural knowledge candidates can identify the problems to be solved and be able to have skill in solving the problems by applying both procedural and epistemic forms of scientific knowledge (Kind, Citation2013). Thus, the contention of this paper is to identify difficulties the candidates faced in conducting research projects concerning procedural and epistemic knowledge along with suggesting research-based teaching-learning strategies that might help overcome the difficulties by jointly considering problem identification and problem-solving aspects. Hence, Figure demonstrates the combination of these aspects concerning different forms of scientific knowledge.

Figure 1. A Conceptual framework that depicts the skills required to conduct research projects in terms of procedural and epistemic knowledge.

Figure 1. A Conceptual framework that depicts the skills required to conduct research projects in terms of procedural and epistemic knowledge.

Table 1. Demonstrates the candidates’ activities about the constructs of procedural and epistemic knowledge

Up on conducting research project, the candidates are expected to complete activities such as identifying a research problem to be solved, developing research questions or hypotheses, researching literature, selecting valid sampling and data gathering tools, analyzing and interpreting data, discussing results, and drawing conclusions based on valid evidence. Though there is no clear demarcation between forms of scientific knowledge and sometimes interact with each other (Kind, Citation2013), literature was reviewed to categorize the activities into the two forms of scientific knowledge. Hence, Table depicts the categorization of candidates’ activities concerning the constructs that represent procedural and epistemic forms of scientific knowledge.

3. Method

The study incorporated both qualitative and quantitative approaches to see the same reality from different perspectives and to make it consistent with the selected research paradigm, the nature of the problem to be investigated, and the type of research questions used. More specifically, an embedded concurrent design was employed to enable the researchers to collect both quantitative and qualitative data simultaneously (Creswell & Clark, Citation2018). Quantitative data (questionnaire) was used to identify the difficulties that the candidates face in conducting research activities. The qualitative data (document analysis) was embedded and used to substantiate the results obtained through quantitative data. As a result, the study followed a mixed research method with qualitative data embedded in quantitative data.

3.1. Research participants and sampling techniques

The target populations for this study were third-year natural science stream (biology, chemistry, and physics departments) candidates who took theoretical and practical research courses. One of the reasons to select this stream was because these were the classes in which one of the authors was assigned to teach the pre-requisite theoretical research course in the previous semester. This allowed observing and noticing the problems or learning difficulties candidates experience while attending classes and preparing their research proposals in groups. In addition, the second author observed the struggle science candidates faced in conducting research projects while evaluating their research project papers and oral presentations. Finally, the natural science stream usually offers courses related to solving problems and this will have implications for future study.

The availability sampling approach was used to choose these candidates from the natural science stream (33 Physics: Female = 3, Male = 30; 34 Chemistry: Female = 10; Male = 24; and 37 Biology: Female = 15, Male = 22). The pilot-tested questionnaires were next administered in person to 89 candidates in the classroom, from which 82 properly filled questionnaire papers were chosen, and grouped as described in the next sections.

3.2. Data gathering instruments and validation

The data for this study were collected via a questionnaire and document analysis. The questionnaire items were adapted from Gay et al. (Citation2012). Two college instructors with more than 10 years of experience in teaching college courses and sufficient experience in conducting research were involved to validate the tools. The instructors provided comments on the questions’ ordering, wording, and conceptual concerns, alignments of the items to the forms of scientific knowledge, and the ways to improve the questions.

The experts’ comments and suggestions for improvement were taken into consideration while revising the items for the administration. Four questions were deleted from the original questionnaire (two questions from each), and all of the questions were changed to address language aspects, statement length, and clarity issues. Finally, eight questions were created to find out what obstacles most candidates faced when doing research activities. They were created in such a way that informants might estimate the difficulty of each of the eight research processes or activities in a logical order by responding very difficult, difficult, less difficult, or easy. The questionnaires were also pilot tested for internal consistency using the SPSS 20 version, which had a reliability of 0.73.

The document analysis was also used to consolidate the results obtained by the questionnaire. Document analysis is a technique for gathering qualitative data to review or evaluate documents (Bowen, Citation2009). Document analysis data were used to corroborate findings across different data sources, and the criteria were established from the literature. In a reference book by Gay et al. (Citation2012), the authors established four criteria for each of the eight steps in the research process. The two instructors, who were selected for validation, examined the criteria developed to determine the quality of the papers and provided comments. The revised final version was then used to evaluate the selected papers based on the comments.

3.3. Data analysis procedures

Both quantitative and qualitative data were collected at the same time for this study. Quantitative data were collected to answer research question 1 from 82 candidates through the questionnaire. The information gathered from the closed-ended questionnaire was tallied, organized into tables, and then analyzed using simple statistical procedures, such as frequencies and percentages.

Bowen (Citation2009) suggested three steps for analyzing data from document analysis. The two authors first skimmed over the entire collection of papers. The purpose of the first step was to choose which research projects the candidates had performed to be considered for inclusion and reviewed for this research. The candidates submitted a total of 13 research project papers, all of which were done in teams. In this step, the researchers assessed and identified research papers that met the study evaluation criteria. The first researcher discovered three research papers that needed to be excluded, while the second identified four research papers that were missing one or more of the eight elements of the research process. One of the research papers, for example, did not have a review of the literature as a separate section, while the other merely presents the results without any discussion. Two of the other research papers were found to have poor organization, a lack of flow, and language issues, as well as missing one or more research study steps. The two researchers then discussed which research papers should be included and which should be discarded. Eventually, it was decided that nine papers that met the requirements were included in this study, whereas four research papers were omitted since they failed to meet one or more steps of the research process.

In the second step, the authors read the research papers thoroughly on their own. The two authors went through the research papers, looking for frequent concepts, words, phrases, and sentences that represented the criteria. To categorize the data, two coding cycles (in vivo and axial) were used (Saldana, Citation2013). Commonly used terms, words, phrases, and sentences that represent the criteria were found via in vivo coding. The identified terms, sentences, and phrases were assigned to the selected criteria in the second cycle of coding (Axial coding). It was scored “YES” when there were terms, phrases, and sentences associated with the criterion (see Table as an example). When there are no terms, phrases, sentences, and words not associated with the criteria scored as “NO.” For a review of the literature, for instance, writings that include areas that need further investigation and discuss differences among studies were categorized under the criteria of “gap identification” and scored as “YES.” When there are no such terms, phrases, and sentences it was scored as “NO”.

Finally, the authors judged the quality of the research papers using the evaluation criteria for each of the eight research steps. The authors then assigned score values (YES or NO) to the research papers based on the criteria and achieved an average inter-rater agreement of 87%. For some disagreements, discussions were made to resolve the concerns, and the initial researcher’s score was ultimately employed in this study. Likewise, qualitative data were quantified and embedded into a quantitative questionnaire. To embed the data and analyze quantitatively, research papers that meet one of the four quality criteria (for each step) were assigned to very difficult, research papers with two qualities were assigned to difficult, research papers with three qualities were assigned to less difficult, and research papers with four qualities were assigned to easy on the questionnaire rating scale. The results of the document analysis were then tabulated, as shown in Table .

Table 2. Depicts the coding of paper 7 for literature review by one of the authors

4. Results

The research process consists of a sequence of activities or steps to be followed in carrying out research properly. The procedure begins with an identification of the research problem and ends with a conclusion. The researchers in this study, therefore, attempted to uncover important difficulty areas candidates faced in conducting a research project and aspects of research skills difficulty in research report writing.

4.1. Difficulties inherent to conducting research projects

This section presents the difficulties inherent to conducting research projects as perceived by the candidates. Accordingly, candidates’ responses respective to the eight research project steps are presented in Table .

Table 3. Aspects of research activities that candidates face difficulties in

As can be seen in Table , the extent of difficulties that the candidates faced differs considerably from one area to another. Thus, the top three areas where the candidates faced difficulty were as follows: unfamiliarity with data analysis and interpretation (77 percent of candidates found it difficult), data collection and organization (67 percent of candidates found it difficult), and sampling and selecting data gathering tools (65 percent of candidates found it difficult). A review of related literature in which 61 percent of candidates found it difficult could be the candidates’ next challenge in sequence. According to the candidates’ views, they lacked the procedural knowledge that is required to accomplish such activities.

On the other hand, formulating research questions or objectives for 57% of the candidates and discussion of results for 62% of the candidates was not perceived as difficult. However, concluding findings and identifying the research problem seemed to be no longer a difficult task because the distribution of scores was more or less similar.

From the candidates’ responses, it can be argued that the most confounding part of conducting research was the methodology section, whereas the problem identification and drawing conclusions were not viewed as challenging tasks. This was substantiated by document analysis. Table shows the results of the document analysis of nine research projects conducted in groups of pre-service teachers.

Table 4. Difficulty as demonstrated in research report writing

4.2. Difficulties related to writing research report

This section presents difficulties demonstrated in writing research project papers as per the authors’ evaluation. Accordingly, the papers quality to the eight research project steps is presented in Table .

The discussion of the results was the most challenging part of the candidates’ research project papers, followed by data analysis and interpretation. The main findings within each paper lacked justification, and the relationship between the data and the literature was also not adequately discussed. In addition, the reviewed document analysis made apparent that in 6(66%) and 3(33%) of the papers, the data were simply described and not properly interpreted, lacked analysis procedures, and did not make proper use of statistical tools. This is consistent with the questionnaire result in Table . Contrary to the questionnaire result, the discussion of results was a significant problem for the candidates because a little attempt was made evident in discussing the results in nearly all the papers 9(100%) reviewed through document analysis. For instance, evidence obtained from reviewing literature was not adequate and the candidates seem unable to integrate and synthesize them with their research findings. Instead, there was simply jumping to conclusions and recommendations without having sufficient and relevant data to support their claims. Thus, drawing conclusions and judging as to what could be the implications of the findings could be the third area where considerable difficulty 8 (88%) was reflected in writing up their research report. These results indicate that the candidates lacked the required procedural and epistemic knowledge to accomplish the activities.

The reviewing of literature, data collection and organization, and sampling and data gathering tools were found to be other areas of concern, where difficulty was demonstrated in papers reviewed in document analysis (77%, 77%, and 66% respectively). Although the extent of the level of difficulty varies, what is displayed here complements the questionnaire result. Because the reviewed papers manifested candidates’ struggle to identify appropriate and relevant literature, employ proper sampling strategies, use up-to-date references, and adequately cover a wide range of concepts discussed in the paper. Thus, candidates face considerable difficulty, particularly in determining the appropriate method, tool selection, data collection and organizing, data analysis, and interpretation.

The research papers review result again indicates that both problem identification and formulating basic research questions appeared to be less difficult as compared to other sections (44% and 55% respectively). This complements the results in Table above. Thus, the findings could imply that candidates consider themselves capable to identify a research problem and formulate the basic research questions accordingly. However, in the context of a real classroom situation, identifying suitable research problems for investigation may not be as simple as the candidates may think. The reason why the findings appeared consistent could be attributed to the fact that candidates simply obtain crude research problems from issues discussed in the classroom or senior batches, which are then shaped or refined based on concepts discussed in the classroom and guidance obtained from their instructors during the research proposal development. The other reason for the relatively less difficulty of these activities could be that such activities are being more related to conceptual knowledge. Therefore, the actual evaluation of the candidates’ paper review revealed that candidates lacked procedural and epistemic knowledge to conduct the research tasks. The results further show that problem-solving for the candidates are more demanding than problem-identification.

5. Discussions

This study investigated the major difficulty area the science candidates faced in conducting research projects. The study identified the activities the candidates are expected to accomplish in conducting research projects from the theoretical and practical courses. An attempt was made to examine how candidates’ activities align with the knowledge required to complete the tasks. Thus, the candidates’ ability to conduct research projects was investigated concerning their skills of problem identification and problem-solving, in relation to procedural and epistemic knowledge. To achieve this, data were collected using a questionnaire (candidates’ perceived level of difficulty) and document analysis (actual difficulty as it appeared in the research project paper evaluation). Accordingly, the result of the two research questions vis-à-vis of literature was discussed in the following section.

5.1. Difficulties candidates experienced in conducting research projects

As shown in Tables candidates seemed to have more difficulties in solving problems than identifying problems. Due to data analysis and interpretation, data collection and organization, determining appropriate data gathering tools were areas candidates faced considerable difficulty along with reviewing literature, respectively. Surprisingly, identifying the research problem and concluding the findings from the two extremes, perhaps for the reasons mentioned above, were considered as not challenging research activities.

Thus, candidate comprehension requires not just having specific knowledge but also be required to apply it. It means that the candidates’ important skills should be evident in the works they exhibit as well as the things he can build, write, and design (Roth, Citation1998). This implies practical research demands producing new research output with the knowledge gained and the skills developed rather than rehearsing what has been learned for simple tasks that could merely be achieved by content knowledge.

Besides, the findings are in some ways in disagreement with previous studies. According to some studies, the most difficult area of conducting scientific research for inexperienced researchers is identifying a problem instead of solving it (Gay et al., Citation2012; Rahman et al., Citation2014). Others have also found that organizing data and data analysis from the methods section is one of the problematic areas in conducting research (deBorja, Citation2018; O’connor et al., Citation2006). This seems to be very compatible with what is presented, particularly in item 6 under Table .

Some other studies identified writing a review of the literature section is also another challenging area for naïve researchers in conducting research (deBorja, Citation2018; O’connor et al., Citation2006; Toquero, Citation2021). Again, this complements with the research results of this study for it was another research activity candidates confronted with difficulty. However, the first two research activities, i.e., identifying the research problem and formulating the research questions, were viewed by the candidates as relatively unchallenged areas. The reason for this could be that the activities are more aligned to conceptual knowledge than procedural and epistemic knowledge as they are categorized under the hypothesis generation that dominates conceptual knowledge than the other two (Kind, Citation2013).

5.2. Difficulties demonstrated in writing research reports

One of the remarkable points revealed in candidates writing research reports was discussion of results and drawing relevant conclusions based on evidences. Because these are the research activities perceived as relatively simple tasks yet appeared to be the most challenging in actual practice, while candidates were writing their research reports. Likewise, identifying research problem and formulating research questions or objectives were not found to be much challenging in writing research reports because these are also among the issues partially addressed during proposal development.

On the other hand, O’connor et al. (Citation2006) found that organizing and writing findings, writing discussions and defining research questions were reasonably simple. Other than defining research questions, this seems to be quite contradictory to what has been displayed under Table . Though the ranking of difficulty areas varies from study to study, it appears that the most difficult areas reported in studies are data analysis and interpretation, and literature review. This actually complements with this study results. Thus, the study findings imply that candidates lacked mostly the procedural knowledge and partly epistemic knowledge that are required to perform such demanding tasks.

Literature also agrees with this result, claiming that the primary cause for the ineffectiveness of inquiry tasks and activities that require higher-order understanding is the inability to design and incorporate procedural knowledge in students’ learning (Arnold et al., Citation2021; Gott & Murphy, Citation1987). The other reasons could also be related to the fact that it is demanding to incorporate these forms of scientific knowledge to classroom context as compared to content knowledge because they required instructors adequate knowledge and skill (Abate et al., Citation2020), candidates’ willingness to move from the familiar comfort zone of content knowledge (Osborne, Citation2014), and instructors belief of conceptual knowledge is more important than the other two forms of scientific knowledge (Hurrell, Citation2021).

6. Conclusions and implications

From what has been discussed so far, one can reasonably conclude that conducting research requires sufficient scientific knowledge and a set of problem-solving skills. However, the research courses candidates took did not prepare them for conducting research projects that need high-level knowledge and skills. As could be seen in the questionnaire result, the difficulties made evident, particularly in analysing, organizing and collecting data, and sampling and tool selection, whereas in document analysis result, candidates significantly challenged in discussion, data analysis and drawing evidence-based conclusions, respectively.

As a result, it was found that the candidates encountered considerable difficulty in carrying out research work independently. Although the extent varies, candidates are confronted with much difficulty in aspects of problem solving as compared to problem identification research activities. This could most probably be related to the ways by which the courses were taught and the assessment techniques employed did not adequately lift the learners from the lower learning ladder to the higher-order thinking.

It could also be reasonable to argue that, among others, the problems for the candidates’ inability to carry out research projects in groups was rooted or linked with the ways how the pre-requisite theoretical research course was delivered to the learners. Thus, before the candidates are engaged in doing their practical research, instructors should ascertain that candidates have adequately understood basic research concepts and procedures involved in conducting research during proposal development.

In line with this, the instructors should utilize varied teaching strategies that can give candidates sufficient learning opportunities to exercise activities such as identifying suitable research problems, collecting and analysing data, and drawing evidence-based conclusions in such a way to develop higher forms of scientific knowledge (procedural and epistemic knowledge) which in turn enable them conduct practical research projects.

Furthermore, the teaching goals in education and teacher education curriculum frameworks should introduce the three forms of scientific knowledge: content, procedural, and epistemic knowledge explicitly (Arnold et al., Citation2021). This study indicated that candidates’ ability in conducting research projects is limited in the lower activities, which could probably be achieved by only content knowledge. This suggests that there is a need to shift from merely content knowledge to the procedural and epistemic knowledge to act in higher-order thinking that is required to handle demanding activities (Abate et al., Citation2020; Kind & Osborne, Citation2017). More studies are also required to explore the difficulty of conducting research projects incorporating instructors’ role in supporting the candidates in carrying out research activities.

Acknowledgments

We would like to express our appreciation to Professor Fred Lubben for his contributions, comments, and suggestions.

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • Abate, T., Michael, K., & Angell, C. (2020). Assessment of Scientific Reasoning: Development and Validation of Scientific Reasoning Assessment Tool. Eurasia Journal of Mathematics, Science and Technology Education, 16(12), em1927. https://doi.org/10.29333/ejmste/9353
  • Abate, T., Michael, K., & Angell, C. (2021). Upper Primary Students’ Views Vis-à-Vis Scientific Reasoning Progress Levels in Physics. Eurasia Journal of Mathematics, Science and Technology Education, 17(5), em1958. https://doi.org/10.29333/ejmste/10834
  • Akyürek, E., & Afacan, Ö. (2018). Problems Encountered during the Scientific Research Process in Graduate Education: The Institute of Educational Sciences. Higher Education Studies, 8(2), 47–14. https://doi.org/10.5539/hes.v8n2p47
  • Alemu, M., Kind, P., Tadesse, M., Atnafu, M., & Michael, K. (2017). Challenges of Science Teacher Education in Low-income Nations - the Case of Ethiopia. Esera-17 Conference Proceedings Part 13. STRAND, 13, 1782.
  • Arnold, J. C., Mühling, A., & Kremer, K. (2021). Exploring core ideas of procedural understanding in scientific inquiry using educational data mining. Research in Science & Technological Education, 41(1), 1–21. https://doi.org/10.1080/02635143.2021.1909552
  • Azmi, N., & Daud, N. (2018). A Relationship Between Research Skills and Autonomous Learning Among Postgraduate Students. International Journal of Business, Economics and Law, 18(6), 2289–1552.
  • Bowen, G. A. (2009). Document Analysis as a Qualitative Research Method. Qualitative Research Journal, 9(2), 27–40. http://dx.doi.org/10.3316/QRJ0902027
  • Carr, R., Palmer, S., & Hagel, P. (2015). Active learning: The importance of developing a comprehensive measure. Active Learning in Higher Education, 16(3), 173–186. https://doi.org/10.1177/1469787415589529
  • Creswell, J., & Clark, V. (2018). Designing and conducting mixed methods research (3rd ed.). Sage.
  • deBorja, J. M. (2018). Teacher action research: Its difficulties and implications. Humanities & Social Science Reviews, 6(1), 29–35. https://doi.org/10.18510/hssr.2018.616
  • Dole, S., Bloom, L., & Kowalske, K. (2016). Transforming pedagogy: Changing perspectives from teacher-centered to learner-centered. Interdisciplinary Journal of Problem-Based Learning, 10(1). https://doi.org/10.7771/1541-5015.1538
  • Fayomi, O. S. I., Okokpujie, I. P., & Kilanko, O. O. (2018). Challenges of Research in Contemporary Africa World. IOP Conf. Series.
  • Gay, L. R., Mills, G. E., & Airasian, P. W. (2012). Educational research: Competencies for analysis and applications (10th ed.). Pearson.
  • Gott, R., & Roberts, R. (2008). Concepts of evidence and their role in open-ended practical investigations and scientific literacy; background to published papers Retrieved from. https://cofev.webspace.durham.ac.uk/wp-content/uploads/sites/299/2022/05/Gott-Roberts-2008-Research-Report.pdf
  • Gott, R., & Murphy, P. (1987). Assessing Investigation at Ages 13 and 15. Assessment of Performance Unit Science Report for Teachers: Hatfield: ASE. 9. Department of Education and Science, Welsh Office,Department of Education for Northern Ireland.
  • Haller, E. J., & Kleine, P. F. (2001). Using educational research: A school administrator’s guide. Addison Wesley Longman.
  • Halpern, D. F. (2014). Critical thinking across the curriculum: A brief edition of thought & knowledge. Routledge.
  • Hurrell, D. P. (2021). Conceptual knowledge or procedural knowledge or conceptual knowledge and procedural knowledge: Why the conjunction is important to teachers. Australian Journal of Teacher Education, 46(2), 57–71. https://doi.org/10.14221/ajte.2021v46n2.4
  • Johnson, B., & Christensen, L. (2008). Educational research: Quantitative,qualitative, and mixed approaches. Sage.
  • Joshi, R., & Verspoor, A. (2013). Secondary education in Ethiopia: Supporting Growth and Transformation. World Bank.
  • Kelly, G. J., & Duschl, R., (2002). Toward a research agenda for epistemological studies in scienceeducation. Paper presented at the National Association for Research in ScienceTeaching.
  • Kind, P. (2013). Establishing a ssessment S cales U sing a N ovel D isciplinary R ationale for S cientific R easoning. Journal of Research in Science Teaching, 50(5), 530–560. https://doi.org/10.1002/tea.21086
  • Kind, P., & Osborne, J. (2017). Styles of scientific reasoning: A cultural rationale for science education? Science Education, 101(1), 8–31. https://doi.org/10.1002/sce.21251
  • MoE&, J. I. C. A. (2016). Strategic Policy for National Science, Technology and Mathematics Education. Addis Ababa.
  • O’connor, K., Greene, H., & Anderson, P. (2006, April). Action research: A tool for improving teacher quality and classroom practice. Paper Discussion Presented at the American Educational Research Association 2006 Annual Meeting,
  • OECD. (2016). Education at a Glance 2016: OECD Indicators. OECD Publishing.
  • Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science TeacherEducation, 25(2), 177–196. https://doi.org/10.1007/s10972-014-9384-1
  • Rahman, S., Yasin, R. M., Salamuddin, N., & Surat, S. (2014). The use of metacognitive strategies to develop research skills among postgraduate students. Asian Social Science, 10(19), 271–275. https://doi.org/10.5539/ass.v10n19p271
  • Roth, T. L. R. (Ed.). (1998). The Role of the University in the Preparation of Teachers (1st ed.). Routledge. https://doi.org/10.4324/9780203982068
  • Saldana, J. (2013). The coding manual for qualitative researchers (2nd ed.). Sage.
  • Sjøberg, S. (2018). The power and paradoxes of PISA: Should we sacrifice Inquiry-Based Science Education (IBSE) to climb on the Rankings? NorDina, Nordic Studies in Science Education, 14(2), 186–202. https://doi.org/10.5617/nordina.6185
  • Surif, J., Ibrahim, N. H., & Mokhtar, M. (2012). Conceptual and procedural knowledge in problem solving. Procedia-Social and Behavioral Sciences, 56, 416–425. https://doi.org/10.1016/j.sbspro.2012.09.671
  • Toquero, C. M. D. (2021). Real-world:” pre service teachers’ research competence and research difficulties in action research. Journal of Applied Research in Higher Education, 13(1), 126–148. https://doi.org/10.1108/JARHE-03-2019-0060
  • Vorholzer, A., Von Aufschnaiter, C., & Boone, W. J. (2020). Fostering upper secondary students’ ability to engage in practices of scientific investigation: A comparative analysis of an explicit and an implicit instructional approach. Research in Science Education, 50(1), 333–359. https://doi.org/10.1007/s11165-018-9691-1