7,861
Views
0
CrossRef citations to date
0
Altmetric
Articles

Teaching science & technology: components of scientific literacy and insight into the steps of research

ORCID Icon
Pages 1916-1931 | Received 16 Mar 2022, Accepted 20 Jul 2022, Published online: 05 Aug 2022

ABSTRACT

Trust in science is indispensable and very important for the development of global health, and can certainly be gained by developing the scientific literacy of the whole population. The article presents various definitions of scientific literacy and seeks its connections to science education. It further explores the progress in developing scientific research skills among university students. The students participated in a one-week seminar and workshop on scientific literacy and were given an assignment. Using the pretest and posttest, we assessed students’ development of scientific research skills, specifically the following procedural skills in scientific research: (a) a student poses high-quality research questions; (b) a student formulates a scientifically testable hypothesis(es) that includes a dependent and an independent variable; and (c) a student designs the experiment. The results show that the students’ procedural knowledge of scientific research has improved in relation to the areas covered. The experiment showed no statistically significant difference in procedural knowledge in scientific research between undergraduate and postgraduate students. Scientific activities, properly integrated into the educational process, have great potential at all levels of education to improve the scientific literacy of the entire population.

Introduction and background to scientific literacy

In the time of the epidemic, we experience great contradictions: on the one hand, there is science and scientists who have proven themselves with an extremely rapid response and are discovering ways to successfully combat the virus in various fields. On the other hand, there is a surprisingly large number of people who are incredibly searching for arguments against scientific findings. However, there is no development without research. Therefore, people's trust in science and the work of scientists is very important for further development and can certainly be gained by developing the scientific literacy of the entire population.

Pella et al. (Citation1966) pioneered an empirical basis for defining scientific literacy and established the framework for identifying individuals as scientifically literate. Their work was further elaborated by Showalter (Citation1974, p. 450), who defined seven dimensions of scientific literacy: A scientifically literate person:

  • understands the nature of science,

  • applies relevant scientific concepts, principles, laws and theories accurately and reliably,

  • uses scientific processes to solve problems and make decisions, and strengthens his or her own understanding,

  • communicates taking into account different aspects and in accordance with the values of science,

  • understands and appreciates scientific and technological enterprises and their attitudes to individual aspects of society,

  • as a result of scientific education, has developed a richer and more exciting view of the world and a desire for further, lifelong learning, and

  • has developed many skills for manipulating science and technology.

Scientific literacy is considered by the authors from different perspectives (Aiman & Hasyda, Citation2020; Amir et al., Citation2017; Avsec & Savec, Citation2019; Bybee, Citation1997; Rutt & Mumba, Citation2021).

According to Glaze (Citation2018), scientific literacy is ‘the knowledge and understanding of scientific concepts and processes required for personal decision making, participation in civic and cultural affairs, and economic productivity’.

Scientific literacy as a goal of education for sustainable development focuses attention to development of learning, inquiry, and ‘transfer’ skills to enable young people to be able to use the aquired knowledge and skills in everyday and professional activities (Jonāne, Citation2015). There seems to be a widespread consensus that scientific literacy is a ‘good thing’, but let us first explain the arguments for such a ‘silent’ consensus. Laugksch (Citation2000) has presented the following arguments in detail:

  1. The first argument relates to the nation's economic prosperity and is based on the fact that national prosperity depends on the successful competition in international markets, and international competition is always based on a strong national research and development programme.

  2. The second argument is also related to the economic prospects of the countries. A higher level of scientific literacy in the population leads to more support for science.

  3. The third argument is that the public's expectations and trust in science are largely related to their understanding of the goals, processes and capabilities of science. And precisely because a large part of science is publicly funded and the public has a legitimate interest in science, it is important that they understand it properly.

  4. The fourth argument, which starts at the level of social relations, concerns the relationship between science and culture.

The arguments put forward for promoting scientific literacy in the population include the benefits to economies, science itself, science policy and democratic practice, and society (Thomas & Durant, Citation1987). However, scientific literacy also has benefits for individuals. More aware and informed citizens can more easily and effectively make personal decisions, e.g. about smoking, vaccination, and the like.

Higher and advanced knowledge of the nature and knowledge of science and technology helps individuals to resist pseudoscientific information that negatively influences them through various media. A high level of scientific literacy enables individuals to feel more confident and qualified in dealing with issues that arise in their daily lives related to science, but it also enables them to have better employment opportunities.

Science and technology education

It is the role of faculties of science not only to teach our content but to ‘provide students with the necessary background knowledge and skills for this pursuit’ and to ‘help motivate students to value scientific knowledge and skills’ (Glaze, Citation2018; Koballa et al., Citation1997). It is a well-known fact that learning science and technology enables the development of an individual's scientific literacy (Brown et al., Citation2005; Glaze, Citation2018; Shwartz et al., Citation2005).

Discussions about the definition and the role of science education date back to the 1980s (Roberts, Citation1988). The definition of science education depends on what the individual considers to be the overriding goal of science education, therefore it depends on who is dealing with it (i.e. the student, teacher, parent, scientist, and politician) and his or her conception of science (Hodson, Citation1998; Roberts, Citation1988). Still, others see its aim as responding to economic needs, social crises, or environmental problems. Science education curricula, therefore, have a very wide range of learning content. The basic scientific knowledge and concepts come from historical research, and if we want to teach these to young people at a higher cognitive level, it is essential to include the development of scientific thinking and procedural skills, and thus the development of scientific literacy, in science education.

How should science be taught and what scientificity level should be adopted in science and technology education?

There has been much interest in research examining science education in the academic environment and the effectiveness of different teaching approaches.

Holbrook & Rannikmae (Citation2009) proposed that in order to strengthen true scientific literacy (Shamos, Citation1995) or multidimensional scientific literacy in an educational context, developing socio-scientific decision-making and scientific problem-solving skills is more important than basic understanding of fundamental content knowledge. His teaching model leads to an approach through which ‘education through science’ is perceived as a more appropriate description of the teaching emphasis than ‘science through education’.

Teaching based on this model is strongly related to activity theory (Roth & Lee, Citation2004; van Aalsvoort, Citation2004a, Citation2004b) where student needs and motivation form the major focus (Holbrook & Rannikmae, Citation2009). Holbrook & Rannikmae's (Citation2009) model is in line with the categorisation of learning goals by Hodson (Citation1998). He divided the goals of science education into the following categories:

  • Learning science enables the acquisition of theoretical knowledge that reaches the level of understanding.

  • Learning about science develops an understanding of nature and scientific methods, respect for its history and development, and an awareness of the complex interactions among science, technology, society, and the environment.

  • Doing science develops process knowledge in scientific inquiry and problem solving.

Learning is successful when it is involved in authentic (regular daily) and meaningful activities (Hodson, Citation1998; Jonāne, Citation2015; Kaeedi et al., Citation2021). Thus, science learning means that students are involved in the activities of the scientific community with the help of mentors/teachers.

DebBurman (Citation2002) suggests that involving students in real research projects promotes interest in complex scientific content, and contributes to the development of scientific experimental skills. Inquiry-based learning as a combination of problem-based learning, independent work, research projects, fieldwork, case studies, and investigations offers many opportunities for learning activities that put the student at the centre of the learning process and allow for many different ways of inclusion in the curriculum (Kahn & O’Rourke, Citation2005).

We believe that student-centered learning enables the development of a good researcher, which will not only benefit their direct study but will also lay the foundations for lifelong learning.

Scientific research skills

In addition to pure science content knowledge, the development of procedural knowledge and scientific research skills is crucial for an individual's scientific literacy (Aiman & Hasyda, Citation2020; Cresswell et al., Citation2015). Procedural knowledge can be understood as tools or scientific research methods and techniques used in scientific research. In its recommendations for the implementation of the study process, the Boyer Commission on Educating Undergraduates in the Research University (Citation1998) emphasised the importance of research-based learning. This perspective has also been emphasised by Lemon et al. (Citation2013), who highlighted the importance of research skills, and even rank them among the competencies of the twenty-first century.

Teaching scientific research (theoretically and practically), and engagement in research endeavours enhances research intellectual and practical skills, nurtures high-order cognitive skills (e.g. critical appraising, problem troubleshooting, idea processing and wise judging), augments interest in inquiry-based learning, generates scientific publications, encourages involvement in future research activities and ultimately supports entry to varying research-focused careers. (Abu-Zaid, Citation2014)

Through problem solving, students develop their procedural knowledge, skills, and conceptual understanding. By observing and experimenting scientifically, they further develop and expand their scientific research skills. In the teaching/education process, proper planning of learning activities is therefore very important for the development and consolidation of conceptual and procedural knowledge (Ploj Virtič & Šorgo, Citation2016; Šorgo & Ploj Virtič, Citation2020). They should be designed to include a variety of scientific research methods that enable students to better understand the general nature of scientific research. They will no longer discover for themselves only the methods of scientific research, but also the way conceptual knowledge is organised in science. Different scientific disciplines offer different pedagogical approaches. In the field of science, these are often experiments, but they are not always feasible. The obstacle may lie in the nature of the experiment, which may be unethical or impractical in terms of time and space. In such cases, it is very practical to introduce different models into the learning process (Pisano et al., Citation2020).

When talking about procedural knowledge, one must consider the different methodological approaches in the various disciplines. The methodology of scientific research differs slightly between disciplines, but we can see from the following examples of the most used methods that it has a common denominator:

Natural Science

Prior Knowledge, Experience and Observation,

Research question,

Hypothesis,

Designing experiments and test hypothesis.

Social science

Prior Knowledge and Experience,

Research question,

Hypothesis,

Designing experimental or non-experimental study,

Using qualitative or quantitative statistical methods to test hypothesis.

Scientific research procedural steps, therefore, depend on the scientific discipline. There are not many studies that deal in-depth with the individual procedural steps of scientific research, but nevertheless the authors define scientific literacy through various procedural steps.

Marteel-Parrish and Lipchock (Citation2018) define: (i) mastery of the scientific literature; (ii) career preparation and marketing skills; (iii) enhancement of scientific communication skills (oral, written and visual); (iv) awareness of ethical codes and guidelines in research; and (v) an appreciation of the role of science in contemporary moral/societal issues, especially sustained science literacy.

Kaeedi et al. (Citation2021), in their study highlighting the important role of universities in improving research procedural skills, divide them into: identifying and formulating research questions; improving the ability to read and review the literature; promoting understanding of the philosophical foundations of research; improving skills in planning and conducting research; and enabling students to correct their mistakes in their research projects during the course.

Studies on developing and assessing scientific competencies

The development of scientific literacy, like the concept of scientific literacy itself, is an extremely complex undertaking. Its effectiveness is influenced by many factors and has been studied by various authors.

Zhu (Citation2019) investigated how students’ science literacy is influenced by their attitudes. His findings show that the way students’ attitudes affect their scientific competencies does not differ (at a statistically significant level) by gender or district. His findings confirm that ‘scientific competencies and attitudes are closely related and should be equally valued in science classrooms, in relation to scientific knowledge’ (Zhu, Citation2019, p. 2107).

The level of scientific literacy of 15-year-olds at the global level is monitored by the OECD, which assesses science literacy in the PISA survey (Cresswell et al., Citation2015). Scientific literacy is assessed as a combination of scientific competencies, knowledge and attitudes (). About 50% of the tasks assess the competence to explain phenomena scientifically, 30% the competence to interpret data and evidence scientifically, and 20% the competence to design and evaluate scientific research.

Table 1. Major components of the PISA 2015 Framework for Scienticif Literacy (OECD, Citation2013).

A deep understanding of 15-year-olds’ scientific research skills enables educators to integrate learning activities, help them better teach science, and support school systems in promoting science as a fundamental skill.

Several authors have reported examples of learning activities that promote the development of scientific research skills and competencies (Ratte et al., Citation2018; Tsai, Citation2015; Tsai et al., Citation2020; Weeks et al., Citation2014).

However, we have not found a systematic approach in the literature that would build procedural knowledge and scientific research skills. The scientific research procedural steps, we are considering in the present study, are summarised from the descriptors for procedural knowledge, developed in the national project ‘Natural Science and Mathematics Literacy: Encouraging Critical Thinking and Problem Solving':

  • posing high-quality research questions whose answers can be tested experimentally,

  • formulating a scientifically testable hypothesis(es) based on the research question and related knowledge, which includes a dependent and an independent variable, and

  • designing the experiment by defining the variables (dependent and independent) to be studied.

Methodology

A study was conducted as part of a short-term visiting professorship at the SPI Doctoral School at the University of Lille (France). The students participated in a week-long seminar and workshop and followed by a quasi-experiment described below.

The sample of the study

The study was conducted with a sample of 31 students: 14 undergraduate students from the Faculty of Science and Technology (Physics, Physics-Chemistry) and 17 postgraduate students from the SPI PhD School (Nanoscience/Nanotechnology).

Description of the study

During the seminar entitled ‘Components of Scientific Literacy and Insight into the Steps of Research in History of Physics’, the students learned about the historical aspect of physical theories and discussed the importance of scientific literacy in daily life. After the seminar, they were given two assignments.

Using the pre-test and the post-test, we assessed and statistically estimated the development of the students’ scientific research skills. The pre-test (Appendix A) defined a scientific problem and enabled students to think like a scientist.

The post-test (Appendix B) was followed by an evaluation of the pre-test results and a comprehensive discussion. The evaluation of the pre-test results served as a starting point for the discussion. I selected some examples of the answers in the pre-test, which we analysed together. I selected the most common errors revealed in the pre-test, such as formulating a hypothesis that is not testable because it involves several factors at once. I also pointed out the importance of considering the hypotheses when designing an experiment … In doing so, I encouraged participants to think outside the box. The comprehensive discussion highlighted important factors of procedural knowledge of scientific research in various cases, covering different areas of the disciplines, both natural and social sciences.

In the post-test, the students were given the same task, but they were asked to find their own problem and tackle it using the steps they had learned.

Assessing criteria

As a starting point for developing criteria for assessing the progress of students’ scientific literacy, we have taken the descriptors for procedural knowledge developed in the national project ‘Natural Science and Mathematics Literacy: Encouraging Critical Thinking and Problem Solving’ (NA-MA-POTI) ().

Table 2. Descriptors for procedural knowledge in scientific research (adapted from Bačnik et al., Citation2019).

Following the descriptors in , we have summarised the scientific research skills to be focused on in the study:

  1. a student poses high-quality research questions whose answers can be tested experimentally;

  2. a student formulates a scientifically testable hypothesis(es) based on the research question and related knowledge, which includes a dependent and an independent variable;

  3. a student designs the experiment by defining the variables (dependent and independent) to be studied.

Research question and hypotheses

We set two research questions:

RQ1: What’s the difference in students’ scientific literacy between undergraduate and postgraduate students?

RQ2: How can we improve students’ scientific literacy through appropriate learning activities?

To find the answers to the research questions, we hypothesised the following:

H1: Planned learning activity will statistically significantly improve the ability to pose high quality research questions in students.

H2: Planned learning activity will statistically significantly improve the ability to formulate scientifically verifiable hypotheses that includes dependent and independent variable in students.

H3: Planned learning activity will statistically significantly improve students’ ability to design the experiment.

H4: There is a statistically significant difference in procedural knowledge of scientific research between undergraduate and postgraduate students.

Statistical analysis

The collected data were analysed qualitatively and quantitatively. Students’ responses to the pre-test and post-test were categorised through qualitative analysis, which is described in the coding section. The data were also analysed with descriptive statistics using SPSS. To identify statistically significant differences between students based on their level of study, we calculated the analysis of variance ANOVA based on a confirmed assumption of a normal distribution of the data. The learning progress of the students was calculated using the Cohen's d effect size using the Psychometric Online Engine (Lenhard & Lenhard, Citation2016) and interpreted according to the recommendations on this website. Margins were set as follows: 0 < no effect < 0.2 < small effect < 0.5 < medium effect < 0.8 < large effect.

Coding of the responses

Coding was done using criteria that we developed based on an in-depth review of students’ responses in the pre-test and post-test. The criteria (as follows) were developed for each procedural scientific research skill we focused on in the study.

Students’ responses to the first task ‘Define a Research Question (RQ)’ were divided into 4 categories:

  • The RQ is irrelevant/not defined as a question

  • The RQ is deficient/a question is transferred to another area

  • The RQ is deficient/a question is too specific

  • The RQ is relevant/answers can be verified experimentally

Students’ responses to the second task ‘Set the Hypotheses (Hs) based on the RQ and underline the dependent variables’ were divided into 5 categories:

  • The Hs are not posed/not defined as an assumption

  • The Hs are not posed/there is just a list of factors

  • The Hs are deficient no variables are included

  • The Hs are relevant, variables included, dependent variables not recognised

  • The Hs are relevant, variables included, dependent variables recognised

Students’ responses to the third task ‘Design the Experiment to test your Hypotheses’ were divided into 4 categories:

  • The research plan is irrelevant/not possible to check the Hs

  • The research plan is deficient no variables are included

  • The research plan is deficient, variables included

  • The research plan is relevant

Results

After completing the qualitative analysis and coding the responses, the data were analysed quantitatively using the SPSS programme. The descriptive statistics are presented in . shows the frequency distribution of the results by category for the first task ‘Define a Research Question (RQ)’ in the pre- and post-test.

Table 3. The frequency distribution of the results by category for the first task ‘Define a Research Question (RQ)’ in the pre- and post-test.

The results in show that in the pre-test the answers in the first three categories dominate, while in the post-test the number of correctly defined research questions jumps.

shows the frequency distribution of the results by category for the second task ‘Set the hypotheses based on the RQ and underline the dependent variables’ in the pre- and post-test.

Table 4. The frequency distribution of the results by category for the second task ‘Set the hypotheses based on the RQ and underline the dependent variables’ in the pre- and post-test.

The improvement in students’ ability to set the hypotheses based on the RQ is clear:

  • 32.3% of students were unable to define assumptions in pre-test, while only 3.2% remained in this category in the post-test.

  • Significant progress can also be seen in the 4th category, where 19.4% ranked in the pre-test and as many as 35.5% in the post-test, and in the 5th category, where 9.7% ranked in the pre-test and twice as many (19.4%) in the post-test.

shows the frequency distribution of the results by category for the third task ‘Design the experiment to test your hypotheses’ in the pre- and post-test.

Table 5. The frequency distribution of the results by category for the third task ‘Design the experiment to test your hypotheses’ in the pre- and post-test.

The improvement in students’ ability to design an experiment is evident:

  • 32.3% of students were unable to design an experiment to test the hypotheses in the pre-test, while only 3.2% remained in this category in the post-test.

  • Significant progress can also be seen in the 4th category, where 9.7% ranked in the pre-test and as many as 35.5% in the post-test, and in the 5th category, where 22.6% ranked in the pre-test and double (45.2%) in the post-test.

The authors relate the similarity of the results in and to the fact that prior knowledge and the ability to formulate hypotheses are necessary for the correct design of an experiment that allows hypotheses to be tested. So, it is a very logical explanation that the post-test improved the designing outcome because the students also improved their ability to formulate hypotheses. Although the improvement in students’ procedural knowledge of scientific research (ability to pose high quality research questions, formulate scientifically testable hypothesis(es) and design the experiment) is evident from , this is not sufficient to confirm with certainty that the progress is statistically significant. Therefore, the Cohens’ d effect size was calculated to represent the absolute value of the difference between the pre-test and post-test results in terms of their common standard deviation. The Cohens’ d effect size was calculated from the value of Mean and Standard deviation using the Psychometric Online Engine (Lenhard & Lenhard, Citation2016) and interpreted according to the recommendations on this website. Margins were set as follows: 0 < no effect < 0.2 < small effect < 0.5 < medium effect < 0.8 < large effect.

The descriptive statistics and Cohens’ d effect size are shown in .

Table 6. Descriptive Statistics and Cohens’ d effect Size (Cohen, Citation1988).

Using Cohens’ d-values from , we can verify the H1, H2 and H3 that we established at the beginning of our study. In particular:

  • The effect size of improving students’ ability to pose high quality research questions is medium, therefore H1 is confirmed.

  • The effect sizes of improving students’ ability to formulate scientifically testable hypotheses containing dependent and independent variables and to design the experiment are large, therefore H2 and H3 are confirmed.

We were also interested in whether there was a statistically significant difference in the advancement of scientific literacy between undergraduate and postgraduate students (test of H4). Using the Kolmogorov-Smirnov test, we found that the variables were normally distributed, justifying the further use of one-way analysis of variance. To compare the means, we performed a one-way analysis of variance ANOVA, which revealed no statistically significant differences: p = .982 for posing a high-quality RQ, p = .105 for formulating the hypotheses, and p = .181 for designing the experiment. Therefore, H4 (There is a statistically significant difference in procedural knowledge between undergraduate and postgraduate students in scientific research) is rejected.

Discussion

Based on the results, we can discuss the research questions, defined at the beginning of our study:

What's the difference in the students'scientific literacy between undergraduates and postgraduates?

In the results of the study, we did not find a statistically significant difference in procedural knowledge about scientific research between undergraduate and postgraduate students.

These results are in line with a previous study (Nja, Citation2019), which found no statistically significant differences in the level of scientific literacy based on the year of study.

The results were also expected because all students in the selected sample are science-oriented and the historical aspect of physical theories taught in the seminar was on average new to undergraduate students and completely new to postgraduate students. Future work is to replicate this serendipitous seminar/workshop experience with a sample of social science or humanities students. In this way, a difference could be assessed and appreciated. Finally, the methodology of scientific research differs slightly from discipline to discipline and the transfer of methodology is not trivial.

Another logical interpretation of the achieved result comes from the basic definition of scientific literacy (Cresswell et al., Citation2015; Jonāne, Citation2015), which interprets scientific literacy as transferable skills and separates them from content knowledge as defined in the curriculum.

The result obtained is therefore further evidence that without the planned development of elements of scientific literacy, there is no guarantee that the population will improve the level of scientific literacy and develop scientific research skills.

How can we improve students’ scientific literacy through appropriate learning activities?

We can conclude from the study that the improvement of students’ scientific literacy is the result of the planned research activities. The most important part of the workshop was the evaluation of the results after the pre-test and a comprehensive discussion. The discussion highlighted important factors of procedural knowledge of scientific research in different case studies (e.g. data, modelling, the historical foundation of science, difference between historical facts and current physics learning knowledge, applied science & technology in society, etc.). Such workshops with scientific activities are suitable for different levels of education, especially higher education levels. According to Holbrook & Rannikmar (Citation2009) at the primary education level student interest in science is generally positive, beside this, conducting a workshop for younger students, whose level of development does not yet allow for abstract thinking, requires conducting concrete experiments or, in exceptional cases, a virtual experiment.

Previous research (Gucluer & Kesercioglu, Citation2012; Sastradika & Defrianti, Citation2019) has shown that any additional direct experience with experiments improves a person's scientific literacy at least somewhat. According to the third dimension of scientific literacy (Showalter, Citation1974, p. 450), stating that a scientifically literate person uses scientific processes to solve problems, we highlight the scientific research procedural steps as one of the most important elements of scientific literacy. Student-centered learning enables the development of a good researcher, which will not only benefit their direct study but will also lay the foundations for lifelong learning.

However, the results of this study suggest that it is not enough to simply conduct an experiment and focus on the content of the experiment. In addition to direct experience with experiments, it is also very important to discuss the research process itself, to articulate and justify a single step in the research process in order to increase awareness of the importance of the research procedure. The results described not only confirm Glaze's (Citation2018) findings but also represent their upgrading with concrete pointers for discussion, focusing on the procedural steps of scientific research.

Believing that this type of scientific activity, regularly integrated into the educational process, improves the scientific literacy of the population, we strongly recommend educators and lecturers to follow this practice.

Conclusion

We can conclude, however, that students have some understanding of how to define research questions, pose the hypotheses and subsequently design experiments to test the hypotheses. This is not surprising, considering that the students have already gained some experience in scientific research during their science education. After completion of the seminar and a comprehensive discussion and evaluation, confidence in all areas increased. This was most notable in the area of designing experiments to test the hypotheses posed, although there is still more that could be achieved in this area to provide greater understanding.

Transferring what has been learned to one's own example is an indicator of the level of understanding. Structured thinking and addressing the different factors/variables that can influence the outcome are important for research work. Indirectly, this also allows for an important improvement in scientific literacy and thinking outside the box.

The focus of science education if it is to enhance students’ acquisition of scientific literacy, should therefore be moved away from a content-led teaching direction to more problem-oriented scientific research activities, corresponding ‘doing science’ (Hodson, Citation1998) concept.

Limitations of the study

When we do research, we are also aware of the limitations. In our case, we point out the following limitation: In conducting the study, we considered a relatively small sample of students. Furthermore, all students in the sample considered are science or technology oriented.

Ethics statement

Before the experiment began, the respondents had to sign under initial letters so that they could not be recognised.

According to the university's regulations, no approval from an ethics committee is required for such an experiment, as no sensitive personal data was collected.

Acknowledgement

The author would also like to thank Prof. Dr. Raffaele Pisano and the Research Centre of the University of Lille for their kind reception.

Disclosure statement

Author has no potential conflict of interest.

Additional information

Funding

This work was supported by the Slovenian Research Agency under the Core Project: ‘Computationally intensive complex systems’, grant number P1-0403, and the Slovenian National Project ‘Science and mathematics literacy, encouraging critical thinking and problem solving’.

References

  • Abu-Zaid, A. (2014). Research skills: The neglected competency in tomorrow’s 21st-century doctors. Perspectives on Medical Education, 3(1), 63–65. https://doi.org/10.1007/s40037-013-0087-7
  • Aiman, U., & Hasyda, S. (2020). The influence of process oriented guided inquiry learning (pogil) model assisted by realia media to improve scientific literacy and critical thinking skill of primary school students. European Journal of Educational Research, 9(4), 1635–1647. https://doi.org/10.12973/eu-jer.9.4.1635
  • Amir, A., Mandler, D., Hauptman, S., & Gorev, D. (2017). Discomfort as a means of pre-service teachers’ professional development – an action research as part of the ‘research literacy’ course. European Journal of Teacher Education, 40(2), 231–245. https://doi.org/10.1080/02619768.2017.1284197
  • Avsec, S., & Savec, V. F. (2019). Creativity and critical thinking in engineering design: The role of interdisciplinary augmentation. Global Journal of Engineering Education, 21(1), 30–36.
  • Bačnik, A., Slavič Kumer, S., Bah Brglez, E., Eršte, S., Golob, N., Gostinčar Blagotinšek, A., Hajdinjak, M., Hartman, S., Ivančič, G., Kljajič. S., Majer Kovačič, J., Mohorič, A., Moravec, B., Novak, N., Pavlin, J., Repnik, R., Vičič, T. (2019). Gradniki naravoslovne pismenosti. [Science literacy building blocks]. https://www.zrss.si/wp-content/uploads/2021/11/2021-11-15-Gradniki-NARAVOSLOVNA-PISMENOST_11V_16_07_2021.pdf.
  • Boyer Commission on Educating Undergraduates in the Research University. (1998). Reinventing Undergraduate Education: A Blueprint for America's Research Universities. Stoney Brook, NY.
  • Brown, B. A., Reveles, J. M., & Kelly, G. J. (2005). Scientific literacy and discursive identity: A theoretical framework for understanding science learning. Science Education, 89(5), 779–802. https://doi.org/10.1002/sce.20069
  • Bybee, R. W. (1997). Achieving scientific literacy: From purposes to practices. American Association for the Advancement of Science.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2. Auflage). Erlbaum.
  • Cresswell, J., Schwantner, U., & Waters, C. (2015). A review of international large-scale assessments in education: Assessing component skills and collecting contextual data. PISA, The World Bank, Washington, D.C./OECD Publishing, Paris. https://doi.org/10.1787/9789264248373-en.
  • DebBurman, S. K. (2002). Learning how scientists work: Experiential research projects to promote cell biology learning and scientific process skills. Cell Biology Education, 1(4), 154–172. https://doi.org/10.1187/cbe.02-07-0024
  • Glaze, A. L. (2018). Teaching and learning science in the 21st century: Challenging critical assumptions in post-secondary science. Education Sciences, 8(1), 12. https://doi.org/10.3390/educsci8010012
  • Gucluer, E., & Kesercioglu, T. (2012). The effect of using activities improving scientific literacy on students’ achievement in science and technology lesson. Online Submission, 1(1), 8–13.
  • Hodson, D. (1998). Teaching and learning science: Towards a personalized approach. McGraw-Hill Education (UK).
  • Holbrook, J., & Rannikmae, M. (2009). The meaning of scientific literacy. International Journal of Environmental and Science Education, 4(3), 275–288.
  • Jonāne, L. (2015). Analogies in science education. Pedagogika, 119(3), 116–125. https://doi.org/10.15823/p.2015.027
  • Kaeedi, A., Nasr Esfahani, A., Sharifian, F., & Moosavipour, S. (2021). Research methods curriculum in graduate program: An investigation of the world’s top universities approaches to design. Implementation and Evaluation. Educational Measurement and Evaluation Studies, 10(32), https://doi.org/10.22034/emes.2021.242398
  • Kahn, P., & O’Rourke, K. (2005). Understanding enquiry based learning. In T. Barrett, I. MacLabhrainn, & H. Fallon (Eds.), Handbook of enquiry and problem based learning (pp. 1–12). AISHE and CELT, NUI Galway.
  • Koballa, T., Kemp, A., & Evans, R. (1997). The spectrum of scientific literacy. Sci. Teach, 64(7), 27–31.
  • Laugksch, R. C. (2000). Scientific literacy: A conceptual overview. Science Education, 84(1), 71–94. https://doi.org/10.1002/(SICI)1098-237X(200001)84:1<71::AID-SCE6>3.0.CO;2-C
  • Lemon, T. I., Lampard, R., & Stone, B. A. (2013). Research skills for undergraduates: A must!. Perspectives on Medical Education, 2(3), 174–175. https://doi.org/10.1007/s40037-013-0054-3
  • Lenhard, W., & Lenhard, A. (2016). Calculation of effect sizes. https://www.psychometrica.de/effect_size.html Dettelbach (Germany): Psychometrica. https://doi.org/10.13140/RG.2.2.17823.92329.
  • Marteel-Parrish, A. E., & Lipchock, J. M. (2018). Preparing chemistry majors for the 21st century through a comprehensive one-semester course focused on professional preparation, contemporary issues, scientific communication, and research skills. Journal of Chemical Education, 95(1), 68–75. https://doi.org/10.1021/acs.jchemed.7b00439
  • Nja, C. O. (2019). Scientific literacy of undergraduate science education students in the university of Calabar Cross River State Nigeria. Quest Journals Journal of Research in Humanities and Social Science, 7(5), 35–39.
  • OECD. (2013). PISA 2015 Draft Science Framework. www.oecd.org/pisa/pisaproducts/ Draft%20PISA%202015%20Science%20Framework%20.pdf.
  • Pella, M. O., O'hearn, G. T., & Gale, C. W. (1966). Referents to scientific literacy. Journal of Research in Science Teaching, 4(3), 199–208. https://doi.org/10.1002/tea.3660040317
  • Pisano, R., Vincent, P., Dolenc, K., & Ploj Virtič, M. (2020). Historical foundations of physics & applied technology as dynamic frameworks in pre-service STEM. Foundations of Science, 26(3), 727–756. https://doi.org/10.1007/s10699-020-09662-4
  • Ploj Virtič, M., & Šorgo, A. (2016). Can we expect to recruit future engineers among students who have never repaired a toy? eurasia journal of mathematics, Science and Technology Education, 12(2), 249–266. https://doi.org/10.12973/eurasia.2016.1201a
  • Ratte, A., Drees, S., & Schmidt-Ott, T. (2018). The importance of scientific competencies in German medical curricula - the student perspective. BMC Medical Education, 18(1), 1–10. https://doi.org/10.1186/s12909-018-1257-4
  • Roberts, D. A. (1988). Development and Dilemmas in Science Education (pp. 27–54)
  • Roth, W. M., & Lee, S. (2004). Science education as/for participation in the community. Science Education, 88(2), 263–291. https://doi.org/10.1002/sce.10113
  • Rutt, A. A., & Mumba, F. (2021). Pre-service teachers enactment of language-and literacy-integrated science instruction in linguistically diverse science classrooms. Journal of Research in Science Teaching, 59(4), https://doi.org/10.1002/tea.21739
  • Sastradika, D., & Defrianti, D. (2019). Optimizing inquiry-based learning activity in improving students’ scientific literacy skills. Journal of Physics: Conference Series, 1233(1), 012061. IOP Publishing. https://doi.org/10.1088/1742-6596/1233/1/012061
  • Shamos, M. (1995). The myth of scientific literacy. Rutgers University Press.
  • Showalter, V. M. (1974). What is unified science education? Program Objectives and Scientific Literacy. Prism II, 2(3–4), 1–6.
  • Shwartz, Y., Ben-Zvi, R., & Hofstein, A. (2005). The importance of involving high-school chemistry teachers in the process of defining the operational meaning of “chemical literacy”. International Journal of Science Education, 27(3), 323–344. https://doi.org/10.1080/0950069042000266191
  • Šorgo, A., & Ploj Virtič, M. (2020). Engineers do not grow on trees. Global Journal of Engineering Education, 22(3), 168–173.
  • Thomas, G., & Durant, J. (1987). Why should we promote the public understanding of science? In M. Shortland (Ed.), Scientific literacy papers (pp. 1–14). Department for External Studies, University of Oxford.
  • Tsai, C. Y. (2015). Improving students’ PISA scientific competencies through online argumentation. International Journal of Science Education, 37(2), 321–339. https://doi.org/10.1080/09500693.2014.987712
  • Tsai, C. Y., Lin, H. S., & Liu, S. C. (2020). The effect of pedagogical GAME model on students’ PISA scientific competencies. Journal of Computer Assisted Learning, 36(3), 359–369. https://doi.org/10.1111/jcal.12406
  • van Aalsvoort, J. (2004a). Logical positivism as a tool to analyse the problem of chemistry’s lack of relevance in secondary school chemical education. International Journal of Science Education, 26(9), 1151–1168. https://doi.org/10.1080/0950069042000205369
  • van Aalsvoort, J. (2004b). Activity theory as a tool to address the problem of chemistry’s lack of relevance in secondary school chemical education. International Journal of Science Education, 26(13), 1635–1651. https://doi.org/10.1080/0950069042000205378
  • Weeks, A., Bachman, B., Josway, S., Laemmerzahl, A. F., & North, B. (2014). Guiding student inquiry into eukaryotic organismal biology using the plasmodial slime mold physarum polycephalum. The American Biology Teacher, 76(3), 196–200. https://doi.org/10.1525/abt.2014.76.3.8
  • Zhu, Y. (2019). How Chinese students’ scientific competencies are influenced by their attitudes? International Journal of Science Education, 41(15), 2094–2112. https://doi.org/10.1080/09500693.2019.1660926