9,520
Views
20
CrossRef citations to date
0
Altmetric
Original Articles

Factors in Information Literacy Education

&
Pages 116-130 | Published online: 29 Feb 2008

Abstract

Information literacy has long been discussed in the field of library science but is only recently becoming applied in specific academic disciplines. This article assesses student learning of information literacy skills analyzing data collected from three semesters of the Introduction to Comparative Politics course. Variables such as major discipline, gender, class year, and grades on several performance indicators are used to identify key patterns in successful information literacy learning among students. Questions that drive this research include: How do major disciplines approach information literacy differently. Is information literacy discipline specific? Does gender affect information literacy aptitude? Do upper-division students still need information literacy education? Which students are most deficient in their pretest knowledge of information literacy? What types of exercises are effective in teaching information literacy? Through analysis of our data, we address these questions and isolate the most significant factors in student learning of information literacy skills. Our data suggest that information literacy knowledge is content sensitive. Not only is information literacy significantly associated with several performance indicators, information literacy appears to be discipline specific.

The quotation “knowledge is power” has widely been attributed to Sir Francis Bacon in the late 1500s. His insight has tremendous relevance at the beginning of the twenty-first century, in the information age, as the volume of information that confronts individuals daily seems to be exponentially rising. Citizens in the contemporary world must be equipped to access and to process information efficiently. Information Literacy skills can make this task easier. Information literacy is defined as “a set of abilities requiring individuals to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information” (Association of College and Research Libraries 2000).

Given that information literacy skills are needed, what accounts for information literacy skills and how can information literacy learning be best accomplished? This article examines the factors that differentiate different levels of student learning when it comes to information literacy skills. We discuss the apparent natural connection between political science and information literacy. The prospect that information literacy might be contextualized or be related more closely to some academic disciplines and not others is explored. An information literacy module is described, which was administered to students in three different classes during successive semesters. These students took a pre- and posttest and their information literacy competencies at each point in time are discussed. Finally, quantitative analysis is used to examine the factors that account for information literacy learning.

Political Science and Information Literacy

Political Science professors often lament the under-informed nature of the average college student. After all, our courses are perhaps the most likely on campus to be regularly engaging the contemporary issues of the day. We need our students to be equal to the task, and pausing to even briefly recapitulate the relevant details of key events can be cumbersome. Thus political science and information literacy seem inherently linked. It seems odd, then, that political science appears to lag behind other disciplines in incorporating information literacy education into its curriculum while psychology, a kindred social science discipline, identified information literacy as a major competency and its sixth learning goal and outcome (Halonen 2002). Additionally, little research has explored the application of information literacy to the political science curriculum apart from Marfleet and Dille (2005).

A central supposition in the Marfleet and Dille article is that information literacy is “content neutral” meaning that it is fostered equally well across courses and academic disciplines (2005, 187). This article investigates that claim more fully, contending that there may be reason to expect a logical fit and affinity between political science and information literacy. This may be the case more widely or perhaps there is a natural aversion among some disciplines, as well. Marfleet and Dille do not explicitly examine major or other discipline indicators in their analysis. Therefore, a central curiosity of this article focuses on the discipline-specific nature or lack thereof of information literacy learning.

Approach and Method

The goal of this investigation is to isolate key explanatory factors in information literacy learning. The data has already been analyzed to show that students learned information literacy skills through the application of the information literacy module described below (Williams, Goodson, and Howard 2006). This article will attempt to demonstrate why learning occurred.

A posttest provides one indicator of information literacy learning. A pretest/posttest design provides for assessment of the acquisition of student information literacy knowledge over the semester. The pretest is administered in the first or second week of the semester. Then the posttest is administered one week after the research paper is submitted. The same questions are asked on both the pretest and posttest. Test questions were designed by correlating student learning outcomes for the course with appropriate information literacy standards as set forward by the Association of College and Research Libraries (2000). For each information literacy competency standard, at least one and often several questions were designed to evaluate whether students possess this knowledge or ability (Figure ). For instance, one group of questions asks students to use a concept map to organize a set of ideas. These questions evaluate the Information Literacy Competency Standard Four: “The information literate student, individually or as a member of a group, uses information effectively to accomplish a specific purpose.” Another series of test questions ask students about conducting searches for information using the library's online catalog. These questions evaluate the Information Literacy Competency Standard Two: “The information literate student accesses needed information effectively and efficiently.”

Figure 1 Information literacy competency standards.

Figure 1 Information literacy competency standards.

A culminating research paper project provides a second indicator of information literacy learning. The instructor provides a grading matrix for this project showing students that information literacy skills comprise a heavy weight in the paper grade breakdown. Also, the instructor reminds students as they work on skill-building exercises over the course of the semester that each is designed to help them improve their research paper product. The exercises, discussed below, take students through steps of project management for the research paper. For the research paper assignment, the Country Issue Study, students are asked to select one country outside of the United States and to explore a contemporary issue facing that country, whether a social, political, or economic issue. The paper is expected to provide analysis of the dimensions of the problem, to describe its effects on the country, and to speculate about possible solutions, as well as the likelihood of those solutions based on the present context. The paper is required to be five to seven pages in length, to utilize a minimum of six scholarly sources, and to follow a structure whereby it begins with a thesis statement addressing the research question.

The raw independent variables considered in this study were drawn from available student demographic information that could be readily derived and coded for each student. Upon project initiation, we expected several factors could account for information literacy learning. These included gender, major field of study, division: upper or lower, class year, overall course grade as a measure of general aptitude, and pretest score as an indicator of the exogenous level of information literacy knowledge possessed by the student. As indicated previously, a posttest score and paper grade were also included as indicators of the level of information literacy knowledge possessed by the student at the end of the course. Finally, a control variable included semester for the course to ensure that the population was relatively normally distributed across semesters and that no oddities appear to have occurred in the treatment administered each semester (see Appendix 1 for variable coding).

In terms of methodology, we first examine the data using correlational analysis to assess the factors related to information literacy levels at the outset of the course. Secondly, we utilize simple descriptive statistics to examine the differences in the development of information literacy by major field of study. Finally, we conduct regression analysis to assess the relative impact of our independent indicators on information literacy knowledge as measured by student posttest and paper scores. Before detailing the findings of our analysis, we discuss the information literacy module that was employed.

An Information Literacy Module for Developing Information Literacy Skills

Information literacy education was introduced in the Introduction to Comparative Politics course at the University of West Florida initially in an attempt to generate higher quality research papers. Skill-building exercises were administered over the course of the semester to create a building-block effect. Students were taught to start with a puzzle or curiosity, then to formalize the research question, to search for relevant information, to analyze information, and to reason to conclusions.

The first exercise in the information literacy module is the Search Term Triangulation exercise (Appendix 2). The exercise is designed to get students to begin to consider ideas for their paper topic, to get them to focus on specific words that they can use in searching the library catalog and databases for information, and to get them to think in terms of asking a research question and using the paper to answer it. The motivation for this exercise is the idea that all good research begins with a puzzle. The Search Term Triangulation exercise emerged out of a comment made during library instruction given by the Social Science Librarian at the University of West Florida. The librarian was showing students how to perform an advanced search for journal articles in the database, and she said that two search terms often do not provide specific enough search parameters but that three search terms provide a broad topic, a focal point, and a clarifying term. This combination of three elements, she indicated, tends to produce more narrow results and yields a list of resources targeting the topic more precisely. The exercise developed for purposes of the Country Issue Study assignment requires students to insert a country at the top point of the triangle, an issue facing the country at the lower left point, and a clarifying term at the lower right point. The assignment requires students to insert three synonyms for each term just inside the triangle diagram at each point, corresponding to instruction about truncation and how to use alternate words when searching databases and library catalogs. Students are then asked to formulate a research question that incorporates all three terms and to write this question beneath the triangle diagram.

When the class meets with a librarian for library instruction tailored to meet the demands of the Country Issue Study assignment, the Search Term Triangulation exercises are given to the librarian to aid in preparing the library instruction. The librarian is then able to use the intended topics that the students want to investigate to demonstrate search strategies and useful resources. By having the students begin to think about their topics before the library instruction and by reading the students' own research questions during the library instruction, the students have been observed to take more notes and to ask the librarian a greater quantity and quality of questions than in classes where the information literacy exercises were not employed. During library instruction, the librarian addresses not only basic search strategies, the operation of the library catalog, and access to databases and general references with country-specific information, but she also brings in books addressing the students' own research questions and shows journal article search results, that she has tested in advance, using the triangle terms from several students in the class. Also during library instruction, the librarian covers the topics of how to detect source bias, how to judge a popular versus a scholarly source, and how generally to evaluate resources for appropriateness to a topic. Following the library instruction, the students are given two exercises designed to review this information with them. One exercise covers the difference between popular and scholarly sources. The other exercise reviews how to evaluate sources that one might consider using.

The next step, once students are engaging in preliminary research, is to have them submit an annotated bibliography. This exercise demands ten scholarly sources, four books and six scholarly journal articles, even though the final paper requires only six scholarly sources. This is so that students are encouraged to be selective in using resources. In other words, choose the best sources from among those obtained for the paper. The annotated bibliography assignment asks students to provide an abstract for scholarly journal articles and a one-paragraph summary of key ideas covered in scholarly books. The student then must provide one to two sentences explaining how they feel that this resource addresses their research question and how they might use this particular resource. The goal is to have students think in terms of the utility of the resource for addressing the particular question that they are posing in addition to assessing its academic quality and appropriateness.

Once the annotated bibliography is submitted and the professor's comments have been returned, students are given a chance to refine their Search Term Triangle and Research Question in light of the comments received. Often they will be told that a topic is too broad or narrow and asked to modify it accordingly. Other times they will be told that the resources they have found do not appear to directly address the question that they are asking. They will be asked to either find a different type of resource or to refine their topic and research question. At any rate, after the initial research is completed this is a prime opportunity to sharpen the focus prior to moving forward with the project.

The next several exercises are designed to review some basics of scholarly writing, from academic integrity to structural components of a research paper. A plagiarism exercise is used that gives students a quotation from a book. The quote is then plagiarized in eight different ways and students are asked to identify the problem with each of the eight examples of use of that material (Appendix 3). Next a Thesis Identification exercise is used whereby students are asked to underline the thesis statement in three paragraphs taken from articles published in key journals from the discipline of political science. After being taught to identify other authors' thesis statements, students are asked to develop their own thesis statement by answering the research question that they developed based on their research. Third, the technique of concept mapping is introduced and demonstrated. Students are asked to complete a Concept Map for their research question (Appendix 4). One week later and using the professor's comments on their concept map, students will number their concept map indicating the order in which they plan to cover the various ideas. Numbering the concept map provides a way to visually structure the research question. It appeals to visual learners and serves much like an outline to provide a premeditated and logical ordering for the paper. Finally, students are taught the basics of American Psychological Association (APA) style of formatting and they complete an exercise where they correct errors in the use of APA style. This is done to ensure that students remain consistent with one formatting style as well as to remind them that this is a formal paper and should be structured and formatted accordingly. APA style was selected because it is a social science writing style, which is also supported by our university writing center.

Analysis

The first goal of this analysis is to shed light on those student factors that influence information literacy. The pretest scores illustrate student information literacy at the beginning of the course. Examining the correlation matrix we see that one particular variable is highly correlated with information literacy scores. The student's class is a positive indicator of pretest scores. The further along a student is in his or her college career, the higher the pretest score (p < .01 level). This relationship appears to be more nuanced than simply a lower-division/upper-division distinction. There are real differences between freshmen and sophomores and particularly between juniors and seniors. Not surprisingly, senior students receive the highest scores on the pretest (p < .05 level).

Female students scored no higher or lower on the pretest than male students, but the correlation matrix does reflect a unique characteristic of our sample. There are significantly fewer female than male political science and social science majors (p < .01 level). Student major, however, is not associated with pretest scores. Political science majors and social science majors share similar scores with other disciplines represented in the class (Table ).

Table 1. Correlation matrix of literacy pretest and student demographics

Information Literacy by Discipline

While the correlation matrix reveals no significant relationship between student major (classified as political science or social science) and information literacy pretest score, examining separate declared majors across the pretest and posttest yields important findings (Table ). First, the highest pretest scores are among social science majors (56%), and the lowest scores are among pre-law majors (47%). Pre-law is part of the political science department and thus is classified in the correlation matrix as a political science major. Limiting our analysis to declared majors, the top half of the distribution of mean major scores includes: political science, international studies, social science, and business. The remaining half includes: pre-law, humanities, education, and science.

Table 2. Improvement across information literacy tests by major

Looking to the posttest scores, we see that the majors receiving the highest scores are political science majors (71%), and the students receiving the lowest scores are education majors (50%) (Table ). The majors making the most notable progress from pretest to posttest are also political science and international studies as well as pre-law and humanities (mean net gain between 17 and 18 points). The majors making the least improvement are business, education, science, and social science (mean net gain between 7 and 14 points). Business majors demonstrate notably less improvement (6.8 points) than other majors.Footnote 1

The uneven gains across disciplines require further consideration. As hypothesized above, we expected that major disciplines may be unequally associated with information literacy education. This is because different disciplines have different emphases on information. Some academic fields require readily available sources of contemporary information to match changing social conditions, such as political science and economics, while others have more fixed content, such as mathematics and English literature. Therefore, Table may reflect the potential bias of our pretest/posttest in favor of political science and/or social science students. Drawing from comparative politics themes, some of the test questions appear content sensitive while others remain content neutral. Figure presents examples of each type.

Figure 2 Sample pretest/posttest questions.

Figure 2 Sample pretest/posttest questions.

Determinants of Information Literacy

The final table illustrates the significant determinants of student improvement on information literacy measures over the course of the semester. Three instruments are used to assess information literacy progress. The first measure is the pretest administered at the beginning of the semester to measure student information literacy prior to exposure to the course. The model predicting student pretest scores reiterates the findings of the correlational analysis (Table ). The only indicator significantly associated with student performance on the pretest is the student's class (p < .01 level). For every year increase in a student's status (freshman, sophomore, junior, senior) pretest scores increase by approximately three points. Equating this numerical increase to letter grades, incoming seniors score over a full letter grade higher on this information-literacy assessment than incoming freshmen. Major, however, does not appear to be significantly associated with information literacy at the time of the pretest.

Table 3. Determinants of information literacy

The same assessment instrument was also administered at the end of the semester to measure improvement. The model predicting student posttest scores suggests several significant indicators associated with information literacy. First, a student's pretest and posttest scores are significantly and positively associated (p < .01 level). Of additional importance is the student's class. Every one-year increase in student class is associated with a nearly three-point increase in student performance on the posttest (p < .05 level). Finally, a student's major is also a significant indicator of his or her posttest score. Political science majors scored an average of seven points higher on the posttest controlling for pretest score and class (p < .01 level). Overall these three indicators account for over a quarter of the variance in student posttest scores.

The third model uses the same indicators to predict student performance on a final paper assignment. Comparatively, the model is very weak in predicting paper grades. None of the factors are significant indicators of student performance on the paper except for the posttest grade, and even this association is relatively weak (p < .10 level).

One of the most interesting findings is that a student's class and major are significantly associated with his or her posttest score, but they seem unrelated to his or her paper grade. While it is possible that there is a disciplinary bias in the construction of the pre- and posttest, it is also possible that while students do not demonstrate uniform improvement in information literacy on the multiple-choice posttest, they have in fact uniformly learned over the course of the semester as measured by the paper. In the end, we see an evolution in student information literacy (Figure ). While some students enter into the course with an experiential advantage, by the end of the semester the academic playing field seems to have been leveled, with students demonstrating information literacy regardless of class or major.

Figure 3 Model of student learning.

Figure 3 Model of student learning.

Discussion and Implications

In this article, we examined the factors that influence student information literacy. Through a series of evaluative instruments, we find that students do demonstrate disparate information literacy based particularly on academic class. This is a comforting finding. It suggests that over the course of the college career students do acquire information literacy skills. Nonetheless, students overall have much to learn pertaining to information literacy. Students who completed exercises over the course of a semester demonstrated marked improvement. This improvement is most evident in the mean scores by major presented in Table .

Of particular note, several elements of this study suggest that student improvement is somewhat contingent upon major. Those students of similar disciplinary training demonstrated greater learning than those from different disciplines. This finding suggests that we should be sensitive to potential disciplinary biases in our assessment instruments. Future research should examine the possible disparity in definitions of information literacy across disciplines. Based on our experience at the University of West Florida, we have found that students enter our courses with different degrees of preparation for research-based writing projects. For example, we have taught upper-division students from other majors who have never made use of the campus library during their college experience. Through a study on incorporating information literacy education conducted under the auspices of the College of Arts and Sciences at our university, an interdisciplinary research team found that while some departments place a premium on encouraging information literacy skills, others did not. This was reflected in the departmental Academic Learning Compacts, where each department articulates the objectives for student accomplishment over the course of study in the major.

Finally, this study strongly indicates that we should be careful in interpreting findings concerning information literacy. Measurement matters. The assessment tools we utilize to capture student learning are sensitive and can produce different results. In this study, the pre- and posttest measures were significantly related to our student demographic data, while the paper measure was not. One of the strengths of the approach utilized here is the quasi-experimental design. Measuring student information literacy at different intervals over the course of the semester captures the character and evolution of student learning. Future work should further make use of quasi-experimental design to more fully understand the dynamics at work in student development of information literacy.

Notes

Note: ∗p ≤ .05; ∗∗p ≤ .01; ∗∗∗p ≤ .001 (two-tailed).

Note: ∗ = Social Science major; ∗∗ = Social Science major, but specifically Political Science major. Cell entries in first column represent the student's major. Cell entries in the second and third column represent the mean scores of majors on the pretest and posttest. Cell entries in the final column represent the mean net gain or amount of improvement across time from the pretest student mean by major to the posttest student mean by major. Highlighted entries represent those majors with means or gains in the top half of the overall distribution of declared majors.

Note: ∗p ≤ .10; ∗∗p ≤ .05; ∗∗∗p ≤ .01 (two-tailed). The dependent variables are the student-level pretest score, posttest score, and paper grade. Assignments were graded on a 100-point scale. Entries are unstandardized regression coefficients. Standard errors are in parentheses. Posttest is excluded from the first model because it accounts for an exercise subsequent to the initial pretest.

∗ = Variable operationalized as both a dependent and independent variable throughout analysis.

It should be noted that there were only two business majors in the sample who completed both the present and the posttest. Findings concerning business major thus should be treated with caution.

References

  • Association of College and Research Libraries. 2000. “Information Literacy Competency Standards for Higher Education.” Association of College and Research Libraries. 〈www.ala.org/acrl/ilcomstan〉 ( September 19 , 2004 ).
  • Halonen , Jane S. (Chair). 2002. Undergraduate Psychology Major Learning Goals and Outcomes: A Report [online]. Task Force on Undergraduate Psychology Major Competencies, American Psychological Association's Board of Educational Affairs. 〈http://wwww.apa.org/ed/pcue/taskforcereport2.pdf〉 . ( February 7 , 2006 ).
  • Marfleet , B. Gregory and Brian J. Dille . 2005 . “Information Literacy and the Undergraduate Research Methods Curriculum.” . Journal of Political Science Education 1 : 175 – 190 .
  • Williams , Michelle Hale , Kymberly Goodson , and Gary Howard . 2006 . “Weighing the Research Paper Option: The Difference that Information Literacy Skills Can Make.” . PS: Political Science and Politics 39 ( July ): 513 – 519 .

Appendix 1: Variable Descriptions and Coding

Appendix 2: Search Term Triangulation Exercise

Topic Triangulation, (example)

Assignment: In conceptualizing your project or searching library resources, using 3 terms can provide much more focused results. Triangulate your topic using three terms by filling in the three boxes and accompanying synonym lines below. Then use the three terms to construct a research question.

Appendix 3: Plagiarism Exercise

Source: adapted from Burkhardt, Joanna et al. 2003. Teaching Information Literacy. Chicago: American Library Association.

The quotation below is from the book Polyarchy by Robert A. Dahl. Read the original quotation. Selections 1–7 are ways in which someone might use this information in a term paper. Which of these constitute plagiarism and which are acceptable? Underline examples of plagiarism. On the next page, explain why the underlined items constitute plagiarism. Be ready to discuss your answers.

Original quotation:

Where the equations of classical liberalism went wrong was in supposing that any alternative to competitive capitalism necessarily required a centrally directed economy, whereas in fact competition among privately owned firms is by no means the unique method of decentralizing an economy.

Selection 1

Where the equations of classical liberalism went wrong was in supposing that any alternative to competitive capitalism necessarily required a centrally directed economy, whereas in fact competition among privately owned firms is by no means the unique method of decentralizing an economy.

Selection 2

Where the equations of classical liberalism went wrong was in supposing that any alternative to competitive capitalism necessarily required a centrally directed economy, whereas in fact competition among privately owned firms is by no means the unique method of decentralizing an economy (Dahl 1971).

Selection 3

“Where the equations of classical liberalism went wrong was in supposing that any alternative to competitive capitalism necessarily required a centrally directed economy, whereas in fact competition among privately owned firms is by no means the unique method of decentralizing an economy” (Dahl 1971).

Selection 4

Where the equations of classical liberalism went wrong was in assuming that any alternative to competitive capitalism had to consist of a centrally directed economy, whereas in fact competition among privately owned firms is by no measure the unique method of decentralizing an economy (Dahl 1971).

Selection 5

“Where the equations of classical liberalism went wrong was in assuming that any alternative to competitive capitalism had to consist of a centrally directed economy, whereas in fact competition among privately owned firms is by no measure the unique method of decentralizing an economy” (Dahl 1971).

Selection 6

Equations of classical liberalism were incorrect in assuming that any alternative to competitive capitalism must be in the form of a centrally planned economy. In fact, competition between privately owned corporations is hardly the only method of decentralizing an economy (Dahl 1971).

Selection 7

Contrary to previous assumptions, competition among private firms is not the only method available for decentralizing an economy. In this way classical liberalism has erred in its calculation that a centrally planned economy is the only alternative to competitive capitalism (Dahl 1971).

Problems constituting plagiarism:

SELECTION 1

SELECTION 2

SELECTION 3

SELECTION 4

SELECTION 5

SELECTION 6

SELECTION 7

Appendix 4: Concept Mapping Exercise

Concept Mapping (example)

Assignment:

Part I. Fill in your refined Research Question in the center circle, above the center line.

Part II. Based on preliminary research, fill in subtopics and factors related to your research question in the outer bubbles.

Part III. Fill in a Thesis in the center circle, below the center line.

Part IV. Number the terms in the outer bubbles to organize and to structure your paper.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.