787
Views
0
CrossRef citations to date
0
Altmetric
Program Sessions

Discovery on a Budget: Improved Searching without a Web-Scale Discovery Product

Pages 129-136 | Published online: 08 Apr 2013

Abstract

Discovery is a key component of a library's services, and user expectations are high. Even if a Web-scale discovery system is not in the cards, there is plenty a library can do to improve discovery for their users. Librarians at Southern Illinois University Edwardsville have conducted a series of user studies focused on discovery tasks using the library's website and catalog. The lessons learned from these studies have led to an ongoing redesign of the library's website as well as recommendations for instruction. The presentation shares successful strategies for evaluating and improving discovery, no expensive software or programming skills necessary.

INTRODUCTION

Discovery is crucial to the success of a library and its users. Even the most current and complete collection is useless if patrons are unable to find the information that is relevant to them. As the discovery process has moved online, libraries and vendors have worked to develop solutions that are easy and effective for a broad range of users. Most recently, libraries have embraced Web-scale discovery tools that allow a user to search quickly and easily across a broad range of resources including article databases, institutional repositories, and the catalog.

However, such tools may be undesirable or unfeasible for many libraries. Primarily due to budget restrictions, Lovejoy Library at Southern Illinois University Edwardsville (SIUE) has been unable to implement a fully featured discovery tool. However, the library staff have recognized that there are other ways to remove barriers and improve discovery and research for users. The discovery process is impacted by many factors, including the research question and search terms, navigability of the library's website and other search tools, and the terminology employed by these tools. This session discussed a program of user studies and site redesigns guided by these studies, with the intent of improving discovery. The presenters relayed the core lessons learned during the process as well as the methods for conducting the user studies in the hopes that attendees would study their own populations rather than simply relying on the reported results.

RESOURCE DISCOVERY

Lovejoy Library's VuFind discovery interface was implemented in June 2010, and was branded as UFind. Lovejoy Library is a member of the Consortium of Academic and Research Libraries in Illinois (CARLI) and with seventy-six other CARLI member libraries participates in the I-Share system, which provides a merged, union catalog of the holdings of all I-Share members. The I-Share union catalog also uses the VuFind discovery interface. The Library also has access to WorldCat Local quick start, provided free to all institutions that subscribe to WorldCat from FirstSearch. In our present configuration, quick start users can only connect directly to FirstSearch databases, most of which are not full-text. Lovejoy Library users have access to journals and databases from the e-journal list, our SFX implementation that has been branded SIUE Journal Link, as well as lists of databases by subject, and the A–Z list of databases and e-resources.

USER STUDIES

By 2009, Lovejoy's website had become very difficult to navigate, so the Web Task Force was formed to assess and redesign the library website. A commercial tool for improving resource discovery was not an option, so the Task Force decided to approach website redesign from the bottom up, with a series of user studies. The first user study, Paper Study One, used a paper worksheet that simulated navigating the library website. Participants were asked to answer seven navigational questions of varying complexity. They were also asked to complete two additional surveys seeking demographic information and wrap up questions soliciting input on ways to improve the website. Paper Study One included 109 participants: 79 undergraduates, 11 graduate students, 14 faculty and staff, and 5 participants who identified themselves as “other.” The data revealed confusion regarding the website both in terms of organization and use of library jargon.

Several revisions to the website were implemented as a result of the data from Paper Study One. The central section of the homepage was reduced to three broad sections, with no more than four links under each section. The left column was reserved for quick links to specific resource discovery tools. The right column was reserved for public relations items such as new books and featured resources and trials.

A second paper study was conducted using the redesigned website. The survey instrument from Paper Study One was reused; however, the paper worksheets included images from the redesigned website. Paper Study Two included 75 participants: 65 undergraduates, 6 graduate students, and 4 faculty and staff. This time around the participants did a better job of completing the discovery tasks, but there was still confusion about possible overlap between the links, and it was evident that some of the terminology used on the website was still not clear.

Due to the fact that so much information had been moved from the homepage to secondary pages, the Task Force decided another paper study would not yield the information needed to continue to improve the website. It was determined that an observational study on the live database would garner better data. Observational Study One was limited to two tasks with investigators noting each link clicked, and the time taken to complete the task. The demographic and wrap up questions from the previous paper studies were also used. Observational Study One included 25 participants: 18 undergraduates, 5 graduate students, and 2 faculty and staff. About half of the participants were unable to complete the discovery tasks due to confusion about the secondary Web pages. This study showed that while the redesigned website was generally easier to navigate, improvements were still needed. Participants did not understand the difference between the journal list, the list of all e-resources (databases) and the list of e-resources (databases) arranged by subject. As a result of the data from the study, the order of links on the homepage was modified and some of the terminology was changed.

To test the latest set of modifications, a second observational study was administered. The same survey instrument as Observational Study One was used. Some Task Force members felt that the “eResources A–Z” link on the homepage was confusing to users. To determine if this link was actually causing problems for users, half of the participants saw a homepage with the “eResources A–Z” link, and half of the participants saw a homepage without the link. The secondary pages were the same for both groups. Observation Study Two included 50 participants: 32 undergraduates, 16 graduate students, and 2 participants who identified themselves as “other.” Participants were much more successful with the tasks than the Observational Study One participants. For task one, 53 participants selected the correct answer, while 22 selected other answers. For task two, 57 participants selected the correct answer, while 18 selected other answers. The different versions of the homepage did not affect the results.

The four studies illustrated how our successful website redesign was guided by user input, and emphasized the importance of regularly surveying users and using those results to aid in website redesign. Even though significant improvements were achieved, Observation Study Two showed that more improvements were still needed.

When Lovejoy Library implemented UFind in June 2010, significant efforts were made to get the word out about the new interface. By late 2011 it was becoming apparent that even though users seemed to be fairly confident using UFind they might not be using it as effectively as possible. Reference Desk staff noticed that most users were only doing keyword searches, and might not understand the difference between an author, title and subject search. There was also a concern that the facets were not being used for limiting, and users were scrolling through long result sets. Two Technical Services librarians and the Instruction Librarian decided it was time to do some more observational studies, but this time aimed at the catalog. The decision was made to do three separate observational studies focusing on e-books, searching and facets, and the I-Share catalog. Each study consisted of four questions on the three topics plus demographic and wrap up questions. Investigators sat next to the participants, and encouraged them to talk out loud as they worked through the tasks. The three studies included a total of 42 participants: 35 undergraduates and 7 graduate students.

For the e-book study we wanted to determine if participants could find e-books in our catalog, if they could distinguish between an e-book and a print book, and if they would sort through a long hit list to find e-books. The study revealed that participants were not narrowing their searches to “electronic” and usually scrolled through long lists of results. Not all participants were able to distinguish the e-book version from the print version.

For the Lovejoy searching study the investigators were hoping to find out if participants could navigate the UFind catalog, if they could decipher the catalog displays, and if they were using the facets for limiting. Results showed that generally participants could navigate the catalog, and they did understand the catalog bibliographic and holding displays. Participants seldom used the facets, and usually favored scrolling over limiting.

In the I-Share searching portion of the study, investigators wanted to determine if participants could navigate the I-Share catalog, if they understood the I-Share display, if they were limiting by facets, and if they understood the difference between the UFind catalog and the I-Share catalog. Investigators found that participants were adept at navigating the I-Share catalog, and were able to interpret the displays, but some participants had problems finding local holdings. A surprising finding was that participants were using the facets for narrowing in I-Share. This may be because of the very large result sets retrieved in the shared catalog.

Across the three studies participants used 56% keyword searches, 34% title keyword searches, 8% subject keyword searches, and 2% author keyword searches (even though there were not any questions requiring author keyword searches). Even though all the questions would have been easier to answer using facets for limiting, participants only used facets in 23% of their searches. The study also found that participants who had library instruction used 1.62 non-keyword searches out of the total of 4 searches and used facets 1.04 times out of the total of 4 searches. Participants without library instruction used a total of 1.38 non-keyword searches out of the total of 4 searches and used facets 0.69 times out of the total of 4 searches. Although interesting, these differences are not statistically significant.

The demographic surveys filled out by participants showed that many students had received introductory-level instruction, and instruction tied to a senior assignment, but very little in between. This indicated a gap between the first and final years of college, when students are building their research skills. This gap in instruction also meant that students who missed out on freshmen level instruction for some reason may not have had any library instruction until they reached the end of their studies. The investigators felt that this gap could be filled with instruction sessions or workshops focusing on specific topics, such as using e-books or using facets to narrow Online Public Access Catalog (OPAC) searches. In addition, the investigators suggested partnerships with teaching faculty as a way to fill this gap and improve students’ searching skills.

CORE LESSONS

While the website and OPAC user studies and redesigns had immediate, institution-specific impact, some central themes also emerged that could inform the work of other libraries. The following six themes represent the core lessons of our studies for improving users’ research and discovery experiences.

Names and Language

Every member of the North American Serials Interest Group (NASIG) conference audience would likely have a definition for terms such as database, research, periodical, or electronic resource. Undergraduate students, on the other hand, might be stumped or define the terms in very different ways. It is important to consider the uncertainty that these labels may cause users, although the meanings are clear to library staff. Action-based language was much easier to understand for participants in the library's users studies. Words such as get, find, search, and ask let users know what they will be able to accomplish on the other end of a link. The library's new website incorporated more action based language, particularly when linking to catalogs, database lists, and journal lists, and users demonstrated a much clearer understanding of these labels.

One final piece of advice is to cut down on vendor branding in links and service names. The landscape of library vendors can be confusing even for professionals. Casual users will likely not be familiar with more than one or two database vendors, so the inclusion of those vendor names may not be helpful. Again, action based language or clear, topical labels are preferable. We found this was particularly important for our link resolver and journal list. While students did not recognize the vendor's name for the product, they understood buttons that said “Find Full Text” or “Journal Link.”

The Order Matters

In any interface or list, the order is a very important concern. Participants in our studies often gravitated to the first link in a list and scanned only the first few search results. It may be unrealistic to change this behavior, so arranging in order of prominence or popularity may be key to users’ success. Along with the clearer labels already mentioned, this minimizes the amount of reading that users must do in order to navigate resources quickly and effectively.

We found this was particularly important for lists of links on a page. After our initial website redesign, users were having trouble with questions that asked them to find a particular journal. Our journal list was at the bottom of a list of four links, and it seemed that this made it difficult for users to spot. It was moved to the top of the list in a subsequent redesign, and students were able to find it much more easily.

Be Familiar

Most library users spend more time searching for information using Google or Amazon than the library's catalog or databases. Because of this, it may be helpful to emulate certain aspects of popular websites’ user experience. This is not to say that the library's website should look exactly like Google, but some conventions of wording and the placement of elements on the page are worth noting.

During our OPAC user studies, we found that very few participants narrowed their searches. It seemed as though many participants were not aware of the narrowing options, despite their fairly prominent placement. However, the investigators noticed that Google, Amazon, and WorldCat all place their narrowing options on the left side of the screen, while our OPAC places them on the right. The inclusion of the narrowing options in a location often reserved for ads may have thrown off many users.

Let Users Help You

The most important lesson we learned during this process was to seek user input as guidance for website design and instruction. The idea that we could think like users or anticipate their behavior proved to be false. In order to develop useful services, it is crucial to see how people will actually use them. There are many ways of soliciting user input, including surveys, focus groups, observation studies, feedback forms, and blogs. Reliance on blog comments or feedback forms may limit input to a vocal minority of users. Direct observation studies offer the possibility of seeing actual use, rather than using possibly inaccurate reports of use.

Problem reports are another valuable form of feedback, and they can tell quite a bit about the obstacles users encounter. It is best to use reporting mechanisms that will provide the recipient with as much information as possible, without requiring the user to report it. For example, when a student encounters a problem with our proxy server, they see a message that allows them to input their e-mail address and send the report. The form itself records the problematic Uniform Resource Locator (URL), rather than relying on the user to copy and paste it. Most link resolvers include a similar reporting mechanism.

Search Boxes

While no search tool can include everything, users will behave as if it does. We observed several participants using the site search to look for articles or books. In general, we saw users approaching search boxes in ways we had not expected. Library researchers from North Carolina State University provide an interesting look at the difficulties of anticipating user behavior, even when the scope of a search box has been made as broad as possible.Footnote 1 Search boxes with a narrow scope (e.g., a search of database titles) are particularly prone to misuse and should be very clearly labeled.

Libraries seeking to bring article searching to their front pages may already be able to do so without purchasing a separate tool. If the goal is simply to provide researchers with a starting point, then several options are available. Most libraries have a cross-disciplinary database or two that is heavily used. Many databases have an open application programming interface (API) that allows libraries to bring the database's search box to their front page. Libraries that use WorldCat Local quick start can do the same thing. A tabbed search box could even be used to provide direct access to multiple databases from the front page.

Work Together

Another important lesson from our experience is to reach out to many colleagues when evaluating or improving the library's Web presence or discovery tools. Even at institutions that have a group in charge of the website, it may be useful to reach beyond that group. If the Web committee does not include a broad range of perspectives, they may work for months before hitting a stumbling block that could have been anticipated from the beginning.

WRAP UP

The discovery process includes many steps and often requires the user to work through several different interfaces. As such, insuring efficient discovery will require a holistic approach in order to address many potential problems along the way. Importantly, user expectations and the set of websites and search tools will continue to change as time goes on. As a result, libraries must focus on continual improvement, rather than making one set of changes and resting on their laurels. By heavily involving users along the way, libraries will enable their communities not just to seek, but to find the information they are looking for.

Notes

1. Cory Lown, Tito Sierra, and Josh Boyer, “How Users Search the Library from a Single Search Box,” College & Research Libraries (forthcoming), http://crl.acrl.org/content/early/2012/01/09/crl-321.full.pdf+html (accessed May 5, 2012).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.