6,848
Views
5
CrossRef citations to date
0
Altmetric
Articles

Finding the gaps: A survey of electronic resource management in Alma, Sierra, and WMS

ABSTRACT

The objective of this study was to determine whether libraries that have implemented a next-generation library system are able to complete electronic resource management (ERM) workflows entirely within that system. A survey of librarians received 299 responses from users of Alma, WorldShare Management Services (WMS), and Sierra. Responses indicate that there are gaps in workflows and that many libraries are still performing core ERM tasks outside these three systems. The study concludes that these systems may require further development before they are able to fully support complex ERM processes.

The rapid growth of electronic resources over the past 2 decades has fundamentally transformed how libraries acquire and manage their collections. To support increasingly complex electronic resource management (ERM) workflows, libraries have had to supplement their traditional integrated library system (ILS) with multiple additional software products such as ERM systems (ERMS), link resolvers, proxy servers, and knowledge bases. The lack of interoperability between these systems has also required librarians to invent manual workarounds such as storing data in spreadsheets, shared drives, or e-mails. For ERM processes to be optimally streamlined and efficient, more sophisticated applications—designed to support every aspect of ERM workflows—are needed. One measurement of success of any application designed to support ERM workflows should be the elimination of the patchwork of systems and workarounds that have plagued electronic resources librarians for years.

In the past several years many academic libraries began retiring traditional print-resource-based ILSs in favor of next-generation systems that promise to more fully integrate electronic resource workflows. These new systems, dubbed “library services platforms” or LSPs by library automation consultant Marshall Breeding (Citation2011), are designed to provide improved system functionalities that meet the current realities of complex collection management across all content formats. These new platforms can replace not only the library's ILS, but the ERMS and link resolver. As of December 2014, three LSP systems have been fully implemented by U.S. libraries: Ex Libris' Alma, OCLC's WorldShare Management Services (WMS), and Innovative's Sierra (Breeding, Citation2015). Two other systems, Kuali OLE and Proquest's Intota, are partially developed but not yet fully implemented (Breeding, Citation2015).

Although these new LSP systems have been implemented in academic libraries for more than 3 years (Breeding, Citation2015), there has been little written to evaluate their impact on ERM processes. In particular, there has been no study that measures whether the external tools needed in a traditional ILS/ERM environment—such as homegrown systems, spreadsheets, e-mails, and so on—are still needed. The objective of this study was to determine whether libraries that have implemented an LSP (Alma, Sierra, or WMS) are still using external tools or whether they are able to complete their ERM workflows entirely within the library services platform. It is hoped that the results of the study will be of use to any library considering LSP products, and that the results also will provide LSP vendors with valuable information they can use to continue to refine their products.

The study seeks to answer the specific research question: Do LSPs provide libraries the ability to manage electronic resources completely within the system without the need for additional tools? By surveying staff at libraries that have implemented Alma, Sierra, or WMS, the study was designed to measure what specific ERM tasks are performed inside or outside the LSP systems, and thereby capture valuable data that can help librarians determine how these newest library systems affect ERM.

Literature review

ERM in library services platforms

There has as yet been little written on how LSP systems are used by libraries for ERM. Much of the available literature consists of accounts of library migrations to LSPs or reviews of LSPs. Both of these sources, however, can provide some insight into how ERM differs in the new systems.

Wilson's Citation2012 survey of LSP vendors includes descriptions of electronic resource functionality within Alma, Sierra, and WMS. All three vendors claim that their systems support robust ERM functionality: Alma functions include providing cost-per-use data and usage statistics, license management, and the ability to create multiple accounts per vendor; and Sierra functions include license management, usage statistics, and streamlined workflows. Wilson notes that WMS requires purchase of a separate License Manager module to manage electronic resources but that the module includes license management, vendor management, and an integrated knowledge base. While these descriptions provide some insight into how electronic resources could be managed within LSPs, they were obtained from vendor representatives and may or may not represent the actual experience of users.

There have been multiple studies published on library migrations to Alma, Sierra, and WMS that offer limited but interesting insight into the benefits and limitations of ERMs in LSPs. The most detailed accounts are from WMS libraries—not surprising considering that WMS came to market a full 2 years before Alma or Sierra.

Several case studies written specifically about the migration process to WMS provide some useful detail on ERM in the system. Erlandson and Kuskie (Citation2015) believe that WMS's unified framework allowed their library to streamline workflows and reduce redundancy. Except for realizing a reduction in the number of products and vendors required, however, no additional details are provided. They do note the inability of the system to harvest and store statistics or to link usage data to cost data.

Dula and Ye (Citation2012) provide a detailed account of Pepperdine University's WMS migration, complete with screenshots of the old versus the new system. They state that WMS has improved their acquisitions workflow and describe two improvements to their electronic resource processes specifically: Librarians no longer need to manually load records but can just mark titles as “owned” in the WorldCat Knowledge Base, and they are now using OCLC's PubGet service to automatically update serials holdings. They do note, however, that they are still “ironing out some wrinkles” with the PubGet service (p. 132).

Bordeianu and Kohl (Citation2015) also describe their library's migration to WMS in detail and note that OCLC provided an automated matching process for migration of electronic resource records. They also describe, however, the order functionality in WMS to be “clumsier and less user-friendly” (p. 287) than their previous Innovative system. The authors also explain that the new blended acquisitions and cataloging workflow caused “discomfort and confusion among staff” (p. 288). As with Erlandson and Kuskie (Citation2015), the article notes the rudimentary report feature for collection analysis and statistics.

The literature on Alma implementations yields very little detail about ERM within the system. In his discussion of Purdue's experience as development partners for Alma, Bracke (Citation2012) notes that they were able to have input into the design of Alma's ERM functions and that many of the ERM functions that had been problematic in their previous systems (Voyager/SFX) were improved. Details on specific ERM functions are not provided, however, and the conclusions were drawn prior to Purdue's production launch of Alma.

Branch (Citation2014) provides an acquisitions department's perspective of Virginia Commonwealth's Alma migration and notes that Alma has streamlined library operations, integrated print and electronic resource processes, created efficiencies, and eliminated communications silos. She also discusses how the acquisitions department's premigration workflow analysis revealed many inefficiencies such as use of paper forms, gaps in workflows, and lack of system integration. No detailed analysis of workflow post-Alma migration however, is described.

Although Persing and Moon (Citation2014) do not explicitly discuss electronic resources in their conference proceeding on print serial workflows in Alma, they do state that Alma can streamline workflows using task lists and work orders and also point out that Alma integrates resource formats into a single database.

Two reports of Sierra implementations focus primarily on use of the system's ability to use application-program interfaces (APIs) to integrate external services and do not address ERM functions. Atkinson (Citation2012) discusses how the Orange County Library System took advantage of Sierra's API functionality to connect with external systems, and Padgett and Hooper (Citation2015) explain how Sierra's Database Navigator Application (SierraDNA) can be leveraged to create custom APIs to improve library services.

Workflow gaps in traditional ERM systems

Although there are no studies that identify whether LSPs fully support ERM workflows, the gaps within the traditional ILS/ERMS environment are well documented.

A seminal survey conducted by Collins and Grogg (Citation2011) of librarians and vendors identifies areas for improvement in ERM. Over a third of librarians surveyed indicate that workflow and communications management are critically important, yet poorly managed within their ERM system, if managed at all. Usage data and statistics management also fall short because silos within the ERM system do not allow for the integration of cost and usage to be captured at both the individual title and package levels and tracked over time.

Feather (Citation2007), in her analysis of Ohio State University Libraries' communication network, demonstrates that complexity not only exists with the management of the electronic resources themselves, but with the ways in which library staff communicate to each other about acquiring, accessing, and maintaining these resources. She notes the flurry of e-mail exchanges when problems arise and the staff's tendency to “deluge those who manage these resources with communications” (p. 204). Her analysis reveals that traditional ILSs are unable to capture and file important internal communications (e-mail, text files, and so on) or incorporate and internalize library workflows.

Rathmel, Mobley, Pennington, and Chandler's (Citation2015) survey of electronic resource troubleshooting asked 226 participants from a wide variety of libraries to identify the tools they use to track e-resource problems. Most of respondents (96%) identify e-mail as the tool they use most often when recording, tracking, and archiving problems. Of the survey respondents using an ERMS, only 23% are using it to track electronic resource problems to some or to a great extent, while a small number of respondents who “were about to implement an ERM hoped that the ERM would help with tracking incidents” (p. 96). The authors conclude that the next-generation library service platforms are still too new to be effectively assessed for their data, tracking, and information-management capabilities.

Use of multiple systems and tools for ERM

The complexity and ever-changing nature of ERM has made it necessary for libraries to invest in multiple software systems as well as use manual workarounds to support ERM workflows. This proliferation of ERM tools is evidenced in the literature. As recently as 2014, a review of electronic resource software systems shows that multiple systems—including the ILS, ERMS, and knowledge bases—are required to accomplish the full range of ER tasks (Anderson, Citation2014).

Two surveys find widespread use of supplemental manual tools for ERM. England's (Citation2013) survey finds that librarians store a wide range of data in different places: Administrative data supporting collection management, e-resource management, and licensing is stored in a combination of spreadsheets, e-mails, shared drives, and even paper files, depending on the type of resource and the librarian's preferences. A similar survey of academic librarians by Branscome (Citation2014) finds that although electronic resources are predominantly managed within commercial library software systems, librarians also use more than 30 additional tools.

Blake and Collins (Citation2010) describe the complexities of ERM as akin to “a sequence from a Kafka novel” (p. 242). Their in-depth interviews with 10 academic librarians who manage electronic resources found that all 10 interviewees use multiple systems, including knowledge bases, ILS, link resolvers, A–Z lists, MARC records services, subscription agents, and ERMS. They also note that these commercial tools and services are not enough—librarians sometimes need to perform redundant or manual processes, such as loading records into multiple systems or manually updating records title by title.

Research method

For this study, an online survey was created using the Qualtrics survey tool and e-mailed directly to 445 library staff in academic libraries that had adopted one of the target library systems (Alma, Sierra, or WMS). Libraries were selected by searching the Library Technology Guides website (http://librarytechnology.org) for U.S. academic libraries that have licensed one of the target systems. E-mails were sent to 110 Sierra libraries, 150 WMS libraries, and 185 Alma libraries. Highly specialized institutions (such as seminaries, technical institutes, or culinary schools) were omitted, as were libraries with electronic resource budgets of less than $10,000 annually. Library staff e-mails were obtained from public-facing library websites, and job titles were used to identify staff involved in ERM tasks (e.g., “Electronic Resources Librarian”). E-mails were sent to professional and paraprofessional library staff. The survey was e-mailed on March 10, 2016, and remained open for 4 weeks, closing on April 7, 2016.

In addition to the direct mailings, the survey was posted to library listservs during the same time period. Listservs included ERM-related lists ([email protected], [email protected], http://www.eril-l.org) as well as technology and academic library lists ([email protected], [email protected]). Vendor/user group lists were also included for both Ex Libris and Innovative Interfaces, the vendors for Alma and Sierra, respectively.

Survey responses were entirely anonymous, and no attempt was made to identify either the library or the respondent. The only information solicited from the respondents was the library system they used. Because of this anonymity, as well as the broad audience of some library listservs, respondents may represent both academic and nonacademic libraries.

A test survey was presented at the ACRL New England Chapter Electronic Resources Management Interest Group (ACRL NEC ERMIG) meeting on November 18, 2015. The participation of ERM librarians at the ERMIG meeting resulted in major revisions to the survey instrument, including making the survey shorter and more focused on specific ERM workflow areas.

The survey consisted of an IRB consent form, two preliminary questions to identify the participant's library system, and 18 task-related questions designed to identify whether ERM tasks were performed using the target LSP systems. The study was approved by the Boston College Institutional Review Board, and all participants were required to read a consent form at the beginning of the survey and press a button indicating their consent. Participants were then asked to identify their library system by selecting one of the target LSPs from a dropdown menu. The dropdown menu also included an option for “other,” and participants who selected “other” were asked to provide their system name in a write-in field. Each subsequent question consisted of a brief description of an ERM task (e.g., “Record beginning and end dates of subscription”) followed by four possible radio-button responses:

a.

We perform this task in our library system.

b.

We perform this task outside our library system.

c.

We perform this task both in and outside our library system.

d.

I don't know or not applicable.

Participants could select only one response per task. The complete survey instrument is provided in Appendix A.

ERM tasks were constructed using the TERMS (Techniques for Electronic Management) framework (CitationEmery & Stone, n.d.) as well as the workflows of both Boston College and Tufts University. TERMS was used to provide a standard ERM workflow not specific to any institution or software system. The survey focused on the following three areas of the ERM workflow:

Section 1: Assessing a resource for purchase (TERMS 1)—7 questions

Section 2: Acquiring and implementing a resource (TERMS 2 &3)—8 questions

Section 3: Assessing a resource for renewal (TERMS 4)—3 questions

Survey tasks were developed without any prior evaluation of actual functionalities in Alma, Sierra, or WMS. Instead, the survey was developed from the standpoint that there are certain core ERM functions, as outlined by TERMS, that an ERM system should be able to support. Because of this focus on non-system-specific ERM tasks, and deliberate lack of prior knowledge of existing system functions, it is possible that some survey tasks are simply impossible to perform within an LSP. There is no way of knowing how a participant would have responded to this situation: he or she may have chosen the “not applicable” response or indicated the function was performed “outside our library system” or possibly skipped the question.

As tasks for different types of e-resources (databases, journal packages, single titles, and so on) can vary significantly, the survey asked participants to consider only one type of resource—a journal package. An example of a commonly licensed journal package (Oxford Journals Online) was provided for clarification.

Survey responses were analyzed using both Qualtrics and Microsoft Excel. Results were filtered within the Qualtrics survey tool to include only the target systems. Any response from participants indicating use of a library system other than Alma, Sierra, or WMS was removed prior to analysis. Results were then exported to Microsoft Excel, and totals and percentages were calculated for each of the four previously described responses. Results were generated for each task, each library system, each section area, and all library systems combined.

Results

The number of completed survey responses received for Alma, Sierra, and WMS library systems totaled 299. Alma libraries made up 43% of responses, while Sierra and WMS made up 40% and 16%, respectively. Additionally, 55 responses were discarded because they did not indicate one of the three target systems.

Assessing a journal package for purchase

The first section of the survey consisted of seven questions about ERM tasks performed when assessing a journal package for purchase. The task-specific questions were preceded by the prompt: “When evaluating a journal package (e.g., Oxford Journals Online) for purchase, where do you perform the following tasks?” Six of the questions asked participants where they recorded information about the journal package, including access type, number of users, cost, library fund, vendor contact information, and trial dates. One question asked where participants performed their evaluation of the journal package title list and coverage dates.

Responses from Alma users showed that many perform prepurchase assessment tasks within the system. Question 1.5, “Record the library fund used to pay for the resource,” had the highest percentage (54%) of respondents indicating they perform this task within Alma. The questions about recording vendor-contact information (Q1.6) and recording the type of access (Q1.1) also had high percentages performing the tasks within Alma: 41% and 39%, respectively. Recording trial start and end dates (Q1.7) had the highest percentage performing this task outside of Alma, at 49%. The task of evaluating the title lists and coverage dates (Q1.3) received the highest percentage of users (37%) needing to use Alma as well as an outside tool.

For Sierra, “recording the library fund for purchase” (Q1.5) also received the highest percentage of users performing the task within the system (48%). Unlike with Alma users, most Sierra users (67%) evaluate title lists entirely outside the system. Only 8% of users indicate they can evaluate title lists within Sierra. Tasks often needing both Sierra and another tool are recording the type of access (Q1.1) and recording the cost of the resource (Q1.4), with 27% and 28% of users indicating they perform these tasks both in and out of Sierra.

Prepurchase journal package assessment tasks are less frequently performed within WMS than in either Alma or Sierra. None of these tasks received more than 28% of users indicating they are performed within WMS. As with Alma, the task most performed outside WMS is recording trial start and end dates (Q1.7), with 64% of users indicating this is done outside the system. Recording the cost of the resource (Q1.4), as well as type of access (Q1.1) are also tasks many users perform outside the system – 47% of respondents indicate “outside the library system” for these tasks. As with Alma, the task with the most users using multiple tools is evaluating title lists (Q1.3), with 25% of WMS users choosing this option.

Alma had the highest percentage of assessment tasks performed within the library system (34%), while WMS had the lowest percentage (20%). WMS had the highest percentage of assessment tasks performed outside the system (48%). WMS also had a relatively high percentage (17%) of “don't know or not applicable” responses, which could indicate that some of these tasks may not be applicable to the WMS system (see ).

Table 1. Section 1: Assessing a journal package for purchase.

Acquiring and implementing a journal package

The second section of the survey consisted of eight questions in the area of acquiring and implementing a journal package. As in Section 1, task-specific questions were preceded by a prompt: “When licensing and implementing a journal package, where do you perform the following tasks?” This section included two questions identical to those asked in the first section: “record the type of access; e.g., perpetual, rolling, one-time purchase, open access” (Q2.6) and “record the number of users—e.g., single user, multiple users, site license” (Q2.7). These tasks were included in both sections because they may be performed either during the prepurchase assessment stage or during the implementation stage, depending on a given library's workflow. Questions unique to this section cover licensing, recording information such as subscription dates, title lists, administrative data, and activating journal package titles.

As with the prepurchase assessment tasks, many Alma users indicate they perform implementation tasks within the system. Journal title activation (Q2.8) had the highest percentage of any question in the survey: 91% of Alma users perform this task within the system and 0% activate journal titles outside the system. Other tasks that generated high in-system percentages for Alma were recording subscription dates (66%) and recording type of access (54%). At 40%, the task most performed by the Alma users outside the system is recording administrative data (Q2.3).

Sierra had the lowest in-system percentages for this section. In stark contrast to Alma, Sierra users generally do not activate journal titles within the system: only 9% indicated they perform this task in Sierra, while 47% perform the task outside Sierra. At 46%, the implementation task most performed within Sierra is recording subscription dates (Q2.4), and the task performed most often outside the system (62%) is storing or linking to the journal package license (Q2.2).

Like Alma, the implementation task most performed within WMS is activating journal titles at 69% (Q2.8). Also like Alma, recording administrative data is the task most performed outside the system (63%). Percentages for the tasks performed both within and outside WMS are all within a fairly close range, from 19% to 28%.

Alma had the highest percentage of implementation tasks performed in the library system (49%), while Sierra had the lowest percentage (27%). WMS had the highest percentage of implementation tasks performed outside the library system (44%; see ).

Table 2. Section 2: Acquiring and implementing a journal package.

Assessing a journal package for renewal

The third and final section of the survey consisted of three questions about tasks performed when assessing a journal package for renewal. The prompt read, “When determining whether to renew a journal package, where do you perform the following tasks?” The three assessment tasks follow: calculating cost per use of journal titles, comparing usage statistics with other packages, and viewing usage statistics.

These three assessment tasks are overwhelmingly performed outside the system for all three LSPs, and percentages are comparable across systems. Cost per use (Q3.1) assessment is performed in Alma by 5% of users, in Sierra by 4%, and in WSP by 6%. Results for comparing usage statistics with other packages, titles, or both (Q3.2) is almost identical for Alma and Sierra: 6% of Alma users and 5% of Sierra users perform this task within their respective systems. WMS results were higher, with 13% of users indicating they can accomplish this task in the system. At 16%, WMS also had the highest percentage of users indicating they view usage statistics within the system (Q3.3), with Alma and Sierra trailing at 6% and 4%, respectively.

The low in-system functionality for these renewal assessment tasks may be an indication that usage statistics functionality is either unavailable in the systems or is available but too problematic to be useful. Indeed, in 2015 Erlandson and Kuskie noted that usage statistics were as yet unavailable in WMS. Despite this, WMS had the highest percentage of renewal assessment tasks performed inside the library system (12%), while Sierra had the lowest percentage (4%). Respondents indicated that 85% of Sierra libraries perform these tasks outside the system. Alma had a relatively high percentage of “don't know or not applicable” responses (17%), which may also indicate a lack of usage statistic functionality within Alma (see ).

Table 3. Section 3: Assessing a journal package for renewal.

Summary of all survey responses

There is little correlation between the three systems as to which ERM tasks are most frequently performed within the library systems. Only one task ranks within the top five most frequently performed tasks across all three systems: “record package details, including full list of titles and coverage dates.” Other similarities are limited to two out of the three systems. For example, “activate all journal titles” is the task most frequently performed within both Alma and WMS, but it ranks 12 out of 18 for Sierra. The task most frequently performed within Sierra is “record the library fund that will be used to pay for the resource.” This same task is ranked fourth for Alma users and twelfth for WMS users (see ).

Table 4. Tasks performed entirely within systems.

The tasks most frequently performed outside the LSPs are much more consistent. As previously noted, the tasks associated with assessing a journal package for renewal rank within the top five tasks performed outside the library system across all three LSPs. In addition, a number of renewal assessment tasks are performed outside the library system by Sierra and WMS users more than 50% of the time (see ).

Table 5. Tasks performed entirely outside systems

Conclusion and future study

This study explored whether libraries can manage electronic resources completely within library services platforms or if additional tools are still required. The results clearly indicate that many Alma, Sierra, and WMS libraries are still performing core ERM tasks outside their systems. The results also provide insight as to which ERM workflow areas or tasks are fairly well supported by these systems and which are not.

While Alma and Sierra had comparable percentages related to in-system tasks associated with assessing a journal package, Alma appeared to outperform the other systems in the area of acquiring and implementing a journal package, scoring the highest percentages in seven of the eight questions. All three systems performed equally poorly in the area of assessing a journal package for renewal. These results indicate that usage statistics and cost-per-use data are not yet useful within LSPs, and that this is the workflow area with the greatest need for improvement.

The gaps remaining within LSP systems become especially evident when results from the “we perform this task outside our library system” are combined with those from “we perform this task both in and outside our library system.” Both responses indicate that some additional tool or system is being used to perform the task. Combined results indicate that many libraries still rely on outside systems either fully or in conjunction with their LSP when performing ERM tasks (see ).

Table 6. Gaps in LSPs—Tasks performed either partially or entirely outside systems.Footnote 1

Future study

As this survey only assessed a subsection of ERM workflows, it provides only a partial look at system gaps. The survey questions were narrowly focused on journal package tasks and only on tasks that take place during prepurchase evaluation, implementation, and assessment for renewal. Further study is needed to determine what gaps exist for other types of resources (e.g., electronic books, databases, open-access resources, and so on) as well as other areas of the ERM workflow.

Another possible area for further exploration is examining the gaps identified in this study in greater detail. This study discovered which tasks are or are not being performed in LSP systems but did not attempt to learn why. Additional research could include in-depth interviews with the electronic resource librarians who use these three systems to address questions such as the following: Which specific additional tools are being used and why? Are tasks performed externally due to a complete lack of functionality within the systems, or are functions available but in need of improvement?

This study also did not compare the gaps in LSPs with gaps that had previously existed in the traditional ILS/ERMS environments. There is need for a comparative study that explicitly measures any improvements gained by moving to these new systems.

This research suggests that ERM remains a complex process that is, as yet, too daunting to encompass within any one software system. It hints that electronic resources workflows may still involve convoluted manual workarounds and a patchwork of tools and that LSPs need further development. It is possible that LSPs, as single, monolithic systems, are not the answer, and that truly streamlined, customized, and integrated workflows will not be possible until the next iteration of library systems. Until either LSPs address the gaps in ERM workflows, or new systems are developed to replace LSPs, however, we will not know whether ERM can be truly automated or whether it will remain a complex, clumsy, and arduous endeavor.

References

  • Anderson, E. K. ( 2014). Elements of electronic resource management. Library Technology Reports , 50 ( 3), 11– 22.
  • Atkinson, W. E. ( 2012). The Orange County Library System environment: Connecting Sierra with custom applications on the Web. Information Standards Quarterly , 24 ( 4), 27– 32.
  • Blake, K. , & Collins, M. ( 2010). Controlling chaos: Management of electronic journal holdings in an academic library environment. Serials Review , 36 ( 4), 242– 250.
  • Bordeianu, S. , & Kohl, L. ( 2015). The voyage home: New Mexico libraries migrate to WMS, OCLC's cloud-based ILS. Technical Services Quarterly , 32 ( 3), 274– 293. doi: 10.1080/07317131.2015.1030267
  • Bracke, P. J. ( 2012). Alma at Purdue: The development partnership experience. Information Standards Quarterly , 24 ( 4), 16– 20.
  • Branch, D. ( 2014). Alma in the cloud: Implementation through the eyes of acquisitions. Proceedings of the Charleston Library Conference . Retrieved from http://dx.doi.org/ 10.5703/1288284315322
  • Branscome, B. A. ( 2014). Management of electronic serials in academic libraries: The results of an online survey. Serials Review , 39 ( 4), 216– 226. doi: 10.1080/00987913.2013.10766402
  • Breeding, M. ( 2011). A cloudy forecast for libraries. Computers in Libraries , 31 ( 7), 32– 34.
  • Breeding, M. ( 2015). Library services platforms: A maturing genre of products. Library Technology Reports , 51 ( 4), 1.
  • Collins, M. , & Grogg, J. E. ( 2011). Building a better ERMS. Library Journal , 136 ( 4), 22.
  • Dula, M. W. , & Ye, G. ( 2012). Case study: Pepperdine University Libraries' migration to OCLC's WorldShare. Journal of Web Librarianship , 6 ( 2), 125– 132. doi: 10.1080/19322909.2012.677296
  • Emery, J. , & Stone, G. ( n.d.). TERMS: Techniques for electronic resource management. Retrieved from https://library3.hud.ac.uk/blogs/terms/
  • England, D. ( 2013). We have our ERM system, it's implemented: Why am I still going here and there to get the information I need? The Serials Librarian , 64 ( 1–4), 111– 117. doi: 10.1080/0361526x.2013.760148
  • Erlandson, R. J. , & Kuskie, J. ( 2015). To boldly go where few have gone before: Global e-resource management in the cloud. The Serials Librarian , 68 ( 1–4), 215– 222. doi: 10.1080/0361526X.2015.1017421
  • Feather, C. ( 2007). Electronic resources communications management: A strategy for success. Library Resources & Technical Services , 51 ( 3), 204– 211, 228.
  • Padgett, J. , & Hooper, J. ( 2015). SierraDNA: Demonstrating the usefulness of direct ILS database access. Code4lib Journal , 30 . Retrieved from http://journal.code4lib.org/articles/10924
  • Persing, B. , & Moon, Y. J. ( 2014). The end of Nostradamus: Killing predictive check-in without feeling guilty. Serials Librarian , 66 ( 1–4), 115– 122.
  • Rathmel, A. , Mobley, L. , Pennington, B. , & Chandler, A. ( 2015). Tools, techniques, and training: Results of an e-resources troubleshooting survey. Journal of Electronic Resources Librarianship , 27 ( 2), 88– 107. doi: 10.1080/1941126X.2015.1029398
  • Wilson, K. ( 2012). Introducing the next generation of library management systems. Serials Review , 38 ( 2), 110– 123. doi: 10.1080/00987913.2012.10765438

Appendix A. Survey Instrument

A.

What is your CURRENT library system?

Ex Libris Alma

OCLC WorldShare Management System

Innovative Interfaces Sierra

Other

B.

If you selected OTHER, please write the name of your CURRENT library system below:

Section 1: Assessing a journal package for purchase

When evaluating a journal package (e.g., Oxford Journals Online) for purchase, where do you perform the following tasks?

Section 2: Acquiring and implementing a journal

When licensing and implementing a journal package, where do you perform the following tasks?

Section 3: Assessing a journal package for renewal

When determining whether to renew a journal package, where do you perform the following tasks?