Sorting by

×

Competency N

Evaluate Programs & Services Using Measurable Criteria

Evaluate programs and services using measurable criteria.

“Evaluation data can help you better understand your community, strengthen your program design & engage in continuous learning…”

– California State Library, Evaluation and Continuous Improvement

Evidence

INFO 210 – Evaluating E-mail Reference Services

For this assignment, I e-mailed the Los Angeles Public Library via their website and utilized the Reference and User Services Association (RUSA) guidelines to evaluate the reference services provided.  I started by filling out a form that included my name, telephone number, email address, library card number, street address, and reference question. Then I analyzed the approachability/interest of the library worker via email, but the response turned out to be an automated one. I inquired about the origins of Robin Hood and if he was based on a real person. The email informed me that due to the special nature of my request, my email has been referred to the Central Library – Literature & Fiction Department and that they would respond to my email as soon as possible. 

Since it was an email inquiry, I was not kept in the loop about the search tactics or methods used. The next email I received the day after the first response was from a librarian. He typed up a couple of paragraphs of information and sent two links to some related books. He did state that there was no easy answer to my question but did his best to explain the books that he had retrieved and how they are relevant, which I very much appreciated. He ended the email by saying that if I wanted to pursue my question further I could use the ‘Library To Go’ (curbside pick-up) system and put a hold on one of the books, or if I wanted more books on the subject I could copy and paste “Robin Hood (Legendary Character)” into the library catalog search. He then thanked me for using the library and ended the reference interview. This exercise made me appreciate the value of human interaction and the time library workers spend on patron inquiries. It also helped me become more familiar with the RUSA guidelines, which can be used to identify strengths and/or weaknesses within a library service and suggest improvements. 

INFO 284 – Evaluating Programs & Services in Archives 

In this discussion, I contemplated what the most important data to gather and share with internal stakeholders and the public is. I also explored who would be interested in this data, within and outside of the institution. Griffin (2020) states how the ACRL-SAA standard allowed their institution to focus on the process of gathering information and not get caught up in perceived value judgments about previous practices. It is possible to imagine how professionals could get caught up in past practices and not move forward. Delegating the evaluation tasks seems to be successful for some institutions. One staff member is responsible for entering statistics into LibInsight, while Griffin is responsible for analyzing the data for approximately 5 hours a month (Griffin, 2020). This tactic could be effective at short-staffed institutions. 

After pondering the initial question, I concluded that the most important data would depend on the setting that the archive is in. An archive in an academic setting, for example, would likely focus more on user demographics, instruction, and reference transactions whereas a museum archive would potentially focus on user demographics, events, exhibitions, and online interactions. However, regarding the public and internal stakeholders, it seems that user demographics, online transactions, reference transactions, collection use, and reading room visits would be the most useful.

INFO 284 – Analyzing Patron Services 

For this assignment, I picked an institution and analyzed its online patron services. I chose the Beinecke Rare Book & Manuscript Library, which is a part of the Yale University Libraries. I first examined the home page of the website including its appearance, and in this case, the search bar caught my eye. Underneath the ‘quick search’ search bar, there are additional search tools/limiters which include articles, archival collections, Orbis (run by Ex Libris), databases, digital collections, and journals. On the homepage, on the upper right-hand side is the “Chat with Us” tab. When clicked on, it prompts the user to ask a question. During the hours that chat is unavailable, there is a link that takes the user to a searchable FAQ page that also contains additional contact information, by phone, text, or e-mail. Also included on the FAQ page is a link to an organized list of subject specialists and personal librarians. There’s no indication of how long it will take to answer the reference question once submitted. 

According to the website, the Beinecke Rare Book and Manuscript Library is one of the world’s largest buildings devoted to rare books and manuscripts. When a user clicks on the Beinecke Rare Book link under the “Visit and Study” heading on the homepage navigation bar, it takes them to a well-organized page, full of information. Here the hours of operation are posted, and there’s a Google Map location along with a telephone number and email. Accessibility and access are discussed as well as items provided during the visit (computers, cell phone service, outlets, etc.). Overall, the services are easy to locate, and location, hours, and reading room protocols are available. The only improvement I would make is adding how long a patron must wait to hear back about their reference question. 

Conclusion

In closing, information professionals should regularly assess their institution’s programs, resources, and services to ensure they are meeting the needs of their user communities. Possessing these skills is crucial when leading a task force, project, or committee that involves planning, developing, and improving services. As a professional, I will use evaluation guidelines and standards to collect data, analyze and publish results, and contribute to decision-making. Following these best practices will help keep an archival or library collection and its services relevant, funded, and utilized.

Introduction

A unique challenge for information professionals is they have had to constantly change and adapt as today’s library and archival services also change and evolve. Information professionals in the 21st century need to be flexible to deal with the constant changes in political, health, climate, and industry-related changes in addition to learning new processes and procedures to better support the communities they serve (Hirsh, 2022). The evaluation of information services is crucial for enhancing and customizing them to the communities that the organization serves. This process of evaluation includes determining the results, costs, outcomes, impacts, successes, and other factors related to a library’s collection program services or resource use (Matthews, 2018). 

In 2023, the Reference and User Services Association (RUSA) formed a task force to re-work the Guidelines for Behavioral Performance of Reference and Information Services. This update included the removal of the distinction between in-person and virtual reference categories, the incorporation of equity, diversity, inclusion, and access (EDIA) principles and behaviors, and adding an evaluation section that addresses the increasing issue of misinformation and disinformation. These guidelines provide professionals with best practices for serving information-seeking patrons in addition to improving services and programs.

The Importance of Evaluating Programs & Services

Richard Orr suggests that a manager has four major responsibilities: to define the goals of the organization, to obtain the resources needed to reach these goals to identify the programs and services needed to achieve these goals, and to see that the resource allocations for a particular activity are used wisely (Matthews, 2018). These concepts also apply to information professionals who wish to conduct program, resource, and service evaluations.

In 2018, the Standardized Statistical Measures & Metrics for Public Service in Archives and Special Collections Libraries was approved by the Society of American Archivists. This standard was developed to provide archivists and special collections librarians with a set of precisely defined, practical measures based upon commonly accepted professional practices that can be used to establish statistical data collection practices to support the assessment of public services and their operational impacts at the local institutional level (SAA-ACRL/RBMS Joint Task Force, 2018). The purpose of this standard is to assist these specific information organizations in establishing statistical data, utilizing this data to assess their public service operations and improve policies and practices of these services.

Reporting the Results 

Once the data has been gathered and analyzed, a report should be prepared that describes the evaluation effort and the results that have been achieved (Matthews, 2018). A professional evaluation report usually contains an executive summary, an introduction, a literature review, data collection on methodology, analysis of data conclusions, recommendations, limitations of the study, appendices, and a bibliography or references. Reports should be short, and concise, and utilize tables, charts, and graphics to highlight what data has been discovered (Matthews, 2018). Another idea to consider is how the report will be shared with stakeholders, coworkers, and/or administrators. Will the report be printed, sent as an email attachment, or given as a presentation?  Sometimes gathering feedback from the committee, team, or task force before submitting the final version of the report can be beneficial.

Program, resource, and service evaluations should be done on an ongoing basis. Libraries that do not engage in evaluation run the risk of becoming irrelevant, underfunded, and underutilized (Matthews, 2018).  It is recommended that an evaluation report should be submitted for publication in a professional journal. Sharing evaluation procedures, ideas, and experiences will benefit other professionals and encourage them to develop their evaluation skills. One example is the City of Vancouver Archives Annual Report 2022, which breaks down the profile of users not only at the archive but on social media as well (City of Vancouver Archives, p. 6). Having 1.1 million impressions on social media is an impressive number! The information presented is easy to understand (no arduous archival terminology), which might encourage interest in the general public, as well as present/potential stakeholders.

References

City of Vancouver Archives. (2021). Annual Report. https://vancouver.ca/files/cov/vancouver-archives-annual-report.pdf

Griffin, M. (2020). A Methodology for Implementing the Standardized Statistical Measures and Metrics for Public Services in Archival Repositories and Special Collections Libraries. Journal of Contemporary Archival Studies, 7(14). https://elischolar.library.yale.edu/jcas/vol7/iss1/14 

Hirsh, S. (Ed.). (2022). Information services today: An introduction. Rowman & Littlefield Publishers, Incorporated. ProQuest Ebook Central. http://ebookcentral.proquest.com/lib/sjsu/detail.action?docID=6891082

Matthews, J.R. (2018). Evaluation: An Introduction to a Crucial Skill. In K. Haycock & M.J. Romaniuk (Eds.), The portable MLIS: Insights from the experts. Libraries Unlimited.

Reference and User Services Association. (2023). Guidelines for behavioral performance of reference and information service providers.  http://www.ala.org/rusa/resources/guidelines/guidelinesbehavioral 

SAA-ACRL/RBMS Joint Task Force. (2018). Standardized Statistical Measures & Metrics for Public Service in Archives and Special Collections Libraries. Association of College & Research Libraries. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/statmeasures2018.pdf