Beyond the Celebrations
The disturbing things LIS students find out about real YA programming and professionals’ ethical obligation to improve YA experience
It becomes more difficult each year to convince LIS students that they need to demonstrate service impact on YAs.
It is especially difficult when they see so little professional commitment to it coming from practitioners in the field. Why is it that so few YA librarians exhibit curiosity about the outcomes their users derive from their professional interventions? It certainly is a rare instance in which librarians demonstrate this curiosity to their future colleagues.
Before you get too angry at this revelation, however, let’s establish a few basic definitions. First and foremost, let’s differentiate library inputs and outputs from outcomes.
- Inputs count the resources libraries offer their users: number of public access hours, number of computers and staff and programs, size of collections, and so on.
- Library outputs are things libraries traditionally count that occur as a consequence of inputs: visitor or “gate” counts, log-in hours on library computers, program attendance or “head-counts,” circulation statistics, etc.
Inputs and outputs help the library keep track of the resources it uses – something comparable year after year, for example.
Neither, however, address nor reflect . . .
- The degree to which actual users value their visits, benefit from computer use, enjoy programs, or grow from borrowed materials. These are Outcomes that reflect impact and illustrate what users feel was useful to them. Outcomes document the quality of YA experience.
During the more than ten years I’ve been teaching graduate LIS students interested in youth services, I have assigned a field experience in which students evaluate a YA or youth services department or program. Students from all over the country take these classes – so we’re not targeting a particular region. What students find during these examinations of “real-life” YA services is that in only very rare circumstances are young adults asked about the degree to which they value, benefit from, enjoy, or grow from what libraries offer.
How, libraries are increasingly asked, do they know they’re throwing “strikes” (offering successful services)?
Yes, students collect, review, assess, and incorporate input and output measures into their overall analysis. Not all libraries willingly share these normally public statistics with LIS students, however; something already of a concern. But most do. Thus, students analyze such things as the number of professional staff hours a library assigns to YA services, the number of summer reading program sign-ups the library records, perhaps YA-specific circulation. These are not unimportant details, particularly when compared meaningfully across several years.
Nor does ignoring the degree to which young adults enjoy, value, appreciate, and grow from their experiences necessarily mean they do not enjoy, value, appreciate, and grow from them. They very well may do so.
But it also means, however, that few libraries exhibit curiosity about these questions. It means libraries seldom collect evidence about them. To be blunter, it also means that, during the past ten years, overwhelmingly few libraries even ask their YA users about their experience. How do libraries know they’re throwing strikes?
Moreover, it means that when next year’s planning comes around, those same libraries possess no YA user data upon which to help modify and improve the services they offer. Are these the best service offerings the library can offer their YA community? How would they know? Are the appropriate resources (staff, skills, time, funding, space, planning, etc.) aligned to support the success of the services they offer? How would they know? If they don’t, what steps are necessary to better align resources with programming objectives? What other service/programming options might be better? Again, how would they know?
These questions cohere into one: To what degree do young adults value library offerings?
This is the question libraries must constantly engage to deliver professional-level YA services. Professionals must exhibit curiosity about the degree to which their interventions are valued. They must produce evidence to support it and the evidence must be easily understood by the library’s many constituencies – especially the public.
Are there obstacles to asking/answering these questions? Yes, of course. But if YA services are ever to be regarded as a professional specialization and add meaningful public value to the library and to LIS, then professionals must pursue them.
My students must assess the obstacles their example libraries face when confronting their lack of user-centric evaluation. These obstacles are familiar to any paradigm or organizational shift – such as the change from an institutional reliance on input and output measures to user evaluation in outcomes.
Students properly assess, for instance, that libraries do not align sufficient time for such evaluations during their service/program planning stages. They assess that staff do not possess sufficient skill to evaluate their own offerings – a skill set these students will certainly not lack. Another common assessment is that organizational culture would reject user-centric evaluations.
In most instances, I would suggest that these issues form common information needs or requests. How do I/we better align planning stages to demonstrate program value and effectiveness for our users? What skills do I/we need to evaluate professional services? How can I/we change the organizational structure to adopt professional-level service evaluation?
Although this is a column, not a workshop, I’d like to serve temporarily as a reference librarian. In addition to several books and articles on outcomes, the Public Library Association is currently in the midst of offering Project Outcome (https://www.projectoutcome.org/) in response to these issues. The Project, free to all U.S. and Canadian libraries, addresses many of the obstacles about outcomes that practitioners face in everyday practice: how measuring outcomes can help demonstrate community impact; a Project Outcome Toolkit; examples of how to apply the toolkit in the library.
PLA’s Project Outcome is not intended simply for the otherwise privileged service profiles of adult and children’s services but is as applicable (and urgent) for YA services. During this period in which public services of all kinds find themselves under ideological and fiscal attack from so many quarters, YA services (already limping from a weak basis in research) must find a way to document the value YA users find in what libraries offer.
Libraries can begin by developing curiosity about what our YA users experience in library services, in developing evidence about those experiences, and using that evidence to improve library offerings to sharpen our strike zones.