Landscape of Library Service Quality Tools
During the course of the past year, I have had a chance to speak with many assessment librarians, library deans, and others in academic libraries about the types of tools they are using, or considering, for their planning and assessment projects. We typically connect because they have fielded, or are thinking about fielding, one or more of the Ithaka S+R Local Surveys, which primarily focus on the ways in which students and faculty members learn, teach, and perform research. They often are also considering a variety of other assessment approaches as well. Perhaps chief among these are library service quality surveys, which generally center on the importance of and satisfaction associated with various aspects of the library. We often are asked about what these library service quality tools cover and how they are administered, and to this end, have undertaken a high-level side-by-side comparison of these tools.
The tools outlined below — LibQUAL+, LibSat, and MISO — are some of the most frequently utilized within academic libraries in the United States. The table below includes a summary of each tool, which I have fact checked with the provider of each.
Parent organization | ARL | Counting Opinions | Bryn Mawr |
Core thematic areas covered | Minimum service levels, desired service levels, and perceived service performance of three library dimensions (affect of service, information control, and library as place) | Satisfaction with and importance of aspects of in-library and online library services, policies, facilities, equipment, and resources | Importance of, satisfaction with, and frequency of use of library (place-based, online, and in-person) and computing services |
Additional topics covered | Information literacy outcomes, library use, and general satisfaction | Likelihood to recommend, services used, and information-seeking preferences | Campus communications, tools used, and levels of skill |
Ability to customize | Participants can add up to five additional local questions Participants can field “lite” version with reduced number of questions | Participants can localize questions and prompts to convey local terminology and remove or add questions Respondents may be able to select survey length upon beginning the survey (regular, or in-depth versions) | Participants can include or exclude any items in the survey Participants can include additional locally developed questions |
Survey administration | Administration handled by the participating institution | Continuous feedback gathered via library website, email distribution, staff intercepts and/or paper-based response; respondents may also volunteer to receive invitations for annual survey follow-ups | All participants on the same timeline for administration |
Response rates | Not reported (ARL does not collect this information; libraries are not required to provide these data to ARL) | If sample is defined, response rates can be generated based on identified time periods, as responses are continuously gathered over an extended period of time | Most institutions see rates of 50%+ |
Institutional participants | 3,000+ surveys fielded to date by 1,390 institutions 109 fielded in 2018 Includes participation within and outside of North America Nearly all participants are higher education institutions | All participants within North America Many participants are not higher education institutions | 149 institutions to date 40 participated in 2018-19 All participants within North America All higher education institutions |
Pricing | $3,200 to participate with discounts for repeat participation within one or two years | Annual subscription relative to size of institution; quotes available on request | $2,200 for three core populations (faculty, undergraduates, and staff) |
Platform | Independently hosted platform | Independently hosted platform | Qualtrics for survey administration Independently hosted platform for reporting |
Deliverables for standard participation | PDF report with aggregate and stratified results by user group Raw data in csv and SPSS formats Real-time access to comments Radar charts within platform | Reports available within platform; can view, segment, and export in real-time as responses are collected Ability to route open-ended comments to persons of responsibility Raw data in XML, tab-delimited, or csv formats | Aggregate results provided in Excel and PDF formats for each campus population Raw data in csv and SPSS formats |
Ability to benchmark and compare results | Can compare results with other institutions that participated in the same year, analyze results by user group and discipline, and download data tables and radar charts; annual subscription available for expanded access to data for all participating institutions from 2003-present | Can compare results within platform with aggregate pooled results from all other institutions and across respondents from multiple libraries | Can compare results within platform with any individual or combination of other participating institutions and across populations (e.g. faculty versus students) |
I recognize that many libraries are developing their own in-house surveys, and while these cannot be compared in the same fashion as the tools above, perhaps institutions can compare on an individual basis the scope of their home-grown instruments with these options. I hope that this summary provides useful information for transparent, evidence-based decision-making and look forward to hearing how it is used when libraries consider their options for measuring service quality.
Comments
It's a few years old now but I wrote an article "Library Assessment and User Surveys in Academic Librarianship in the United States" that included overviews/commentary on LibQUAL, MISO, Ithaka , and the HEDS Research Practices surveys. It was originally published in Italian (thank you translator!) but the English version is in our institutional repository: http://hdl.handle.net/2142/78077
Thanks, Lisa! I appreciate the depth of examination of topics covered within each survey instrument in your article -- really interesting to see how some of the coverage has shifted and evolved just over the past few years.
Thanks for this helpful overview Christine! Frankie Wilson and I have recently conducted a similar review in our (in press) book 'Putting Library Assessment Data to Work'. The ones you've listed are predominately US focused, although LibQUAL+ has been widely adopted internationally. Another commonly used tool outside the States is the Insync Survey (formally Rodski). It was developed in Australia around the same time as LibQUAL+ and has been used predominately by the CAUL community. Hope that's helpful for your research.
You are absolutely right, Selena -- though there is some engagement with these tools outside of the United States, this list has very much been shaped by academic libraries within the US. I've amended the post to more clearly reflect this. Looking forward to when your book is available!
There is indeed a very large number of publications on LibQUAL+ which probably obscures the distinctions; however, it is worth pointing out that this is primarily because LibQUAL+ has added in a substantive way to the concept of library service quality as well as the practical transformations of our libraries.
The primary driver that LibQUAL+ got established as a service by ARL was the support provided from the Fund for the Improvement of Post-secondary Education from the U.S. Department of Education; the effort is grounded on the academic rigor of scholarly and scientific communication.
As one of the co-principal investigators on LibQUAL+, I am delighted to see the impact half a million in federal funds had on the library profession, academic libraries, and higher education! The international reach and global impact of LibQUAL+ is one example of the kind of impact US higher education projects may achieve.
Research on library service quality as it relates not only to higher education but to our enhanced social, educational, and civic infrastructures continues! QualityMetrics continues this research and produced a rich record of how libraries improve the quality of our lives through 22 LSTA evaluation reports posted on the IMLS website.
It is my hope that we continue to see rigorous research, meaningful evaluation studies, and robust scholarly communication outputs on these issues in the future as well! Transforming libraries can transform our world for the better!
Martha Kyrillidou, PhD MLS MEd
Co-PI on LibQUAL+ and Principal for QualityMetrics LLC
http://www.qualitymetricsllc.com