The Library Assessment Conference took place last week in Seattle, a valuable forum for those gathering and using evidence in support of library management and planning. I attended, with my colleague Alisa Rod, Ithaka S+R’s surveys coordinator. The program included a diverse set of presentations on topics from information literacy to space planning. Ithaka S+R’s local surveys were also featured in a number of sessions on the program.

Developing the Ithaka S+R Student Survey

Alisa and Heather Gendron, head of Sloane Art Library and coordinator of assessment at UNC Chapel Hill, spoke about how the Ithaka S+R local student survey was developed and tested. Heather spearheaded a cognitive interview process to ensure that the language of the survey questionnaire is well-suited for the population of students who take it, resulting in a variety of modifications. Alisa reviewed other aspects of the development and testing of the questionnaire and methodology, including her use of factor analysis to quantitatively evaluate the questionnaire following the pilot testing of the survey at six higher education institutions. This testing and improvement process is a real advantage of using a community survey such as the Ithaka S+R instrument, and I was glad to see the lively interest in the methodological techniques used to develop and test our survey questionnaire.

Maximizing response rates

Later in the program, Alisa spoke with Debbie McMahon of Baylor University about techniques for maximizing levels of response to surveys of faculty members. They shared practices and tips for achieving buy-in with key library staff such as liaisons, for structuring invitations and reminder messages, and for crafting incentives that help to achieve project objectives. We focus carefully with each local survey participant on techniques for maximizing responses, and we are constantly adapting our implementation guidelines to provide distinctive best practices for reaching various groups of faculty members and students.

Surveys and strategic planning

Lisa Hinchliffe, coordinator for Information Literacy Services and Instruction at the University of Illinois at Urbana-Champaign, organized a panel that focused on how to “close the loop” to establish real impact from survey projects.  Eric Ackerman, head of Reference Services and Library Assessment at Radford University, spoke on LibQual and some of the service adjustments that its findings made possible. Heather Gendron spoke about running both the Ithaka S+R faculty and student surveys in the context of a major strategic planning process that has refocused the UNC Chapel Hill libraries on supporting the research lifecycle. Finally, Elizabeth Edwards, assessment librarian at the University of Chicago, described the Herculean task of performing a kind of meta-analysis of ten years’ worth of assessment in preparation for a new library director and in anticipation of fielding the Ithaka S+R survey of graduate students. All three institutions have translated survey findings into various types of real impact.

Comparative data

Heather also spoke on a panel with Andrew Asher, the assessment librarian at Indiana University Bloomington, that looked at the Ithaka S+R local surveys in different institutional contexts. Chapel Hill and Indiana—two large research-intensive public universities—have both now run the local student and faculty surveys. This gave them the opportunity to compare findings from students and faculty members within and between both universities in order to enhance situational awareness in the context of campus-specific strategic priorities.

In an earlier post, I wrote about my LAC presentation on the potential role of assessment in library decision making.

There were many other terrific presentations on other topics, and overall it was interesting to see a real focusing of effort on the outcomes of assessment, which I believe is a vitally important issue for our community in coming years.