At ALA Midwinter in Boston, we attended several valuable sessions on assessment, evaluation, and data visualization. Here’s a roundup of what we heard.

ARL Library Assessment Forum

Kenning Arlitsch, dean of the library at Montana State University, reported on his grant project, Measuring Up: Assessing Accuracy of Reported Use and Impact of Digital Repositories. While at Marriott Library at the University of Utah, Arlitsch observed that the reported use of the digital collections at the university’s three libraries was widely divergent. While some of this usage could be explained by the scope and mission of the libraries, Arlitsch discovered that they were using different methods to count usage. Log files count PDF usage very differently that analytics services such as Google Analytics. Google Analytics, for instance, does not track non-HTML file downloads via direct external links, resulting in serious undercounting. Log file stats are more likely to over count. Because of these significant differences, Arlitsch is concerned that when libraries compare the usage of their digital resources, and especially their institutional repositories, to usage at peer institutions, they aren’t comparing apples to apples.

Bob Fox, dean of the University Libraries of the University of Louisville, discussed both the space assessments that have led, and are still leading, to major renovations of the Ekstrom Library and how he uses assessment of library resources to advocate for more funding from the university. His pie chart of LibGuide usage was particularly resonant. While arts and sciences guides receive just around half of the usage, health sciences guides account for almost one third of total usage. This type of visualization is a simple way to demonstrate the value of library resources to this important constituent. Clear visualizations of how Louisville’s University Libraries allocate their budget to collections as compared to peer institutions will also (he hopes) help him make the case for why they need more funding.

Data Visualization for the Rest of Us: A Beginner’s Guide

Linda Hofschire, research analyst at the Library Research Service of Colorado State Library, presented on strategies for designing across the “data visualization continuum” from simple Excel charts to complex infographics.  She shared tips on making data more accessible, stressing the importance of ensuring that the data displayed are meaningful and tell the story you want to convey without relying on additional text.  She led the attendees in useful exercises in critiquing charts and infographics and generating suggestions for improvements.  Hofschire wrapped up the session by sharing her steps for success, noting that it is critical to understand your goals for presenting the data and the audience you hope to reach prior to designing any visualization.  Slides from Hofschire’s presentation, as well as other relevant resources, can be found here: http://www.lrs.org/ala-mw-2016-data-visualization-for-the-rest-of-us-a-beginners-guide/.

ACRL Assessment Discussion Group

Convened by Nancy Turner, assessment and organizational performance librarian at Temple University, the ACRL Assessment Discussion Group generated meaningful discussion and debate on how to balance student privacy with assessment needs. Several librarians are experimenting with swiping student ID cards during every reference transaction and matching those transactions with student data, including major, GPA, and year. How much information about each reference transaction is appropriate to log? Should this include specific research questions, or might that raise privacy concerns? As one attendee noted, librarians should continually ask whether the value of the assessment data outweighs the privacy of the student. This issue was again highlighted as the group turned to library space usage. If we use transmitters or other means to see where students congregate in the library, do we get more useful data than from observations or interviews with students about their activities? Or, is it worthwhile to have an automatic system to know where students are so that library staff can bring resources directly to those students?

The group also discussed strategies for assessing library instruction. Most are unhappy with the widespread practice of asking students to self-report what they learned at the end of a “one-shot” instruction session. One librarian reported that they are experimenting with surveying students at the end of the semester, asking what aspects of the instruction helped them fulfill the requirements of the course.

Evidence-Based Practice Discussion Group

At the Evidence-Based Practice Discussion Group, Jennifer Elsik, research and liaison services librarian at the University of Texas, Austin, discussed recent and ongoing changes to library spaces to meet the evolving needs of users.  Elsik emphasized the importance of gathering input from students, both for gathering useful feedback for designing these spaces and for developing relationships and communities with these students.  Nancy Adams, assistant librarian at the College of Medicine at Penn State, Hershey, spoke about a recent project which examined the ways that librarians have partnered with instructors in the field of education to inform, provide resources, and develop materials about teaching evidence-based practices.  Adams explained that in the field of education, the “way of knowing” is often based on the professional wisdom of the educator themselves, which can negate the perceived usefulness and importance of evidence-based practices.  Finally, Adams led a discussion on whether evidence-based practice is seen as making decisions based on data or making decisions based on the best available, scientifically rigorous evidence; attendees expressed that the two are on a continuum and that while the latter is ideal, the former may sometimes be the best that we can do with the information available.

Hot Topics in Assessment Discussion Group

At the Hot Topics in Assessment Discussion Group, hosted by the LLAMA Measurement, Assessment, and Evaluation Section (MAES), Lisa Horowitz, assessment librarian at MIT, discussed her library’s assessment of liaison librarian outreach efforts to new faculty.  Every liaison librarian was asked to take certain steps to interact with their faculty members and to document these interactions and outcomes on a standardized worksheet.  In instituting this procedure, the library had several goals:

  • To increase new faculty members’ knowledge of library services
  • To create bridges between the resources the faculty members had access to at their previous institutions and the resources available at MIT
  • To build and strengthen relationships between faculty and the library
  • To create greater awareness of library resources and services for students by building awareness for faculty members.

Maurini Strub, user experience and assessment librarian at the University of Louisville, discussed a recent project designed to better assess events hosted in the library. Now, everyone wanting to host an event needs to submit a proposal that outlines the event’s goals and expected outcomes. Each proposal also needs to describe the plan for assessing the success of the event.

 

Several of these topics are central to Ithaka S+R’s research program, including exploring group study, digging into the research support needs of faculty, and privacy. The questions and topics raised by librarians in these ALA sessions help us see where further investigation might be most fruitful.