Don’t File It Away: Creating Actionable Survey Results
You’ve developed a thoughtful questionnaire, gathered responses, generated meaningful, insightful findings, and now want to share these results to more widely inform decision-making. How do you ensure that these findings are communicated effectively to the right individuals?
In spring 2016, Penn State University fielded the Ithaka S+R Undergraduate Student Survey and faced the challenging task of communicating findings from the survey, which covered perspectives from 2,000+ students using 37 libraries across 20 campuses. The Ithaka S+R local surveys, which have now been fielded at 80+ institutions, enable academic libraries to better understand the changing needs of important campus communities, and allow these participating institutions to compare findings within their own campuses as well as with other peer institutions.
Steve Borrelli, Head of Library Assessment at Penn State University, led this effort with much success and has graciously agreed to share some of these strategies.
–Christine Wolff-Eisenberg
We received the results of the Ithaka S+R Undergraduate Survey that was conducted across 20 Penn State campuses in April 2016 in the third week in my new position as head of library assessment. I definitely hadn’t yet figured out how to operate in the organization and knew that this project would establish the reputation of our newly formed department. I wanted this to go well, so in collaboration with colleagues who understood the organization, we developed a number of ways to share the results internally and to prepare products that would enable colleagues across the state to also utilize the data locally.
Along with the full dataset of survey responses, we received a summary report of aggregate findings which we made available internally, but because we had 20 participating campuses that varied in size, from a few hundred to over 47,000 students, it was important to start by breaking the results apart by campus and creating individual reports for each campus. These reports graphically present the results of each question for the particular campus and contrast them with the results of similar campuses and the overall results. This was no small task: with almost 200 questions in our administration of the survey, each report ended up between 70 and 85 pages. The approach provided granular data that proved valuable in discussions with the respective head librarian from each campus. During these conversations, I discussed key findings for each campus, and in the process, armed them with new data related to their students or facility that could benefit their interactions with library or campus administrators. These conversations were followed up with a presentation to all non-University Park head librarians to inform the group about trends in the data at the campus grouping level. After sharing the results with head librarians, we wanted to communicate the findings more broadly across the organization. This was done through internal blog posts of key findings which were used to promote a libraries-wide forum. At the forum, key findings were presented in addition to statistical analyses and demographic breakdowns of the results of key questions.
The dataset resulting from this survey can be intimidating to work with, as it spreads across some 220+ columns (or columns “A” through “HO”) in the CSV we received. So, we knew that in order to encourage change as a result of what we’d learned, we needed to package the results into more digestible formats. First, we developed a summary dataset, which presents the results of each question granularly at the campus, campus grouping, and overall result level. Examining and comparing results from particular campuses and between campuses was particularly informative for head librarians looking for evidence relating to facility enhancements. Next, we mapped the questions to functional areas (Instruction, Reference, PR, Development, IT, Collections, etc.) taking a first pass in identifying the subset of questions that relate to each area, and shared those lists including the summary data format with respective leadership. I also presented the subset of questions that we identified as related to library instruction with the local Instruction Community of Practice, discussing how different sets of questions could be used to inform practice. The survey asks about the types of student assignments and resources typically used; for instance, over 75% of respondents from the DuBois Campus reported that in courses they were currently taking they regularly use responses to assigned readings, whereas less than 30% of respondents from the Fayette Campus report regularly using responses to assigned readings in their coursework or research projects. This data provides insight when considered at the campus level. So once again, the aim was to arm colleagues with data that they can use to improve local practices. Our libraries benefit from having strong Development, and Public Relations and Marketing units. To make the survey data useful for them, we developed a series of statements of key findings and made them available for their use. Many of these statements have been integrated into library annual reports and other communication and outreach products. For instance, 89% of respondents agreed the library’s main responsibility was to support student learning by helping them develop research skills and find, access, and make use of books, articles, data, images, and other resources.
Communicating the results outside of the Libraries has been facilitated through the approaches and products described. Administrators and colleagues have presented findings across the campuses. For instance, they relayed two important findings relating to World Campus students to the University Online Coordinating Committee: (1) these students reported receiving library instruction at significantly lower percentages than on campus students, and (2) these students do not value core library services as highly as on campus students. Also, a subset of facility related results were used as evidence in a space study at University Park and were integrated into the program plan for an upcoming renovation to Pattee and Paterno Libraries and were presented to the Library Development Board. Additionally, key findings have been valuable to administrators and head librarians at the campuses in conversations with other institutional leaders.
In an organization as large and complex as the Penn State Libraries, our role as the Assessment Department is to make sense of a dataset like this and communicate findings with colleagues by packaging it for broad consideration and use. We try to anticipate who can make use of the data and organize it appropriately so colleagues can use the findings to inform their daily practice. We’re over a year out from conducting the survey and still are actively analyzing it from different angles to learn more about our students.