Five strategies for humanely conducting surveys in higher ed during a global crisis
The world has changed drastically in the last few months and so have the challenges that are facing our communities. Decision-making informed by evidence, gathered and acted upon quickly, is as important—if not more important—than it has ever been for higher education leaders.
These are not normal circumstances for conducting research, let alone working or living. Under normal circumstances, my colleagues and I might start the development of a major survey by building an advisory board to brainstorm thematic areas of coverage before starting to draft the instrument. We might take a few weeks to iterate on the instrument prior to conducting cognitive interviews with a dozen pre-testers. We might need a few weeks, if not months, to analyze and synthesize the results into a final report.
Over the past month, my team at Ithaka S+R has unexpectedly led and collaborated on the development of three surveys (with a fourth to be announced tomorrow) for different communities affected by the global pandemic. These include projects that focus on tracking key steps that academic libraries are taking related to their physical buildings, services, collections, and staffing; helping college and university leaders as they grapple with challenges related to moving instruction online, ensuring that students are supported holistically, and beginning to forecast retention; and informing what our ITHAKA colleagues need after four weeks of working from home.
There are certain principles that have been employed across the development of all of these surveys. These principles, detailed below, generally boil down to minimizing harm and maximizing possible benefits—which has always been at the heart of our survey development process, but which we must interpret differently in light of today’s circumstances.
…Maybe don’t field a survey at all
To contradict everything you’re about to read: maybe now is not the right time to field a survey. As my colleagues and I have considered fieldwork over the past month, we’ve first reflected on the extent to which a particular survey is related to the pandemic—and, if it is not, we have foregone or disbanded fieldwork. We’ve also asked ourselves, “Is this a survey that people will have the bandwidth to respond to right now? Is there existing data we can consult instead?” It’s safe to assume that the individuals we’re seeking to survey are already burdened by other professional and personal concerns at the moment—so we’re doing whatever we can to avoid burdening them further with yet another request.
To this end, there are multiple survey initiatives that we have shuttered for the time being. We launched a national survey of art museum directors in late February—when there were roughly a dozen cases of COVID-19 in the US—intending to leave the survey open through April. We decided to cut our fieldwork short and close the survey after only a few weeks of gathering responses. Similarly, in early March, we strongly recommended that our academic library local survey partners suspend or hold off on fielding their surveys on the information practices and preferences of faculty members and students.
But just because it may not make sense to field certain surveys at this time, we shouldn’t give up on using evidence to inform decision-making. We do, however, need to take into account certain principles given the current climate.
At the outset of a survey, it’s critical to disclose certain information—what the survey covers, how long it will take to complete, how the data will be used, and how respondents can consent to participate. Now, however, additional steps need to be taken to acknowledge the very difficult circumstances that many individuals are facing, both professionally and personally.
For our COVID-19 Student Survey, the invitation and reminder emails emphasize at the outset the ways that the survey results will help support students during this complicated and disruptive time. And, the survey focuses not just on how students are doing with their coursework but in their lives more broadly, including their physical and mental well-being. Similarly, our internally-facing ITHAKA employee survey starts with an explanation of how the survey is but one part of the organization’s ongoing efforts to put the health, safety, and well-being of employees first while continuing to advance the organization’s mission and support our partners, participants, and users.
And it’s worth noting that people often want the chance to offer feedback right now. For our internal staff survey of ITHAKA employees, we saw unprecedented participation—70 percent of all employees completed the survey in an eight hour period. If you are fielding a survey that can be beneficial during the crisis, you may see greater levels of response than expected.
Make it easy
An important principle of survey design is ensuring that those who are taking the survey consistently understand what is being asked of them—both in terms of the survey questions themselves and also the commitment involved in completing the survey. This is especially important during a crisis.
We know that this isn’t the time for overly complicated, theoretical, or perhaps even open-ended survey questions. We’ve needed to really think critically about what is best suited for a survey (and what’s not) and examine every question to assess the ease of responding. In the COVID-19 Student Survey, for example, we made a very deliberate decision to focus most questions on current and very recent past behavior. Several questions ask students about the activities they have performed in the past week or how they are currently doing. We’ve very deliberately avoided asking respondents to speculate right now about future behavior or preferences.
And, we have quickly iterated on draft instruments based on community feedback. For the survey of US academic libraries, which Lisa Janicke Hinchliffe and I developed and launched in roughly 10 hours, we solicited the library community to conduct self-paced cognitive walk-throughs of the questionnaire once a draft was developed—and many offered their help in this process over a period of about 90 minutes.
Focus on equity
Certain groups have already and are likely to continue to be disproportionately affected by the pandemic. Students from historically underserved backgrounds are disproportionately struggling with having sufficient access to technology for completing remote coursework. Institutions that serve higher proportions of these students are also disproportionately affected in turn.
In our COVID-19 Student Survey, we have a series of demographic questions—including items on socioeconomic status, first-generation college status, gender, and race/ethnicity—that will help for capturing this information so that we can categorize different subgroups of respondents when analyzing results. For the survey of US academic libraries, we have asked participants to include a unique identifier (IPEDS UnitID) at the outset of their response so that we can tie in institutional descriptors such as sector, Carnegie Classification, or geographic region. If we only were able to focus on the aggregate findings, we might miss parts of the sample that are most in need of additional support.
Make it actionable
Above all else, we’re aiming to make sure that our surveys are actionable. Asking questions that are “interesting” or “nice to know” is a luxury right now that we cannot afford.
In developing our recent surveys, we have seriously considered each question and asked, “How will we (or our partners) act on the findings? How quickly? Are we offering a false promise by asking about difficulties on this particular topic?” If there is an area where you know that you cannot offer support based on responses, and you know that you cannot connect with others who can provide support, it may not be worth asking.
In an early draft of the ITHAKA staff survey, for example, we included questions on the general workloads and productivity of staff. While the initial objective was simply to understand patterns across the organization, after some reflection, we recognized that this information is (1) too general to be actionable; (2) potentially going to feel intrusive; and (3) likely best collected through a relationship between managers and their employees. Those questions were cut and replaced with a more direct statement on the flexibility that managers are giving to their staff at this time; staff can register their perceptions based on the past four weeks of working from home. If employees respond negatively to this statement, we already have next steps brainstormed—for example, direct intervention with managers.
Lastly, ensuring that results are made available as absolutely quickly as possible for decision-making has been essential. For these three surveys, we have either made the results available in real time or taken a first pass of the data collected after just a few days of fieldwork.
Conducting survey research over the last month has provided us with many unanticipated challenges. At times, it has been difficult to make the decision to wind down or postpone important research initiatives—and, frankly, it has sometimes been an adjustment to embrace some of the new practices outlined above. However, this has also been a time for further honing skills and capacities that are needed at this time—being flexible and responsive to community needs, quickly iterating approaches over time, and developing new ways to get results back to key stakeholders in record time—and much of what we’ve learned will be infused into our regular and future research initiatives. We know that this is no time for maintaining the status quo and we are committed to continuing to respond to community needs as they evolve.