Since 2000, Ithaka S+R has run the US Faculty Survey, which tracks the evolution of faculty members’ research and teaching practices against the backdrop of increasing digital resources and other systemic changes in higher education.  Starting in 2012, Ithaka S+R has offered colleges and universities the opportunity to field the faculty survey, and a newly added student survey, at their individual institutions to gain better insight into the perceptions of their faculty members and students.  More than 70 local faculty and student surveys have been fielded thus far and have enhanced Ithaka S+R’s expertise in higher education survey administration.

This post is the fifth in a series on survey administration best practices.  Ithaka S+R is often queried about our survey administration processes, and this series of blog posts explores both our experience in fielding surveys and current research from email marketers.  Before reading this blog post, you may want to start from the beginning of the series with learning how to ensure your survey invitation is received and opened.

After reading my earlier posts in this series, you now know how to craft effective invitation and reminder messages, determine when you will send your messages and what incentives, if any, you will offer, and ensure that your survey invitation is received and opened.

This post will focus on steps you can take in designing your survey to make it easy for respondents to complete your survey.  These recommendations should be interpreted in the context of your organization, the population you are surveying, and the type of survey you are conducting, as your experiences may vary depending on these factors.

blog post survey image

 

The number of participants who complete your survey is dependent on all of these steps.

Provide essential information on the survey landing page

Once a respondent has clicked on your survey link and is redirected to the survey, it’s important to provide them with information on the purpose of the survey, who is conducting the survey, how long the survey will take to complete, if their response will be confidential or anonymous, and, if applicable, any other information required by your IRB.  Colleges and universities that Ithaka S+R works with often encourage participants to print the text on this page, as they may not be able to return back to it once they have started the survey.

Use survey logic

If certain questions in the survey are only relevant to a subset of respondents, use the logic tools within your survey platform to only display these questions to that audience.  This will make the process of taking the survey more pleasant for respondents and will increase the likelihood that they will complete the survey.  You also don’t want respondents potentially answering questions that aren’t relevant to them, as this would affect the quality of the responses and the conclusions drawn from these findings. When it comes time for interpretation and analysis, make sure to take into account the fact that such questions were only asked of a certain subset of overall respondents.

Be mindful of sensitive questions

Respondents may answer sensitive questions with answers that they believe to be more aligned with the norm in order to avoid embarrassment or out of concerns around privacy or confidentiality.  Alternatively, they may skip a question they deem as being sensitive or stop the survey altogether when they arrive at one of these questions.  With this respondent behavior in mind, it’s important to reassure respondents of the confidentiality or anonymity of their responses and to place these questions towards the end of the survey.  Even demographic questions, such as age, faculty status, or department, can at times be uncomfortable for respondents to answer, so think carefully about what questions your respondents might consider sensitive when designing your questionnaire.

Include important demographic questions

While some demographic questions can be considered sensitive in nature, it is incredibly important to include them if you want to stratify your results.  For example, if you want to analyze how respondents differ based on their departmental affiliation, you will need to include a question to this effect.  Alternatively, many survey platforms allow you to embed metadata as variables within the dataset, so instead of asking respondents for certain demographic information, you are able to tie in this data to their responses.  Embedding metadata not only helps to cut down on the number of questions within a survey, which increases rates of response, but is often more accurate and complete than self-reported information.  However, it is important to keep in mind that whether you are asking for demographic information or embedding this information, there is a possibility that respondents could be re-identified, depending on the size of the population you are surveying and the characteristics of this population.

Use a progress bar – maybe

Generally, respondents appreciate the appearance of a progress bar, which indicates what percentage of the survey has been completed and allows respondents to track their progress.  In many cases, it makes sense to include a progress bar for that reason.  However, if many questions in your survey are contingent on logic that you have set up, a progress bar may not always be accurate; it may appear as though respondents are progressing much more slowly through the survey than they actually are if the total number of questions taken into consideration for the progress bar is much higher than the number of questions that will actually appear for the respondent.