Introduction

Higher education researchers need to employ effective outreach methods in order to connect with the populations they study. For surveys in particular, low response rates can lead to non-response error, decreasing generalizability and representativeness. To combat these issues, Ithaka S+R has developed and tested a suite of outreach strategies that we have employed over the past two decades in our long-running national faculty survey as well as our local surveys of faculty and students.[1]

In fall 2018, we surveyed students across seven community colleges to assess the value of and demand for proposed services designed to address student goals, challenges, and needs.[2] The present report describes the administration of this survey and what we learned from this experience. We discuss strategies and recommendations for creating survey communications, distributing surveys via email, and determining effective incentives. We also share existing research on these topics, as well as new information on how the use of different student email types and incentives can bolster engagement.

Reaching Students

Connecting with students is a critically important first step in successful survey implementation. Without effective channels to contact students, participation rates are likely to be low, potentially limiting the representativeness and generalizability of findings. For survey research in particular, many higher education institutions rely heavily on email to reach their student body. How effective is this form of communication—for both survey dissemination and broader messaging—given the increasing number of channels for reaching students?

A 2015 study conducted by Bowling Green State University, which surveyed 315 students from various majors about their email, social media, and texting habits, found that using email to connect with students was not without its pitfalls. Researchers found that 85 percent of students checked their university email every day, and that they were highly likely to open messages from faculty members.[3] However, over one-third of students (39 percent) said that they do not always read emails from academic advisors, and over half of students (54 percent) reported that they did not always open emails from the university or from academic departments. Emails from student organizations were of even less importance, with 72 percent of students reporting that they avoid such messages.

In another study, conducted at Purdue University, students downloaded a computer program that enabled tracking of their computer habits.[4] The study found that on average, students spent only six minutes of dedicated time per day on email. Since students were recruited for this computer usage study via email invitation, the actual amount of time students spend checking and sending emails per day could even be far less. Conversely, it is also possible that students check their email more frequently than was found in this study, since email usage was only studied via computer and not on smart phones or other devices.

Many students now arrive at college with personal email accounts and may not want to switch to a “temporary” university account. To help mitigate issues with students potentially not using their institutional email accounts in particular, some colleges have begun offering alternative approaches. Boston College, for example, ceased offering new students email accounts, and instead offered new first year students the option to use an e-mail forwarding service that enabled them to forward messages to a personal account.[5] Many universities have adopted similar e-mail forwarding services and have created pages on their websites that provide students with instructions on how to forward their university email messages to their personal email accounts, ensuring that students have multiple ways to access email messages from their institution.

Some professionals in higher education have devised new strategies beyond email channels to facilitate reliable communication since students have been found to be more active on other channels, such as social media platforms, than email. One study found that 35.2 percent of students use social media as their primary means of communication versus 12.1 percent for email.[6] As a result, some professors provide students with the option to contact them via text or social media as an alternative.[7]

Several possible methodological adaptations can be considered based on limited student email use. Surveys can be distributed via social media, typically with an open-access survey link. However, it may be advisable to avoid using open-access survey links, regardless of the channel used for their distribution, given that it becomes extremely challenging, if not impossible, to control for who participates in an open-access survey.[8] Using an open-access survey link also limits the ability to calculate a response rate.

The widespread use of text messaging means that text distribution of survey links may also provide an alternative to email. It is also possible to send individual survey items via text message with the participant responding via return text. However, this strategy potentially raises concerns about privacy, since colleges will need to explicitly receive permission from students to contact them this way. Additionally, it may prove especially complicated to administer a survey by text message if researchers opt to send individual survey questions, rather than using a link to a survey. This can be limiting since questions cannot be longer than 160 characters, including both the question text and answer options.[9] Messages may also be received out of order or sequence, which could potentially become confusing to respondents. Also, should an institution choose to use this method, they would only be able to send one question at a time, which may cause respondents to lose interest in surveys more quickly.[10] Therefore, the number of questions asked in this format should be kept at a minimum.

Another approach beyond sending emails directly to students may be connecting with instructors of specific courses and requesting that the instructor send messages to their students. Given that students are highly likely to read emails from faculty members when compared to other institutional staff, this approach could help facilitate participation in a survey.[11]

Crafting Communications and Distributing Surveys

There are a number of evidence-based strategies that can be employed in crafting surveys communications to increase engagement. The content of the messaging, the number of messages sent to invitees, and the day of the week and time of day that messaging is distributed can all have an impact on response rates.

Previous research has shown that personalization, such as that of salutations, title, signatories, college name, etc., has a significant positive impact on response rates.[12] For example, two studies conducted among students at the Katholieke Universiteit Leuven, Belgium examined whether personalizing email invitations by using the intended recipient’s name within the salutation had an impact on response rates.[13] Response rates in the treatment group (those with personalized salutations) were significantly higher than those in the control group. This approach, however, requires the availability of names in a sample file, and researchers are not always able to gain access to these data.

Research on the most effective time and day of the week for deploying surveys has been mixed. In one study outside of the higher education sector that examined the response rates of 100,000 customer feedback surveys by the day of the week and the time of day of the survey distribution, researchers found that survey invitations distributed on Mondays saw a significantly higher response rate than other days.[14] Conversely, Friday was the least effective day to deploy survey invitations. Another study examined the email open and click-through rates of over 300 million marketing messages and found that Tuesday had the highest open and click through rates.[15] The same study found that weekends had the lowest open rates, and that very few messages were sent on these days. While these results tend to vary across studies and within different industries, broader research indicates that weekdays are a more favorable time to distribute email messaging than weekends.

The time of day in which survey invitations are distributed can also influence response rates. A 2014 study conducted by a marketing automation platform showed that, in an email recipient’s time zone, 10:00 am was the most optimal time to distribute messaging.[16] Therefore, it is generally advisable to avoid sending out messaging to a survey population whose local time is much earlier (i.e. 10:00 am on the east coast, but 7:00 am on the west coast) as those respondents will be unlikely to participate. Additionally, while 10:00 am was found to be the peak time to distribute messaging, the optimal time remained high during normal business hours and tended to decrease more substantially after 6:00 pm. The peak send time also varied among different populations. For example, among college-age recipients specifically, the peak time to send email messages rose to 1:00 pm; however, the age range of “college-age” recipients was not specified in this study.

Distributing multiple messages, such as an initial survey invitation and one or more reminder messages, has also been shown to be effective in increasing response rates. In a study conducted at the University of Antwerp, students received one initial survey invitation and two reminder messages.[17] Reminder messages were only distributed to those who had not yet completed the survey. An experimental group was sent a third reminder message. The results revealed that response rates gradually increased as each message was distributed—with the initial survey invitation yielding a 6.2 percent response rate, the first reminder increasing this response rate by an additional 10.3 percent, and the second reminder message increasing the response rate by another 8.6 percent. The extra reminder message further increased the response rate by an additional 6.1 percent, yielding a final response rate of 31.2 percent. With each message, response rates increased; however, sending too many messages could have an inverse effect on response rates. Striking the right balance on message frequency is especially important for building a long-term relationship with participants in any study.

Incentivizing Participation

One important predictor of whether a student will complete a survey is the presence or absence of an incentive. While there is little research on the use of incentives to boost response rates on community college campuses specifically, broader research suggests that incentives do have a positive impact on response rates by offering compensation for time and effort spent engaged in the study. When determining what incentives, if any, to offer, there are a number of factors worth considering.

First, while previous research often indicates that incentives are associated with higher response rates, external motivations like incentives are not the only source of motivation to consider when designing a study. Some respondents may feel intrinsically or altruistically motivated to complete a survey because they want to be helpful, are interested in the topic, or are invested in the outcomes of the research.[18] Beyond these internal motivations, leverage-salience theory argues that monetary incentives can encourage individuals who might not have otherwise been motivated to participate in a survey.[19] Likewise, social-exchange theory posits that pre-paid incentives may be seen as a symbol of trust; since researchers are providing a “token of appreciation,” demonstrating hope that the prospective participant will complete the survey, individuals are then motivated to respond as a form of reciprocation.[20]

In deciding how to use incentives most effectively in a student survey specifically, it is important to consider several factors including the campus culture, type of incentive offered, and dollar amount or value. For instance, if it is typical for an institution to offer students an incentive to participate in a survey, there could be ramifications for not doing so. Likewise, if an institution has historically offered a particular type of incentive, providing a different incentive for a study could weaken response rates. While it is possible for researchers to deviate from what their colleagues elsewhere at the institution have offered historically and still be successful, it is worthwhile to at least take the precedent into consideration.

The National Survey of Student Engagement (NSSE), a large-scale study conducted at hundreds of colleges within the United States, has historically offered incentives to students to increase participation.[21] Among those institutions participating in NSSE, the use of incentives increased from 35 percent in 2010 to 54 percent in 2014, suggesting that offering an incentive to students has aided in the success of gathering responses for this survey. Lottery incentives were most frequently employed, and lotteries for technology devices like tablets and iPads and general gift cards (e.g. Amazon, Walmart) led to the greatest increase in response rates when compared with other incentive options.[22]

Additional research also suggests that incentives are particularly useful in encouraging survey participation among college students. A series of web-based surveys were conducted among students at a four-year public research university that examined the impact of incentives on response rates.[23] Students were randomly assigned to the treatment or control group, and those in the control group received an email invitation that was identical in every other aspect when compared to the messaging that the treatment group received, but the treatment group messaging described a lottery incentive for participating in the survey(s). Across four different surveys, three different lottery incentives were offered to students in the treatment group: a raffle for a 4 GB iPod Nano, an 8 GB iPod Touch, or one of ten $50 gift cards for on-campus dining. In all four surveys, students in the treatment group displayed higher response rates than those in the control group, and the 8 GB iPod Touch lottery proved to be the most effective incentive of the three offered. Lottery incentives, whether for a technology item or a gift card raffle, proved to be an effective strategy for boosting survey participation.

If it is not sufficiently equally valued across invitee subgroups, the specific incentive offered can of course yield an imbalance between survey invitees and respondents, and therefore skew results by creating an incentive bias, which can negatively affect the quality of responses.[24] When implementing the 2013 NSSE study, researchers found that lottery incentives attracted more women respondents than men, and that technology incentive lotteries, specifically iPods, particularly attracted more men.[25] This suggests that a high-ticket item such as an iPad or other technology prize is more likely to draw in men, whereas lottery incentives as a whole are more likely to attract women. Exploring what incentives are most likely to resonate with your survey population—both overall and throughout subgroups—is crucial, since this has the potential to directly affect the success of a survey sample reflecting the overall population.

Further, it is important to consider the monetary value associated with the incentive. According to a 2001 study that measured the impact of lottery incentives on high school students who took interest in a specific college but did not apply, “increasing the size of the prize did not result in a linear increase in response rates.”[26] In this study, respondents were offered gift cards in incremental values of $50, from $50 to $200, to participate in an online survey, and were then compared against a control group that was not offered an incentive. The researchers found that the $100 gift card lottery incentive yielded the highest response rate. Additionally, response rates do not appear to increase after a certain monetary value has been surpassed, as the incentives that exceeded $100 were no more effective.

Lastly, it is critical to reflect on respondents’ confidentiality and/or anonymity when collecting contact information for incentives. Given that eligibility for an incentive generally requires collecting some form of identifying information, it is crucial to ensure that respondents’ identifies will be protected, and that if anonymity has been promised, that personally identifiable information is not directly tied to their responses. Respondents may be less likely to respond to questions, especially those of a sensitive nature, if they are concerned about re-identification.[27]

Our Approach

In fall 2018, we administered an online survey to students within each of the seven partner community colleges in the final phase of the Community College Libraries & Academic Support for Student Success (CCLASSS) project. Each of the colleges fielded their own customized instance of the survey that contained their institutional logos, colors, and other associated branding. These surveys were launched between mid-October and early November of 2018, and closed to new responses by mid-December.

The survey population at each college included up to approximately 15,000 students who were 18 years of age or older, and included both credit students and non-degree/workforce students. Topics covered within the questionnaire include students’ personal and professional goals and objectives, as well as the unique challenges they face in achieving these goals. A series of service concepts were also tested within the questionnaire to determine how valuable a variety of services might be for students based on unmet needs uncovered in an earlier qualitative phase of the project. For more information on the survey results and on the CCLASSS survey itself, please refer to our recent publication of findings.[28]

In addition to gathering results on the above topics, we tested the effectiveness of different student email address types used in outreach, including both institutional email addresses and non-institutional email addresses.[29] We also A/B tested Amazon and Visa gift cards in a lottery incentive to assess which incentive was more likely to resonate with students and therefore maximize response rates. In the following sections we share results from this testing as well as general strategies and lessons learned throughout the survey administration period.

The survey was deployed directly from a third-party survey platform that sent out all survey communications via email. Through this platform, we distributed invitation and reminder messages using unique links for each survey respondent to ensure that all were within the survey population, that none could take the survey multiple times, and that all received customized messaging. Further, by distributing the survey using unique links through the survey platform, we were able to send reminder messages to only those students who had not yet fully completed the survey.

Project Planning

We set out with a goal to leave each of the surveys open for a period of about four to six weeks. This allowed us to space out all messaging by at least one week so that students were less likely to feel overwhelmed or bombarded with messaging. We also varied the time of day that messages were distributed so that we could capture students’ perspectives during different times when they might be more likely to engage with the survey. All survey communications were distributed before 3:00 pm in the respondents’ local time zone, as we and other researchers have found late morning and early afternoon to be the most effective time for reaching respondents.

During fieldwork, we collaborated with each of the seven community colleges according to their individual survey implementation timelines, based largely on their academic calendars, to distribute the surveys at optimal times within their local context. We also sent follow-up messaging at days/times that we hypothesized would attract the highest level of engagement. For example, we would rarely encourage an institution to send out messaging on a Friday, as we have found that this day of the week tends to yield a relatively lower amount of responses. We would especially caution an institution against launching their survey on a Friday and instead recommend another day earlier in the week. As a result, all of the surveys launched between Monday and Thursday.

We also encouraged each of the partner institutions to promote their instance of the survey at their library and throughout their institution at large. The partners used a number of strategies, including hanging physical posters around campuses, distributing flyers at key locations, creating library computer screensavers, e-signage, and digital displays across campus, running announcements in student newspapers, and placing messages about the survey in student portals, college calendars, and the library/college homepage. By employing all of these strategies, we gradually increased response rates each week that we were in the field.

Developing Messaging

Each of the seven community colleges distributed one initial survey invitation and three subsequent reminder messages. Ithaka S+R developed standardized language for both the survey invitation and reminder messages that allowed for customization of specific fields, including the name of the community college, the signatory information, the name of the survey itself (which was attached to the survey link), and the type of incentive that the student was being offered to participate in the survey. The signatories from each of the colleges were generally not directly affiliated with the library as part of a deliberate strategy to not influence responses towards the library. The messages also allowed for customization of various features, such as the “from” name and email address, the use of invitee first names (i.e. “Dear Elizabeth,”), and an opt-out link for any students who did not wish to participate in the survey or did not wish to receive further messaging. We have historically found with other higher education surveys that these layers of personalization result in higher response rates.[30]

We intentionally kept the invitation and reminder messages brief and to the point, but included key information on the topics the survey covers, the estimated time it takes to complete, incentive information, an explanation of the partnership among each of the seven community colleges and Ithaka S+R, and how the findings will be used to better understand and support the needs of students like them.

We also employed a variety of different subject lines for each of the messages, which were designed at the outset of the project with the intent to capture the attention of potential respondents. In particular, we have found that using “RE:” at the beginning of the subject line, to indicate that the reminder message is following up on an earlier message, has been effective in garnering responses.[31] The subject lines employed were displayed as follows:

  • Survey invitation: [name of college] Student Survey: complete for a chance to win [incentive]!
  • Reminder 1: [name of college] Student Survey needs your input!
  • Reminder 2: RE: [name of college] Student Survey
  • Reminder 3: [name of college] Student Survey: Final opportunity to participate to shape future resources and services

While we aimed to standardize the subject lines of all messaging for the survey across colleges, we also were flexible in the language that was used to meet each individual college’s unique context, at times customizing subject lines for individual colleges.

Preparing for Deploying

A key challenge in engaging with students via email is cleaning and processing lists of contact information. Among the seven partner colleges, there were a number of unexpected challenges in obtaining this contact information for students and in deciding what contact information was best to use (i.e. institutional vs. non-institutional email addresses).

We asked each of the seven colleges to provide both an institutional email address (i.e. studentname@collegename.edu) and an alternative email address that was not from the institution (i.e. @gmail.com, @hotmail.com, @outlook.com, etc.) if this information was available for each student within their sample. In order for each of the colleges to obtain contact lists of their students, they had to reach out to the appropriate parties (usually their institutional research office) well in advance of the survey administration period.

It was not uncommon for credit student and non-degree/workforce student contact information to be stored in separate databases, and obtaining the latter often-required additional efforts. When credit and non-degree/workforce student information was stored in more than one place, there was often significant overlap between the two databases; that is, students were found to be classified simultaneously as credit students and workforce students with inconsistent email addresses across the two records. This reinforces the often non-linear paths of these students and that databases containing contact information may not be coordinated sufficiently. There were also a number of spelling errors and inconsistencies between databases. A substantial amount of individual contacts within nearly all of the contact lists contained email domain names that were invalid, or spelled incorrectly (i.e. gnail.com, yahooo.com, etc.). These lists often required extensive cleaning before putting the associated survey into the field.

The decision of whether we elected to use the institutional or non-institutional email address for each college also posed unique challenges and opportunities. For example, Monroe Community College has developed a practice of only contacting their students through their institutional email addresses and does not keep records of non-institutional email addresses. On the other hand, some of the institutions primarily use non-institutional email addresses to contact their students, or their students often use email-forwarding services to send messages from their institutional email accounts to their personal email accounts. Further, the majority of non-degree/workforce students did not have an institutional email addresses at all, and thus we contacted those students using their non-institutional email addresses only.

Decisions regarding what email address to use in contacting students depend upon several conditions, such as the length of time available prior to launching a given survey, list availability, list accuracy, and campus culture around the use of institutional and non-institutional email addresses. These factors and subsequent decisions significantly varied among each individual college, and in making these decisions, it was important to obtain feedback from others that have conducted internally facing research studies at each institution, as well their institutional research offices. Developing sample processes and protocols well in advance of the surveys launching significantly increased the success of the project.

Choosing incentives

For this survey, we aimed to employ the most effective incentive options possible given the available resources allocated for the study. Based on our own and others’ previous research, as well as budgetary limitations and New York State lottery guidelines, we chose to move forward with general gift card incentives. Because we were unable to locate any previous research on the effectiveness of using an incentive that closely resembles actual monetary value (such as a Visa or AMEX gift card) within a student survey, we elected to A/B test the effectiveness of both Amazon gift cards (a gift card for a specific retailer) and Visa gift cards (a gift card that more closely resembles cash) as lottery incentives among students. We hypothesized that the amount associated with the gift cards ($100) would provide a sufficient incentive to participate without being coercive, and that the types of gift cards offered would not disproportionately influence any particular respondent subgroup since recipients could use these gift cards for a broad range of purchases.

Each student was randomly assigned to one of the two prizes, either a $100 Amazon or Visa gift card, which was tied to their individual contact information. We were then able to pipe in the incentive information into each respondent’s survey invitation and subsequent messaging, and also pipe this text into the introductory language that respondents saw on the survey-landing page.

When respondents reached the end of the main survey and submitted their response, they were automatically redirected to a separate form that was not connected to their responses, in which they could enter their contact information to be entered into the prize drawing. This separate form for the prize drawing contained piped-in text for the institution name and the prize type (either an Amazon or Visa gift card) in the instructions. In employing this method, we were able to collect information for the prize drawing in a single survey, rather than create multiple forms for each of the partner institutions and each of the prize types.[32] This also allowed respondents to remain completely anonymous, since the prize drawing was entirely separate from their survey responses.

Upon the close of each of the surveys, we randomly selected from each college five winners for the Amazon gift cards, five winners for the Visa gift cards, and a number of backup contacts to be used as needed. We then notified the recipients via email that they were selected as a winner for participating in the survey, and began to distribute the gift cards. Those who won the Amazon gift card were sent a unique code that they could use to digitally access their prize. Conversely, Visa gift card winners were required to provide their full mailing address so that we could physically mail them their Visa gift card. Visa gift cards were sent through Amazon to ensure an easy purchasing process and automatic package tracking.

Results

Email Preferences

We found minimal differences in the effectiveness of institutional versus non-institutional email addresses in the aggregate results (about a 12 percent response rate for institutional email addresses and 11 percent for non-institutional email addresses), but this substantially varied at the individual college level. An overview of the sample breakout and response rates by email address type can be found below (see Table 1).

Table 1: Invitees and Respondents by Email Address Type

InstitutionInvitees by email type

(% represents share of email type out of total sample)
Completed responses by email type

(% represents response rate)
Institutional email addressNon-institutional email addressTotalInstitutional email addressNon-institutional email addressTotal
Bronx Community College7,1336,37613,5095225471,069
52.8%47.2%100.0%7.3%8.6%7.9%
Borough of Manhattan Community College7,2987,59614,8945801,1841,764
49.0%51.0%100.0%7.9%15.6%11.8%
LaGuardia Community College15,6786115,7391,14031,143
99.6%0.4%100.0%7.3%4.9%7.3%
Monroe Community College11,571011,5712,25202,252
100.0%0.0%100.0%19.5%0.0%19.5%
Northern Virginia Community College90114,09915,0001011,4861,5787
6.0%94.0%100.0%11.2%10.5%10.6%
Pierce College3,4316,3699,8001,1613551,516
35.0%65.0%100.0%33.8%5.6%15.5%
Queensborough Community College4,9968,22113,2174701,1311,601
37.8%62.2%100.0%9.4%13.8%12.1%
Total51,00842,72293,7306,2264,70610,932
54.4%45.6%100.0%12.2%11.0%11.7%

For Monroe Community College specifically, we were only provided with institutional email addresses, yet they concluded the survey with the highest overall response rate among the partner colleges (19 percent versus 7-16 percent for the other six partner colleges). Similarly, while approximately two-thirds of Pierce College’s sample contained non-institutional email addresses, those who we contacted via their institutional email address responded at substantially higher rates (34 percent response rate for institutional email versus 6 percent for non-institutional email). Pierce College had the second highest overall response rate among the college partners at about 16 percent. LaGuardia Community College, which provided almost exclusively institutional email addresses, also saw higher response rates among those contacted via their institutional email address as compared to other addresses (7 percent versus 5 percent), though their response rate overall was relatively lower than other colleges in the project.

One possible reason for these outcomes is that each of the community colleges was required to whitelist the Ithaka S+R IP addresses that are used to distribute the survey messaging, thus helping prevent the messages from being sent to spam folders or blocked by IT administrators. Unfortunately, this can only be controlled for at the institutional level and would not apply to non-institutional email addresses. In cases where we distributed messaging to non-institutional email address, it may have been more likely that the messaging was sent to spam or junk mail folders. Another explanation for this result could largely be due to campus culture around the use of email. Among some of the colleges, there is greater inclination to use institutional email addresses, and thus rates of response were higher among these groups. For example, Monroe Community College’s policy is to exclusively contact their students via their institutional email addresses, and we saw the highest rate of response among this group. Similarly, at LaGuardia Community College, where a much higher share of student records have institutional email addresses, we saw a higher response rate for those students compared to those contacted through their non-institutional email addresses.

On the other hand, Borough of Manhattan Community College had an even distribution of institutional and non-institutional email addresses within their sample (49 percent institutional/51 percent non-institutional), yet those students who were contacted via their non-institutional email addresses responded at twice the rate of those who were contacted using their institutional email (8 percent versus 16 percent). Queensborough Community College also saw higher overall rates of response among non-institutional email addresses (9 percent versus 14 percent), though their sample contained nearly two-thirds of non-institutional email addresses.

The Northern Virginia Community College sample uniquely contained predominantly non-institutional email address (93 percent), yet received relatively even rates of response between the two groups (11 percent). Bronx Community College was the only college that had a nearly even sample distribution of institutional and non-institutional email addresses, and saw relatively balanced rates of response among both groups (about a 1 percentage point difference between the two types).

Given the substantial degree of variation in results across colleges, decisions regarding an institutional versus non-institutional email approach require consideration at the institutional level. There is no “one size fits all” equation that can be applied to ensure that a sample is adequate and effective at gaining sufficient response rates for any given study. Each of the seven community colleges had specific limitations and internal processes for surveying their students, and this led to unique outcomes for each of the individual surveys.

Incentive Effectiveness

In comparing rates of response for the two forms of incentives offered to complete the survey (Amazon or Visa gift cards), we found little meaningful differences. Respondents were about equally as likely to partially complete (start and not finish) and finish the survey, regardless of whether they were assigned to the Amazon or Visa gift card prize. There was typically a one to two percentage point difference between the prize offering distribution among surveys started (48.87 percent for Amazon vs. 51.13 percent for Visa in the aggregate) as well as for surveys completed (49.11 percent for Amazon vs. 50.89 percent for Visa). This holds true at both at the aggregate and individual institutional level. An overview of the response rates by incentive is below (see Table 2).

Table 2: Response Rates by Prize Type

InstitutionInvitees by Prize
(% represents share of prize type out of total sample)
Completed Responses by Prize
(% represents response rate)
AmazonVisaTotalAmazonVisaTotal
Bronx Community College6,7526,75313,5055035621,065
50.0%50.0%100.0%7.4%8.3%7.9%
Borough of Manhattan Community College7,4457,44214,8878639021,765
50.0%50.0%100.0%11.6%12.1%11.9%
LaGuardia Community College7,8697,87015,7395695721,141
50.0%50.0%100.0%7.2%7.3%7.2%
Monroe Community College5,7855,78611,5711,1071,1482,255
50.0%50.0%100.0%19.1%19.8%19.5%
Northern Virginia Community College7,4747,47214,9468087841,592
50.0%50.0%100.0%10.8%10.5%10.7%
Pierce College4,8994,8989,7976857371,422
50.0%50.0%100.0%14.0%15.0%14.5%
Queensborough Community College6,5976,59613,1937918131,604
50.0%50.0%100.0%12.0%12.3%12.2%
Total46,82146,81793,6385,3265,51810,844
50.0%50.0%100.0%11.4%11.8%11.6%
The sample figures by Prize Type exclude those emails that bounced back, were invalid, or duplicate. Therefore, the distribution of contacts assigned to Amazon and Visa gift cards is not perfectly even.

When we examine response rates at the aggregate level, just over 16 percent of respondents assigned to the Amazon prize either partially or fully completed the survey, while the rate for those respondents who were assigned to the Visa prize was just under 17 percent. The same pattern emerged among completed responses, where those assigned to either of the prize types displayed a response rate of about 11-12 percent. While Visa gift cards are slightly more popular among respondents, the difference between overall response rates among the two incentive options is marginal. We see this at the institutional level as well, where response rates for both options were generally within approximately one percent of one another.

We also analyzed the percentage of respondents from each prize type who completed the survey against key demographic groups and found minimal differences among respondents. Those who indicated that they were a parent or a guardian (52 percent for Visa), those who indicated that they were within the ages of 31-40 years old (53 percent for Visa), and those who indicated that they were serving on active duty or were a veteran of the U.S. Armed Forces, National Guard, or Reserves (about 54 percent for Visa) preferred Visa gift cards slightly more than their peers in other subgroups.[33] Differences among other demographic groups (gender, age, Pell Grant eligibility, etc.) were even smaller, and within 1-2 percentage points of the aggregate results. While our overall sample generally mirrors the demographic characteristics of the student population, we do in particular see overrepresentation of women; however, the incentives offered did not influence participation by gender, or any other demographic variable, substantially.

Recommendations

The implementation of this large-scale survey has yielded a number of recommendations for surveying community college students and college students at large. We offer the following conclusions and reflections on our approach for others to employ in future research.

Sample Viability

When preparing to survey students, it is essential to consider what information is required from institutional databases as soon as possible to enable fieldwork within an anticipated timeframe. This process often entails communicating with multiple departments to obtain student contact information, so it is worth informing corresponding parties of the project timeline and sample parameters well in advance of fieldwork. Taking a proactive approach in the early stages of the project will help ensure successful survey administration.

Another key reason for obtaining sample information as soon as possible is to ensure that the contact list itself contains all necessary fields and appropriate contacts for the study. Contact lists often require cleaning for spelling, formatting, duplicates, and accuracy, and may require ongoing communication with the parties that provided the list to receive revised versions. When working with such lists, it is highly important to thoroughly check the contact information for errors, as the sample will likely be significantly reduced without proper de-duping and quality checking. This will also help prevent any delays in launching the survey. Without performing these pertinent quality checks, the response for our student survey would have yielded less successful outcomes, particularly among smaller size subgroups for important student populations.

Lastly, beyond cleaning student email addresses, it is important to conduct quality checks for contact fields (i.e. student names). Particularly in survey research, it might look strange to a student to receive an email addressed to their full name, their name in all capital letters, or their first name plus their middle name(s), though this is sometimes the format in which these lists are provided. It would also be unfortunate for someone to receive an email in which their name was spelled incorrectly, or where they were not even the intended recipient. Such errors can discourage participation, and lead students to question the validity of the research study in which they are being invited to participate.

Student Contact Preferences

One of the most substantial decisions we made in administering this student survey was selecting the email address that we hypothesized would have the greatest chance to be actively used so that our survey communications had the best chance of being received. Results from the survey, outlined above, have indicated that there is no clear answer across institutions for whether it is better to use a student’s institutional or non-institutional email address.

The most significant factor to consider when contemplating which email address to use to reach students may be campus culture. If an institution typically contacts students via institutional email, and this policy is widely encouraged and embraced by faculty and administrators, it likely would be advisable to move forward with this method. In addition, using an institutional email address means having more control on the deliverability of the survey communications. In many institutions, it is possible to work with an IT department to employ various security measures to ensure that the survey communications are received (i.e. IP address whitelisting, unblocking surveys on web browsers, etc.). However, if it is more common at an institution for students to be contacted via their non-institutional email addresses, it is worth considering using this approach for reaching students, especially if teaching faculty at the institution actively work with students to meet them on their preferred channel of communication.

If there are channels outside of email that are regularly used to contact students, consider using those methods instead of or in addition to email. For example, if an institution has a system in which students can be contacted en masse via text message (for university wide alerts and announcements), it may be possible to distribute the survey via text message. As students increasingly communicate in a variety of digital formats outside of email, it may be worthwhile to consider this option to yield greater response rates.

When an institution does not have a policy or established practice regarding the use of institutional and non-institutional email addresses, and it is possible to obtain more than one email address per student, students could be randomly assigned to an institutional or non-institutional email address. Alternatively, another approach would be to randomly assign either an institutional or non-institutional email address to a subset of individuals within the sample and distribute survey invitations to this group before fully going live with the survey. This can help determine which channel is the most effective for reaching students, and after completing testing, the most successful approach can then be applied to the remainder of the sample.

A final factor to consider when obtaining student contact information is the validity of the email addresses the institution provides. If a sample that contains multiple email addresses for each student includes some that are inaccurate or invalid, using an alternative email address if available may allow for the delivery of a greater number of valid surveys.

Incentives

Given that the results from our multi-site student survey show little difference between the effectiveness of Amazon and Visa gift card incentives, our recommendation under most circumstances would be to offer Amazon gift cards due to the additional labor and cost associated with purchasing and distributing Visa gift cards. Amazon is generally accessible to most students in the US and is equally as likely to resonate with students when compared with Visa gift cards. We found that the Amazon gift cards were much easier to obtain and distribute since the entire process of dissemination was conducted digitally. Amazon also allows the purchase of as many digital gift cards as are necessary, and can send the digital gift code to students via email or another online method. This made acquiring and distributing the Amazon gift cards hassle-free and an efficient use of time.

If providing an Amazon gift card is not feasible, Visa gift cards do resonate with students at the same level as Amazon gift cards, but there may be additional steps and fees associated with this method. For example, obtaining Visa gift cards involves paying a small purchasing fee for each individual card, purchasing physical gift cards (either online or in person), and allocating funds for the delivery of the prizes. This also requires obtaining an additional piece of information from the prize winners; since we did not collect the mailing address of our survey participants at the point of prize lottery entry, we needed to follow up with the winners after they were randomly selected. Of course, we could have collected this information when students provided us with their contact information to be entered into the prize drawing, but collecting such information in the lottery entry—especially for so many respondents who would not later receive the Visa gift card—could have also raised privacy concerns among students.

With physical Visa gift cards, it is important to allow sufficient time before the end of the semester or term for students to collect their prize if provided at an on-campus location, which could help limit expenses associated with mailing physical gift cards. However, students may be traveling or unable to collect their prize in person after the period of instruction has ended. This also poses complications for students who take online courses and do not live nearby, or those students who do not typically visit a physical campus location. Overall, both prize types are about equally as likely to encourage participation, but there are additional costs and labor associated with using Visa gift cards when compared with Amazon gift cards.

Final Thoughts

Implementing this large-scale survey allowed us to both utilize existing best practices developed for other higher education communities and to gain greater insight into newly tested approaches. In this project, we closely examined both survey distribution practices and incentive selections, and uncovered the extent to which certain strategies are likely to resonate with students.

Our administration processes and outcomes have only reinforced the importance of meticulous attention to detail in generating contact lists. Beginning the survey administration process by developing thoughtful, personalized messaging, and strategically mapping out the date and time of each individual message, plays a key role in the success of a research project.

Selecting the appropriate email address to send a survey invitation to a student can vary on an institution-by-institution basis. The email address that has the greatest likelihood to be used to successfully contact a student, whether institutional or non-institutional, has the potential to substantially increase response rates, likely increasing representativeness and generalizability of findings.

Identifying an incentive that resonates with students without influencing responses in a biased fashion is important in bolstering results. In this study, we found that a gift card for a specific retailer, Amazon, affected response rates in a similar fashion to a Visa gift card that more closely represented cash, and that response rates for subgroups were largely unaffected by the differentiated approaches.

Our aim in publishing this report is to encourage discussion on this topic and to continue to uncover and improve upon survey administration practices with community college students, students more broadly, and the greater academic community. We look forward to seeing how others employ and build on these findings in their own practices.

Endnotes

  1. For more on the Ithaka S+R US Faculty Survey: Melissa Blankstein and Christine Wolff-Eisenberg, “Ithaka S+R US Faculty Survey 2018” Ithaka S+R, 12 April 2019, https://doi.org/10.18665/sr.311199. For more on the Ithaka S+R local surveys: http://www.sr.ithaka.org/services/surveys/.
  2. Through the Community College Libraries & Academic Support for Student Success (CCLASSS) project, we have (1) examined student goals, challenges, and needs from the student perspective, (2) developed a series of services that target these expressed goals, challenges, and needs, and (3) tested the demand for these service prototypes. This project, co-led by Ithaka S+R and Northern Virginia Community College, along with six other community college partners and with support from the Institute of Museum and Library Services (IMLS) [RE-96-17-0113-17], focuses on strengthening the position of the community college library in serving student needs. For the most recent report in the project, see Melissa Blankstein, Christine Wolff-Eisenberg and Braddlee “Student Needs Are Academic Needs,” Ithaka S+R, 30 September 2019, https://doi.org/10.18665/sr.311913
  3. Carl Straumsheim, “Read and Unread,” Inside Higher Ed, 2 March 2016, https://www.insidehighered.com/news/2016/03/02/study-explores-impact-social-media-texting-email-use.
  4. Reynol Junco, “iSpy: Seeing What Students Really Do Online,” Learning, Media and Technology, 26 November 2012, https://www.tandfonline.com/doi/full/10.1080/17439884.2013.771782.
  5. Jeffery R. Young, “Boston College Will Stop Offering New Students E-Mail Accounts,” The Chronicle of Higher Education, 19 November 2008, https://www.chronicle.com/blogs/wiredcampus/boston-college-will-stop-offering-new-students-e-mail-accounts/4390.
  6. Carl Straumsheim, “Read and Unread,” Inside Higher Ed, 2 March 2016, https://www.insidehighered.com/news/2016/03/02/study-explores-impact-social-media-texting-email-use.
  7. For further reflections on the alternative approach of texting students see Karen Acosta, “The Desire Path of Texting,” Inside Higher Ed, 18 September 2015, https://www.insidehighered.com/views/2015/09/18/essay-why-faculty-members-should-text-their-students.
  8. For further reflections on the use of open-access survey links: Nicole Betancourt, “Concerned About Bots Taking Over Your Survey? Reflections on Maintaining Data Integrity,” Ithaka S+R, 23 September 2019, https://sr.ithaka.org/blog/concerned-about-bots-taking-over-your-survey/.
  9. For more on the use of text messages for survey distribution see Jenny Marlar, “Using Text Messaging to Reach Survey Respondents,” Methodology Blog, Gallup, 1 November 2017, https://news.gallup.com/opinion/methodology/221159/using-text-messaging-reach-survey-respondents.aspx.
  10. Ibid.
  11. Reynol Junco, “iSpy: Seeing What Students Really Do Online,” Learning, Media and Technology, 26 November 2012, https://www.tandfonline.com/doi/full/10.1080/17439884.2013.771782.
  12. Fan Weimiao and Tan Zheng, “Factors Affecting Response Rates of the Web Survey: A Systematic Review,” Computers in Human Behavior, Elsevier, 23 November 2009, https://www.sciencedirect.com/science/article/pii/S0747563209001708.
  13. Dirk Heerwegh, Tim Vanhove, Koen Matthijs and Geert Loosveldt, “The Effect of Personalization on Response Rates and Data Quality in Web Surveys,” International Journal of Social Research Methodology, 8:2, 85-99, DOI: 10.1080/1364557042000203107.
  14. Jill Zheng, “What Day of the Week Should You Send Your Survey?” Survey Monkey, 2019, https://www.surveymonkey.com/curiosity/day-of-the-week/.
  15. Magdalena Pietras, “New Infographic: Best Day to Send Email 2013.” GetResponse, 7 October 2013, https://www.getresponse.com/blog/new-infographic-best-day-to-send-email-2013.
  16. John, “Insights from MailChimp’s Send Time Optimization System,” MailChimp, 14 June 2014, https://mailchimp.com/resources/insights-from-mailchimps-send-time-optimization-system/.
  17. Christof Van Mol, “Improving Web Survey Efficiency: The Impact of an Extra Reminder and Reminder Content on Web Survey Response,” International Journal of Social Research Methodology, 20:4, 317-327, DOI: 10.1080/13645579.2016.1185255.
  18. Eleanor Singer and Cong Ye, “The Use and Effects of Incentives in Surveys,” Annals of the American Academy of Political and Social Science, vol. 645, 2013, pp. 112–141, JSTOR, www.jstor.org/stable/23479084.
  19. For more on theories on the use of incentives see Erica Ryu, Mick P. Couper, and Robert W. Marans, “Survey Incentives: Cash vs. In-Kind; Face-to-face vs. Mail; Response Rate vs. Nonresponse Error,” International Journal of Public Opinion Research, Oxford University Press on behalf of The World Association, Vol. 18 No. 1, 1 July 2005.
  20. Ibid.
  21. For information on the NSSE Survey: “NSSE Survey Instruments” NSSE, FSSE, BCSSE, last modified 2018, http://nsse.indiana.edu/html/survey_instruments.cfm.
  22. Shimon Sarraf, “Improving Student Participation Rates: What We’ve Learned about Incentives and Promotions,” (presentation, A NSSE Webinar, 2 October 2014).
  23. Jerold S. Laguilles, Elizabeth A. Williams and Daniel B. Saunders, “Can Lottery Incentives Boost Web Survey Response Rates? Findings from Four Experiments,” Research in Higher Education, Springer, Vol. 52, No. 5, August 2011, pp. 537-553.
  24. For more information on incentive bias see Alli Whalen, “What’s in it for Me? How to Use Survey Incentives Correctly,” Cvent Blog, Cvent, 22 July 2015, https://blog.cvent.com/events/feedback-surveys/whats-use-survey-incentives-correctly/.
  25. Shimon Sarraf, “Improving Student Participation Rates: What We’ve Learned about Incentives and Promotions,” (presentation, A NSSE Webinar, 2 October 2014).
  26. Stephen R. Porter et al, “The Impact of Lottery Incentives on Student Survey Response Rates,” Research in Higher Education, Vol. 44, No. 4, (August 2003), http://stephenporter.org/surveys/Lottery%20incentives%20RHE%202003.pdf.
  27. “Using Survey Incentives to Improve Response Rates,” Survey Monkey, https://www.surveymonkey.com/mp/using-survey-incentives-to-improve-response-rates/.
  28. Melissa Blankstein, Christine Wolff-Eisenberg and Braddlee, “Student Needs Are Academic Needs,” Ithaka S+R, 30 September 2019, https://doi.org/10.18665/sr.311913.
  29. “Institutional email addresses” are defined as those that are provided by the institution and contain the institution domain name (i.e. @collegename.edu), whereas “non-institutional email addresses” are not provided to students from the institution and do not have an institutional domain affiliation (i.e. @gmail.com, @yahoo.com). Non-institutional email addresses may be a personal email address, an alternative email address that the institution has on file, a preferred email address with which the student has indicated they wish to receive communications, or even an old email address from a previous institution or high school. These definitions will be used throughout to describe the implementation and results of this project.
  30. For further reflections from Ithaka S+R on constructing survey communications see Christine Wolff-Eisenberg, “Crafting Effective Communications: Survey Administration Best Practices,” Ithaka S+R, 14 October 14 2015, https://sr.ithaka.org/blog/survey-administration-best-practices/.
  31. For reflections from Ithaka S+R on survey administration practices see Christine Wolff “Survey Administration Best Practices: Lessons Learned from the 2015 Ithaka S+R Faculty Survey,” 2016 Library Assessment Conference 2016, pp. 54-56. http://old.libraryassessment.org/bm~doc/11-wolff-2016.pdf.
  32. By using piped text for the incentive form, we had the ability to customize the prize type and institution name in the form. If we had been unable to customize this text for each individual student, we would have needed to create two separate incentive forms for each of the seven partner colleges, one for those assigned to the Amazon prize and one for those assigned to the Visa prize, resulting in a total of 14 incentive forms.
  33. Please note that these percentages represent the share of prize types awarded out of the total sample and do not represent response rates. Response rates by demographic subgroup are not possible to calculate given that we did not have access to this demographic data in the sample files for this study.