Through the Community College Libraries & Academic Support for Student Success (CCLASSS) project, we examined student goals, challenges, and needs from the perspective of the students themselves. The project’s goal was to provide community colleges and their libraries with strategic intelligence on how to adapt their services to most effectively meet student needs.
The project was led by Northern Virginia Community College and Ithaka S+R, with support from the Institute of Museum and Library Services (IMLS) and partnership from six community colleges: Borough of Manhattan Community College, Bronx Community College, LaGuardia Community College, Monroe Community College, Pierce College, and Queensborough Community College. Our work, which spanned 2017 to 2019, centered on three high-level research questions:
- How do students define success?
- What challenges are they facing?
- What might help them succeed?
The project was broken into three phases:
- Qualitative Discovery. Through on-campus, in-depth, in-person qualitative interviews we learned about the practices, preferences, and needs of community college students and their relationship to success
- Service Concept Development. We developed a series of concepts for services that colleges and their libraries might seek to offer based on the discovery phase.
- Survey Assessment. We evaluated the service concepts with students by means of a survey to evaluate their potential value.
We have developed the following toolkit of resources for others to leverage the methodologies employed in this project at their own colleges and universities. The toolkit contains key information required to conduct similar studies as well as the results from each phase of the project, including:
- Institutional Research Board application template based on the application developed by Jean Amaral from Borough of Manhattan Community College. To conduct your own research, you will likely need to obtain formal permission from your institution’s IRB
- Qualitative interview script used for our interviews with students
- Survey questionnaire employed for evaluating service concepts with students
- Survey communication templates including text for email invitations, reminders, and incentive information
- Dataset deposit coming soon. We have deposited the dataset from the survey with the online data repository ICPSR where it is currently being processed.
- Phase I: Qualitative Discovery
In the first phase of the project, we conducted semi-structured interviews with students at each of the seven partner community colleges. These interviews were primarily conducted in person, and held on campus at each partner college in order to learn about student practices, preferences, and unmet needs. The population for this phase of the project was currently-enrolled students including credit and workforce students; non-credit, degree, credential, and continuing education students; and in-person and online students. Students under the age of 18 were excluded from this study. Approximately 500 students from each college were randomly selected to be invited via email to participate in the interviews. For institutions with multiple campuses, this sample of students remained proportional to the enrollment on each campus.
As an incentive for participation, students who participated in the interviews were provided with a $50 Amazon gift card. The interviews were then anonymously transcribed and coded through NVivo utilizing a grounded theory approach. To record the interviews, we used the Olympus Voice Recorder WS-853. We worked with a third party company, Transcript Divas, to transcribe the interview recordings.
It is important to note that our interview script intentionally did not mention the library or any other specific academic department in an effort to minimize bias toward particular units or services at each college. Additionally, when possible, the email invitations for this project were sent under the signatory of the college provost, vice president of academic affairs, or equivalent, and most interviews were held outside of the library in an office, classroom, or other academic room elsewhere on campus.
Our findings from this phase of the project are described in Amplifying Student Voices: The Community College Libraries and Academic Support for Student Success Project.
- Phase II: Service Concept Development
In the second phase of the project, we collaborated with each of the partner colleges to develop a series of concepts for services that community colleges and their libraries might seek to offer. The service concepts were deeply grounded in findings from the interviews in order to help meet student-identified needs, and were then assessed via a survey in the third and final phase of the project. The service concepts that we have developed include: the aid of a social worker; the ability to borrow and access various technologies; child care services; civic engagement opportunities; workshops on digital privacy issues; access to a single point of contact for navigating college services broadly; access to a personal librarian; and opportunities to display and share coursework.
What Do Our Users Need? An Evidence-Based Approach for Designing New Services outlines how we developed the service concepts.
- Phase III: Survey Assessment
In the third phase of the project, we surveyed students at each of the seven partner colleges to examine their personal and professional goals, needs, and challenges. We also assessed the value of and demand for each of the proposed service concepts which could later be implemented at the partner colleges. The findings from this stage of the project are included in Student Needs Are Academic Needs: Community College Libraries and Academic Support for Student Success. The population for this survey, like that employed in the earlier interviews, contained community college credit and non-degree/workforce students who are 18 years of age or older. The survey was pre-tested in order to ensure that it was understood clearly and consistently across respondents; eight in-depth cognitive interviews were conducted with students from two of the colleges involved in the project. We describe this process in Employing Cognitive Interviews for Questionnaire Testing.
The finalized survey was then fielded during the fall semester via the survey platform Qualtrics, with each of the partner colleges working closely with us to distribute a customized instance of the survey which contained their institutional logos, colors, and other associated branding. Using a tool that enables this level of customization helps to generate a survey that looks like it is authentically associated with an institution.
Survey administration
For each of the surveys, we developed standardized messages–including introductory text, incentive form text, and survey invitation and reminder message text–that allowed for the customization of certain fields, including the institution name, incentive offering, and signatory. The participating colleges worked with us to send the survey invitations under the signature of the college provost, vice president of academic affairs, or equivalent.
Surveying Community College Students: Strategies for Maximizing Engagement and Increasing Participation provides guidance on administering surveys to this population. This report contains an overview of effective practices for surveying college students, with a special focus on methods for contacting students, distributing messaging, and incentivizing participation. When conducting similar studies of this population, keep in mind:
- Taking a proactive approach in the early stages of a project will help ensure successful survey administration.
- Gather information on whether students primarily use an institutional or alternative email address–participation may vary based on which approach is employed.
- Determining an incentive that resonates with students without influencing responses in a biased manner is important in boosting response rates. We found that both Amazon and Visa gift cards resonated with students at about the same level.
Technical survey requirements
We used the third-party survey platform Qualtrics to implement these surveys which allowed for sophisticated display and skip logic so that respondents did not receive every question within the survey. Of the eight service concepts, each respondent received four randomly selected concepts within their survey to reduce survey duration. If the respondent indicated that the service concept would be of any value, they were presented with a series of follow up questions related to that service concept. If the respondent indicated that the service concept was not valuable at all, additional questions were skipped and the next, randomly ordered service concept was displayed. By limiting the number of questions displayed to respondents and maximizing the relevance of follow up questions, we reduced the risk of survey fatigue and drop outs. If you are considering using another platform to implement a survey like the one we have, you should keep these functionality requirements in mind when deciding on a specific tool.
This toolkit is licensed under a Creative Commons Attribution/NonCommercial 4.0 International License. To view a copy of the license, please see http://creativecommons.org/licenses/by-nc/4.0/.