Executive Summary

For many years, higher education data collection and funding efforts have focused on student success metrics like enrollment, graduation, retention, and course completion rates. At the same time, higher education leaders have become increasingly aware—in part because of the COVID-19 pandemic—of the vast array of challenges that college students face outside of the classroom that prevent them from fully succeeding.

To shed light on the challenges and opportunities associated with the collection and prioritization of a broader set of student success metrics, especially those focused on a more holistic set of student experiences and challenges like food and housing security, we surveyed community college provosts across the country in fall 2020. Our report examines national provost perspectives on college priorities and influencing factors, traditional data collection practices, emerging data collection processes on student basic needs, and the role of data disaggregation for advancing equity.

Key Findings

  • Provosts emphasize academic and business priorities in determining institutional success metrics. The most important objectives for community college provosts are increasing retention, graduation, course completion, and enrollment. Data collection efforts for measuring and evaluating progress toward these most commonly-held objectives are fairly universal.
  • Social justice imperatives have become increasingly important over the last several years as institutions look to close achievement gaps. The share of provosts that see addressing social justice imperatives as highly important has more than doubled since a prior survey in 2019. Institutions are often looking to disaggregate data by race/ethnicity, socioeconomic status, enrollment status, and first-generation status to support equitable student outcomes.
  • Provosts see a strong connection between improving student well-being and bolstered performance and funding. Provosts believe that their institutions should play a role in supporting students with basic needs insecurity through social services provision. Many would like to know more about holistic student needs, as colleges are not currently collecting these data uniformly.
  • Given limited resources and infrastructure, most community colleges have limited capacity to collect additional data beyond what is currently required by funding and compliance agencies. When data are collected about basic needs fulfillment, student engagement, and student health, it is largely the student affairs office doing so. Data collection related to curricular outcomes, like graduation and enrollment rates, are typically shared between student and academic affairs departments.

Introduction

Across the higher education sector, there is growing interest in better understanding and serving students holistically to attend to their entire experience—both as students inside of the classroom but also as people with complex lives outside of their involvement with the college. Higher education institutions, in particular community colleges, are increasingly focused on far more than the academic objectives and needs of their students and are building supports to increase well-being, engagement, and basic needs security.

These relatively newer priorities can be mapped directly to the many challenges that students face throughout their academic experience, which in recent years have been increasingly shared both via student testimonies and mounting empirical research. We have found via interviews and surveys of community college students that their most pressing challenges are often related to their ability to balance academic responsibilities with family, household, and work responsibilities, as well as the ability to pay for their most basic needs like food, housing, and technology.[1] Although many students faced these challenges prior to the pandemic, they were amplified in the face of campus closures, job losses, as well as personal grief and loss. For instance, almost 44 percent of students at two-year colleges were affected by food insecurity in the months following March 2020, and 11 percent experienced homelessness during the pandemic.[2]

While these holistic student needs—which we define as those that reflect the student and their experience as a whole—are increasingly recognized as important for higher education leaders to address, there is still much to be done to move the needle towards greater collection and prioritization of metrics that support holistic student success. Community college leaders typically use metrics like graduation, course completion, and retention rates to quantify success, as these data points are most often tied to external mandates and funding incentives. Data on holistic student needs and success are less likely to be collected at the institutional level and subsequently are not prioritized for action to the same degree.

To provide the higher education community with insights on how data regarding holistic student needs—in particular, basic needs like food and housing security—have been collected and prioritized across the country, we launched a national study of provosts in late 2020. This survey is part of the broader Holistic Measures of Student Success initiative at Ithaka S+R through which we have inventoried community college data collection processes and reporting mandates and interviewed leaders in institutional research and effectiveness offices to gather perspectives on challenges and opportunities associated with new data collection practices.[3] Later this year we will conclude the Holistic Measures of Student Success project with a report containing a number of recommendations for community colleges, and the organizations to which they report data, on how student basic needs data can be more effectively gathered and prioritized.

Methodology

The population for this survey included provosts, chief academic officers (CAO), vice presidents of academic affairs (VPAA), and those in equivalent positions at not-for-profit, two-year colleges and associate’s dominant four-year institutions across the United States.[4] These individuals were surveyed because of their role in shaping and bolstering positive student outcomes at their college and their responsibility in overseeing data collection processes and the departments that determine institutional and student success. In an effort to streamline our reporting, we will refer to respondents within this report as provosts, recognizing that these titles vary across institutions.

Contact information for the survey sample was gathered by the research team through an iterative process, involving creating a list of applicable colleges and combing through institutional websites for the official in the most senior role within their academic affairs department or equivalent. Overall, 70 percent of the sample indicated their title at their institution aligns with provost, CAO, VPAA, or equivalent, 17 percent are vice presidents of both academic and student affairs, and three percent are the presidents of their college.[5]

Prior to the survey launch, the questionnaire was tested via five in-depth cognitive interviews with provosts in September and October 2020 to ensure that the survey instrument was understood in a clear and consistent fashion across respondents.[6] In an effort to incentivize and thank respondents for their participation, Ithaka S+R donated two dollars to the emergency student relief fund Believe in Students for each survey response and offered participants an invitation to a pre-release webinar of survey findings.[7]

The survey was fielded between October and December 2020 and distributed under the signatory of Ithaka S+R personnel and community college presidents. Overall, the sample for the survey included 1,080 community college provosts, with 128 completing the survey for a response rate of 12 percent.[8]

We employed a variety of descriptive techniques to analyze the data for this report, such as frequency analyses, cross-tabulations, and relevant correlations. Due to the limited sample size, inferential analyses were not conducted, and we stratified the sample by variables where group sizes for each category were roughly equally represented. On average, the resulting sample of colleges represented by the respondents have a graduation rate within 200 percent of normal time to degree of 34 percent (SD = 13), a full-time retention rate of 62 percent (SD = 9), a part-time retention rate of 44 percent (SD = 9), and a transfer-out rate of 18 percent (SD = 9).[9] Results throughout the report are occasionally broken down by institutional size: 32 percent of the sample are from small or very small colleges, 38 percent from medium sized colleges, 23 percent from large or very large colleges, and the remaining seven percent from four-year, primarily nonresidential colleges.[10]

Acknowledgements

We thank ECMC Foundation, in particular our program officer Angela Sanchez, for supporting this project as well as the learning community established by Education Northwest across the Basic Needs Initiative cohort of grant recipients.

We also thank our external advising committee for their expert guidance and support throughout the project:

  • Kimberly Cacciato, Student, The College of New Jersey
  • Cara Crowley, Vice President of Strategic Initiatives, Amarillo College
  • Linda García, Executive Director, Center for Community College Student Engagement
  • Pam Eddinger, President, Bunker Hill Community College
  • Wayne Taliaferro, Strategy Officer for Student Success at Community Colleges, Lumina Foundation
  • Paula Umaña, Director of Institutional Transformation, Hope Center for College, Community, and Justice

We are immensely grateful to Ithaka S+R colleagues Nicole Betancourt, Kimberly Lutz, and Maya Godbole for their input on this report. This project would not be possible without their substantive contributions.

College Priorities and Influencing Factors

Provosts are very much focused on increasing student retention, graduation, course completion, and learning, consistent with prior study findings from Ithaka S+R in 2019 (see Figure 1).[11] These objectives considerably align with traditional outcome-based metrics—like retention rates, graduation rates, and course completions—that colleges collect and report out on for funding, benchmarking, and accreditation.[12] In recent years, these outcomes have been prioritized considerably by the growing number of performance-based funding (PBF) models within state systems, which typically use traditional outcome metrics in their funding formulas.[13] The power of these funding models for influencing data collection is further evidenced by the high influence provosts place on accreditors and state departments of education and college systems to determine success metrics, as discussed later in this section.

Although most college priorities have not changed substantially since our last survey of provosts in 2019 (see Figure 1), we did see a notable increase in the prioritization of social justice imperatives. In fact, the share of provosts who indicated that social justice imperatives are extremely important more than doubled in the course of roughly a year; in 2019, less than a quarter rated it as extremely important compared to nearly six in ten in 2020. This significant increase may be in response to the ways in which the pandemic has revealed already existing and worsening inequities within higher education. Renewed demands for racial justice led by the Black Lives Matter movement and heightened calls to dismantle systemic racism within higher education most likely also played a role in this increased prioritization of social justice imperatives.

Community colleges, intuitively, are collecting and prioritizing data aligned with their most important objectives. All or almost all respondents rated traditional metrics of success—graduation rates, retention rates, and course completions—as highly important for determining student success (see Figure 2). Despite this strong focus on traditional metrics, it is notable that more than two-thirds of provosts consider more holistic metrics—like those centered on housing or food security and well-being—to be highly important in determining student success. Eight in ten rate basic needs fulfillment as highly important, while 75 percent rated student physical and mental health, and 63 percent rated student engagement, as equivalently critical to student success. Of course, there is a relationship between many of these holistic metrics and how they contribute to traditional indicators of student success; how provosts view this relationship will be discussed later in this report.

As mentioned previously, data collection agencies frequently influence the metrics collected by community colleges because of the impact these metrics have on their funding and accreditation outcomes. Almost nine in ten provosts rated accreditors as highly influential in shaping metrics used in determining student success, followed by 81 percent that consider their state’s department of education and/or college system as similarly influential (see Figure 3). National data collection organizations are less often influential; 69 percent of provosts rated the Integrated Postsecondary Education Data System (IPEDS) as highly influential, and 57 percent rated the National Student Clearinghouse (NSC) as highly influential. While accreditors and states are likely to use many of the same metrics for which IPEDS and NSC set standards, it is clear that they have greater influence over how institutions shape their data collection processes because of the decisions made with those data that affect funding and accreditation.

As data collection is predominately driven by incentives tied to funding and compliance, making changes to college-wide, and especially cross-institutional, data collection practices may require substantial extrinsic motivators. In the following sections, we explore perspectives on current and emerging data collection practices and what it might take—behaviorally and attitudinally—to shift colleges toward greater collection and prioritization of holistic student success metrics.

Institutional Data Collection Processes

It has been well-documented that many, if not most, institutional data are maintained and analyzed by institutional research and institutional effectiveness (IR/IE) departments. [14] These departments were at times informally referenced by our provost survey respondents in open-ended comments as the “champion” of their college’s data. IR/IE departments have historically most often reported directly to the college’s provost, though these reporting lines tend to be less consistent in the community college sector.[15] Forty percent of our community college provost respondents indicated that their IR/IE department reports directly to the provost, followed by 30 percent to their president’s office or chief executive officer (CEO), and the remaining portion to student affairs, business affairs, information technology (IT), or to a vice president of strategic initiatives or equivalent.

However, while student data are frequently centralized and maintained through IR/IE offices, many different departments generally contribute to the collection of these data. Individuals in academic and student affairs alike share data collection responsibilities, especially for metrics associated with funding, benchmarking, and compliance (see Figure 4). Metrics that are collected most often across the two are student demographic data, student enrollment, graduation rates, and retention rates.

Academic affairs, which generally includes both the provost and the IR/IE office, tends to lead on data collection on course completions and post-graduation employment, though there is a good deal of shared responsibility with student affairs for these efforts as well. For instance, 40 percent of provosts indicated course completion metrics are collected by academic affairs alone, compared to 46 percent who indicated these data are collected by both academic and student affairs.

Substantial data collection is also led by student affairs, especially for metrics related to holistic student success. Although 80 percent of provosts rated metrics of student basic needs as highly important for determining student success, student affairs more often collects these data than the academic affairs enterprise which reports to the provost (see Figure 4). We also found that nearly eight in ten provosts agree that improving student well-being, through basic needs fulfillment and engagement, will lead to tangible funding incentives. Thus, greater centralization of these holistic data points—which most often are collected by student affairs alone—may be merited for moving the needle on more traditional outcomes linked to funding.

When collecting and consulting data to enhance student success, colleges rely most often on surveys of current students and evaluations of current support service impact and value. However, some provosts expressed concerns with these data collection efforts given the impact of the pandemic. For instance, some provosts commented that they are concerned for survey response rates, sharing “It is tougher to get student participation in surveys,” and “Overwhelmed students [are] not completing surveys.” The pandemic has also made it more difficult to assess the impact of current services, as another provost mentioned “We are no longer able to collect student engagement with our services areas in the same way. More challenging because students are not visiting offices, but call centers.”

Overall, surveys of current students are considered relatively more important than surveys of prospective students or students who have left the college. A little under 90 percent of provosts rated current student surveys as highly important sources of data for enhancing student success, followed by 52 percent for surveys of students who have left the college and 44 percent for surveys of incoming students. While it is generally much easier to conduct surveys of current students than students who have not yet arrived or who have left, these post- and pre-college surveys could serve as an important method for documenting student goals early on in their college experience, and measure the attainment of those goals after they leave.

Student Goal Setting

Defining, tracking, and measuring students’ self-defined goals—that is, goals that students have created for themselves—can help ensure students are not only meeting institutional objectives, but their own as well. As gleaned from a previous Ithaka S+R study comprised of dozens of interviews and a survey of over 10,000 students across several community colleges, students enroll in college with a vast array of goals—from attaining a degree to gaining knowledge to building professional and personal experiences and relationships.[16] Given the ways in which these goals tend to be broader and more varied than how institutional leadership has traditionally defined student success, we decided to ask provosts in this survey a series of questions about how student goals are captured at their institutions.

Overall, many colleges are not currently documenting students’ self-defined goals to the same degree that they have measured other traditional indicators of success. A little under three in ten provosts somewhat or strongly agreed that their college has developed a comprehensive process for documenting student goals as defined by students themselves, and 35 percent agreed that their college comprehensively documents the extent to which students achieve their self-defined goals. Given that gathering and tracking student-defined goals requires a fairly nuanced, labor-intensive, and likely qualitative approach, it is perhaps unsurprising that most colleges have not found a way to scale this practice.

However, respondents who indicated their college does comprehensively track student goals were asked to further to describe this process. Many mentioned that student goals are agreed upon when a student starts at the college either through admissions intake forms, first year experiences, and/or meetings with academic or faculty advisors. These goals are then managed through iterative meetings with those advisors, or through virtual, licensed resources like degree audit programs and Tableau dashboards. One respondent commented that their “students have 24/7 access to a degree audit that identifies the courses needed to complete their programs and advisors use Pharos 360 to document qualitative student meetings and goal attainment.” Another provost shared that their college has “Friday goal calls and Friday college-wide town halls,” and yet another mentioned their students have an “extensive meeting with [an] advisor upon enrollment and follow up meetings each term.” Lastly, through its partnership with Achieving the Dream, a non-profit community college membership organization, one college developed a working student success center where students create an individual career plan upon enrollment. While documenting and tracking student goals may seem like a fairly straightforward process based on these examples, it can be challenging to scale given the ratio of personnel tracking these goals to students. However, measuring attainment of these self-defined goals can supplement and contextualize institutionally-prioritized outcome metrics of success.

Identifying and Meeting Needs Holistically

As discussed previously in this report, data on holistic student needs and outcomes, such as having adequate food, housing, health, and technology, are not collected as centrally nor prioritized to the same degree as other more traditional variables like graduation and course completion. And yet, according to a large-scale survey of community college students, their biggest challenges are often those outside of the classroom—these include balancing their personal and professional responsibilities with their academic ones and financially meeting their basic needs in conjunction with their educational expenses. [17] The pandemic has certainly amplified these challenges, and as a result, provosts have displayed an increased interest in understanding and gathering data on non-curricular needs. For instance, as one provost in the survey described, “COVID has brought about a newly heightened sense of urgency across the college to collect, understand, and respond to student data, there is more understanding of the influence of life outside of the classroom to learning inside of the classroom.”

Mitigating these challenges can help to ensure that students will be more successful at both achieving traditional performance outcomes and fulfilling their self-defined goals. For these reasons, it can be beneficial to both the student and college to identify and address students’ unmet needs outside of the classroom. To that end, we asked provosts about the holistic data their college is currently collecting and the extent to which they are meeting holistic student needs.

In general, provosts recognize the benefits of meeting students’ basic needs and see meeting these needs as part of their institution’s responsibility. Three-quarters of respondents recognize that higher education institutions should play a role in helping students meet their basic needs through social service provision; in the survey, the overwhelming majority strongly disagreed with the statement that it is not higher education’s responsibility to provide these services to students. Additionally, almost eight in ten somewhat or strongly agreed that enhancing student well-being can lead to tangible funding incentives. This signals that provosts see the value of incorporating holistic metrics of student success into their data collection. But, as we discuss later in this report, their inability to collect these data may be due to inadequate resources and infrastructure.

Although large shares of provosts agree that meeting student basic needs will lead to tangible results, a smaller share are currently tracking these needs. Currently, the holistic student data most often collected by colleges include disability needs and technology/Wi-Fi access, followed by short-term or emergency financial aid needs, student engagement, and physical safety (see Figure 5).

Most institutions are providing services and resources for disability needs as they must make reasonable accommodations for students with disabilities in accordance with the Americans with Disabilities Act of 1990.[18] As data on students with disabilities are also required by IPEDS, state data collection systems, and compliance organizations, it is unsurprising that colleges are collecting considerable data on students with disabilities (see Figure 5). It is notable that 48 percent and 44 percent are collecting data on technology and Wi-Fi needs and short-term/emergency financial aid respectively, which may be a function of the amplified nature of these needs due to the pandemic’s impact on severe job losses around the country and the shift to digital instruction. Additionally, colleges collect financial data through the Free Application for Federal Student Aid (FAFSA) as well as through other financial assistance programs such as Supplemental Nutrition Assistance Program (SNAP) and TRIO programs.

The amount of data currently collected on different student needs tends to align with the services and resources most often offered for those needs—though larger shares of colleges are generally providing support for these needs than are collecting data on them (see Figure 5). This signals that perhaps colleges are looking to external data sources to determine the extent of student needs and have chosen to provide services and resources before more formalized data collection processes have been implemented. It is also possible, given the limited infrastructure and funding to expand data collection, that colleges are allocating resources towards service provision rather than data collection processes.

Overall, colleges most often provide resources and services to support students with disabilities and to meet student technology and Wi-Fi needs. More than 70 percent of provosts indicated they provide services in these areas (see Figure 5). Meanwhile, around half of colleges provide services geared towards short-term/emergency financial aid, engagement, safety, and food security. Less than a quarter provide resources and services for transportation, belonging, housing security, and caregiving.

Additionally, larger colleges tend to collect more data on student basic needs compared to small or medium sized colleges—potentially due to the additional infrastructure, funding, and resources these colleges have in relation to their smaller counterparts (see Figure 6). Greater shares of large colleges specifically collect data on student disability, short-term or emergency financial aid, as well as food and housing security needs compared to small and medium sized colleges. However, greater shares of smaller colleges collect data on student technology access and Wi-Fi connectivity needs than medium or large colleges.

Disaggregating to Advance Equity

Disaggregating student data is an important way to understand student needs to promote equitable student outcomes, and it has become an increasingly important component of institutional data collection processes. Disaggregating institutional data can bolster equity across students by identifying student subgroups with greater needs and challenges in specific areas, allowing institutions to make data-informed decisions to better serve these students. As such, we asked provosts which demographic areas are the most important to disaggregate and their interest in knowing more about different student subgroups.

Two-thirds of provosts either somewhat or strongly agree that their college robustly disaggregates student data to pinpoint subgroups of students with differing levels of need. Further, colleges that collect greater shares of data on student basic needs are also generally more likely to disaggregate their data, particularly for student housing needs, food security, mental health and well-being, caregiving needs, and belonging.[19] Thus, the collection of data on holistic student needs is related to more equitable analysis practices.

Overall, when asked to select the five most important characteristics by which their colleges can disaggregate data, three-quarters of provosts identified socioeconomic status (SES) and students’ race/ethnicity, followed by 64 percent who selected part-time and full-time enrollment status, and 52 percent for parent education level or first-generation status (see Figure 7). In the comments to the survey, some provosts mentioned a heightened interest in disaggregating data to focus on racial equity given recent national movements for racial justice. One provost mentioned their college “furthered all disaggregation efforts, centering the experience of our students of color,” and another commented their college has “been disaggregating for a while, [and are] acting more on the data now, confronting faculty with it.” As SES and race/ethnicity are the most important variables for colleges to disaggregate, it makes sense that colleges are also the most highly interested in knowing more about student needs in these areas.

Provosts want to know even more about many different student subgroups, with over 80 percent highly interested in race/ethnicity and SES, 79 percent in part-time/full-time enrollment status and first generation status, and 76 percent in students at different levels of credit accumulation (Figure 7). However, there are a number of student characteristics where there is relatively less existing effort toward disaggregation and little interest in expanding these efforts, including for parental/caregiver status, gender identity, and sexual orientation. It is notable, however, that students identified through these demographics have often been in greater need of support for their basic needs. For instance, at the beginning of the pandemic, a survey of around 16,000 students revealed that higher instances of transgender and non-binary students, as well as students who are caregivers, displayed increased concern with their mental and physical health compared to their peers.[20] Other studies of student parents and caregivers have also observed high rates of challenges with securing food and housing.[21] Greater levels of support to these student subgroups will necessitate data collection that identifies relevant students and their needs through targeted data collection or disaggregation.

Expanding Holistic Student Success Metrics

While current data collection efforts support more traditional, institutionally focused metrics, college leaders are interested in knowing about more holistic, student-centered metrics as well. In the survey, we asked provosts about the types of holistic student needs they are most interested in and their primary constraints in their ability to expand their current data collecting process.

Overall, large shares of provosts are highly interested in a number of holistic student needs. Around three-quarters of respondents are extremely or very interested in knowing more about student engagement, food security, mental health and well-being, need for short-term or emergency financial aid, and technology access and Wi-Fi connectivity needs (see Figure 8). By institutional size, greater shares of provosts at medium and large colleges are interested in a range of holistic student needs compared to shares of provosts from small colleges (see Figure 9). This could be due to the already limited infrastructure at smaller colleges to expand their data collection processes, as discussed later in this section.

Provosts are unsure about how to adjust their data collection processes, particularly if this requires collecting more data without additional resources. Given the volume of data already collected by institutions for funding and compliance, additional data collection may overwhelm the limited resources of institutional research or institutional effectiveness (IR/IE) departments and create additional strain if there is insufficient data infrastructure to collect these metrics. As such, expanding data collection efforts towards greater inclusion of holistic metrics may necessitate streamlining and/or reducing the amount of metrics currently collected. However, it is notable that provosts are not sure if they want to reduce the number of different metrics their college collects on student success—31 percent somewhat or strongly agree they are interested in reducing the amount of student success metrics collected, 30 percent neither agree nor disagree, and the remaining 40 percent strongly or somewhat disagree.

Accordingly, when asked about the primary challenges for expanding data collection toward these holistic student metrics, provosts most often select lack of human resources within IR/IE departments, followed by inadequate digital data infrastructure like new platforms and software, and a lack of capacity to incorporate new metrics (see Figure 10). About 13 percent of respondents described additional constraints, citing lack of time, the “lack of funding to enact some of these measures,” and the proliferation of metrics to choose from.

Additionally, while similar shares of provosts from colleges of different sizes selected lack of IR/IE resources as a primary constraint, 54 percent of provosts at small colleges selected inadequate digital data infrastructure as a primary constraint compared to 45 percent of provosts at large colleges, and 35 percent at medium sized ones. To mitigate these challenges at smaller institutions, one provost mentioned their college “collect[s] a lot of data at our system level and it is shared back to our campuses, this helps with our lack of personnel to work on data collection as we are a very small campus.” Thus, data collection infrastructure appears to be a relatively more significant challenge for smaller institutions.

A possible step in expanding data collection processes, given these limitations, is to review currently collected indicators of holistic student success, reduce duplicative data collection across different departments, centralize where data are stored, and make these unified data available to faculty and staff across the college. Indeed, seven in ten provosts already strongly or somewhat agree that their college has a well-developed culture of regularly sharing data on students with faculty and staff.

Final Thoughts

Collecting data on unmet needs is often an important step in addressing those needs—and large-scale surveys, especially in the community college sector, indicate that students are facing food insecurity and housing insecurity, and are especially struggling to meet these and additional basic needs in light of the global pandemic.[22]

Through this national survey of community college provosts, we see that higher education leaders want to expand and adjust their institutional data collection to incorporate more holistic metrics of student success. However, while interest is high, and provosts are aware of the benefits that tracking and meeting these needs bring to their institutions, the challenge now lies in prioritizing these metrics and developing centralized data collection processes in the face of limited infrastructure and external incentives to do so.

Currently, traditional metrics of student success are typically collected across both academic and student affairs and housed by institutional research and effectiveness departments, yet the majority of holistic metrics are siloed within student affairs departments. As many colleges are attempting to meet holistic needs, especially considering the amplification of these needs throughout the pandemic, it may be beneficial to centralize holistic data alongside more traditional metrics. Disaggregating these data to identify particular subgroups facing increased challenges and sharing these data openly with faculty and staff can help lead to streamlined holistic measurement while managing limited resources, capacity, and infrastructure. These steps are important for local change to take place.

And while there is leadership buy-in for local data expansion efforts to support holistic student needs, significant change across higher education—change that involves taking these communications of commitment and turning them into action–will also likely require external mandates and incentives. As one provost mentioned in a comment to the survey, “So much is required at the state level that policy change could drive/push us to more holistic metrics.” As the metrics that colleges use to determine student success are most often influenced by their states and accreditors, there needs to be greater collection and prioritization at the state, regional, and national level.

Building on these findings, later this year we will produce a set of recommendations for higher education institutions and data collection organizations on local and large-scale strategies for greater prioritization of holistic student success metrics. We look forward to learning from and sharing with the community as our work continues.

Endnotes

  1. Melissa Blankstein, Christine Wolff-Eisenberg, and Braddlee, “Student Needs are Academic Needs: Community College Libraries and Academic Support for Student Success,” Ithaka S+R, September 2019, https://doi.org/10.18665/sr.311913.

  2. Sara Goldrick-Rab, Vanessa Coca, Gregory Kienzl, Carrie R. Welton, Sonja Dahl, and Sara Magnelia, “#RealCollege During the Pandemic: New Evidence on Basic Needs Insecurity and Student Well-Being,” The Hope Center for College, Community, and Justice, 2020, https://www.luminafoundation.org/wp-content/uploads/2020/07/evidence-of-basic-needs-insecurity.pdf

  3. Melissa Blankstein and Christine Wolff-Eisenberg, “Measuring the Whole Student: Landscape Review of Traditional and Holistic Approaches to Community College Student Success,“ Ithaka S+R, September 2020, https://doi.org/10.18665/sr.313888; Melissa Blankstein, “Student Success, Basic Needs, and the COVID-19 Pandemic: Institutional Research Perspectives on Holistic Student Success Metrics,” Ithaka S+R, October 2020, https://sr.ithaka.org/blog/student-success-basic-needs-and-the-covid-19-pandemic/.

  4. The list of colleges included within the sample was determined by Carnegie Classification and pulled from the Integrated Postsecondary Education Data System based on data from the 2017 academic year. The following Carnegie Classifications are those that make up our sample: Associate’s Colleges: High Transfer-High Traditional; Associate’s Colleges: High Transfer-Mixed Traditional/Nontraditional; Associate’s Colleges: High Transfer-High Nontraditional; Associate’s Colleges: Mixed Transfer/Career & Technical-High Traditional; Associate’s Colleges: Mixed Transfer/Career & Technical-Mixed Traditional/Nontraditional; Associate’s Colleges: Mixed Transfer/Career & Technical-High Nontraditional; Associate’s Colleges: High Career & Technical-High Traditional; Associate’s Colleges: High Career & Technical-Mixed Traditional/Nontraditional; Associate’s Colleges: High Career & Technical-High Nontraditional; Special Focus Two-Year: Health Professions; Special Focus Two-Year: Technical Professions; Special Focus Two-Year: Arts & Design; Special Focus Two-Year: Other Fields; and Baccalaureate/Associate’s Colleges: Associate’s Dominant.

  5. The remaining respondents in the sample wrote in their own unique title that align with the highest-ranking administrator within academic affairs, such as Dean of Academic and Student Services.

  6. Christine Wolff-Eisenberg, “Employing Cognitive Interview for Questionnaire Testing: Preparing to Field the US Faculty Survey,” Ithaka S+R, June 2018, https://sr.ithaka.org/blog/employing-cognitive-interviews-for-questionnaire-testing/.

  7. Following the closure of the survey, each completed and partial response to the survey was included within our donation to Believe in Students, resulting in a total donation of $346.00 to the emergency student fund.

  8. Margin of error is 7 percent for n = 128 at the 90 percent confidence interval.

  9. Data were taken from the Integrated Postsecondary Education Data System (IPEDS) as part of the fall 2019 reporting period.

  10. Institutional size categories were determined by Carnegie Classification 2015/2018: Size and Setting (beginning 2015-16) IPEDS variable. Within analysis, some size and setting categories were merged: “very small, two-year” and “small, two-year” were merged into a “small” category and “large, two-year” and “very large, two-year” were also merged into a “large” category. Additionally, “primarily nonresidential” includes colleges with fewer than 25 percent of degree-seeking students that live on campus and/or fewer than 50 percent who attend full time and were excluded from stratified analysis due to their low number of respondents compared to other subgroups.

  11. Melissa Blankstein and Christine Wolff-Eisenberg, “Organizing Support for Success: Community College Academic and Student Support Ecosystems,” Ithaka S+R, December 2019, https://doi.org/10.18665/sr.312259.

  12. Melissa Blankstein and Christine Wolff-Eisenberg, “Measuring the Whole Student: Landscape Review of Traditional and Holistic Approaches to Community College Student Success,“ Ithaka S+R, September 2020, https://doi.org/10.18665/sr.313888.

  13. Despite increased implementation of performance-based funding models nationwide, the intended results of these models, such as bolstering student performance and completion, are not yet emerging in early evaluations of these policies. See Robert Kelchen, “Performance-Based Funding Produces Mixed Results,” Education Next: Forum, 20, no. 1, (2019), https://www.educationnext.org/performance-based-funding-produces-mixed-results-forum-should-congress-link-higher-ed-funding-graduation-rates/.

  14. Randy L. Swing, Darlena Jones, and Leah Ewing Ross, “National Survey of Institutional Research Offices,” Association for Institutional Research, 2016, https://www.airweb.org/docs/default-source/documents-for-pages/national-survey-of-ir-offices-report.pdf?sfvrsn=1ab5100b_4.

  15. Ibid.

  16. Melissa Blankstein, Christine Wolff-Eisenberg, and Braddlee, “Student Needs are Academic Needs: Community College Libraries and Academic Support for Student Success,” Ithaka S+R, September 2019, https://doi.org/10.18665/sr.311913.

  17. Ibid.

  18. Americans with Disabilities Act. (1990). Public Law 101-336. 42 U.S.C. 12111,12112.

  19. Agreement that their college robustly disaggregates data to determine subgroup needs is significantly and positively correlated with the amount of data collected on each of the following areas of need: Housing security and homelessness r = .327, p < .000; food security r = .247, p < .05; mental health and well-being r = .270, p < .05; caregiving r = .206, p < .05; and, belonging r = .261, p < .05.

  20. Melissa Blankstein, Jennifer K., Frederick, and Christine Wolff-Eisenberg, “Student Experiences During the Pandemic Pivot,” Ithaka S+R, June 2020, https://doi.org/10.18665/sr.313461.

  21. Sara Goldrick-Rab, Carrie R. Welton, and Vanessa Coca, “Parenting While in College: Basic Needs Insecurity Among Students with Children,” The Hope Center for College, Community, and Justice, May 2020, https://hope4college.com/wp-content/uploads/2020/05/2019_ParentingStudentsReport.pdf.

  22. Sara Goldrick-Rab, Vanessa Coca, Gregory Kienzl, Carrie R. Welton, Sonja Dahl, and Sara Magnelia, “#RealCollege During the Pandemic: New Evidence on Basic Needs Insecurity and Student Well-Being,” The Hope Center for College, Community, and Justice, 2020, https://www.luminafoundation.org/wp-content/uploads/2020/07/evidence-of-basic-needs-insecurity.pdf.