Acknowledgements

This project is generously funded by a US Department of Education First in the World validation grant with additional support from Arnold Ventures. [1] We thank the project principal investigator, Dr. Timothy Renick of Georgia State University, for inviting Ithaka S+R to serve as the independent evaluator and for being an invaluable thought and project partner throughout. We would like to acknowledge the key role of the University Innovation Alliance (UIA), which inspired the project by virtue of its collaborative nature and the commitment of its member institutions to innovate together by being early testers and adopters of new ideas. We thank the leadership and staff of the 11 institutions that participated in the study: Arizona State University (ASU), Georgia State University (Georgia State), Iowa State University (Iowa State), Michigan State University (MSU), The Ohio State University (Ohio State), Oregon State University (Oregon State), Purdue University (Purdue), University of California Riverside (UCR), University of Central Florida (UCF), University of Kansas (KU), and University of Texas at Austin (UT Austin). In particular, we thank the MAAPS team members at each institution for committing to the project and overseeing its different aspects, and for working with us to facilitate data collection around student outcomes and program implementation. This includes the principal investigators, advising leads, data leads, data analysts, and MAAPS advisors at each site, as well as the UIA fellows and other institutional staff members who assisted the MAAPS teams on project implementation and data collection as needed, including all members of the MAAPS data teams for their valued collegiality and support.

We extend a special thanks to Dr. Rayane Alamuddin, former associate director at Ithaka S+R, whose leadership and research expertise were integral to the success of this project. We also thank C. Lockwood Reynolds of Kent State University for his technical assistance during the early years of the project.

Last but not least, we are grateful to the thousands of students who participated in the MAAPS project and were willing to contribute their data to the evaluation study, and to those who also completed surveys and participated in focus groups with us. We wish them much success in their educational endeavors and beyond.

Summary

Monitoring Advising Analytics to Promote Success (MAAPS) is a large-scale randomized-controlled trial designed to validate the effectiveness of intensive, proactive, technology-enhanced advisement in increasing achievement, persistence, and completion of historically underserved students. MAAPS is a multi-institutional project of the University Innovation Alliance (UIA) funded by a US Department of Education First in the World Grant to Georgia State with additional support and funding from Arnold Ventures.

The MAAPS advising intervention and accompanying impact and implementation studies began at the 11 public research universities that were members of the UIA at the start of the Fall 2016 term. Over 5,000 low-income and/or first-generation students were randomly assigned to the treatment group and received proactive outreach, degree-planning activities, and targeted interventions from their assigned MAAPS advisors in addition to business-as-usual advisement at their institution, while over 5,000 students were assigned business-as-usual advisement only at their institution. The intervention concluded at the end of the Spring 2019 term, after three years of implementation, at ten out of 11 participating institutions. Georgia State was the only institution to provide the intervention to its original cohort of treatment group students through the 2021-2022 academic year, the students’ sixth year. The second half of the fourth year and the entirety of students’ fifth and sixth years occurred during the COVID-19 pandemic, which, among many other challenges for students and the institutions, disrupted the delivery of advising services. For example, national research on the impact of the pandemic on student outcomes has found increases in DFW (D, F, and withdrawal) rates, especially at institutions that serve racially diverse student bodies.

This final report on the project presents impact findings for the intent-to-treat effect of MAAPS advisement on participating students’ outcomes for the final analytic sample of 10,037 students and at each participating institution after six academic years. The study’s two primary outcomes are based on data from the National Student Clearinghouse (NSC), which provides graduation and persistence information after six years on MAAPS students at their original institutions, as well as those who left their original MAAPS institution and enrolled and graduated elsewhere. This report also summarizes findings from the implementation study conducted during the first three years of the study, which consisted of interviews with project staff, student advising surveys, the completion of an implementation form on the extent to which each institution implemented the key components of the intervention to date, focus groups of participating students, and analysis of advising interaction data logged by MAAPS advisors.

For the full study sample, assignment to the MAAPS advising intervention had no significant impact on either of the study’s two primary outcomes–graduation and persistence–six years after random assignment.

However, exploratory analyses of student subgroups revealed that first-generation students in the treatment group had a persistence rate that was two percentage points lower than first-generation students in the control group, driven in large part by enrollment differences in the Spring 2022 term.

Secondary analyses revealed significant positive impacts on the study’s primary outcome measures at Georgia State, the only institution to run the intervention for six years. After six academic years, treatment group students had a graduation rate that was seven percentage points higher than control group students. Follow-up analyses revealed that this impact was driven by the graduation rate of Black students in the treatment group at Georgia State, who had a graduation rate that was 15 percentage points higher and a persistence rate that was 11 percentage points higher than their counterparts in the control group. Additional analyses using data collected through the implementation study found that Black students received a relatively high dosage of the intervention, compared to other students in the treatment group, even though the intervention design was blind to race, and Georgia State does not use race as a factor in its early alert models. This may at least partially explain why they benefitted from the intervention.

In contrast to the findings at Georgia State, after six academic years, treatment group students at UCR had a persistence rate that was five percentage points lower than control group students, also driven in large part by enrollment differences in the Spring 2022 term.

Institutions faced a number of challenges implementing the MAAPS protocol, preventing some institutions from offering all components. These challenges, combined with only three years of implementation at ten out of 11 institutions and the disruption brought on by the pandemic, were likely responsible for finding no positive impacts in the aggregate and at ten of the 11 participating institutions.

Despite these challenges, results from student advising surveys and focus groups suggest that students gained valuable skills and information and had a more favorable perception of their advising experience as a result of the intervention. In addition, most institutions reported that the MAAPS project will have a lasting positive impact on their institutional programming, policies, and practices, especially around supporting historically underserved students through academic and financial advising. Over the last few years, MAAPS institutions have simplified certain curricula and degree plans; filled gaps in the type of information being collected on student success indicators; developed and deployed new tools to collect and share information between advisors; reassessed and modified their approach to advising, including tailoring advising services to close equity gaps; and brought together different parts of the institution to better support students.

Project Background

Monitoring Advising Analytics to Promote Success (MAAPS) is a multi-year, multi-institutional project of the University Innovation Alliance (UIA) initially funded and supported by a US Department of Education First in the World Grant to Georgia State University. The project is a large-scale, randomized-controlled trial designed to validate the effectiveness of intensive, proactive, technology-enhanced advisement in increasing achievement, persistence, and completion for historically underserved students. The study included more than 10,000 low-income and/or first-generation students enrolled at the 11 large public universities that constituted the membership of the UIA at the time: Arizona State University (ASU), Georgia State University (Georgia State), Iowa State University (Iowa State), Michigan State University (MSU), The Ohio State University (Ohio State), Oregon State University (Oregon State), Purdue University (Purdue), University of California Riverside (UCR), University of Central Florida (UCF), University of Kansas (KU), and University of Texas at Austin (UT Austin). Ithaka S+R has served as the independent evaluator of the study since the project’s inception in 2015.

This report presents: 1) a brief overview of the MAAPS intervention and its key activities; 2) an update on the final study and analytic samples; 3) findings from the impact analyses on the outcome measures after six academic years for the full sample and institutional subsamples; 4) implementation study findings; and 5) a discussion of the results and avenues for future research.

Overview of the MAAPS Intervention

The MAAPS intervention is grounded in empirical research findings on the positive impacts of intensive, proactive, technology-enhanced advisement and degree planning,[2] and in the dramatic improvements in student success associated with Georgia State’s advising redesign.[3] Attempting to address documented obstacles to college persistence and completion that disproportionately affect historically underserved students, the intervention includes the following activities: (1) regular and individualized degree planning activities; (2) real-time and early alerts prompted in part through an analytics-based system; and (3) timely, targeted advising interventions informed by degree planning activities and early alerts.

The MAAPS advising intervention and accompanying impact and implementation studies began at each participating institution at the start of the Fall 2016 term, after a year of planning and preparation. The advising intervention was offered to a randomly selected group of eligible students at each institution (i.e., “treatment group”) who were assigned a dedicated MAAPS advisor by their institution. The majority of MAAPS advisors were given a caseload of 150 students or fewer at the outset, significantly lighter than the average caseload of traditional academic advisors. MAAPS advisors provided advisement that included the intervention components (i.e., “MAAPS advisement”) to randomly selected treatment group students, in addition to business-as-usual advisement from their institution. Randomly selected control group students received business-as-usual advisement at their institution only.[4] The advising intervention concluded at the end of the Spring 2019 term at most participating institutions, after three years of implementation. Georgia State was the only institution to provide MAAPS advisement to its original cohort of treatment group students through the 2021-2022 academic year, students’ sixth year.[5]

The original First in the World grant, initially slated to end following the Spring 2019 term, was extended to allow for the collection of student administrative data through the 2019-20 academic year. In addition, a new grant from Arnold Ventures funded the collection and analysis of data on MAAPS students from the National Student Clearinghouse (NSC) through 2020 and 2022–study participants’ fourth and sixth years after initial enrollment. These data supplement existing MAAPS study data by providing graduation and persistence information after the 2019-20 and 2021-22 academic years on MAAPS students at their original institutions, as well as those who left their original MAAPS institution and enrolled and graduated elsewhere. This report is the second to include results based on NSC data, following the publication in 2021 of findings on the impact of the MAAPS intervention on student outcomes after four academic years.

The MAAPS Cohort: Final Study and Analytic Samples

In the summer of 2016, the 11 participating institutions identified more than 20,000 Pell-eligible and/or first-generation students who met the study’s eligibility criteria. Four weeks before the start of the Fall 2016 term, Ithaka S+R randomly selected a subset of 10,946 eligible students, stratified by institution and based on each institution’s desired sample size, and then randomly assigned them to either the treatment or control group, also stratified by institution. Each eligible student had an equal chance of being selected into the study, conditional on their institution, and each selected student had an equal chance of being assigned to either the treatment or control group, conditional on their institution.[6]

A total of 457 students were subsequently identified as not eligible for the study and were removed from the sample and study. These students either turned out to have baseline characteristics that rendered them ineligible to participate in the study (e.g., were neither Pell-eligible nor first-generation) or had not matriculated at the participating institution for which they were selected. This resulted in a final study sample of 10,489 students, including 5,239 in the treatment group who were assigned to receive MAAPS advisement and 5,250 in the control group. Cohort sizes varied by institution, ranging from 391 to 1,162 students. Students who opted out of the study and deceased students were included in the study sample but excluded from analytic samples, and thus considered attriters. There was a total of 452 attriters, resulting in a 4.3 percent overall attrition rate.[7]

Table 1 provides a breakdown of final study and analytic samples at each institution and across the study.

Table 1. Final Study and Analytic Samples by Assigned Group and Institution

MAAPS-eligible at selectionRandomly selected & assignedIncluded in study sampleIncluded in analytic sample
 Total TotalCTTotalCTTotal
ASU3,8451,0375075041011501494995
Georgia State1,9981,040492502994476488964
ISU1,5201,23058457811625615321,093
KU1,1731,17356555911245465361,082
MSU1,830930456456912434434868
Ohio State2,6151,024494499993486448934
Oregon State920920437430867434420854
Purdue964964472469941472436908
UCR3,5341,1125445441088507488995
UCF1,2031,1005035031006489485974
UT Austin416416196195391186184370
Total20,01810,9465,2505,23910,4895,0924,94510,037

Key Outcome and Baseline Measures

The measures used to capture students’ outcomes at the conclusion of the Spring 2022 term, after six academic years, were derived from student data collected by NSC and submitted directly to Ithaka S+R. NSC data supplement existing MAAPS study data by collecting persistence and completion information on students who left their initial MAAPS institution and continued college elsewhere. We generated one academic achievement variable and one persistence/credit accumulation variable using NSC data, as detailed below. Per the study’s pre-analysis plan, these measures are the study’s primary outcomes. This final report on the MAAPS study is the first to not include analysis and findings based on student administrative data submitted directly by participating institutions to Ithaka S+R, which previously served as secondary outcomes.

Outcome Measure: Academic Achievement

Graduation: Whether the student earned a bachelor’s degree by the end of the Spring 2022 term from their initial MAAPS institution or elsewhere,[8] or not.[9] Binary variable.

Outcome Measure: Persistence/Credit Accumulation

Persistence: Whether the student earned a bachelor’s degree by the end of the Spring 2022 term from their initial MAAPS institution or elsewhere, or was enrolled at least half time,[10] at the end of the Spring 2022 term at their initial MAAPS institution or a degree-granting institution, or not.[11] Binary variable.

Due to the nature of the selected study outcomes, which are specific to students’ postsecondary experiences and performance, we rely on one baseline measure of high school achievement for all outcomes.

Baseline Measure: Academic Achievement and Persistence/Credit Accumulation

High School Achievement and College Readiness: Student’s highest composite ACT score recorded by the participating institution where the student enrolled. For students who submitted SAT scores, concordance tables provided by the College Board were used to convert SAT composite scores to ACT composite scores.

Analyses and Impact Findings After Six Years

This section provides an update on the final analytic samples, describes the analytic approach, and presents the impact analysis results for the full sample of 10,000+ students remaining in the study after six academic years, as well as for each institutional subsample.

Analytic Samples: Full Sample

Table 2 presents attrition information for the full sample reported in this section.

Table 2. Analytic Sample and Attrition Information for all Outcome Measures: Full Sample

Outcome Measure
Control Group
Treatment Group
Diff. Attrition
(pp)
Overall Attrition
# original sample# analytic sample# original sample# analytic sample
Graduation5,2505,0925,2394,9452.64.3%
Persistence5,2505,0925,2394,9452.64.3%

The combination of overall and differential attrition between the treatment and control groups are considered low according to What Works Clearinghouse (WWC) standards, yielding a tolerable (low) level of potential bias under cautious assumptions.[12]

Analytic Approach: Full Sample

We employed linear regression analyses to assess the intent-to-treat effect of the MAAPS intervention after six academic years on the specified outcomes for the full sample. The primary model includes baseline demographic covariates collected at the start of the study in 2016 (high school achievement scores as determined by composite ACT score,[13] low-income status as determined by expected family contribution (EFC) at baseline, and the number of college-level credit hours transferred into the institution before the start of the Fall 2016 term).[14] Full sample analyses also include institutional fixed effects to account for idiosyncrasies across the 11 institutions in samples, implementation, and duration of the intervention, and policies regarding enrollment deadlines, credit accrual, and GPA calculations. We addressed missing baseline data in accordance with WWC standards by replacing missing values with a constant of zero and adding a missing data indicator for the given baseline measure in the analysis.[15]

For the analysis included in the report, significant impact is defined as p-values<0.10. Unlike past analyses, we did not use the Benjamini-Hochberg method to adjust for multiple outcomes within a given domain, since WWC categorizes graduation within the academic achievement domain and persistence within the persistence/credit accumulation domain. However, all p-values<0.10 associated with subgroups of interest are corrected to adjust for multiple comparisons using the Benjamini-Hochberg method. In addition, all adjusted p-values<0.10 associated with subgroups of interest are tested for heterogeneity of impact by regressing the outcome against the treatment variable indicating whether the student was in the treatment group, a binary variable indicating whether the student is in the relevant subgroup, and an interaction between those two variables. Where relevant, we conducted additional exploratory analyses to further examine or explicate certain results. Regression tables for the full sample are presented in Appendix A.

The primary model for the full sample is estimated as follows:

Yij = δ+ β*TREATMENTi + αXi + γ*INSTj + εij

Where Y is an outcome for individual i at institution j, TREATMENT indicates whether the student was in the treatment group, X is a vector of control variables, and INST represents the institutional fixed effects.

Impacts: Full Sample

Table 3 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on academic achievement.

Table 3. Intent-To-Treat Effect of MAAPS Advisement on Academic Achievement Outcomes: Full Sample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean[16](SD)
Graduation5,0920.70(0.46)4,9450.70(0.46)-0.00-0.010.692

For the full sample, assignment to the MAAPS advising intervention had no significant impact, on average, on academic achievement after six academic years.

Exploratory analyses revealed that there also were no significant impacts on academic achievement for any of the four student subgroups of interest (i.e., Pell-eligible students, students from underrepresented ethnic or racial minority groups, and Black students specifically).[17]

Table 4 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation.

Table 4. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Full Sample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean(SD) 
Persistence5,0920.77(0.42)4,9450.76(0.43)-0.01-0.010.113

For the full sample, assignment to the MAAPS advising intervention had no significant impact, on average, on persistence/credit accumulation after six academic years.

However, while there were no significant impacts for three of the four student subgroups of interest in the full sample, we did observe significant impact among first-generation students in the full sample as an exploratory finding. Table 5 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation among first-generation students.

Table 5. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Full Sample, First-Generation Students

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p[18]
nMean(SD)nAdj. mean(SD) 
Persistence2,6910.76(0.43)2,6240.74(0.44)-0.02-0.020.005

After six academic years, first-generation students in the treatment group had a persistence rate that was 2 percentage points lower than first-generation students in the control group. However, there were no differences in graduation rates between the two groups.

Table 6 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation by first-generation status.

Table 6. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Full Sample, by First-Generation Status

Persistence
Treatment<-0.01

(0.02)
<0.01

(0.01)
First-Generation x Treatment-0.02

(0.02)
-0.03

(0.02)
Observations9,7049,704
Baseline CovariatesNOYES
Institutional FEYESYES

The results do not reach the 0.10 threshold, which suggests that the significant finding among first-generation students may have resulted by chance.

Follow-up analyses revealed that differences in persistence rates were primarily driven by enrollment differences in the Spring 2022 term. Specifically, 353 control group students were enrolled in the Spring 2022 term and had not yet graduated, compared to 307 in the treatment group. It is not clear what factors were responsible for this difference. It is worth noting that an enrollment snapshot may provide an incomplete picture of persistence if there are students who temporarily stopped out in the Spring 2022 term but return at a later term. Regression tables for the full sample are presented in Appendix A.

Institutional Subsamples

Secondary analyses were conducted to determine whether the MAAPS intervention may have had differential impacts on graduation and persistence at the 11 participating institutions. Among institutional subsamples, significant impacts on the study’s primary outcome measures—graduation and persistence—were observed at Georgia State, the lead institution and the only institution to offer MAAPS advisement to its students after the third year of the intervention, and UCR. No significant impacts were observed on either outcome measure at the remaining nine participating institutions after six academic years. Regression tables for the other nine institutions are presented in Appendix D.

Analytic Samples: Georgia State and UCR Subsamples

Table 7 presents attrition information for the Georgia State sample reported in this section.

Table 7. Analytic Sample and Attrition Information for all Outcome Measures: Georgia State Subsample

Outcome MeasureControl GroupTreatment GroupDiff. Attrition

(pp)
Overall Attrition
# original sample# analytic sample# original sample# analytic sample
Graduation4924765024880.53.0%
Persistence4924765024880.53.0%

The combination of overall and differential attrition between the treatment and control groups are considered low according to WWC standards, yielding a tolerable (low) level of potential bias under cautious assumptions.

Table 8 presents attrition information for the UCR sample reported in this section.

Table 8. Analytic Sample and Attrition Information for all Outcome Measures: UCR Subsample

Outcome MeasureControl GroupTreatment GroupDiff. Attrition

(pp)
Overall Attrition
# original sample# analytic sample# original sample# analytic sample
Graduation5445075444883.58.5%
Persistence5445075444883.58.5%

The combination of overall and differential attrition between the treatment and control groups are considered low according to WWC standards, yielding a tolerable (low) level of potential bias under cautious assumptions.

Analytic Approach: Georgia State and UCR Subsamples

We employed linear regression analyses to assess the intent-to-treat effect of the MAAPS intervention after six academic years on the specified outcomes at each institution. The primary model includes the same baseline demographic covariates. We addressed missing baseline data in accordance with WWC standards by replacing missing values with a constant of zero and adding a missing data indicator for the given baseline measure in the analysis.

Unlike past analyses, we did not use the Benjamini-Hochberg method to adjust for multiple outcomes within a given domain, since WWC categorizes graduation within the academic achievement domain and persistence within the persistence/credit accumulation domain. However, all p-values<0.10 associated with subgroups of interest are corrected to adjust for multiple comparisons using the Benjamini-Hochberg method. In addition, all adjusted p-values<0.10 associated with subgroups of interest as well as p-values<0.10 associated with institutional subsamples are tested for heterogeneity of impact by regressing the outcome against the treatment variable indicating whether the student was in the treatment group, a binary variable indicating whether the student is in the relevant subgroup or institutional subsample, and an interaction between those two variables. Where relevant, we conducted additional exploratory analyses to further examine or explicate certain results. Regression tables for the Georgia State and UCR subsamples are presented in Appendix B.

The primary model for each institutional subsample is estimated as follows:

Yi = δ+ β*TREATMENTi + αXi + εi

Where Y is an outcome for individual i, TREATMENT indicates whether the student was in the treatment group, and X is a vector of control variables.

Impacts: Georgia State Subsample

Table 9 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on academic achievement at Georgia State.

Table 9. Intent-To-Treat Effect of MAAPS Advisement on Academic Achievement Outcomes: Georgia State Subsample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean(SD) 
Graduation4760.61(0.49)4880.68(0.47)0.070.070.018

Assignment to the MAAPS advising intervention resulted in significant positive impact on academic achievement for the sample of 964 students enrolled at Georgia State. After six academic years, treatment group students had a graduation rate that was seven percentage points higher than control group students.

Table 10 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on academic achievement by the Georgia State subsample.

Table 10. Intent-To-Treat Effect of MAAPS Advisement on Academic Achievement Outcomes: Full Sample, by Georgia State Subsample

Graduation
Treatment-0.02**

(0.01)
-0.01**

(<0.01)
GSU x Treatment0.09***

(0.01)
0.08***

(<0.01)
Observations10,03710,037
Baseline CovariatesNOYES
Institutional FEYESYES

These results indicate that the treatment effect was significantly greater for students at Georgia State than it was for students enrolled at the other ten participating institutions, further supporting the findings in table 9.

Exploratory analyses revealed that Pell-eligible students experienced similar results: Pell-eligible students in the treatment group at Georgia State had a graduation rate that was eight percentage points higher than Pell-eligible students in the control group at Georgia State.

Table 11 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation at Georgia State.

Table 11. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Georgia State Subsample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean(SD) 
Persistence4760.72(0.45)4880.77(0.42)0.040.050.122

Assignment to the MAAPS advising intervention had no significant impact on persistence/credit accumulation for the sample of 964 students enrolled at Georgia State.

Secondary analyses included examining the impact of the MAAPS intervention on Black students at Georgia State. Prior exploratory analyses revealed an impact on interim outcomes for Black students after four academic years, so we followed up to see if those effects translated to differences in graduation and persistence after six academic years. Table 12 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on academic achievement among Black students and students who are not Black at Georgia State.

Table 12. Intent-To-Treat Effect of MAAPS Advisement on Academic Achievement Outcomes: Georgia State Subsample, Black and Not Black Student Subgroups

SubgroupControl GroupTreatment GroupT - C diff.Std. diff.p[19]
nMean(SD)nAdj. mean(SD) 
Black2080.55(0.50)2100.70(0.46)0.150.150.002
Not Black2500.68(0.47)2700.69(0.47)0.010.000.823

After six academic years, Black students in the treatment group at Georgia State had a graduation rate that was 15 percentage points higher than their counterparts in the control group. On the contrary, assignment to the MAAPS advising intervention had no significant impact on academic achievement for students at Georgia State who are not Black.

Table 13 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation among Black students and students who are not Black at Georgia State.

Table 13. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Georgia State Subsample, Black and Not Black Student Subgroups

SubgroupControl GroupTreatment GroupT - C diff.Std. diff.p[20]
nMean(SD)nAdj. mean(SD) 
Black2080.66(0.47)2100.77(0.41)0.110.120.011
Not Black2500.79(0.41)2700.78(0.42)-0.01-0.020.851

After six academic years, Black students in the treatment group at Georgia State had a persistence rate that was 11 percentage points higher than their counterparts in the control group. Assignment to the MAAPS advising intervention had no significant impact on persistence/credit accumulation for students at Georgia State who are not Black. Table 14 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on both academic achievement and persistence/credit accumulation at Georgia State by the Black student subgroup.

Table 14. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Georgia State Subsample, by Black Student Subgroup

GraduationPersistence
Treatment<-0.00

(0.04)
0.01

(0.04)
Treatment-0.01

(0.04)
-0.01

(0.04)
Black x
Treatment
0.16**

(0.06)
0.14**

(0.05)
Black x
Treatment
0.13**

(0.06)
0.12**

(0.06)
Observations938938Observations938938
Baseline CovariatesNOYESBaseline CovariatesNOYES

These results indicate that the treatment effect was significantly greater for Black students at Georgia State than students who are not Black at Georgia State, further supporting the findings in table 13.

Some commentators have expressed concern that early alert and predictive analytic tools steer students from underrepresented minority groups–and Black students in particular–away from STEM majors and towards majors that are perceived to be easier.[21] And one recent study found that introductory STEM courses were responsible for disproportionately driving underrepresented minority students out of STEM pathways.[22] At Georgia State, however, there is no evidence of either finding. Black students at Georgia State were just as likely to have a STEM major in the Spring 2020 term (the last year in which data on students’ majors are available for this study) as other Georgia State students (16 percent), whether looking at the entire study sample or limiting it to the treatment group only. Regression tables for the Georgia State subsample are presented in Appendix B.

Impacts: UCR Subsample

Table 15 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on academic achievement at UCR.

Table 15. Intent-To-Treat Effect of MAAPS Advisement on Academic Achievement Measures: UCR Subsample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean(SD) 
Graduation5070.78(0.42)4880.77(0.42)-0.01-0.010.722

Assignment to the MAAPS advising intervention had no significant impact, on average, on academic achievement after six academic years for the sample of 995 students enrolled at UCR.

Table 16 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation at UCR.

Table 16. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: UCR Subsample

Outcome measureControl GroupTreatment GroupT - C diff.Std. diff.p
nMean(SD)nAdj. mean(SD) 
Persistence5070.85(0.35)4880.80(0.42)-0.05-0.080.060

Assignment to the MAAPS advising intervention resulted in significant impact on persistence/credit accumulation for the sample of 995 students enrolled at UCR. After six academic years, treatment group students at UCR had a persistence rate that was five percentage points lower than control group students.

Table 17 presents the impact analysis results estimating the intent-to-treat effect of MAAPS advisement on persistence/credit accumulation by the UCR subsample.

Table 17. Intent-To-Treat Effect of MAAPS Advisement on Persistence/Credit Accumulation Outcomes: Full Sample, by UCR Subsample

Persistence
Treatment -0.01

(0.01)
-0.01

(0.01)
UCR x Treatment-0.03***

(0.01)
-0.03***

(0.01)
Observations10,03710,037
Baseline CovariatesNOYES
Institutional FEYESYES

These results indicate that the treatment effect was significantly greater for students at UCR than it was for students enrolled at the other ten participating institutions, further supporting the findings in table 16.

Follow-up analyses revealed that differences in persistence rates were primarily driven by enrollment differences in the Spring 2022 term, though the number of students driving those differences was relatively small. Specifically, 38 control group students at UCR were enrolled in the Spring 2022 term and had not yet graduated, compared to 20 treatment group students. It is not clear what factors were responsible for this difference. No significant impacts were observed on either outcome measure for the four subgroups of interest after six academic years. Regression tables for the UCR subsample are presented in Appendix C.

Implementation Study Findings

The implementation study conducted by Ithaka S+R between 2016 and 2019 consisted of the following activities:

  • Yearly phone interviews with advising lead staff in the fall of 2016, 2017, 2018, and 2019
  • Yearly student advising surveys in the spring of 2017, 2018, and 2019
  • An implementation form completed by each advising team in the Fall 2018 term on the extent to which each institution had implemented the key components of MAAPS to date
  • A site visit to each of the 11 participating institutions that included interviews with MAAPS staff and focus groups with participating students
  • Logged advisement interactions by MAAPS advisors in a common secure database called REDCap that documented the reason, format, and type of intervention provided through each interaction with treatment group students

The implementation study concluded for most institutions in 2019 to coincide with the end of the intervention. MAAPS advisors at Georgia State continued to log interventions through the end of the Spring 2020 term when the intervention ended at their institution.[23]

Logged Advisement Interactions

In the last report, we used advisement interaction data logged by Georgia State MAAPS advisors to test whether Black students received a higher dosage of the intervention, which may explain the greater magnitude of their outcomes. To measure this, we compared the prevalence of the following key advising metrics between various subgroups of Georgia State students in the treatment group:

  • Share who experienced at least one in-person contact (year 1)
  • Share who experienced at least one degree planner review per year (1-3)
  • Number of interactions (years 1-4)
  • Number of interventions (years 1-4)
  • Number of advising triggers that were student-initiated (years 1-4)
  • Number of advising triggers that were not student-initiated (years 1-4)

Table 18 presents differences in the prevalence of select advisement metrics between Black students and students who are not Black in the Georgia State treatment group.

Table 18. Prevalence of Select Advisement Metrics: Georgia State Subsample, Treatment Group

Advising MetricBlack StudentsNot Black Studentsp-value
nMeannMean
Share who experienced at least one in-person contact (year 1)21093.827095.90.29
Share who experienced at least one degree planner review per year (years 1-3)21056.727047.40.04
Number of interactions (years 1-4)21023.327021.2<0.01
Number of interventions (years 1-4)21047.027042.20.01
Number of advising triggers that were student-initiated (years 1-4)2100.892700.740.18
Number of advising triggers that were not student-initiated (years 1-4)21022.527020.4<0.01

Black treatment group students at Georgia State were more likely to participate in at least one degree planner review per year, experienced more interactions and interventions, and prompted more triggers per student, on average, than treatment group students at Georgia State who are not Black. Black students triggered and experienced more advising interactions and interventions, including key events like degree planner reviews, irrespective of their academic preparation, at least the way it was measured for this analysis.[24] There were no differences, for example, when disaggregating the same set of advising metrics by students’ academic preparation rather than race. Further, the differences in advising interactions between Black treatment group students at Georgia State and other students in the Georgia State treatment group persisted after limiting the sample to those less academically prepared only, as well as limiting it to those more academically prepared only.[25] In short, these findings offer evidence that Black students did in fact receive a relatively high dosage of the intervention, compared to other students in the treatment group, which may at least partially explain why they benefited from the intervention.

It is worth noting that Georgia State does not use race as a factor in its early alert models; its alerts identify and notify advisors of students veering off path according to historical data. Despite differences in the number of interactions and interventions, analyses of student survey data did not reveal any differences in how Black survey respondents at Georgia State perceived and rated their advising experiences and support compared to other survey respondents at Georgia State, both for the full Georgia State sample and for treatment group students at Georgia State only.

Student Advising Survey Findings

The implementation study also included a 10-minute student advising survey administered to all MAAPS students in both the treatment and control groups in the spring of 2017, 2018, and 2019. The surveys explored how treatment and control group students experienced and perceived advising at their institution, and analyses included investigating whether their experiences were associated with their academic progress and achievement.[26]

The number of students across the 11 participating institutions who completed the annual survey decreased each year, with 1,137 students completing the 2017 survey (11.3 percent response rate), 942 students completing the 2018 survey (9.4 percent response rate), and 788 students completing the 2019 survey (7.9 percent response rate). However, the assessments of their advising experiences by students who did complete the survey were consistent across all three years.

Table 19 presents average responses of select items included in the 2019 student advising survey by assigned group.

Table 19. Average Responses of Select 2019 Student Advising Survey Items by Assigned Group: Full Sample[27]

Survey Item/ScaleControl GroupTreatment Groupp-value
nMeannMean
Institutional Know-How Scale3163.74473.9<0.01
Advisor Support Scale2873.63984.0<0.01
Proactive Scale2993.24173.7<0.01
Overall Satisfaction with Advisement3003.74184.1<0.01
Multiple Advisor Contact2770.654090.89<0.01

In all three surveys, those in the treatment group reported significantly higher levels of institutional know-how and reported experiencing higher levels of academic support and proactive advising than those in the control group, each assessed through a multi-item scale.[28] Treatment group students also reported higher overall satisfaction with advisement than control group students, and were more likely to report having an advisor contact them at least twice to set up an in-person meeting.

It is important to note a few significant limitations to the survey findings. First, the findings are based on low response rates that hover around the 10 percent level. Second, the survey subsamples are not representative of the larger MAAPS sample. For instance, female students and high-performing students were more likely to respond.[29] To address this, we calculated the average responses by treatment group after correcting for any response bias related to the gender or academic performance of the student.[30] However, the weighting procedure did not alter the results, so we report the unweighted responses. Although the survey subsamples are not representative, the results are aligned with findings from focus groups of both treatment and control group students across the participating institutions. Despite the lack of significant impacts of the MAAPS intervention on key outcome measures, these findings suggest that at least a subset of treatment group students experienced the key features of MAAPS advisement, including proactive and personalized advisement, and gained information and skills that they perceive as increasing their ability to navigate the complexities of a large, public university.

Implementation Challenges, Responses, and Successes

As discussed in much greater detail in previous MAAPS reports,[31] participating institutions experienced a range of implementation challenges, preventing some institutions from offering all components of the MAAPS protocol. This likely reduced the potential impact of the MAAPS intervention on student outcomes.

In an effort to account for idiosyncrasies across the 11 large, public universities participating in the study, the grant gave institutions flexibility in deciding how to implement the intervention on their own campuses. For example, institutions had to decide at the outset of the program how to offer MAAPS advising within their current advising system. Prior to MAAPS, most participating institutions relied on departmental academic advisors to serve as students’ primary advisor in a decentralized advising model. In an attempt to offer MAAPS without disrupting or redesigning their advising model and infrastructure, nine institutions chose to provide treatment students with MAAPS advising and business-as-usual advising through multiple advisors. Eight of these institutions had treatment group students retain a primary advisor from their department, with MAAPS advisors offering supplemental advisement. However, this increased the complexity of implementation and undermined delivery of the MAAPS intervention. Primary advisors questioned the role and need of MAAPS advisors, and students reported preferring to go to their primary advisor over their MAAPS advisor for academic support and guidance. The institutions that engaged with the broader advising community and advisors before the start of the intervention in the summer of 2016 to explain the study and the goal of MAAPS advisors—to support and complement, not hinder or duplicate, the work of primary advisors—were most successful in gaining their approval and trust.

Another issue a few institutions faced was the inability to implement an early alert data system to inform proactive and early advisement in the first half of the intervention. Other institutions had early alert systems that were not effective in facilitating the type of proactive outreach described in the MAAPS protocol. For instance, some systems did not automatically push out information to advisors, so advisors had to sift through the data and identify relevant information themselves. Advisors who failed or were unable to do this on a regular and frequent basis were not able to offer timely information to their students in a proactive manner. Other institutions had multiple early alert systems for different units and offices, which forced advisors to merge and synthesize information across sources. To overcome this series of challenges, institutions worked collaboratively with their MAAPS data team to produce data reports and dashboards that provided advisors with key information on the status and progress of their students.

At four institutions, students responded to advisor outreach at lower-than-expected levels, leading to low in-person interactions between MAAPS advisors and students, which served as another hurdle to providing MAAPS advisement to students.[32] Findings from focus groups revealed that at institutions that used the supplemental advising approach this was driven in part by students who preferred engaging with their primary advisor or were confused by the role that their MAAPS advisor played. Some institutions were able to overcome this by placing holds on students’ registration accounts until they met with their MAAPS advisor. Six sites also faced advisor and staff loss or turnover by the end of the second year of the intervention, which resulted in increased caseloads that were much higher than the 150:1 student-to-advisor ratio that was initially conceived and intended for this project, making it more difficult for advisors to provide personalized and proactive advisement. At institutions that had to replace advisors and/or staff, new advisors had trouble developing relationships with students while new project leads found it difficult to guide project staff because they were not as familiar with the MAAPS protocol.

Other than Georgia State, no participating institution offered MAAPS advisement to their students after the third year of the intervention, diluting the treatment and disrupting relationships and momentum that MAAPS advisors developed with their students in the first three years. The COVID-19 pandemic brought further disruption to the final two and a half years of the study period, as institutions shifted from face-to-face interaction to online delivery of courses and services in emergency fashion. Among the many effects of the pandemic was an increase in DFW rates, especially at institutions that serve racially diverse student bodies, which likely stymied students’ academic progress.[33]

Georgia State, in addition to offering the intervention to its students for an additional three years, stood out for its implementation, facilitated by a set of institutional conditions that existed prior to the intervention. Georgia State already had a centralized advising system in place so the primary model in which the MAAPS advisor was the sole advisor was a natural extension. Moreover, they had cultivated a culture of proactive advisement and degree planning after years of doing similar activities with well-documented success. While Georgia State MAAPS advisors did not interact and intervene with their students more frequently than MAAPS advisors at other institutions, Georgia State set themselves apart for the quality of their early alert tools and student support systems.

Even with these challenges, most institutions shared that the MAAPS project will have a lasting positive impact on their institution’s programming, policies, and practices. Some institutions noted that the project exposed policies that were adversely impacting students, policies they have been working on addressing. For example, according to the UIA’s playbook on proactive advising published after the completion of our implementation study, UT Austin developed its own degree maps that are formatted in a consistent manner across all academic majors and colleges after the project revealed the complexity and difficulty of navigating certain curricula and degree plans.[34] In addition, the university worked with the Texas state legislature to pass Texas Senate Bill 25, which requires that institutions in the state develop at least one recommended course sequence for each undergraduate degree program that they offer.[35] Similarly, at MSU, four-year degree maps are much more commonly used across campus as a result of their use in the MAAPS project.[36] Institutions have also been working to fill gaps revealed by the MAAPS project in the type of information that institutions are collecting on student success indicators. For example, Iowa State added information related to holds and low-income status to their EAB platform, which they learned through the project are key predictors of success.[37] Other institutions are deploying dashboards similar to the ones created for MAAPS so advisors can easily reference and pull up-to-date information on their students or are using new tools to collect and share information between advisors. For example, ASU adopted the REDCap system to log notes from advising interactions for all advising across the institution and experimented with dynamic, real-time program maps updated based on student course progress.[38] The project has also prompted institutions to reassess their approach to advising. UCF has adopted a more centralized advising system and has expanded professional development services and opportunities for advisors.[39] In addition, the project’s focus on historically underserved students and particular subgroups of interest revealed disparate outcomes some institutions were previously unaware of. For instance, UCR tailored its advising services to better meet the needs of its Black students in an effort to close persistence and completion gaps.[40] More generally, the MAAPS project brought together different parts of the institution that historically operated in silos, and as a result, has prompted conversations on how to work together to better support students.

Discussion

Key Takeaways

Assignment to the MAAPS advising intervention had no significant impact, on average, on either of the study’s two primary outcomes—graduation and persistence—after six academic years for the full sample. However, based on exploratory analyses among student subgroups of interest, first-generation students in the treatment group had a persistence rate that was two percentage points lower than first-generation students in the control group. Secondary analyses revealed significant impacts on the study’s primary outcome measures at Georgia State and UCR. At Georgia State, after six academic years, treatment group students had a graduation rate that was seven percentage points higher than control group students. Follow-up analyses revealed that impacts were driven by Black students at Georgia State, who had a graduation rate that was 15 percentage points higher and a persistence rate that was 11 percentage points higher than their counterparts in the control group. Additional analyses using data collected through the implementation study found that Black students received a relatively high dosage of the intervention, compared to other students in the treatment group, which may at least partially explain why they benefited from the intervention. After six academic years, treatment group students at UCR had a persistence rate that was five percentage points lower than control group students.

Implementation challenges during the three-year intervention period at most participating institutions may have been responsible for these differential findings. These challenges included providing treatment students with MAAPS advising and business-as-usual advising through multiple advisors, which increased the complexity of implementation and undermined delivery of the MAAPS intervention; the inability to implement early alert data systems to inform proactive and early advisement in the first half of the intervention; low student take-up of MAAPS advisement; advisor and staff turnover; and only three years of implementation. By contrast, Georgia State, in addition to having six years of implementation, benefited from having designed the MAAPS intervention as an extension and enhancement of its pre-existing advising approach, and therefore already had in place the institutional infrastructure, culture, and data tools and systems that eased implementation. These distinctions suggest the importance of institutional readiness for, including aligning staffing and technological resources to, the advising intervention.

Despite these implementation challenges, a subset of treatment group students across all participating institutions reported a positive experience and improved perceptions of institutional know-how, and higher levels of academic support and proactive advising than students in the control group in each of the three years they were surveyed.

Avenues for Future Research

As one of the first studies examining the causal impact of technology-enhanced proactive advisement on student achievement, persistence, and completion, the MAAPS project and findings from both the impact and implementation studies will make a significant contribution to the higher education community’s understanding of such interventions and the conditions under which they are most effective. With that said, we have identified a few areas for future research.

Implement and evaluate the MAAPS protocol using the primary model at institutions other than Georgia State. The supplemental model, in which MAAPS advisors supplement the work of existing primary advisors rather than replace them, in many ways undermined the delivery of the MAAPS intervention. Georgia State, by contrast, was one of only two institutions to have MAAPS advisors serve as students’ primary and sole advisor (i.e., primary model), which contributed to its high fidelity of implementation.[41] As the only institution where we observed positive impacts, it raises the question whether similar results could be replicated at other institutions that employ a primary model. On the one hand, it is possible that the positive outcomes of the MAAPS intervention at Georgia State underrepresent the potential positive impacts of similarly designed and implemented data-based proactive advising at other institutions. At Georgia State, even the control group students were receiving standard advising supports based on data and proactive outreach—just not with the intensity of the MAAPS cohort. Or it may be the case that Georgia State’s institutional conditions, culture, and proactive and early alert tools are truly unique. Examining the use of the primary model at other institutions would be a fruitful area of exploration for researchers.

Investigate which advising interventions and interactions are most beneficial to students. The intervention at Georgia State had an exceptionally positive impact for Black students on achievement, persistence, and graduation, which has the potential to inform institutional efforts around promoting equity and closing racial graduation gaps. Black students at Georgia State triggered and experienced more advising interactions and interventions than other students in the Georgia State treatment group, which may explain why they benefited from the intervention. Additional research is needed to better understand the types of advising interactions and interventions that are most impactful for students and the extent to which they vary by student demographics, including race, socioeconomic status, and gender, as well as institutional characteristics and settings.

Design and study interventions that apply features of the MAAPS intervention in other contexts. Some of the core components of the MAAPS intervention have already been applied in contexts outside of MAAPS as part of efforts to better support students. For example, Georgia State’s Panther Retention Grant program regularly monitors students’ financial data and status and proactively acts on that information by automatically awarding up to $2,500 to clear students’ unpaid balances and allow them to remain enrolled. Ithaka S+R’s evaluation of the program suggests that the program has benefited grant recipients, concluding that it was responsible for decreasing students’ time to degree, which in turn decreased the amount of debt incurred post-receipt.[42] There are other areas in higher education that may benefit from interventions that consist of key MAAPS features. For instance, degree planning activities have the potential to help community college students who want to eventually earn a bachelor’s degree map out courses in a multi-year plan to minimize time to degree and maximize the number of credits that will be accepted by the receiving institution. Designing, implementing, and evaluating similar types of interventions deserve further attention.

Appendix A. Results Tables: Full Sample

For the full sample, we present four regression models for each analysis, with each model presenting a different or additional set of control variables. Model 1 does not include control variables, model 2 includes baseline demographic covariates only (high school achievement scores as determined by composite ACT score, low-income status as determined by expected family contribution (EFC) at baseline, and the number of college-level credit hours transferred into the institution before the start of the Fall 2016 term), model 3 includes institutional fixed effects only, and model 4 includes both baseline demographic covariates and institutional fixed effects.[43] For follow-up analyses, we present the results of model 4 only.

Table 20. Descriptive Statistics for Academic Achievement Outcome

Control GroupTreatment GroupTotal Sample
Mean

(SD)
nMean

(SD)
nMean

(SD)
nRange
Graduation0.70

(0.46)
5,0920.69

(0.46)
4,9450.70

(0.46)
10,0370 - 1

Table 21. Descriptive Statistics for Persistence/Credit Accumulation Outcome

Control Group Treatment Group Total Sample
Mean

(SD)

n Mean

(SD)

n Mean

(SD)

n Range
Persistence 0.77

(0.42)

5,092 0.76

(0.43)

4,945 0.76

(0.43)

10,037 0 – 1

Table 22. Intent-To-Treat Effect of MAAPS Advisement on Graduation[44]

(1) (2) (3) (4)
VARIABLES
Treatment -0.01 <-0.01 -0.01 <-0.01
(0.01) (0.01) (0.01) (0.01)
Control Mean 0.70 0.70 0.70 0.70
Observations 10,037 10,037 10,037 10,037
R-squared <0.01 0.06 <0.01 0.06
Baseline Covariates NO YES NO YES
Institutional FE NO NO YES YES

Table 23. Intent-To-Treat Effect of MAAPS Advisement on Persistence

(1) (2) (3) (4)
VARIABLES
Treatment -0.01* -0.01 -0.01* -0.01
(0.01) (0.01) (0.01) (0.01)
Control Mean 0.77 0.77 0.77 0.77
Observations 10,037 10,037 10,037 10,037
R-squared 0.00 0.04 0.00 0.04
Baseline Covariates NO YES NO YES
Institutional FE NO NO YES YES

Student Subgroups of Interest

Table 24. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Pell-Eligible Student Subgroup[45]

Graduation Persistence
VARIABLES
Treatment <0.01 0.01
(0.01) (0.01)
Control Mean 0.69 0.76
Observations 8,071 <8,071
R-squared 0.06 0.05
Baseline Covariates YES ES
Institutional FE YES YES

Table 25. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: First-Generation Student Subgroup

Graduation Persistence
VARIABLES
Treatment -0.01 -0.02***
(0.01) (0.01)
Control Mean 0.69 0.76
Observations 5,315 5,315
R-squared 0.06 0.04
Baseline Covariates YES YES
Institutional FE YES YES

Table 26. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Underrepresented Racial and Ethnic Minority Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.01 -0.01
(0.01) (0.01)
Control Mean 0.64 0.73
Observations 3,724 3,724
R-squared 0.05 0.04
Baseline Covariates YES YES
Institutional FE YES YES

Table 27. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Black Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.03 0.02
(0.05) (0.04)
Control Mean 0.58 0.68
Observations 1,264 1,264
R-squared 0.08 0.08
Baseline Covariates YES YES
Institutional FE YES YES

Additional Analyses

Table 28. Intent-To-Treat Effect of MAAPS Advisement on Enrollment in Spring 2022 Term: First-Generation Student Subgroup

Enrollment in Spring 2022 Term
VARIABLES
Treatment -0.02*
(0.01)
Control Mean 0.20
Observations 5,315
R-squared 0.01
Baseline Covariates YES
Institutional FE YES

 

Appendix B. Results Tables: Georgia State Subsample

For the Georgia State subsample, we present two regressions models for each primary analysis. Model 1 does not include control variables, while model 2 includes baseline demographic covariates only (high school achievement scores as determined by composite ACT score, low-income status as determined by expected family contribution (EFC) at baseline, and the number of college-level credit hours transferred into the institution before the start of the Fall 2016 term). For follow-up analyses, we present the results of model 2 only.

Table 29. Descriptive Statistics for Academic Achievement Outcome: Georgia State Subsample

Control Group Treatment Group Total Sample
Mean

(SD)

n Mean

(SD)

n Mean

(SD)

n Range
Graduation 0.61

(0.49)

476 0.68

(0.47)

488 0.65

(0.48)

964 0 – 1

Table 30. Descriptive Statistics for Persistence/Credit Accumulation Outcome: Georgia State Subsample

Control Group Treatment Group Total Sample
Mean

(SD)

n Mean

(SD)

n Mean

(SD)

n Range
Persistence 0.72

(0.45)

476 0.77

(0.42)

488 0.75

(0.44)

964 0 – 1

Table 31. Intent-To-Treat Effect of MAAPS Advisement on Graduation: Georgia State Subsample[46]

(1) (2)
VARIABLES
Treatment 0.07** 0.07**
(0.03) (0.03)
Control Mean 0.61 0.61
Observations 964 964
R-squared 0.01 0.04
Baseline Covariates NO YES

Table 32. Intent-To-Treat Effect of MAAPS Advisement on Persistence: Georgia State Subsample

(1) (2)
VARIABLES
Treatment 0.04 0.04
(0.03) (0.03)
Control Mean 0.72 0.72
Observations 964 964
R-squared <0.01 0.03
Baseline Covariates NO YES

Student Subgroups of Interest

Table 33. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Georgia State Subsample, Pell-Eligible Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.08** 0.05*✝
(0.03) (0.03)
Control Mean 0.60 0.72
Observations 860 860
R-squared 0.04 0.03
Baseline Covariates YES YES

Table 34. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Georgia State Subsample, First-Generation Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.06 0.01
(0.05) (0.04)
Control Mean 0.61 0.74
Observations 429 429
R-squared 0.03 0.01
Baseline Covariates YES YES

Table 35. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Georgia State Subsample, Underrepresented Racial and Ethnic Minority Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.06 0.04
(0.04) (0.04)
Control Mean 0.59 0.71
Observations 562 562
R-squared 0.02 0.01
Baseline Covariates YES YES

Additional Analyses

Table 36. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Georgia State Subsample, Pell-Eligible Students who are not Black

Graduation Persistence
VARIABLES
Treatment 0.01 0.01
(0.04) (0.04)
Control Mean 0.68 0.78
Observations 441 441
R-squared 0.06 0.05
Baseline Covariates YES YES

 

Appendix C. Results Tables: UCR Subsample

For the UCR subsample, we present two regressions models for each primary analysis. Model 1 does not include control variables, while model 2 includes baseline demographic covariates only (high school achievement scores as determined by composite ACT score, low-income status as determined by expected family contribution (EFC) at baseline, and the number of college-level credit hours transferred into the institution before the start of the Fall 2016 term). For follow-up analyses, we present the results of model 2 only.

Table 37. Descriptive Statistics for Academic Achievement Outcome: UCR Subsample

Control Group Treatment Group Total Sample
Mean

(SD)

n Mean

(SD)

n Mean

(SD)

n Range
Graduation 0.78

(0.42)

507 0.77

(0.42)

488 0.77

(0.42)

995 0 – 1

Table 38. Descriptive Statistics for Persistence/Credit Accumulation Outcome: UCR Subsample

Control Group Treatment Group Total Sample
Mean

(SD)

n Mean

(SD)

n Mean

(SD)

n Range
Persistence 0.85

(0.35)

507 0.81

(0.39)

488 0.83

(0.37)

995 0 – 1

Table 39. Intent-To-Treat Effect of MAAPS Advisement on Graduation: UCR Subsample[47]

(1) (2)
VARIABLES
Treatment -0.01 -0.01
(0.03) (0.03)
Control Mean 0.78 0.78
Observations 995 995
R-squared <0.01 0.01
Baseline Covariates NO YES

Table 40. Intent-To-Treat Effect of MAAPS Advisement on Persistence: UCR Subsample

(1) (2)
VARIABLES
Treatment -0.05* -0.05*
(0.02) (0.02)
Control Mean 0.85 0.85
Observations 995 995
R-squared <0.01 0.01
Baseline Covariates NO YES

Student Subgroups of Interest

Table 41. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: UCR Subsample, Pell-Eligible Student Subgroup

Graduation Persistence
VARIABLES
Treatment 0.01 -0.03
(0.03) (0.03)
Control Mean 0.76 0.84
Observations 877 877
R-squared 0.01 0.01
Baseline Covariates YES YES

Table 42. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: UCR Subsample, First-Generation Student Subgroup

Graduation Persistence
VARIABLES
Treatment <-0.01 -0.04
(0.03) (0.03)
Control Mean 0.76 0.83
Observations 657 657
R-squared 0.01 0.01
Baseline Covariates YES YES

Table 43. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: UCR Subsample, Underrepresented Racial and Ethnic Minority Student Subgroup

Graduation Persistence
VARIABLES
Treatment <0.01 -0.03
(0.04) (0.03)
Control Mean 0.74 0.82
Observations 615 615
R-squared 0.01 0.01
Baseline Covariates YES YES

Table 44. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: UCR Subsample, Black Student Subgroup

Graduation Persistence
VARIABLES
Treatment -0.13 -0.12
(0.17) (0.14)
Control Mean 0.89 0.94
Observations 31 31
R-squared 0.21 0.27
Baseline Covariates YES YES

Appendix D. Results Tables: Institutional Subsamples

Regression results for the remaining nine participating institutions did not reach statistical significance (p<0.10). We present the results of model 2 only.

Table 45. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 1[48]

Graduation Persistence
VARIABLES
Treatment -0.04 -0.02
(0.03) (0.03)
Control Mean 0.69 0.74
Observations 995 995
R-squared 0.05 0.04
Baseline Covariates YES YES

Table 46. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 2

Graduation Persistence
VARIABLES
Treatment -0.02 -0.03
(0.03) (0.03)
Control Mean 0.68 0.73
Observations 1,093 1,093
R-squared 0.05 0.04
Baseline Covariates YES YES

Table 47. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 3

Graduation Persistence
VARIABLES
Treatment -0.02 -0.01
(0.03) (0.03)
Control Mean 0.62 0.68
Observations 1,082 1,082
R-squared 0.07 0.06
Baseline Covariates YES YES

Table 48. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 4

Graduation Persistence
VARIABLES
Treatment 0.01 -0.01
(0.03) (0.03)
Control Mean 0.72 0.79
Observations 868 868
R-squared 0.06 0.06
Baseline Covariates YES YES

Table 49. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 5

Graduation Persistence
VARIABLES
Treatment <-0.01 <-0.01
(0.03) (0.03)
Control Mean 0.68 0.77
Observations 854 854
R-squared 0.06 0.05
Baseline Covariates YES YES

Table 50. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 6

Graduation Persistence
VARIABLES
Treatment -0.02 -0.01
(0.03) (0.03)
Control Mean 0.59 0.66
Observations 934 934
R-squared 0.21 0.17
Baseline Covariates YES YES

Table 51. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 7

Graduation Persistence
VARIABLES
Treatment -0.01 -0.03
(0.02) (0.02)
Control Mean 0.82 0.88
Observations 908 908
R-squared 0.11 0.02
Baseline Covariates YES YES

Table 52. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 8

Graduation Persistence
VARIABLES
Treatment -0.01 -0.01
(0.03) (0.02)
Control Mean 0.77 0.83
Observations 974 974
R-squared 0.02 0.01
Baseline Covariates YES YES

Table 53. Intent-To-Treat Effect of MAAPS Advisement on all Outcomes: Institutional Subsample 9

Graduation Persistence
VARIABLES
Treatment 0.02 -0.02
(0.04) (0.03)
Control Mean 0.83 0.90
Observations 370 370
R-squared 0.04 0.04
Baseline Covariates YES YES

Endnotes

  1. Grant Number: P116X150015.
  2. Eric P. Bettinger, and Rachel B. Baker, “The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising,” Educational Evaluation and Policy Analysis 36, no. 1 (March 2014): 3–19, doi:10.3102/0162373713500523.
  3. Martin Kurzweil and D. Derek Wu, “Building a Pathway to Student Success at Georgia State University,” Ithaka S+R, 23 April 2015, https://doi.org/10.18665/sr.221053.
  4. Most commonly, business-as-usual advisement at the participating institutions involved a larger student-to-advisor ratio, fewer communications from advisors, shorter advisor-student meetings, and lower levels of proactive outreach to students based on in-term and end-of-term student information. Through business-as-usual, students were also less likely to work on personalized and dynamic four-year degree plans with their advisors.
  5. For detailed descriptions of the MAAPS advisement activities and additional background information on the study and intervention, please see earlier MAAPS reports: Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Monitoring Advising Analytics to Promote Success (MAAPS): Evaluation Findings from the First Year of Implementation,” Ithaka S+R, 4 April 2018, https://doi.org/10.18665/sr.307005, and Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Interim Findings Report: MAAPS Advising Experiment,” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311567.
  6. Selected students were informed of their selection into the study, but not of their selection into the treatment or control group, via email, on the third day of the Fall 2016 term. This ensured that student matriculation at the participating institutions was not impacted by the study and allowed students to opt out of the study at that time irrespective of their assigned study group.
  7. For additional technical details on student sampling procedures, please see an earlier MAAPS report, Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Technical Supplement – Interim Findings Report: MAAPS Advising Experiment.” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311566.
  8. In 2020, there were 48 cases in which a student was categorized as a graduate in the student administrative data but not categorized as such in the NSC data because their administrative records were not successfully matched to records in the NSC database. These 48 students were not categorized as a graduate in the NSC data submitted in 2022 either. As with the 2020 analysis, to address this set of discrepancies, we recoded the 48 students’ NSC records to indicate that they earned a bachelor’s degree from their home MAAPS institution by the end of the Spring 2020 term. In addition, there were five cases in which a student was categorized as a graduate in the NSC data submitted in 2020 and not as a graduate in the NSC data submitted in 2022. We confirmed with NSC that the data submitted in 2020 was correct, so we recoded the five students’ 2022 NSC records to indicate that they graduated to match the records submitted in 2020.
  9. A total of 590 students graduated with a bachelor’s degree by the end of the Spring 2022 term from an institution other than their initial MAAPS institution.
  10. A student is considered enrolled if they were enrolled through the end of the Spring 2022 term. Students who withdrew from their institution(s) before the conclusion of the Spring 2022 term are considered not enrolled for that term. Half-time status is determined by the institution, but typically is considered at least six credits enrolled per term.
  11. A total of 962 students persisted (either graduated with a bachelor’s degree or were enrolled at least half-time) by the end of the Spring 2022 term at a degree-granting institution other than their initial MAAPS institution.
  12. What Works Clearinghouse Procedures and Standards Handbook, Version 5.0, Institute of Education Statistics, p.150, https://ies.ed.gov/ncee/wwc/Docs/referenceresources/Final_WWC-HandbookVer5_0-0-508.pdf.
  13. For students who submitted SAT scores, we used concordance tables provided by the College Board to convert to ACT composite scores.
  14. Controls also include a dummy variable capturing whether the student is one of 22 enrolled in Purdue University’s Doctor of Pharmacy (PharmD) Program. These students are no longer considered undergraduates once they enter the program. Unlike traditional graduate programs, this typically occurs two years into their college career, at which point their credits accumulated and cumulative GPA are frozen. In addition, they do not earn an undergraduate degree despite pursuing a PhD in Pharmacy. We decided to control for these students’ outcomes due to the unusual nature of the program and the fact that their progress is misleadingly low.
  15. What Works Clearinghouse Procedures and Standards Handbook, Version 5.0, Institute of Education Statistics, p.206, https://ies.ed.gov/ncee/wwc/Docs/referenceresources/Final_WWC-HandbookVer5_0-0-508.pdf.
  16. The adjusted mean is calculated by adding the control group mean to the coefficient estimate on the treatment variable in the output of the regression.
  17. For the second time, student subgroups of interest included Black students. Black students were initially added to the previous report published in 2021 after we were informed that the UIA is working on a student success initiative focused on supporting Black students and were asked to explore whether MAAPS has benefitted this student subgroup.
  18. Persistence result remains statistically significant at the p<0.10 level after correcting for multiple comparisons. The Benjamini-Hochberg corrected p-value is 0.02.
  19. Graduation result remains statistically significant at the p<0.10 level after correcting for multiple comparisons. The Benjamini-Hochberg corrected p-value is 0.008.
  20. Persistence result remains statistically significant at the p<0.10 level after correcting for multiple comparisons. The Benjamini-Hochberg corrected p-value is 0.044.
  21. Todd Feathers, “Major Universities Are Using Race as a “High Impact Predictor” of Student Success,” The Markup , 2 March 2021, https://themarkup.org/news/2021/03/02/major-universities-are-using-race-as-a-high-impact-predictor-of-student-success.
  22. Neil Hatfield, Nathanial Brown, and Chad M Topaz, “Do Introductory Courses Disproportionately Drive Minoritized Students Out of STEM Pathways?” PNAS Nexus, no. 4, September 2022, pgac167, https://doi.org/10.1093/pnasnexus/pgac167.
  23. For additional details on the methodology used in the implementation study, please see Appendix A of an earlier MAAPS report, Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Technical Supplement – Interim Findings Report: MAAPS Advising Experiment,” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311566, and for a more comprehensive description of findings from the implementation study, please see Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Interim Findings Report: MAAPS Advising Experiment,” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311567.
  24. For this analysis, less academically prepared is defined as having an ACT composite score in the lower half of the distribution.
  25. Among those more academically prepared, Black treatment group students at Georgia State also experienced more student and other initiated advising triggers.
  26. For additional details on the administration of the surveys and the survey items, please see earlier Ithaka S+R reports on the MAAPS study, including Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Monitoring Advising Analytics to Promote Success (MAAPS): Evaluation Findings from the First Year of Implementation” Ithaka S+R, 4 April 2018, https://doi.org/10.18665/sr.307005, and Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Technical Supplement – Interim Findings Report: MAAPS Advising Experiment,” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311566.
  27. We excluded students who were not enrolled according to their institution’s Spring 2019 census.
  28. The proactive scale was not introduced until the 2018 survey so for that scale, the findings were only consistent across two years.
  29. For this analysis, a high-performing student is defined as entering the spring term of the given year with a cumulative GPA in the upper half of the distribution.
  30. To do this, we tallied the number of students enrolled in the given spring term who had the following characteristics: female and relatively high cumulative GPA; female and relatively low cumulative GPA; male and relatively high cumulative GPA; and male and relatively low cumulative GPA. All enrolled students with the same gender and relative cumulative GPA were assigned a weight equal to the ratio of the total number of enrolled students of that gender and relative cumulative GPA to the number of enrolled students of that gender and relative cumulative GPA who responded to the survey in the given spring term. For example, if there were ten female students with a relatively high GPA enrolled in the Spring 2018 term, and five responded to the Spring 2018 survey, then all five students would be assigned a weight of two (10/5).
  31. See, for example, Rayane Alamuddin, Daniel Rossman, and Martin Kurzweil, “Interim Findings Report: MAAPS Advising Experiment,” Ithaka S+R, 27 June 2019, https://doi.org/10.18665/sr.311567.
  32. At these institutions, between a third and half of treatment group students had not interacted with their MAAPS advisor in person by the end of the second year of the intervention.
  33. “Hitting their Stride: Equity, Outcomes, and the Impact of COVID,” Tyton Partners, 2021, https://tytonpartners.com/hitting-their-stride-2021-equity-outcomes-and-the-impact-of-covid/.
  34. “Proactive Advising: A Playbook for Higher Education Innovators,” University Innovation Alliance, 2021, https://proactiveadvising.theuia.org/assets/uia-proactive-advising-playbook.pdf. See also Dr. Cassandre Alvarado’s presentation on what UT Austin learned from the MAAPS project: https://www.youtube.com/watch?v=9EMhcI1X4g4.
  35. Senate Bill 25, https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00025F.htm.
  36. See Kristen Renn’s presentation on what MSU learned from the MAAPS project: https://www.youtube.com/watch?v=ilNvYP9JHsM.
  37. “Proactive Advising: A Playbook for Higher Education Innovators” University Innovation Alliance, 2021, https://proactiveadvising.theuia.org/assets/uia-proactive-advising-playbook.pdf.
  38. See Lisa McIntyre’s presentation on what ASU learned from the MAAPS project: https://www.youtube.com/watch?v=dhKJxmlMtK8.
  39. Ibid.
  40. Ibid.
  41. The other institution that offered the primary model to its MAAPS students did not observe significant positive impacts. However, that institution failed to implement and offer other key features of the MAAPS intervention. As a result of advisor turnover, the institution had only two MAAPS advisors for most of the project, as opposed to the three advisors that were originally planned, which significantly increased advisors’ caseload and made it more difficult for them to offer proactive outreach. In addition, the institution was not able to implement an early alert system in a timely manner.
  42. Daniel Rossman, Julia Karon, and Rayane Alamuddin, “The Impacts of Emergency Micro-Grants on Student Success: Evaluation Study of Georgia State University’s Panther Retention Grant Program,” Ithaka S+R, 31 March 2022. https://doi.org/10.18665/sr.316611.
  43. Controls also include a dummy variable capturing whether the student is one of 22 enrolled in Purdue University’s PharmD Program.
  44. For all regression tables in Appendix A, robust standard errors are in parentheses, with *** indicating p<0.01, ** indicating p<0.05, and * indicating p<0.10.
  45. All regressions looking at Pell-eligible students only do not include low-income status at baseline as a control.
  46. For all regression tables in Appendix B, robust standard errors are in parentheses, with *** indicating p<0.01, ** indicating p<0.05, and * indicating p<0.10. ✝ indicates that the result is not statistically significant at the p<0.10 level after adjusting for multiple comparisons.
  47. For all regression tables in Appendix C, robust standard errors are in parentheses, with *** indicating p<0.01, ** indicating p<0.05, and * indicating p<0.10.
  48. For all regression tables in Appendix D, robust standard errors are in parentheses, with *** indicating p<0.01, ** indicating p<0.05, and * indicating p<0.10.