Introduction

The American Talent Initiative (ATI) seeks to substantially expand access and success for low-income students at the nation’s colleges and universities with the highest graduation rates.[1] One member of ATI, the University of Texas at Austin (UT Austin), is piloting a bold, data-informed strategy aimed at removing key barriers to access. This new initiative combines automatic admissions with proactive financial aid guarantees, with the goal of encouraging more students from low-income backgrounds to both apply and enroll.

I recently spoke with Dr. Matt Giani, a lead investigator on the UT Austin research team, to learn more about the design and early outcomes of the pilot study.[2] This approach builds on earlier research from the University of Michigan on the power of guaranteed tuition commitments.[3] Key findings from the pilot include:

  • Proactive financial aid guarantees significantly boosted enrollment for automatically admissible students. Among students in the top 6 percent of their class (eligible for automatic admission), the intervention nearly doubled the likelihood of enrolling at UT Austin—from 23 percent in the control group to 43 percent in the treatment group.
  • Application rates increased broadly, but enrollment gains were concentrated among admissible students. While the intervention increased application rates for students outside the automatic admission threshold (class rank of 7–10 percent), it did not lead to statistically significant differences in admission or enrollment for this group—highlighting that financial aid offers alone are not sufficient when admission chances are low.
  • Strategic targeting is key: aid guarantees are most effective when combined with admissions eligibility. The findings underscore the importance of pairing financial incentives with admissions strategies—like automatic or direct admission—to effectively reduce barriers and support the enrollment of high-achieving, low-income students.

This published interview shares highlights from our conversation, including how the UT Austin team designed the pilot study and what their findings could mean for other institutions seeking to expand opportunity for low-income students.


Let’s begin by discussing the concept I’ve been referring to as “stacked student initiatives.” The idea is to combine multiple supports, with the goal of improving student outcomes. We know, for example, that programs aimed at increasing financial aid to low-income students can positively impact college enrollment.

What motivated your team to explore the idea of combining proactive financial guarantees with automatic admission policies?

Headshot of Matt Giani

Dr. Matt Giani

As you mentioned, there have been some really impactful studies demonstrating that the proactive financial aid guarantees can really move the needle on students’ application and enrollment, particularly to selective institutions. But one question we had was just, are those types of proactive guarantees necessary in context where students have admission certainty.

Texas has had guaranteed admissions for decades now. So, there was a question of, if students are automatically admissible to a particular university, do you even need to proactively guarantee financial aid to them? One hypothesis is that if they’re automatically admissible, they’re going to apply anyway if they want to go. And then they’ll see what financial aid they get. So that’s part of the reason we wanted to test that.

Additionally, we’ve seen in some studies that admission certainty can help students or increase the likelihood they apply to colleges. But it doesn’t necessarily increase enrollment, particularly as much as we would like. And there’s been some really great experimental studies that have concluded that direct admissions alone might not be sufficient to move the needle if you’re not also addressing the financial barriers.

Can we also address some of the uncertainty of those non-tuition costs that might stymie students’ enrollment in selective institutions?

So that really led to the idea of how we combine the admission certainty with the financial certainty. And of course, there’s the tuition certainty which had been covered before. But we know that the non-tuition cost of college often exceeds the tuition costs. So part of our question was, well, can we also address some of the uncertainty of those non-tuition costs that might stymie students’ enrollment in selective institutions? Which is why we also tried to address the housing, both through guaranteed on-campus housing, as well as the housing scholarship for students that were eligible for the free tuition plan at UT Austin.

One of the things that stood out to me in the white paper was the range of what “financial aid guarantees” can actually mean. There’s full coverage, but also packages that leave out important costs—like housing, transportation, or other essentials. That really highlighted the kind of cognitive work students and families have to do to fully understand what’s being offered.

In your work, you explored this in the context of administrative burden. Could you explain what that concept entails?

Absolutely. The administrative burden framework was developed among public policy and public administration scholars. The broad question they were asking was, why aren’t folks who are eligible for specific government policies or programs, taking up those services?

The three costs that comprise administrative burdens are learning costs, psychological costs, and compliance costs.

The explanation that led to the development of this framework was that there’s these administrative burdens that might prevent folks from accessing benefits that they’re eligible for. The three costs that comprise administrative burdens are learning costs, psychological costs, and compliance costs. So the learning costs are just the amount of time and effort it takes to learn about your eligibility. Determining if you’re eligible for admission to a university or eligible for a specific financial aid program can be really taxing, just figuring it out, particularly when you’re searching through dozens of different colleges you’re considering, trying to sort out your eligibility for each one is a lot of effort in and of itself.

On top of that, you have the compliance costs of all the paperwork, the materials, the steps you need to claim your eligibility and to maintain your eligibility over time. And of course, the psychological costs are the stress, the anxiety. Sometimes even the shame that is integral to accessing benefits, can play a factor. We tried to take that framework and then apply it to the process of applying to selective colleges and enrolling in selective colleges. We took those learning costs, those compliance costs, and those psychological costs and then applied them to the three critical components of selective college admission; admissions, tuition, and then the non-tuition components, such as housing.

Our question was, how do we reduce the learning costs, compliance costs and the psychological cost for all three of those components? Which led to us essentially designing an intervention where we sent letters directly to students with really clear, concise language. We put it on one page: you’re eligible for free tuition for four years plus guaranteed on-campus housing and a housing scholarship. That was one paragraph in the letter that we sent to students. We tried to make it as clear and concise as possible.

We also tried to make it a kind of celebratory, emphasizing you’ve earned this and we are thrilled to offer you this scholarship. So there’s no shame. Based on your financial need and your inability to pay for college, we’re going to cover the cost of tuition. We wanted it to be positive and celebratory.

And we tried to reduce the compliance costs as much as possible, but students still needed to apply for housing. We couldn’t just say, we’re going to give you housing no matter what, regardless of your application. The university was like, no, all students have to apply for housing and it’s separate from their application to the college because we don’t know if they want to live on campus or not. So there were still some compliance costs, of course, that students incurred, but we at least use this framework to try to think through minimizing those costs as much as possible to increase the effectiveness of the outreach.

Could you talk a bit more about how those choices shaped your research methods? Our readers will find it especially compelling that this was a randomized controlled trial (RCT). What does that mean for how you were able to structure the study and, ultimately, how confidently you can speak to your findings?

In Texas, there is no statewide repository of information that included students’ contact information, their class rank, their eligibility for free and reduced-price lunch. It was only the districts that had access to that information. So out of necessity, we had to reach out to districts and partner with them.

We identified schools and school districts that were high poverty, meaning they served at least 50 percent low-income students, and that they had a limited history of sending kids to UT Austin. So we looked at historical data and identified districts that didn’t send a lot of kids to UT, despite the fact that we have the automatic admission plan where students can attend UT from any high school in the state as long as they’re in that top six percent (at the time of our study).[4]

So from that, we excluded really large districts and really small districts. The really large districts such as Houston, Austin, Dallas, Fort Worth, were excluded because those are often the districts where there’s already the most recruitment effort. And we excluded the really small districts, mainly for pragmatic reasons—we didn’t want to focus a lot of our recruitment effort on schools that might have zero students that were eligible.

So we ended up in the kind of medium-sized, high-poverty school district range, which was still a whole lot of districts. From that group, we randomly selected 100 school districts. Of the hundred districts that we emailed, we were able to establish data sharing agreements and partnerships with 20 of them within a few months of reaching out to them.

We wanted to partner with districts to have ongoing relationships so we could continue to test different strategies and different ideas moving forward. So we didn’t want it to be a one-time research project. We wanted it to be like a new form of relationship between universities and school districts, particularly those that had received less recruitment historically.

You mentioned the outreach materials and wanting to make sure that you were accessing a diverse range of low-income students. Can you talk a little bit about how you made sure that those materials were culturally and linguistically accessible to student populations that you were targeting?

One of the greatest assets of Texas is that it is incredibly diverse: students of color have been the majority of the K-12 population for many years now. It was also very important that we ensured that everything about the intervention and the research was compliant with all state and federal policy. And Texas is one of many states that has passed anti-DEI legislation that generally prevents universities from targeting students based on race or ethnicity.

So we did not target students based on race or ethnicity. We defined district and school eligibility based on income. We randomly selected the sample. We did not consider students’ race or ethnicity when drawing the sample, and we did not change the materials based on students’ race or ethnicity.

That said, we still wanted the materials to be linguistically and culturally accessible. So we did a couple of things. One is that we had letters to students and we also had letters to parents or guardians. We tried to ensure that the language of that letter was very inclusive to respect all the different types of households that students are coming from. We also translated those letters into Spanish and we did that for all students.

Let’s dive into the results. Your study found significant impacts on both college application and enrollment rates among students who were automatically admitted.

Could you elaborate on those findings? Were there any outcomes that surprised you or challenged your initial expectations?

The top line finding is that this intervention, where you directly guarantee financial aid to students, before they had applied to college, resulted in significant increases in students’ likelihood of applying, being admitted to, and enrolling at UT Austin. For students who are automatically admissible, these were students in the top six percent, the intervention almost doubled the likelihood that they would enroll at UT Austin.

This intervention, where you directly guarantee financial aid to students before they had applied to college, resulted in significant increases in students’ likelihood of applying, being admitted to, and enrolling at UT Austin.

The likelihood for the control group was about 23 percent. For the treatment group, it was like 43 percent. That is a 20 percentage-point increase in their likelihood of enrolling, which is huge.

And once again with the randomized controlled trial, we have 32 schools from those 20 districts in the sample. We randomly assigned 16 schools to this treatment group that got this proactive guarantee. The 16 schools in the control group also got letters from UT Austin signed by the VP of admission saying you’re an amazing student, we’d love for you to come to UT. Here is a glossy burnt orange informational packet about how incredible the University of Texas at Austin is.

It is important to note that students getting personalized letters with their name from the vice president of admissions mailed to their home with the glossy information packet is atypical. So even students in the control group, it’s really like a no-guarantee treatment condition. This is still different from business-as-usual recruitment practice.

Another point to note: because our sample included students in the top 10 percent, that means we have students in the 1 percent to 6 percent range that are eligible for automatic admission and then we have students in the 7 percent to 10 percent range that are not eligible for automatic admission to UT Austin.

So what we find for the 7-10 percent is that it did have a significant impact on application, but it didn’t have much effect on admission and enrollment. It was only about a one percentage-point increase in enrollment and that wasn’t a statistically significant difference.

From the control group. It was like 8 percent of the control group who is in the non-automatic admission range got into UT, it was 9 percent of the treatment group. So once again, that’s not that surprising, given that, it’s very hard to get into UT Austin, if you’re not eligible for automatic admission.

If you’re targeting students who have a very low likelihood of getting in, then just giving them financial aid probably isn’t going to help the fact that they’re unlikely to be admitted.

So simply guaranteeing financial aid alone does not mean you’re more likely to get in. If you’re targeting students who have a very low likelihood of getting in, then just giving them financial aid probably isn’t going to help the fact that they’re unlikely to be admitted.

So I’ll just say that’s important because once again, it underscores the need to combine efforts related to admission, whether it’s direct admission or at least very strategic targeting of students that are likely eligible for admission, combined with the financial aid guarantees and addressing those administrative burdens that might stymie admissible students transition into selective colleges.

These are really strong findings, and they offer a lot for policymakers and institutions to think about—especially when it comes to combining programs to better support low-income students. That said, could you talk a bit about the study’s limitations? What should readers keep in mind when looking at the results?

Yeah. Well, many. So one really important limitation is that the intervention was targeted at automatically admissible students or at least students in the top 10 percent, many of whom were automatically admissible. But the intervention included the guaranteed tuition and guaranteed on-campus housing for the first year, and a housing scholarship. We don’t know which of those components moved the needle, or if it was the combination of all three.

I’ll say that we had always conceptualized this as a pilot study because this was our first time establishing these partnerships with districts, the first time collecting this data, the first time doing an RCT like this. So we wanted to kind of put all of these components into the intervention just to kind of maximize the likelihood that we would find significant and positive effects. Now that we’ve done that, we really need to explore the mechanisms and the underlying workings of which of these guarantees matter most.

How might these findings apply to other schools like UT Austin—highly selective institutions with strong graduation rates? What would it take to adapt this kind of approach elsewhere?

We sent letters to students’ homes. That was the strategy that we used. However, there are lots of other ways to disseminate information to students. You could go through schools and teachers, or school counselors might be a really great approach. You could use social media. You could email students.

There’s probably a lot of students who never got our letters at all. We don’t know exactly what that percentage is, but I would estimate maybe 10 percent to a third of the sample might just not have received any letter at all. And we still found these effect sizes despite that fact. So I think there’s also a lot you can do to think about, like, how do you ensure that you’re getting the communications to students in a way that they pay attention to, in a way that they respond to, and in a way that leads them to actually apply to the institution that’s recruiting them.

And beyond that, what advice would you offer to other universities that are interested in exploring similar research efforts? What should they be thinking about—particularly when it comes to communicating with students and building the kinds of partnerships that are essential for broader implementation?

I think one is just to be really thoughtful about things like, what are the school districts and schools and students that you’re targeting for recruitment. How can you establish partnerships with those entities? Of course purchasing lists of students and trying to recruit them is still an important part of enrollment management and admissions. But perhaps through research practice partnerships like this, you’re able to come up with more informed strategies and produce better evidence of the efficacy of the strategies that you use.

I think the other thing that I would suggest is generating the type of evidence in a way that will allow us to say something definitive about how effective and how cost effective a strategy was.

It’s kind of odd to me that we highlight the incredible research that our universities are doing, but when it comes to recruitment and supporting student success, we don’t design and implement the programs in a way that allows us to generate that evidence. So this effort was very new for the university, where it’s like, let’s do a randomized controlled trial to see if the strategy works or not. We hope that additional universities will be interested in partnering with us or other research teams that can produce rigorous evidence of the effectiveness of their strategies.

It’s kind of odd to me that we highlight the incredible research that our universities are doing, but when it comes to recruitment and supporting students success, we don’t design and implement programs in a way that allows us to generate that evidence.

And then I think I would just say continuing to listen to the voices of students. And figuring out their anxieties, their uncertainties, their burdens that they’re facing. To really try to center student voices and then to design strategies that are aimed at addressing the barriers that students are facing, I think is, of course, really critical as well.

For anyone considering these types of strategies in the future, I think the question is what are the socio-cultural and linguistic backgrounds of the students that you’re recruiting? And how can you work with those students and families to create content and language that is maximally accessible? I’ll say there’s a lot more we hope to do in future research including putting the letters in front of students and having them mark them up and say, how would you phrase this differently? How would you revise this?

How could we say this in a way that resonates with you and do the same thing with parents as well.

Thanks so much for sharing all of this. Just one final question to wrap up.

Looking ahead, we imagine there’s a lot more you’d like to explore in this area. What are your thoughts on building a stronger evidence base for the adoption of stacked or bundled services in higher education? And how do you see opportunities for collaboration between researchers and practitioners fitting in to advance this work?

As I alluded to previously, despite how excited I was about the findings from our study, it’s still just a single study. And there’s still a lot more research to do to figure out if findings can be replicated in other contexts. Where does this intervention work? For whom does it work? Under what conditions does it work? So, the more institutions we can partner with, the more we’ll learn about those contours of the effectiveness of an intervention like this.

We’re very interested in other contexts with other state or institutional policies about automatic admission or about automatic financial aid guarantees, like how effective would these types of interventions be in those contexts? So we have a lot to learn there.

At the end of the day, what we want to understand is, how do these interventions support students’ long-term success?

At the end of the day, what we want to understand is, how do these interventions support students’ long-term success? After looking at their enrollment, what we’d like to see is some of those mediators that might be predictive of future success.

For example, are students more likely to live on campus? Are they more likely to take higher numbers of semester credit hours because they don’t have to work off campus or live off campus and they can be more engaged? Are they more connected to community life? Are they more involved in student organizations? Are they more part of their community because we’ve addressed housing and some of the housing costs and other financial costs that might mitigate the amount they need to work in order to cover college, even though we know that the majority of college students do work in this day and age. So we have a lot to learn about how an intervention like this might link to subsequent postsecondary outcomes.

Great. Thanks for your time today, Matt. We greatly appreciated your time and insights and look forward to following along with you as you move this important research forward.

Endnotes

  1. The American Talent Initiative (ATI) is a Bloomberg Philanthropies-supported collaboration between the Aspen Institute’s College Excellence Program, Ithaka S+R, and a growing alliance of colleges and universities. For more information, visit the ATI website at https://americantalentinitiative.org/.
  2. Matt S. Giani, Richard Murphy, Stella M. Flores, Jori Barash, Brian Dixon, and Julio Mena Bernal, “From Passive Promises to Proactive Guarantees: The Efficacy of Financial Certainty Interventions Among Automatically (In-)Admissible Students,” EdWorkingPaper: 25-1158 (Annenberg Institute at Brown University, March 2025), https://doi.org/10.26300/bk34-s137.
  3. Susan Dynarski, CJ Libassi, Katherine Michelmore, and Stephanie Owen, “Closing the Gap: The Effect of Reducing Complexity and Uncertainty in College Pricing on the Choices of Low-Income Students,” American Economic Review 111, no. 6 (June 2021), 10.1257/aer.20200451.
  4. “Top 10 Percent Law,” UT News, https://news.utexas.edu/topics-in-the-news/top-10-percent-law/.