Abstract
States provide merit-based financial aid to enhance access to and success at in-state colleges. Over time, however, many recipients fall short of scholarship renewal requirements—community college attendees and low-income students in particular. Using a regression discontinuity design, we examine how losing eligibility for the Tennessee HOPE scholarship differentially affects community college students’ academic outcomes across income levels. Losing HOPE eligibility increased stopout and reduced vertical transfer and associate degree completion among all community college students. The lowest-income students experienced the strongest negative effects on transfer and associate degree completion. Merit-aid renewal criteria ultimately preclude students from persisting and benefiting from degree completion. States with similar policies should consider interventions that help more students maintain scholarship eligibility or reforms to existing renewal criteria.
Keywords
Introduction
States adopt merit aid programs—postsecondary financial aid based on academic criteria instead of demonstrated need—to improve high-school students’ performance and increase postsecondary access and success at in-state colleges (Doyle, 2010; Dynarski, 2004; Nguyen et al., 2019). Merit aid programs often have initial eligibility criteria such as thresholds on high school grade point averages (HSGPA) or college entrance examination scores (e.g., ACT/SAT), and those standards vary substantially across states (Domina, 2014; Education Commission of the States, 2021). Researchers have exploited these eligibility criteria to evaluate the effects of merit aid on postsecondary enrollment, persistence, completion, and labor market outcomes (e.g., Bettinger et al., 2019; Bruce & Carruthers, 2014; Castleman, 2014; Gurantz & Odle, 2022; Scott-Clayton, 2011; Scott-Clayton & Zafar, 2019). The effects of merit aid on access and success are mixed across settings (Herbaut & Geven, 2020; Nguyen et al., 2019).
Researchers have questioned why merit aid does not have a stronger influence on student outcomes. Some point to differences in initial eligibility criteria, award generosity, and related program design elements as factors that might contribute to the effectiveness of merit aid, and criticize merit aid programs for awarding funds to students who would have enrolled in college and completed degrees regardless of aid receipt (Domina, 2014; Dynarski, 2000; Herbaut & Geven, 2020; Nguyen et al., 2019). However, another potential explanation is that, once enrolled in college, recipients of state merit aid must often meet additional academic criteria to maintain scholarship eligibility. Merit aid recipients frequently lose scholarship eligibility over time due to such maintenance criteria (Henry et al., 2004). Community college attendees and students from economically disadvantaged backgrounds—among other subgroups of students—experience disproportionate negative effects from merit aid renewal policies (Gross et al., 2016; Ribar & Rubenstein, 2023; Tennessee Higher Education Commission [THEC], 2016b). The loss of financial aid represents a significant shock to students’ finances, and students who are disproportionately subjected to renewal criteria (e.g., low-income students) are also more likely to be enrollment sensitive in response to changes in price (DesJardins & Bell, 2006). The overall effectiveness of state merit aid in facilitating positive outcomes for students may thus be undermined by renewal policies, especially if students who are sensitive to changes in their finances are also more likely to face scholarship loss.
Relative to the literature on the effects of merit aid receipt, less research examines the consequences of merit aid loss on measures of student success. The few studies investigating merit aid loss generally find negative effects on persistence and completion (Carruthers & Özek, 2016; Cummings et al., 2022; LaSota et al., 2021). This emerging literature, however, has not focused on community college students, nor has it empirically tested for heterogeneous effects based on income. This paper fills these gaps by studying Tennessee’s merit aid program, the HOPE scholarship (HOPE), which provides scholarships to more than 30,000 first-time freshmen annually. Recipients can receive the award for 5 years, provided they maintain a minimum college GPA (CGPA) at each of four renewal “checkpoints.” Importantly, HOPE award amounts vary across the income distribution. Policymakers designed the program to provide additional assistance to low-income students, with a supplementary award (HOPE Aspire) for students with an adjusted gross income (AGI) of $36,000 or less. Over 40% of students lose their scholarship before the second year of college (THEC, 2016b), with scholarship loss rates disproportionately high for community college students, particularly those from low-income backgrounds.
We use a regression discontinuity (RD) design to evaluate whether and the extent to which merit aid eligibility loss differentially affects community college students’ stopout, transfer, and completion outcomes across income levels. Among the full sample of students, we find that losing HOPE eligibility increases community college students’ stopout and negatively affects cross-sector transfer and associate degree completion within 4 years of entry. The adverse effects of HOPE eligibility loss are strongest among the lowest-income community college attendees, reducing cross-sector transfer within 4 years by 14 percentage points and associate degree completion within 4 years by 17 percentage points. HOPE eligibility loss also causes an 11-percentage point decrease in the probability of associate degree completion within 2 years among higher-income community college students. However, there is no effect on associate degree completion within 4 years among this subgroup of students. Our results suggest that merit aid renewal criteria preclude many community college students from persisting and realizing the benefits from credential attainment, especially among students from the lowest-income backgrounds.
The Tennessee HOPE Scholarship
HOPE Eligibility and Generosity
The Tennessee HOPE scholarship, launched in 2004, awards financial aid to graduates of Tennessee high schools who enroll in a 2- or 4-year Tennessee public or private postsecondary institution within 16 months of high school graduation. Students must file the Free Application for Federal Student Aid (FAFSA) and attain either a score of 21 (1060) or higher on the ACT (SAT) or a minimum of a 3.0 cumulative HSGPA to obtain the HOPE scholarship. Given the state’s emphasis on providing additional support to students from low-income backgrounds, HOPE recipients with an adjusted gross income (AGI) of $36,000 or less are eligible for a supplementary award called HOPE Aspire. 1
The maximum HOPE award for community college students was $1,000 per semester (up to $3,000 per academic year, including summer) in the time period under study. Students eligible for the Aspire supplement could receive an additional $750 per term. The maximum award represented between 50% and 73% of tuition and fees at community colleges. 2 HOPE awards are applied after the student’s federal Pell Grant has been allocated, and are guaranteed up to the cost of attendance, with students receiving a refund check if their HOPE award exceeds the cost of tuition and fees. Students, therefore, do not lose out on eligible funding from HOPE if the Pell Grant covers all their community college tuition and fees, which is the case for many last-dollar scholarships that only cover costs up to tuition and fees.
HOPE Renewal Criteria
Once enrolled, recipients of the HOPE scholarship must meet specific criteria to maintain scholarship eligibility. The primary criterion is a minimum cumulative CGPA, which the state evaluates at four “checkpoints” along students’ educational trajectories. Each checkpoint is based on cumulative credits attempted. Students reach the first checkpoint in the semester in which they attempt 24 credits. Subsequent checkpoints occur once students have attempted 48, 72, and 96 credits. This study focuses on the first checkpoint, in which a cumulative CGPA under 2.75 triggers future ineligibility for the HOPE scholarship (THEC & Tennessee Student Assistance Corporation [TSAC], 2023a). 3 The first checkpoint is more consequential than the others because 67% of HOPE recipients who lose their scholarship do so by this juncture (Cummings, 2020).
In addition to the CGPA requirement, HOPE recipients must continuously enroll in a HOPE-eligible Tennessee institution and not change their enrollment intensity within a given semester (i.e., moving from full- to part-time enrollment status). The penalty for not meeting the renewal criteria is that students lose eligibility for all types of HOPE awards (Aspire included) beginning in the subsequent semester. As long as HOPE recipients maintain their eligibility from checkpoint to checkpoint, they can receive the award until they either complete a bachelor’s degree or exhaust their eligibility by, among other limits, attempting a total of 120 credit hours.
Prevalence and Consequences of HOPE Loss
Many students lose the HOPE scholarship before their second year of college. In Fall 2015, 42% of HOPE recipients did not meet the scholarship renewal criteria (THEC, 2016b). Renewal rates varied by the type of HOPE award and sector. Compared to the overall population of HOPE recipients, loss rates were notably higher for community college students (55%) and were higher yet for the low-income Aspire recipients from community colleges (60%; THEC, 2016b). The pronounced loss rates for Aspire students, coupled with the larger award amounts they stand to lose, are key factors motivating our interest in exploring heterogeneous effects across the income distribution.
The CGPA renewal requirement is the principal reason why community college students lose their HOPE scholarship at the first checkpoint. Nearly 60% of the community college students in our sample who reached the first checkpoint with an intact HOPE scholarship and then lost it in the subsequent term (N = 12,510) had CGPAs below 2.75. 4 Losing HOPE eligibility due to an insufficient CGPA at the first checkpoint is also correlated with student success (Table 1). Among students who exceeded the CGPA threshold at the first checkpoint, only 7% (12%) stopped out in the next (second) semester after the checkpoint. In contrast, 33% (45%) of students who lost HOPE scholarship eligibility stopped out in the next (the second) term. Students whose CGPA exceeded the renewal threshold were more likely than their ineligible peers to transfer to a 4-year institution (42% vs. 16%, respectively) as well as earn an associate degree within 2 years (25% vs. 4%) or 4 years (54% vs. 19%).
Student Outcomes by GPA Eligibility Status at First Checkpoint
Note. The full sample (Fall 2010–Fall 2015) includes 27,272 students. To allow the requisite time for transfer (Fall 2010–Fall 2014) and 4-year completion (Fall 2010–Fall 2013), those samples include 20,150 and 15,219 students, respectively. All outcomes measured after the first checkpoint.
Tennessee introduced an additional statewide aid program in 2015—the Tennessee Promise (TN Promise)—which covers tuition and fees at public community colleges. Of relevance to our study, TN Promise may backfill aid for community college enrollees who lose Tennessee HOPE—but only if students satisfy the additional eligibility criteria for TN Promise (e.g., community service hours, mandatory meetings with an advisor). Only one of our cohorts overlaps with the introduction of Tennessee Promise, and our results are robust to excluding this cohort. 5 Informed by prior research (e.g., Carruthers & Özek, 2016; Welch, 2014) and conversations with partners in Tennessee, we are unaware of any other policies or interventions (e.g., satisfactory academic progress) that are triggered at the 2.75 CGPA threshold used to determine HOPE eligibility. Additionally, TN HOPE operates in similar fashion today as it did during the timeframe under study. At present, the initial eligibility and renewal criteria are identical to those in place between 2010 and 2015, and scholarship renewal rates for current HOPE recipients are likewise comparable to those of the cohorts in our analysis. 6
Relevant Literature
Merit Aid and Student Success
Many states implemented merit-based aid programs over the last 3 decades (Domina, 2014; Doyle, 2006), resulting in an extensive literature investigating the impact of these programs on student enrollment, persistence, completion, and, more recently, labor market outcomes. Evidence has been mixed regarding how merit-based aid programs affect students’ academic outcomes (for a review, see Nguyen et al., 2019). Some evidence points to significant positive effects of merit aid on persistence, completion, and labor market outcomes (Bettinger et al., 2019; Dynarski, 2004; Scott-Clayton & Zafar, 2019). Other research demonstrates less pronounced effects on these outcomes, but that merit aid induces some students to enroll at a 2-year institution rather than a 4-year institution, although the size of this shift was relatively small (Bruce & Carruthers, 2014; Gurantz & Odle, 2022). In particular, using a regression discontinuity design, Welch (2014) found that initial eligibility for Tennessee HOPE had a null impact on academic performance, persistence, completion, and labor market outcomes for Tennessee community college students.
The mixed evidence discussed above has led researchers to question why the impacts of merit aid were inconsistent and less pronounced than other forms of financial aid. Some of the most compelling responses pointed to differences in program design, award generosity, and the criteria required to maintain eligibility (Domina, 2014; Herbaut & Geven, 2020; Nguyen et al., 2019). Regarding renewal criteria, Henry et al. (2004) showed that over half of the students who initially qualified for merit-based aid in Georgia eventually lost their eligibility due to academic or administrative requirements. The authors found a small, positive relationship between merit aid and student outcomes, although this relationship was concentrated almost exclusively among students who maintained their eligibility. Additional research demonstrated that renewal criteria attached to merit aid often disproportionately affected certain groups, including students from economically disadvantaged backgrounds (Gross et al., 2016) as well as Black and Hispanic students (Ribar & Rubenstein, 2023). This line of inquiry led some to argue that merit aid generally benefitted students whose educational outcomes were less contingent upon the receipt of aid, whereas students who relied upon aid to enter college were often those who faced the most barriers to completion, making their outcomes more reliant upon continued receipt of merit aid (Carruthers & Özek, 2016). These findings, coupled with the fact that rate of spending on merit-based aid programs is outpacing the rate of spending on need-based aid programs (Cummings et al., 2021), raises equity concerns (Baum, 2008; Knox, 2023).
Merit Aid Loss and Student Success
Recent work has focused on the impact of losing merit-based aid on persistence and completion, particularly in Tennessee. 7 Carruthers and Özek (2016) investigated what happened to students who did not meet the minimum CGPA criterion for maintaining their HOPE scholarship in Tennessee. Using a sample of 2- and 4-year students, the authors found that losing HOPE caused a 2.9 percentage point decrease in persistence and a $245 increase in biannual earnings. They did not, however, find any significant differences in on-time completion. Persistence rates among 2-year students fell by as much as 8.6 percentage points, and low-income students (defined as having an annual family income below $60,000) experienced more pronounced decreases in their credit load relative to their higher-income peers. However, these results for low-income students are generated using a pooled sample of 2- and 4-year students, meaning the authors cannot parse specific outcomes for low-income students at 2-year institutions. Using more recent data, Cummings et al. (2022) also estimated the impact of losing Tennessee HOPE among students at 4-year institutions, finding a 2.5 percentage point decrease in persistence and similarly null results for degree completion. However, the authors found heterogeneous effects across these 4-year HOPE recipient income and racial subgroups: Higher-income and White students experienced an increase in next-semester stopout as a result of HOPE eligibility loss, whereas low-income (defined as receiving an Aspire supplement) and Black students were more likely to transfer to a 2-year institution.
While the primary contribution of this work is investigating whether renewal criteria diminish the overall effectiveness of merit aid programs, past research has highlighted two additional areas for further investigation on this topic: understanding how HOPE loss uniquely affects student outcomes at community colleges, and further investigating heterogeneous effects for subgroups of students. Our work pursued both objectives. First, we expanded upon Carruthers and Özek’s (2016) initial analysis by including more—and more recent—cohorts in our sample which, in turn, enabled us to focus explicitly on community college students, who lose eligibility for merit-aid scholarships at especially high rates (Gross et al., 2016; Ribar & Rubenstein, 2023; Tennessee Higher Education Commission [THEC], 2016b). Carruthers and Özek (2016) were limited to investigating two cohorts (between 2004 and 2005) and thus pooled 2- and 4-year students in their analysis. Our sample includes up to six cohorts (between 2010 and 2015), offering sufficient statistical power to estimate effects exclusively for students at a community college (see the Appendix in the online supplemental material for minimum detectable effect sizes). Moreover, the comparative recency of our sample allowed us to sidestep a meaningful limitation in the Tennessee administrative data that Carruthers and Özek (2016) faced regarding systematic missingness in the CGPA running variable. 8
Second, we operationalized a more granular categorization of income status to observe the effect of losing HOPE on multiple subgroups in alignment with policy priorities in Tennessee. We disaggregated income levels based on Aspire status (family income below $36,000), Pell-eligible with no Aspire (family income above $36,000 but still eligible to receive the Pell Grant), and non-Pell-eligible students. This approach allowed us to create subgroups that interest policymakers in Tennessee, while also disaggregating the relatively coarse measures employed in prior work (e.g., above/below $60,000) to observe heterogeneous effects for more precisely operationalized income bands. Unlike prior work on 4-year students (Cummings et al., 2022), we could not analyze race/ethnicity-based subgroups in this study due to insufficient sample size.
Need-Based Aid Loss and Student Success
While our study is centered on state merit aid, other research has explored the consequences of losing eligibility for federal and state need-based aid (e.g., Bettinger, 2015; Kim, 2025; Mabel, 2020; Schudde & Scott-Clayton, 2016; Scott-Clayton & Schudde, 2020; Sparks, 2025a). LaSota et al., (2021) ably review this literature, generally finding that “reductions in grant aid . . . likely reduce college enrollment and persistence and, when aid is awarded based on financial need, these effects will, by definition, be greater for students from low-income families” (p. 166). A handful of these analyses also explicitly focus on community college students (Schudde & Scott-Clayton, 2016; Scott-Clayton & Schudde, 2020; Sparks, 2025a).
Some studies examine the Satisfactory Academic Progress (SAP) criteria for the federal Pell Grant, which primarily involves enrolled students maintaining a CGPA of 2.0 or higher (Kim, 2025; Schudde & Scott-Clayton, 2016; Scott-Clayton & Schudde, 2020). Scott-Clayton & Schudde (2020), for example, studied the effects of SAP on community college students’ academic outcomes using both RD and difference-in-difference designs. For students who did not meet SAP in their first year, the authors found negative effects after 6 years on attempted and completed credits, persistence, and associate degree and certificate completion in some or all specifications. Of note, the CGPA threshold of 2.0 for SAP is much lower than the CGPA threshold of 2.75 for renewing HOPE.
Others investigate a change to the Pell grant program involving a reduction in lifetime eligibility from nine to six full-time equivalent (FTE) years (Mabel, 2020; Sparks, 2025a). Sparks (2025a), for instance, studied whether the threat of losing Pell eligibility affected students’ academic outcomes once they had expended 5 years of Pell eligibility. Using individual fixed effects models, the author found that the threat of aid loss discouraged enrollment, reduced enrollment intensity, and lowered academic performance, as well as engendering substitution effects for foregone grant aid by way of earnings and loans. These effects were more pronounced for community college students. For reference, the sample of community colleges students in Sparks (2025a) had maintained enrollment for a much longer time span in order to exhaust five FTE years of Pell eligibility relative to the students in this analysis, who faced academic renewal requirements for HOPE after cumulatively attempting 24 credits.
Conceptual Framework
We draw from economic theory to hypothesize how students might respond to the loss of merit-based aid. We also draw upon the emerging literature on administrative burden to conceptualize additional nonpecuniary costs accompanying the HOPE renewal requirement that may influence students’ reactions to losing financial aid. Concepts from both literatures predict differential effects of scholarship loss on persistence, transfer, and completion based on income levels.
Shocks to the Direct Costs of College
Economic theory stipulates that rational agents make choices that maximize their utility after assessing the benefits and costs of the available alternatives (Becker, 1975). In this setting, community college students must decide whether to persist into subsequent semesters (and eventually transfer or complete a degree). The benefits primarily consist of future earnings attributable to community college education beyond what students would have earned had they not enrolled. The direct costs include the price paid to attend community college net of all financial aid received. 9 All else equal, losing eligibility for the HOPE scholarship will increase the direct costs of remaining enrolled in college. We expect that losing HOPE will result in some students discontinuing their enrollments because they determine that the costs now exceed the expected returns.
Some groups of students exhibit greater sensitivity to college price changes (DesJardins & Bell, 2006), especially students from low-income backgrounds, who have less financial flexibility and fewer resources to respond quickly to price changes (Hurwitz, 2012; Kim, 2012). Indeed, prior work in Tennessee has shown that low-income students are significantly more likely than their more affluent peers to indicate that HOPE scholarship funds are important in their college decisions (Ness & Tucker, 2008). Low-income students also disproportionately attend community colleges, often because these institutions are more affordable at baseline (McDonough & Calderone, 2006). Thus, relative to their higher-income peers, we expect low-income, community college students to experience a more pronounced negative effect on their academic outcomes after scholarship loss due to heightened price sensitivity.
There are additional reasons to expect community college students’ responses to losing HOPE eligibility will vary in intensity along the income gradient. HOPE recipients with family incomes below $36,000 receive an additional $750 per semester due to the Aspire supplement. If these students lose HOPE, they also lose the Aspire supplement. The lowest-income students—who are likely more price sensitive than higher-income students—may face even larger increases in the direct costs of college due to scholarship loss. For this reason, we expect losing HOPE to have the strongest negative effects on the lowest-income students. The effects of losing HOPE may also vary depending on the extent to which students are eligible for other forms of need-based aid. Students may also differentially compensate for lost HOPE dollars through work, (additional) loans, out-of-pocket funds, parental transfers, and other means (Carruthers & Özek, 2016).
Administrative Burden
Scholarship loss is also likely to produce nonpecuniary costs for students, which the administrative burden literature articulates as the learning, compliance, and psychological costs experienced by individuals when interacting with government programs (Herd & Moynihan, 2018). Learning costs stem from information-gathering processes about a program. Compliance costs encompass the time and effort individuals expend following programmatic requirements. Psychological costs include disempowerment and stigmas associated with program participation (Moynihan et al., 2015). Administrative burden is commonplace across state financial aid programs (Dynarski et al., 2022; Everett et al., n.d.; Rosinger et al., 2021). Examples in the HOPE context include the lack of knowledge of rules and requirements among potential HOPE recipients (THEC and TSAC, 2021); annual FAFSA submission for continued eligibility (and income verification for Aspire recipients); regular monitoring of academic standing relative to renewal criteria; and the absence of mechanisms to notify at-risk students. When scholarship loss occurs, students face additional burdens including meetings with advisors and financial aid counselors, appeals processes, and time spent searching for alternative financial support.
Given its roots in behavioral science, the administrative burden literature theorizes how individuals interpret and respond to costs during these interactions (Herd & Moynihan, 2018; Rosinger et al., 2021). Behavioral science stresses the limitations in time and attention, psychological biases, and contextual factors that constrain individuals’ decision-making processes, which may result in them making choices or appraising options in a different manner than strict cost-benefit analyses would suggest (Simon, 1982; Tversky & Kahneman, 1974). Temporal biases, for example, may result in individuals avoiding courses of action due to minor costs in the present, even if those alternatives are likely to translate to substantial benefits in the future (Moynihan et al., 2015). Contextual factors, such as financial strain, can exacerbate such biases (Christensen et al., 2020; Mullainathan & Shafir, 2013). The prevalence and influence of biases may vary across individuals. Prior research on the differential effects of financial aid shows that students from low-income backgrounds consider different time horizons when responding to discount rates than their more affluent peers (Goldrick-Rab et al., 2009), suggesting that temporal biases may be more prevalent among low-income students who give greater consideration to present costs than future benefits.
Exchanges between individuals and governments may also be more or less costly for different subpopulations (Christensen et al., 2020; Herd & Moynihan, 2018). Programmatic requirements are often more onerous for some subgroups by design. Everett et al. (n.d.) find that Tennessee’s Aspire program is more burdensome than other aid programs in the state, meaning low-income students bear more costs because of requirements in the program targeted explicitly to them. Additionally, some individuals have greater capacities to absorb or avoid program participation burdens. The learning and compliance costs associated with retaining HOPE awards are likely higher for students from low-income backgrounds, as they historically have less access to information about financial aid opportunities and programmatic requirements (Goldrick-Rab et al., 2009). When facing scholarship eligibility loss, they may also have fewer available resources to accommodate the resulting financial shock or pursue avenues to regain eligibility.
Taken together, economic theory and behavioral science provide insights into community college students’ possible responses to losing merit-aid eligibility. Community college students—particularly those from lower-income backgrounds—are predicted to be more likely to stop out (and, in turn, less likely to graduate or transfer) because of the direct financial shock of the lost HOPE scholarship, the nonpecuniary costs accompanying scholarship loss, and conditions of financial strain engendered by scholarship loss that may negatively affect their education plans. The latter two factors may also explain why the effects of losing HOPE eligibility on students’ academic outcomes may be larger than what would be implied by examining the pecuniary effects of such a financial shock alone.
Empirical Strategy
We used a sharp regression discontinuity (RD) approach to estimate the local average treatment effect (LATE) of losing eligibility for the HOPE scholarship on the stopout, transfer, and degree completion outcomes of HOPE recipients attending Tennessee community colleges. Herein, we describe our data, sample, key variables, models, and the validity of the models’ identifying assumptions.
Data
We used data from THEC and TSAC on all first-time freshmen who first enrolled at a Tennessee public 2- or 4-year postsecondary institution in Fall 2010 to 2011 through Fall 2015 to 2016 with a HOPE scholarship (N = 150,569). The semester-level dataset included information on student enrollment (e.g., term-specific and cumulative CGPA, credits attempted and earned), credentials awarded (e.g., associate and bachelor’s degrees), demographic characteristics (e.g., sex, race/ethnicity, age), and financial aid disbursements (e.g., Tennessee scholarship and FAFSA records) through the end of the academic year (AY) 2016 to 2017.
Sample
We restricted the overall sample described above in several ways (see the online Appendix). Our final analytical sample included 27,272 HOPE recipients who first enrolled in a fall semester between 2010 and 2015 and faced scholarship renewal at a community college after reaching the first checkpoint with an intact scholarship. 10 There were more females than males in our sample, and females were slightly overrepresented among students who retained their HOPE scholarship (Table 2). Our sample overwhelmingly consisted of White students (85.7%), with Black students as the next largest racial group (5.4%). The overall distribution of race/ethnicity was relatively consistent across renewal status, with slightly more White (0.9 percentage point) and slightly fewer (1.1 percentage point) Black students retaining eligibility compared to the overall sample. Students from low-income backgrounds, as measured by Pell eligibility and Aspire status, were slightly underrepresented in the kept-HOPE category, suggesting these students retain their scholarships at lower rates. Students in a health-related field of study were slightly less likely to retain HOPE, whereas there was a slight increase in the proportion of students majoring in education among HOPE-retainers. In addition, we note that students who kept their HOPE scholarship had higher cumulative CGPAs and credit success rates in their checkpoint term. These patterns aligned with our expectations, as students who retained HOPE mechanically had higher CGPAs, and therefore, we expected to see higher credit success rates.
Characteristics of Community College Students Who Reach First Checkpoint
Note. Due to rounding, totals may not equal 100%. HS = high school; GAMS = general assembly merit scholarship; CP1 = check point 1 for HOPE renewal; CGPA = college grade point average.
Treatment
The cumulative CGPA threshold for HOPE scholarship renewal was 2.75. A student was classified as being treated (i.e., lost eligibility) if their cumulative CGPA was less than 2.75 after the semester in which they attempted their 24th credit (i.e., the first checkpoint semester). We did not use all students in our models, as we restricted the sample of community college HOPE recipients to those near the CGPA threshold, meaning our RD design only estimated the local effect of losing eligibility for HOPE.
Outcomes
We examined the effect of losing scholarship eligibility on five academic outcomes: next-semester stopout; stopout the semester after that (dubbed “next-next"); cross-sector transfer; and associate degree completion within 2 and 4 years. Stopout was a dichotomous measure (= 1) indicating whether a student was not enrolled at a Tennessee 2- or 4-year public college in the relevant semester, regardless of their enrollment in previous or subsequent semesters. Next-semester stopout was measured in the semester directly after the student reached the first checkpoint. If a student reached the first checkpoint in the fall semester, next-semester stopout was measured in the subsequent spring semester. Students who reached the checkpoint in the spring semester were not considered to have stopped out if they had an enrollment record in the subsequent summer or fall semester. 11 Next-next semester stopout was measured in the second semester after the student reached the first checkpoint. For students who reached the first checkpoint in the fall semester, this variable indicated whether a student was enrolled in the subsequent summer or fall semester. For students who reached the first checkpoint in a spring semester, next-next semester stopout was measured in the following spring. Students were not considered to have stopped out if they completed a degree from a Tennessee public institution in the time between the checkpoint semester and when this outcome was measured. Our results were not sensitive to alternative constructions of these outcomes in which the summer term was not included. In addition, our “next-next semester stopout” outcome was analogous to the “enrolled t+1” outcome from Carruthers and Özek (2016).
Cross-sector transfer was a dichotomous measure (= 1) signifying whether community college students ever enrolled at a public, in-state, 4-year institution after reaching the checkpoint but within 4 years of initial college enrollment. Associate degree completion was a dichotomous measure (= 1) if students completed an associate degree from an in-state, community college. 12 We measured associate degree completion within 2 (i.e., “on-time” degree completion) and 4 years (i.e., “200%-of-time” degree completion) from the student’s initial postsecondary enrollment. 13
Our five academic outcomes only encapsulated student behavior within Tennessee’s public, postsecondary sector. We could not observe enrollment in or degrees awarded by Tennessee private or out-of-state institutions; we only observed associate degrees awarded by Tennessee community colleges. Our transfer and completion outcomes also covered different cohorts of students due to the structure of the data. We had data on graduation within 2 years for six cohorts of students (Fall 2010–Fall 2015), cross-sector transfer for five cohorts (Fall 2010–Fall 2014), and graduation within 4 years for four cohorts (Fall 2010–Fall 2013).
Subgroups
We segmented our sample into three income categories to examine the differential effects of scholarship loss with more granularity than prior analyses (Carruthers & Özek, 2016; Cummings et al., 2022; LaSota et al., 2021). The first group (Aspire students) was defined based on whether students received the supplemental Aspire grant during their checkpoint semester (N = 8,648). Students were eligible for Aspire if their adjusted gross income (AGI) on the FAFSA was $36,000 or lower. The average reported AGI for Aspire students at the checkpoint semester was $26,477, with an expected family contribution (EFC) of $670 (Table 3). 14 Nearly all Aspire recipients were Pell eligible by definition. 15 The second group of students (non-Aspire, Pell students) did not receive the Aspire supplement, but they were still Pell eligible based on their AGI ($52,352, on average; N = 5,020). The third group of students (non-Aspire, non-Pell students) neither received Aspire nor were Pell eligible (N = 13,604) and had higher incomes than the other two groups ($105,477).
Descriptive Statistics for Each Income Subgroup at the First Checkpoint
Note. FAFSA = free application for federal student aid. HOPE Aid received includes the HOPE scholarship, Aspire supplement, and General Assembly Merit Scholarship supplement.
Due to their HOPE supplement, Aspire students received more financial support in their checkpoint semester ($1,716) than other students in the sample. We expected lower-income groups to exhibit more price sensitivity to the loss of aid than higher-income groups. We also expected the effects of scholarship loss to vary depending on whether students were eligible for other forms of need-based aid (i.e., the Pell Grant). To test these hypotheses, we conduct t tests that compare the effects for the Aspire subgroup relative to the two other subgroups (non-Aspire, Pell and non-Aspire, non-Pell).
We could not examine heterogeneous effects by race/ethnicity or the intersection of income and race/ethnicity due to insufficient sample size. Combining race/ethnicity categories may, however, mask meaningful racial/ethnic differences in students’ experiences within higher education. This is a limitation to be addressed through follow-up studies when additional cohorts are available, and through qualitative work that explores the mechanisms through which loss affects students differentially.
Covariates and Fixed Effects
The RD design assumes that treatment assignment is essentially randomized for students with cumulative CGPAs close to the eligibility threshold. If true, no covariates are needed to identify treatment effects. Nonetheless, we included a set of student-level covariates to improve the precision of our estimates and to make the results comparable to prior research on this topic (Cummings et al., 2022). Covariates included race, sex, whether a student graduated from a high school in an economically distressed or at-risk county, whether they graduated from a rural high school, whether they received a supplemental award for low-income recipients (i.e., Aspire), whether they received a separate, supplemental award for students with high academic achievement (i.e., GAMS), whether the student was Pell Grant eligible, and the highest education level of their parents. We also included institution (based on initial college entrance) and cohort fixed effects to account for (respectively) any college- and time-based heterogeneity. The covariates we operationalized are commonly cited in community college research as determinants of student success (e.g., Dougherty et al., 2017; Goldrick-Rab, 2010), and many of them are incorporated in financial-aid evaluations in community college settings (e.g., Broton et al., 2016; Park & Scott-Clayton, 2018; Sparks, 2025b).
We included three, additional controls corresponding to students’ academic behaviors: declared major in the first checkpoint semester, cumulative CGPA in the term prior to the checkpoint semester, and credit success rate (i.e., total credits earned divided by total credits attempted) in the checkpoint semester. Prior research suggested that students might engage in strategic behavior, such as changing majors or withdrawing from courses, to improve their CGPA and avoid losing their merit aid scholarship (e.g., Carruthers & Özek, 2016; Cornwell et al., 2006; Sjoquist & Winters, 2013). Many of these hypothetical behaviors were unobservable in our semester-level dataset, but we included these three controls to reduce any potential bias in the estimates from these behaviors. Prior work on community college students has also demonstrated that these measures are predictive of students’ future academic outcomes (e.g., Jenkins & Cho, 2013; Turk, 2017; Yanagiura, 2020).
Empirical Model
Our preferred specification estimated the effect of losing HOPE on the aforementioned outcomes via nonparametric local linear regression models (Cattaneo & Titiunik, 2022; Calonico et al., 2017; Gelman & Imbens, 2019). After segmenting the sample into different groups with CGPA values at or above (+) and below (−) the cutoff point c (i.e., 2.75), selecting the triangular kernel function (K) which upweights students with CGPA values closer to c, and determining mean squared error-optimal bandwidths (h), we fit separate weighted-least squares regressions for each outcome of interest (yi) on first-order polynomial expansions of the running variable (
The LATE (
We estimated the main effects of HOPE scholarship loss on the full sample (N = 27,272) and separate models for each of the three income subgroups: Aspire students (N = 8,648), non-Aspire/Pell students (N =5,020), and non-Aspire/non-Pell students (N = 13,604). Bandwidths varied across outcomes and subgroups (ranging from 0.255 to 0.552 grade points across specifications). We reported bias-corrected estimates and robust variance estimators clustered by the postsecondary institution (Cattaneo & Titiunik, 2022). Given that we produced estimates for five outcomes across three income subgroups, estimated p values were also adjusted to account for the multiple hypothesis tests conducted to reduce the chances of reporting false positive results (Porter, 2018). Specifically, we used the Holm-Bonferroni correction (Holm, 1979), which is a sequential procedure that implements a family-wise error rate (FWER) adjustment and preserves statistical power relative to alternative test corrections.
We used a sharp RD design even though the sample had a limited degree of noncompliance. 16 All estimated treatment effects represented intent-to-treat effects of losing HOPE eligibility because not all students complied with their treatment status. A fuzzy RD design—often employed by researchers to address imperfect treatment compliance—was not optimal in this context. It would involve predicting treatment receipt (i.e., a non-renewed HOPE scholarship) in the first-stage equation using students’ observed cumulative CGPA in the checkpoint semester relative to the cutoff of 2.75. In this set-up, the treatment would have to be measured using forward-looking information on HOPE payments in the next semester’s data. Yet, the presence or absence of information on future HOPE payments would be partially contingent upon possible outcomes of the assigned treatment (e.g., stopping out of community college, transferring to a private or out-of-state postsecondary institution), which would thus invalidate the causality of the design (Cummings et al., 2022). Nevertheless, CGPA strongly predicted treatment status for all samples used—approximately 90% of the full sample and of each income subgroup within 0.35 grade points of the cutoff (2.75) complied with the treatment assignment (Figure 1)—and results from our preferred model are comparable to those from a fuzzy RD model and those from a model excluding non-compliers below the CGPA cutoff (see online Appendix).

Regression Discontinuity Plots, by Subgroup.
Results were robust to additional, alternative specifications as well (see the online Appendix). We estimated models without covariates and fixed effects
Validity of the RD Design
The core identifying assumption of our RD design was that the relationship between stopout, transfer, or completion would have varied continuously across the CGPA threshold of 2.75 but for the existence of the HOPE renewal criteria. To interpret discontinuities in those outcomes at the threshold as the LATE of HOPE eligibility loss, we investigated threats to the smoothness assumption: precise manipulation of the running variable, heaping on the running variable, and systematic differences between groups near the threshold (Barreca et al., 2016; Imbens & Lemieux, 2008; Lee & Lemieux, 2010).
No precise manipulation
First, students must not have deliberately manipulated their CGPA values to avoid losing their HOPE scholarship (i.e., no precise manipulation), which was conceivable in this setting. By design, the renewal policy strove to improve students’ academic performance. The CGPA criteria for losing HOPE eligibility was well-established in Tennessee at the time under study and, to some extent, known in advance by students and institutions. Students at risk of scholarship loss may work harder in their coursework or, conversely, reduce academic effort through strategic behaviors (e.g., enrolling in easier coursework, changing majors, withdrawing from courses, requesting grade changes) to retain the scholarship (Cornwell et al., 2006; Sjoquist & Winters, 2013). 17 Indeed, we observed a disproportionate concentration of students with CGPAs of 2.75 (Figure 2). However, the bunching at 2.75 was less pronounced than other CGPA values (e.g., 2.5, 3.0), suggesting that the increased concentration was a mechanical function of how CGPAs were measured (i.e., on a 4-point scale with fixed values) rather than student manipulation. Importantly, formal tests for manipulation of the running variable (Cattaneo, Idrobo, & Titiunik, 2019; Cattaneo, Titiunik, & Vazquez-Bare, 2019)—for the full sample and each subgroup under study—failed to reject the null hypothesis of no difference in the density of students about the cutoff, suggesting that there was no threat from precise manipulation. 18

Cumulative GPA at First Checkpoint (with and without Heaped Values).
Bias from non-random heaping
Heaped observations on the running variable occurring away from the threshold—as observed at CGPA values of 2.5 and 3.0 in Figure 2—could bias RD estimates if students with particular attributes related to the outcomes of interest were disproportionately represented at those points (Barreca et al., 2016). We investigated this matter by regressing a full set of observed covariates on the running variable, treatment indicator, and an indicator variable (=1) that jointly represented all the heaped values of the running variable at increments of 0.125 grade points (Table 4). 19 The heaped indicator was significant at the 5% significance level for several student-level controls. In response, we incorporated flexible controls corresponding to each value of the running variable in increments of 0.125 along the distribution of CGPA in our preferred local linear nonparametric regression model (Equation 1; Barreca et al., 2016). Including controls corresponding to heaped values of CGPA depended upon the size of the bandwidth selected. All results under this preferred specification were robust to various alternative specifications for addressing heaping (see the online Appendix). 20
Estimates for Having Heaped CGPA Values at 0.125 Increments
Note. Robust standard errors are reported (SE), clustered by postsecondary institution. HS = high school; GAMS = general assembly merit scholarship; CP1 = check point 1 for HOPE renewal; CGPA = college grade point average; BW = bandwidth.
p < 0.1, **p < 0.05, ***p < 0.01.
Smoothness of student-level covariates
A final condition for internal validity was that any discontinuities in outcomes observed at the eligibility threshold must be solely attributable to HOPE eligibility loss and not due to systematic differences between students below and above the threshold. We tested this condition in several ways, all providing evidence against violations of the smoothness assumption (see the online Appendix). 21 As one example, we regressed each of the full set of observed covariates on the running variable and treatment indicators. All but one of the 23 controls—majoring in Health during the checkpoint semester—were statistically insignificant at the 5% level (Table 5). In sum, these findings suggested that our preferred RD specification did not violate the underlying identifying assumptions.
Effect of Losing HOPE Eligibility at the First Checkpoint on Student-Level Observable Characteristics
Note. Robust standard errors are reported (SE), clustered by postsecondary institution. All models use data-driven optimal bandwidths (BW). As in the main text, losing HOPE eligibility is defined as having a cumulative HOPE GPA below 2.75 at the first renewal checkpoint. HS = high school; GAMS = general assembly merit scholarship; CP1 = check point 1 for HOPE renewal; CGPA = college grade point aaverage; BW = bandwidth.
p < 0.1, **p < 0.05, ***p < 0.01.
Results
We present evidence of the effect of losing HOPE eligibility on student stopout, transfer, and associate degree completion for both the full sample of community college students and each income subgroup. To establish priors for our point estimates, one rigorous meta-analysis of the effects of merit aid receipt found null effects on year-to-year persistence, small and imprecise increases in on-time completion, and a significant 2.7 percentage point increase in delayed completion across studies (Nguyen et al., 2019). Carruthers and Özek (2016) found substantial and significant decreases in fourth-term persistence and null effects for on-time associate degree completion for community college students losing HOPE. They also found small and statistically insignificant effects on persistence and on-time completion for low-income students from 2- and 4-year colleges in response HOPE loss. Cummings et al. (2022) found that Tennessee HOPE eligibility loss caused an increase in next-semester stopout for 4-year recipients and cross-sector transfer for 4-year Aspire recipients.
Stopout
Losing HOPE eligibility caused a 7-percentage point increase in the probability of stopping out two semesters after the checkpoint (Table 6). For context, approximately 9% and 18% of above-threshold students stopped out in the first- and second-post checkpoint semesters, respectively, demonstrating significant negative consequences of HOPE eligibility loss. Our estimates aligned with those of Carruthers and Özek (2016), who found that losing HOPE at the first checkpoint reduced within-sector persistence among community college students two semesters later by 8.6 percentage points, although Cummings et al. (2022) found no effect of HOPE eligibility loss on next-semester stopout among 4-year students.
Effect of Losing HOPE Eligibility at the First Checkpoint
Note. Robust standard errors are reported in parentheses, clustered by postsecondary institution. All models include student-level controls, institutional and cohort fixed effects, and use data-driven bandwidths (BW). BW = bandwidth.
p < 0.1, **p < 0.05, ***p < 0.01, where tests are adjusted using the Bonferroni-Holm correction for multiple hypothesis testing.
Examining the effects of scholarship loss for each income subgroup, no statistically significant effects on stopout in one or two semesters after the checkpoint were observed. There was a relatively large, albeit statistically insignificant, 12-percentage point increase in stopping out two semesters after the checkpoint for Aspire students who lost HOPE eligibility. For non-Aspire/Pell and non-aspire/non-Pell students, stopout increased by 6.2 and 4.0 percentage points, respectively. However, these point estimates were not statistically significant after implementing corrections for multiple hypothesis testing. Further, the t statistics in the first two columns of Table 7 do not reach conventional thresholds for statistical significance, suggesting that there are no differential effects across subgroups for either of our stopout measures. For reference, Carruthers and Özek (2016) observed an imprecise 3.3 percentage point reduction in persistence for low-income students, though this estimate is based upon a pooled sample of students from both the 2- and 4-year sectors in their analysis. 22
Significance Tests of Heterogeneous Treatment Effects by Income Group
p < 0.1, **p < 0.05, ***p < 0.01.
Transfer
Losing HOPE eligibility also reduced the likelihood of community college students later transferring into Tennessee’s public, 4-year sector (Table 6). Below-threshold students experienced a statistically significant 5.3 percentage point reduction in cross-sector transfer within 4 years of college entry. 23 Roughly 36% of above-threshold students transferred in that timeframe, meaning that losing HOPE eligibility had a noteworthy relative effect on below-threshold students.
The negative effect of losing HOPE eligibility observed for cross-sector transfer for the full sample of community college students was again exacerbated for Aspire students but not for the other, higher-income groups. Aspire students experienced a 14 percentage point reduction in the likelihood of transfer to a public, in-state, 4-year institution. At baseline, 28% of above-threshold Aspire students transferred to in-state, public 4-year colleges, resulting in a 50% reduction in the probability of transfer for those who lost eligibility. We did not observe any precisely estimated reductions in transfer for either non-Aspire/Pell or non-Aspire/non-Pell community college students. Here, we observe differential effects for Aspire students that point to statistically significant negative differences in transfer outcomes relative to the other subgroups (Table 7).
Associate Degree Completion
Losing eligibility for HOPE did appear to impact on-time associate degree completion for community college students, but those effects did not hold when examining completion within 4 years (Table 6). HOPE eligibility loss lowered on-time degree completion by 5.4 percentage points, with no statistically significant effect on associate degree completion within 4 years. 24 Our results for on-time completion differed from Carruthers and Özek (2016), who did not find a statistically significant relationship between losing HOPE and on-time completion for earlier cohorts of community college students in Tennessee. 25
The effects of lost eligibility on degree completion outcomes varied somewhat between the subgroups and the full sample. 26 For Aspire recipients, there were small and imprecise effects in 100%-of-time associate degree completion. That may be expected, as on-time completion occurred infrequently for above-threshold Aspire recipients (7.6%). As with stopout and transfer, losing HOPE eligibility affected the lowest-income students most severely with respect to 4-year completion; there was a substantial, negative effect on completion within 4 years for this group. Aspire recipients experienced a statistically significant 17.2 percentage point decrease in 200%-of-time to associate degree completion in response to losing HOPE eligibility. For context, 36% of above-threshold Aspire students completed associate degrees in 4 years, indicating this result’s magnitude on the lowest-income community college students.
In contrast, non-Aspire/non-Pell students experienced steeper declines in on-time completion due to HOPE loss than either Aspire students or the full sample of community college students. The higher-income subgroup experienced a nearly 11 percentage point reduction in associate degree completion in this time frame, with a baseline completion rate of 18.6%. Higher-income students did not experience a statistically significant decrease in associate degree completion within 4 years, although the estimated effect was somewhat larger in magnitude (7.7 pp; on a base of 45.5%) than that of the full sample (5.1 pp; 41.4% base). Considering the completion effect alongside the small and imprecise effects observed for stopout, there is reason to believe that losing HOPE eligibility generally prolonged associate degree completion for higher-income students rather than precluding attaining the degree altogether.
Discussion
State-level merit aid programs are commonplace across the country, as are evaluations of their effects on access and student success (Bettinger et al., 2019; Bruce & Carruthers, 2014; Castleman, 2014; Gurantz & Odle, 2022; Scott-Clayton, 2011; Scott-Clayton & Zafar, 2019). One common provision of many state merit aid programs is continued eligibility requirements involving academic performance (Education Commission of the States, 2021). Researchers are just starting to accumulate evidence on the consequences of losing merit aid (Carruthers & Özek, 2016; Cummings et al., 2022; Jones et al., 2022; LaSota et al., 2021), even though many enrolled students fail to renew their merit aid scholarships (Henry et al., 2004).
Our results demonstrate the negative effects of losing eligibility for the HOPE scholarship on community college students’ stopout, transfer, and completion outcomes. The increased rate of stopping out in the short-term, as well as the negative effects on longer-term transfer and completion outcomes suggests that HOPE eligibility loss is not only increasing student departure as prior studies suggest (Carruthers & Özek, 2016) but that it also alters longer-run outcomes. More students likely would have completed an associate degree or transferred to a 4-year institution had they not lost their scholarship eligibility.
Our analysis also underscores the importance of disaggregating results to explore differences in treatment effects. HOPE eligibility loss differentially affects students across income levels, with the lowest-income students experiencing the most detrimental effects. Aspire recipients experience large, negative effects on cross-sector transfer and associate degree completion within 4 years, and the magnitude of these relationships is obscured in the full sample results. Aspire recipients are likely more price-sensitive than higher-income students and are subject to larger financial shocks due to losing eligibility for the baseline HOPE award and the Aspire supplement. Aspire recipients may also disproportionately experience administrative and nonpecuniary costs that accompany scholarship loss as well as conditions of financial strain that constrain their educational decision-making processes. Losing eligibility for HOPE (plus the Aspire supplement) is an explicit roadblock for the lowest-income community college students—at least those local to the CGPA cutoff—subverting associate degree attainment as well as some measure of bachelor’s degree attainment by way of cross-sector transfer.
Higher-income students—the non-Aspire/non-Pell students in our analysis—do not experience the same degree of adverse effects from losing eligibility for their HOPE scholarship. These findings align with our expectations, as higher-income students are typically less sensitive to fluctuations in the price of college because they often have access to other sources of funds to fill gaps in unmet financial need. Higher-income students may avoid or have greater capacities to absorb the administrative and nonpecuniary burdens resulting from scholarship loss as well. We do not observe clear evidence of increases in stopout or reductions in 200%-of-time to credential completion for this group. However, we see declines in on-time associate degree completion. So, unlike Aspire recipients, losing eligibility for the HOPE scholarship appears to be a speedbump for higher-income students, simply delaying rather than reducing degree completion.
Policy Implications
The estimated HOPE loss impacts for the lowest-income subgroup warrant further discussion. Aspire recipients exhibit the most pronounced detrimental effects in response to losing eligibility for the HOPE scholarship, with negative effects on cross-sector transfer and associate degree completion within 4 years. These adverse effects of losing HOPE eligibility compound earlier disparities in who receives and loses HOPE. Compared to their higher-income peers, the group of Aspire and Pell-eligible students are less likely to receive HOPE; more likely to lose HOPE eligibility; lose the largest amount of money (since they are receiving an additional supplement); and, as shown in our analysis, experience the most detrimental effects upon losing eligibility for HOPE. The confluence of these negative trends necessitates concentrated attention on how best to support low-income students and reduce any unnecessary administrative burdens associated with participation in the HOPE scholarship program (Herd & Moynihan, 2018).
Implications for low-income students
States or individual institutions could proactively target additional support towards Aspire recipients before HOPE eligibility loss. Providing supplemental student services alongside financial aid has been found to be crucial to helping students successfully navigate college (Dickason et al., 2023). Kennesaw State University in neighboring Georgia is a model of what this program could look like with its Thrive Scholars Program (Kennesaw State University, 2023). The program provides newly matriculated Georgia HOPE recipients with shared courses, academic coaching, and learning communities to increase the number of Thrive Scholars who maintain their Georgia HOPE scholarship. Implementing a similar program in Tennessee could help low-income students avoid scholarship loss and improve their completion outcomes. Additionally, community college programs that serve low-income students may already exist in Tennessee (e.g., TRIO-funded centers). States or institutions could tap into those programs and build capacity or connect Aspire recipients with those services with little additional funding needed.
Institutions could also implement an early warning system to identify students at risk of losing their HOPE scholarship before the checkpoint semester. Given the gap between the Satisfactory Academic Progress threshold at community colleges (often around 2.0) and the HOPE renewal eligibility threshold (2.75), students on the margin of HOPE eligibility loss may not get automatically placed into existing academic support. States and/or institutions could identify scholarship recipients at risk of losing their award based on their GPA in pre-checkpoint semesters and proactively react before the loss occurs. Institutions could feasibly target resources and student success interventions such as advising, tutoring, and/or financial supports (Weiss & Bloom, 2022) toward these students before the HOPE renewal criteria sets in during the first checkpoint semester. At a minimum, institutions could ensure that the renewal criteria are clearly communicated to students at risk of scholarship loss to avoid students unexpectedly losing a substantial portion of their financial aid. These measures would address potential learning and compliance costs that students face when participating in the HOPE program, and they could simultaneously improve student outcomes, the efficiency of state financial aid expenditures, and institutional performance on state or federal accountability measures.
Implications for merit aid renewal criteria
Tennessee may also reflect on whether the renewal criteria are operating as intended, and our analysis provides relevant evidence for this discussion. First, we find strong adverse effects resulting from HOPE eligibility on key outcomes, with these effects concentrated among the lowest-income HOPE recipients. Low-income students are among those who have the most to gain from financial aid, yet the eligibility criteria appear to be a roadblock to their collegiate success. Second, if the renewal criteria are a strong motivator for students to improve academic performance—arguably one of the primary intentions behind the criteria—we would expect to observe manipulation in CGPAs at the first checkpoint as students try to retain their financial aid. However, as we explain above, there is no evidence of this type of CGPA manipulation behavior at the first checkpoint. If the renewal criteria are not operating as intended and, in fact, are disproportionately harmful to the lowest-income students, Tennessee may consider whether reforms to the criteria are justified.
We propose two potential changes to the renewal structure: lowering the CGPA threshold and introducing a probationary period for students who fall below the threshold. First, the state could lower the CGPA renewal threshold, either creating a community college-specific threshold for the first checkpoint or making this adjustment across both 2- and 4-year institutions. Doing so would likely decrease the number of students who lose eligibility, thereby decreasing the likelihood of stopout and increasing credential attainment—at least for those students whose checkpoint CGPA falls near the current 2.75 renewal threshold. There are positive economic returns to community college credential attainment, especially in career and technical fields of study (Belfield & Bailey, 2017; Carruthers & Jepsen, 2021). Not meeting the renewal criteria imposes costs on students—who may forego credential attainment and subsequent earnings gains—and the state—by way of a lesser educated population and potentially higher social service spending. If Tennessee wishes to maintain the merit scholarship aspect of the program, it could keep the initial eligibility criteria but remove or lower the renewal CGPA requirements. Although our analysis does not identify the optimal threshold, one option could be to use the Satisfactory Academic Progress criteria at the student’s institution. One consequence, however, is the potential downstream impact on students interested in transferring to a 4-year institution if the renewal threshold is lower than the 4-year threshold. Furthermore, the current renewal criteria require a higher CGPA of 3.0 at the third and fourth checkpoints (72 and 96 credits, respectively), adding further complexity to the situation. Any evaluation or alteration of the renewal criteria should involve cross-sector collaboration and engagement from postsecondary institutions in the state to ensure consideration for the complexities outlined above.
Rather than alter the renewal criteria, the state could also soften the consequences when students do not meet them. The effects of HOPE eligibility loss for Aspire students do not immediately appear, as demonstrated by the statistically insignificant estimated effect of HOPE eligibility loss on the probability of stopout in the following semester. This delayed effect presents an opportunity to intervene before students depart the institution. There is already a mechanism for students to regain their lost HOPE scholarship, specifically if they meet the renewal criteria at a subsequent checkpoint (i.e., exceed 2.75 cumulative CGPA in the semester in which they reach 48 attempted credits). Of course, this provision requires these students to increase their grades while also grappling with how to fill the financial gap left by losing HOPE support. Given Carruthers and Ozek’s (2016) finding that students substitute work for their lost HOPE dollars, Tennessee could consider providing a temporary, reduced scholarship to below-threshold students as they work to raise their grades, increasing the amount of time students have to improve their CGPA. Georgia State University’s Keep HOPE Alive program (Georgia State University, 2023) provides $500 per semester alongside workshops, academic coaching, and advising to students who lost their Georgia HOPE scholarship. Ribar and Rubenstein (2023) find that students at Georgia State University are more likely to regain their scholarship than below-threshold students at other Georgia institutions. The authors attribute this to the program’s additional supports and financial cushioning.
The responses and interventions discussed in this section represent a few of the many levers available to Tennessee in addressing the adverse effects on students resulting from HOPE eligibility loss. While we present various options that the state may consider, our analysis does not reveal the underlying mechanisms behind our findings, meaning that we lack sufficient evidence in favor of any specific reform. As explained in more detail below, qualitative work with students and administrators that provides insight into how students experience the HOPE program and subsequent scholarship loss could reveal possible paths forward and the policy changes most beneficial to students. However, in choosing whether to pull any of these levers, the state must consider college access goals, accountability issues, and balanced budget requirements. If the criteria are modified or additional supports are provided to increase the number of students who reach and pass the first checkpoint, this may have downstream financial ramifications. The state must consider the tradeoff between incurring higher short-term costs associated with providing additional student services and extending HOPE scholarship eligibility for more students, and the potential long-term benefits of increased state tax revenues (for those states with state income tax) and reduced social service spending resulting from helping more students complete an associate degree and/or transfer to a 4-year institution.
Implications for other states’ merit aid programs
Several other states—Georgia (Georgia Student Finance Commission, 2023), New Mexico (New Mexico Higher Education Department, 2023), and South Dakota (South Dakota Opportunity Scholarship, 2023), to name a few—have merit aid programs with CGPA criteria as conditions for continued eligibility (ECS, 2021). Our results can thus inform other states with comparable requirements about the consequences of losing eligibility for merit aid and how merit aid loss differentially affects students with different levels of financial need. One caveat is that results on the effects of postsecondary financial aid programs in one state do not always generalize to other state contexts, given differences in the intricacies of program design and implementation and student populations served (Domina, 2014). At the very least, other states should investigate whether and how their existing merit aid programs serve their lowest-income student populations.
Future Research
There are extensions of this work that can be addressed in future research. For one, there are additional academic and labor-market outcomes that would more fully illustrate the consequences of losing HOPE eligibility. Using transcript-level data, researchers could investigate enrollment intensity outcomes, exploring whether scholarship loss induces students to take additional courses to finish more quickly or, alternatively, reduce course loads in response to the short-term increases in college costs. Using workforce data, researchers could evaluate students’ earnings and employment outcomes, assessing whether students increase labor-market participation to compensate for scholarship loss in the near term and the extent to which scholarship loss affects students’ long-run earnings potential. Using state or institutional financial aid data, analysts might further investigate the extent to which students differentially mitigate scholarship loss by taking up student loans. We were unable to analyze any of these outcomes due to data constraints.
Our analysis incorporates AY 2010 to 2011 through 2015 to 2016 cohorts of HOPE recipients at community colleges, but researchers could incorporate additional cohorts of students from more recent years. Including additional cohorts would result in a larger analytic sample of HOPE recipients. In our analysis, after restricting our sample to community college students who had persisted to the first HOPE renewal checkpoint, we lack sufficient statistical power to investigate differential effects for other demographic characteristics such as race/ethnicity and any interaction effects between income status and other student characteristics. There is reason to believe one might observe differences in effects based on results in the 4-year context: low-income Black students experience increased transfer to 2-year institutions, and Black students overall experience a decline in 150% graduation rates due to HOPE eligibility loss (Cummings et al., 2022). Likewise, the mechanisms identified in the conceptual framework may function differently across race/ethnicity, as previous research has found (Allen & Wolniak, 2019; Boatman et al., 2017). Indeed, if there are differences by income and race/ethnicity, this could influence the subsequent policy implications. Incorporating additional cohorts in future studies would allow for additional sub-group analyses. 27
Finally, qualitative research might examine how students perceive the HOPE renewal criteria before and during the first checkpoint semester and the effects of scholarship loss after its imposition. This work would increase understanding of the student-level mechanisms underlying the observed effects of scholarship loss and why student responses differ according to student income which, in turn, could inform future quantitative work on this topic (e.g., DeLuca et al., 2021). Embedded in our discussion are the assumptions that HOPE recipients are broadly familiar with the timing of and requirements for renewal and that the lowest-income HOPE recipients have comparatively less access to or knowledge of alternative funding sources to offset scholarship loss than their higher-income peers. Hearing from scholarship recipients at different points in their educational trajectories would enhance our understanding of the role that aid receipt and loss play in their decision-making. Financial aid is a notoriously complex system for students to navigate (Dynarski et al., 2022), and there may be differences in how scholarship renewal information is communicated to students within and across Tennessee institutions. Additional qualitative investigations of colleges’ formal and informal practices—when and how eligibility criteria are communicated before reaching checkpoints and when students are informed of eligibility loss—could unearth insights that explain observed results and inform future program design and implementation reforms.
Supplemental Material
sj-docx-1-ero-10.1177_23328584251393688 – Supplemental material for Examining How Income Influences the Effects of Losing Merit Aid Among Tennessee Community College Students
Supplemental material, sj-docx-1-ero-10.1177_23328584251393688 for Examining How Income Influences the Effects of Losing Merit Aid Among Tennessee Community College Students by Rooney Columbus, Kristen M. Cummings, KC Deane, Joshua Skiles, Stephen L. DesJardins and Brian P. McCall in AERA Open
Footnotes
Acknowledgements
We thank the Tennessee Higher Education Commission (THEC) for providing access to data used in this study. We also thank Amanda Klafehn, Dominique Baker, Rajashri Chakrabarti, Laura Perna, and members of the DesJardins-McCall research team for helpful comments, as well as attendees of the CIERS seminar at the University of Michigan, the SHEEO 2021 Public Investment in Higher Education webinar series, and the AEFP 2023 conference. Cummings, Deane, and Skiles gratefully acknowledge the funding support of the Institute of Education Sciences (IES), U.S. Department of Education under Grants R305B200011 and R305B150012. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of THEC or IES. Any errors are our own.
Open Practices Statement
Although we are not authorized to make our data available through a data archive, we describe how one would obtain the data which would generate the results in our article.
The student-level administrative data used for this analysis was obtained through a data use agreement with the Tennessee Higher Education Commission / Tennessee Student Aid Commission (THEC/TSAC).
Most of the data used in this analysis is also available through the P20Connect TN Data. You will need to submit a “Data Request Application for External Research Partners” to the P20 Data Governance Coordinator. This application includes a research proposal and is reviewed by representatives from each state agency that contributes data to the P20 system. These data require a data use agreement to be signed. Data is accessed via a remote desktop.
Columbus, Rooney, Cummings, Kristen, Deane, KC, Skiles, Joshua, DesJardins, Stephen, and McCall, Brian. Examining how income influences the effects of losing merit aid among Tennessee community college students. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2025-10-09.
.
Notes
Authors
Dr. ROONEY COLUMBUS is the principal and founder of E&E Analytics, a research consultancy (
Dr. KRISTEN M. CUMMINGS is a postdoctoral fellow at Harvard University’s Center for Education Policy Research (
Dr. KC DEANE is the Associate Director of Research and Program Evaluation at the Washington Student Achievement Council, the state’s higher education agency (
JOSHUA SKILES is a doctoral candidate at the University of Michigan’s Center for the Study of Higher and Postsecondary Education and an IES predoctoral fellow at the University of Michigan (
Dr. STEPHEN L. DESJARDINS is the Marvin W. Peterson Collegiate Professor Emeritus at the University of Michigan’s Marsal Family School of Education (
Dr. BRIAN P. MCCALL is a Professor of Education, Economics and Public Policy at the University of Michigan (
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
