Abstract
This study examined predictors of persistence and fadeout across multiple cluster randomized controlled trials that evaluated a preschool mathematics curriculum. We used meta-analytic methods to explore how impacts on student mathematics achievement faded between post-test (i.e., endline) and 1-year follow-up. We found that the magnitude of the impact at post-test was a strong predictor of the 1-year follow-up impact. Contrary to popular theory, we found that intervention impacts faded faster when students attended sites that showed more learning in the year following the end of the intervention. Factors related to intervention fidelity and dosage did not strongly predict fadeout patterns after considering the magnitude of the post-test effect. Results suggest that educational program evaluators can use immediate impacts to forecast follow-up effects.
Keywords
Get full access to this article
View all access options for this article.
