Abstract
Research design and greater attention to social context have figured prominently in efforts to improve evaluation utility. This essay continues a second, related line of inquiry: ways to analyze data to increase the precision with which program effects-or lack of them- can be understood. The heuristics suggested treat the issues of magnitude of effects, attribution of causality, and statistical reliability. These are illustrated through examining interpretations of data from an evaluation of the Career Intern Program, a project serving high risk, low-income youth.
Get full access to this article
View all access options for this article.
