Abstract
The case of a health careers program evaluation illustrates some aspects of evaluating very small and rapidly evolving programs. Such programs can change quickly, since there is no massive bureaucracy to restrain them. Rigid evaluation research methods would frustrate both researchers and program staff The authors suggest that flexible design, use of specialized interviewing and analytical experiments-in this case, telephone surveys testing program response-and a recognition that a consultative relationship exists will result in outcomes useful to the program and rich in evaluation data.
Get full access to this article
View all access options for this article.
