Abstract
This paper describes a framework for educating future evaluators and users of evaluation through community-engaged, experiential learning courses and offers practical guidance about how such a class can be structured. This approach is illustrated via a reflective case narrative describing how an introductory, undergraduate class at a mid-size, public university in the northwest partnered with a community agency. In the class, students learned and practiced evaluation principles in the context of a Parents as Teachers home visiting program, actively engaged in course assignments designed to support the program's evaluation needs, and presented meta-evaluative findings and recommendations for future evaluation work to the community partner to conclude the semester. This community-engaged approach to teaching evaluation anchors student learning in an applied context, promotes social engagement, and enables students to contribute to knowledge about effective human action, as outlined in the American Evaluation Association's Mission.
Keywords
Introduction
This paper describes a framework for educating future evaluators and users of evaluation through community-engaged, experiential learning courses in partnership with community agencies. Community-engaged learning (CEL) is a high-impact educational practice consistent with the field's understanding of best practices in teaching evaluation; however there is little guidance available on how to implement it in an undergraduate, introductory evaluation course (Alkin & Christie, 2002; Trevisan, 2004). This paper addresses that gap by sharing a reflective case narrative (RCN) of an undergraduate class at the University of Alaska Anchorage (UAA) describing the instructor's rationale for incorporating a CEL component into the course and providing details on how the project unfolded. The RCN framework was selected to organize and provide a structure for systematic reflection on this course's transformation and lessons learned from the change (Becker & Renger, 2017).
The goal of the RCN structure proposed by Becker and Renger (2017) is to make information shared in a case study more engaging and encourage reflection by the audience. The RCN framework builds on adult learning theory and the strengths of a case study methodology to drive practice improvement. RCNs are explicitly “evaluative in nature,” providing a lens to both describe and reflect on the facts within the case study (Misra et al., 2021, p. E367). Becker and Renger’s (2017) suggested RCN structure to guide systematic reflection includes five sections: (1) Background and Context, (2) Motivation and Scope of the Evaluation Strategy, (3) Review of Subjectivity, (4) Evaluation Strategy in Context and Indicators of Success, and (5) Lessons Learned (Becker & Renger, 2017). This article follows the suggested structure, with minor modifications (e.g., “Strategy in Context” vs. “Evaluation Strategy in Context”), and offers a case study describing an applied way to integrate CEL with program evaluation for undergraduate students, which is not well-represented in the existing literature.
CEL is the umbrella term for structured learning experiences for students that rely on university-community partnerships (Chupp & Joseph, 2010). CEL builds upon reciprocity between a university and a community partner, recognizing that each offers the other something of value (Kellogg Commission on the Future of State and Land-Grant Universities, 1999). Courses designed to foster CEL, such as the
CEL is a transdisciplinary pedagogical strategy, utilized with students enrolled in undergraduate, graduate, and professional programs. CEL is utilized across the curriculum, in courses teaching public health (e.g., Comeau et al., 2019), library and information science (e.g., Johnson & Lasher, 2020), economics (e.g., Schraner & Hayward-Brown, 2010), engineering (e.g., Benitz & Yang, 2020), and in medical and pharmacy schools (e.g., Fan et al., 2021; Meurer et al., 2011), as well as in many other disciplines. Research to date confirms this type of experiential learning is assessed positively by students and supports their learning and perceived connection to the community (Astin et al., 2000; Berard & Ravelli, 2021).
In the case presented, novice students learned and practiced evaluation principles in the context of a Parents as Teachers home visiting program, actively engaged in course assignments designed to support the program's evaluation needs, and presented meta-evaluative findings and recommendations for future evaluation work to the community partner at the end of the semester. This community-engaged approach to teaching evaluation anchors student learning in an applied context, promotes social engagement, and enables students to contribute to knowledge about effective human action, as outlined in the American Evaluation Association's Mission (AEA, 2016; Brandon, 2014).
Background and Context
Course Characteristics
Health Sciences 420
Instructor Background
It is widely understood that program evaluation is “a field with a can-do attitude,” which welcomes individuals from a wide variety of backgrounds (Stevahn et al., 2005, p. 43). Many evaluators build their skill in the field through on-the-job experiences and training, as well as formal educational experiences. A survey of the American Evaluation Association's
My background is consistent with this “can-do” pattern: I came to the university after 5 years performing evaluation work in a tribal health context. My graduate training in organizational and developmental psychology provided me with a rigorous methodological background, but my evaluation-specific skills and evaluative judgement were built performing program evaluation work. I identify as an evaluator and am active in my professional community, yet found myself frustrated teaching evaluation to students. Over time I realized that teaching program evaluation in the absence of a program left me unmoored and my students confused, so I was determined to find a way to anchor our learning in a real-world context. Revising the course to partner with a local early childhood service provider interested in evaluation consulting offered a solution to this dilemma.
Community Partner Characteristics
Our community partner was the Rural Alaska Community Action Program (RurAL CAP) Parents as Teachers (PAT) early childhood home visiting program. RurAL CAP is a non-profit, statewide organization dedicated to improving the quality of life of low-income Alaskans (RurAL, 2017a). RurAL CAP offers early-childhood parent education and family support in 19 Alaska communities through the PAT program, using home visits and group socialization (RurAL, 2017b). The goal of PAT programs is to support all children so that they will learn, grow, and develop to realize their full potential. Our designated partner was the PAT statewide program manager.
At the time of our collaboration, RurAL CAP's PAT was a mature program, operating in Alaska for more than 15 years. The PAT program is funded in three year cycles and they were in the final year of a cycle, making it a useful time to review their existing evaluation reports to identify information for future competitive grant applications and consider their existing evaluation contract's scope. The PAT manager and I agreed that the
Motivation and Scope
Motivation for the Course Revision & CEL Projects
The first time I taught the undergraduate program evaluation course, I followed the pattern set by former instructors: we had a textbook, which the students read each week. Students took quizzes on vocabulary and created needs assessments and evaluation plans for programs they invented, serving populations they identified. It was a fairly typical undergraduate evaluation course, but we joked that it was 10% fiction writing, since each assignment required students to respond to fictional programs I shared, invent details about existing programs, or devise an imaginary program in order to, for example, demonstrate their ability to craft evaluation questions.
Teaching program evaluation without a program felt impossible to do well, which left me grappling with pedagogical challenges similar to those faced by other instructors of stand-alone, introductory evaluation courses (Davies & MacKay, 2014). To address these challenges, I revised the course to incorporate CEL experiences focused on addressing the evaluation needs of a real program. The course assignments were altered to align with the community partner's requests, while fulfilling course learning objectives.
I was committed to providing students with hands-on practical experiences, as is widely recommended as a best-practice in the field (i.e., Alkin & Christie, 2002; Trevisan, 2004). Trevisan’s (2004) literature review identified four types of hands-on training for evaluation students: simulation, role play, team or individual evaluation projects for clients, or practicum experiences. My students were not ready to complete independent evaluation projects either individually or in teams, eliminating the last two approaches. This left classroom-only simulation and role-play options or pursuing a new tactic to provide students with an experiential, CEL experience.
High-impact educational practices, including CEL, have been found to reliably increase rates of student retention and engagement, making them a natural fit for the evaluation classroom (Kuh, 2008). CEL is consistent with my pedagogical emphasis on promoting students’ social engagement and real-world competence. This alignment encouraged me to partner with a local agency to develop a project that harnessed our classes’ collective energy and commitment to learning without over-promising on what we could deliver.
Introductory Evaluation Course Context
The
Scope of the Course Revision
The revision of this course focused on anchoring student learning in the context of an existing community program by (a) using what we knew about PAT to illustrate evaluation concepts (i.e., developing a logic model for their program), (b) changing course assignments to reflect PAT program needs, and (c) offering students the chance to interact with the “client” as learners (during the program manager's presentations to the class) and evaluators (during students’ presentations to PAT program staff). The course learning objectives, content, and focus on professionalism were held consistent through this change.
During this revision, I also chose to replace the commercially published textbook with zero cost materials by curating a “reader” of evaluation materials for students from many sources, including CDC-created materials, Kellogg Foundation publications, AEA's
Course Teaching Methods
To teach this
Scope of Partnership With Community Organization
I approached RurAL CAP leadership approximately 4 months before the course began, asking if there was a program under their organizational umbrella that would be interested in partnering with our class. The RurAL CAP PAT program was identified as a potential partner and I met with the program manager in person twice over the summer before class began in August. During those meetings we brainstormed about what our class could realistically offer and the program manager shared what they would find useful: meta-evaluative work identifying the strengths and limitations of their existing evaluation strategy and any support we could offer to link the outputs they track to outcomes that would be of interest to their funder. These products were tied to the program's upcoming proposal for continuation funding, in which they needed to strengthen their evaluation plan and link the data they had (mostly outputs) to the outcomes of interest to the funder. We also discussed what the class needed from their staff, to be able to provide the products they requested.
After I (as the instructor) and the program manager reached agreement on what our partnership would look like, we drafted a one-page description of the project and partnership, which the RurAL CAP Advisory Board overseeing the PAT program and the Chair of the UAA Department of Health Sciences both signed before the semester began. This document described the overall goals of the project—(1) to improve the training of the future healthcare workforce by grounding student learning around evaluation of a real program, and (2) to provide RurAL CAP with an outside perspective on their current evaluation materials, with an eye toward strengthening future grant funding applications to support PAT—and a list of activities, which was considered sufficient for review.
The PAT program is run out of a small building and provides the majority of services in communities outside the local area, which led the program manager and me to decide against a site visit; instead PAT program staff met with students on campus. Together the program manager and I decided that three visits—in weeks 2, 7, and 16 of the semester—would be adequate for program staff to provide students with necessary context and program details and for students to present their recommendations to program staff. The program staff also provided background documents that were shared with students via the course learning management system.
At our fourth class meeting, in week 2 of the semester, the PAT program manager presented a 75-min summary of PAT's history, goals, and services in great detail and shared program description documents, marketing materials, and existing reports with the students. Bringing PAT into the course early enabled me to use that program's context to provide examples as students learned about program theory, logic models, evaluation design, and other related concepts.
In week 7 of the semester, after we began discussing measurement, the PAT program manager returned to provide a 75-minute “Data Tour,” focused specifically on
Students’ relative naiveté to evaluation concepts was a challenge to implementing CEL in this course. Many students encountered evaluation as a discipline and the systematic thinking associated with it for the first time in this undergraduate course, limiting the expertise they could offer a community partner. Students were interested in providing useful material; however, their skills were not adequate to design, conduct, or report on evaluation activities during the approximately 4-month semester (acknowledging that by the end of the semester they had more skill, but less time remaining to complete these activities). This reality shaped the partnership between our class and the community partner.
Student Assignments
The products the community partner requested shaped the assignments students completed in the course. I cross-referenced the PAT program's requested products with the course learning objectives, designing assignments that would assess students’ progress toward learning objectives while simultaneously supporting the PAT program needs.
The students completed five assignments based on the PAT program. The first four assignments were completed individually by each student:
An Evaluation Question and Indicator Brief asked students to take on the role of the program's evaluator to develop new evaluation questions and identify outcome and process indicators to answer those questions. This assignment fulfilled a specific learning objective for the course, built students’ skills, and helped them understand the PAT program. A Data Quality Assessment asked students to carefully examine the data collection tools and methods for data collection to answer several questions: how reliable and valid are the measures being used? How is implementation fidelity accounted for? What additional questions (beyond those in the evaluation plan) could data collected using these instruments answer? What gaps are there between data being collected and what is needed to address questions identified earlier? What data is being collected but not used (i.e., appearing in instruments but not reports); should it be incorporated or not collected at all? What longitudinal outcomes could be examined? A Meta-Evaluation asked students to identify the strengths and limitations of the existing evaluation plan and implementation of that plan, highlight cultural and ethical issues relevant to that implementation, and make recommendations for future iterations of the project. A Literature Review asked students to look at what has already been done by others to glean insight that can be applied locally, with a particular focus on linking
The fifth assignment invited teams of students working together to draw from their previous work in the course to aggregate recommendations for the RurAL CAP PAT staff and present them in an executive summary and multimedia presentation. Each team had a distinct task: Team 1 reviewed the program's existing evaluation reports and data to highlight strengths and limitations of current work and assess data quality; Team 2 provided an evidence-based crosswalk linking child-focused outputs PAT already tracked to outcomes of interest in the literature; Team 3 provided a similar crosswalk linking family-focused outputs tracked by PAT to outcomes of potential interest to funders; and Team 4 proposed five operationalized longitudinal evaluation questions and provided a rationale for each, since the PAT staff had mentioned they were interested in adding a longitudinal component to their evaluations going forward.
Students used the 2 hour “final exam period” for the course to present these meta-evaluative conclusions and recommendations for future linkages and evaluation planning to three members of the PAT staff in person. The PAT manager and staff members engaged in lively conversation with students for an hour after the presentations concluded.
While this approach incorporated the work of every student in the final recommendation assignments, it was a shared project. Individual student projects were not sought for both pedagogical and practical reasons. Pedagogically, working in teams allowed students to delve deeply into a more narrowly scoped assignment and avoided the creation of 20 “competing” narratives for the community partner to consider. Each student was encouraged by pedagogical design choices to fully engage in CEL tasks, since the final, team-based assignment built directly on the first four assignments that were individually graded. Practically speaking, the instructor did not have the capacity to identify, scope, and adequately oversee 20 separate projects conducted by novice evaluators.
It is worth noting that students did not collect, review, or analyze data; instead they reviewed data collection instruments and aggregate project reports. Neither students nor their instructor had access to program data or participants. According to the UAA Institutional Review Board (IRB), the students’ coursework activities did not meet the definition of human subjects research and therefore did not require IRB review. I also confirmed that IRB review was not needed to describe within this paper some of the content from student course evaluations.
Review of Subjectivity
I identify as an evaluator and am active in my professional community, which contributes to my assumption that students benefit from learning evaluation in the context of a program. In this course, I strove to support my students’ development of their own identities as evaluators by explicitly incorporating AEA's
My past experiences conducting evaluation of a similar early childhood home visiting program in a tribal health context and my role as the mother of young children helped me establish smooth working relationships with RurAL CAP staff, who are focused on serving families with young children and parents themselves. Our class's university affiliation was a symbolic presence that facilitated the relationship with the agency, fostering trusting interactions with their staff. This collaboration was greatly supported by a shared assumption that undergraduate evaluation students would be able to provide helpful perspective on questions program staff had about their evaluation needs going forward (Becker & Renger, 2017).
Course Strategy in Context and Indicators of Success
Community-Engaged and Experiential Learning
The community-engaged course framework provides a much-needed option for experiential learning in evaluation, providing students with the opportunity to build understanding via participation and learn “through doing, rather than sitting and listening” (Alkin & Christie, 2002, p. 210). Building upon this goal, the community-engaged evaluation course outlined here offered experiential, participatory learning for students who would otherwise be limited to traditional, classroom-only learning. This course provided students with the opportunity to connect evaluation theory and practice in a way that accrued benefits to themselves and their community, while approaching a new program as external evaluators would and applying their evaluation learning in that context.
Process Indicators of Success
In Becker and Renger’s (2017) RCN framework, indicators are signs that point to success or failure as a project occurs. These indicators are subtle, often process-focused signals likely to be tacitly understood by experienced practitioners. Specifying them in a RCN is valued because it makes that tacit understanding explicit and thus accessible to newer practitioners. In their description of these indicators, Becker and Renger (2017) provide six guiding questions to identify indicators, which are answered below.
The design for this class changed rapidly over the course of our summer meetings and email exchanges, as the community partner floated their needs, I explained our classes’ expertise and limitations, and we negotiated a win-win course plan. At that point the design became static, and remained so throughout the semester. Decisions about how this community-classroom partnership would look were originally made by me as the instructor and the PAT program manager, then—at a high level—endorsed by both university and agency supervisors. These decisions were communicated in a one page executive summary both leadership teams reviewed and approved, then adapted into course assignments by the instructor and included in the semester syllabus.
Signs indicating our collaboration were on track included quick responses confirming the meeting times I proposed would fit the PAT speaker's schedule, the voluntary sharing of cell phone numbers by program staff who wanted to be reachable, and high levels of interest in the PAT material by students. Student enthusiasm for the partnership was conveyed via high rates of attendance on days PAT staff visited the classroom, engaged body language, and multiple students asking of thoughtful questions at the end of RurAL CAP presentations.
Indicators that suggested a change might be needed included two (of 20) students complaining that they plan to work with a different priority population (e.g., Veterans or those living unhoused) after graduation, making the PAT early childhood program irrelevant to their learning. To address this frustration, I revised the fourth assignment, offering students the option to complete a literature review on a topic relevant to the PAT program
Outcome Indicators of Success
In addition to enjoying the opportunity to work directly with the PAT program, students found the PAT program-specific program evaluation teaching very engaging, in part because it offered concrete examples of what evaluation components can look like. They noted that the instructor's use of PAT-specific examples for new concepts shared in class helped anchor the material and demonstrate the applied value of the course content (Student Course Evaluations, 2016).
The one negative theme that emerged from student feedback concentrated on the individual program selected: while many students were eager to engage with an early childhood program, a small number reported they were frankly bored with the selected program because their career plans did not involve work with children (Student Course Evaluations, 2016).
In addition, the RurAL CAP program and the Department of Health Sciences both benefited from the development of a functional, working partnership and are eager to find additional opportunities to work together. The PAT program manager has expressed a strong willingness to work together on similar projects in the future and encouraged me to connect with other RurAL CAP programs that could benefit from similar meta-evaluation work. His desire to introduce me to colleagues and foster new, related partnerships is the sincerest endorsement possible for the utility and strength of the community-engaged approach.
Lessons Learned
Benefits to Instructor & Students
As an instructor, I found it rewarding and, frankly, easier to teach program evaluation within the CEL framework. Designing the course outline and materials in partnership with the agency created a rich context that enabled students and me to build a shared understanding of programs and illustrate abstract evaluation terms and concepts with specific, grounded examples. The students benefited from the opportunity to learn about evaluation in a real-world context, which equipped them to be competent and confident when they encounter that type of complexity as practitioners, and from the opportunity to create and present meaningful evaluation products for a real-world client.
Based on feedback highlighting students’ preference to work on a program aligned with their existing interests, in the future I plan to incorporate at least two distinct programs from different public health domains into CEL courses. Students interested in child health and family well-being responded positively to PAT-related projects; students interested in—for example—substance abuse treatment and Veterans’ health were more likely to report they found the PAT-related assignments boring. Adding an additional program will provide students with a choice about which community partner they engage with, making it possible for them to select the program better aligned with their interests.
Preparation for Effective Partnership
This CEL course relied on a strong and effective partnership between the Parents as Teachers program staff and the Health Sciences department. The project was designed to emphasize the reciprocal nature of CEL, with both Health Sciences students and the community partner (RurAL CAP's PAT program) benefiting from the collaboration (Kellogg Commission on the Future of State and Land-Grant Universities, 1999).
The keys to this mutually beneficial outcome were scoping projects appropriately and setting clear and realistic expectations for one another during the project development phase. The partnership benefited from clear communication before and during the semester, which enabled us to calibrate both the class and the agency's expectations for one another. For example, the community partner understood that the students were novice evaluators who would be learning by doing, under the oversight of the instructor who is an experienced professional evaluator
Implications for Future Community Engaged Evaluation Courses
A community-engaged approach to teaching evaluation anchors student learning in an applied context, promotes students’ social engagement, and enables the course to contribute to knowledge about effective human action, as outlined in the American Evaluation Association's Mission statement (AEA, 2016; Brandon, 2014). The community-engaged evaluation course framework described in this article offers a useful approach for instructors interested in providing novice evaluation students with hands-on, practical opportunities to link learning with practice, before they are able to provide useful evaluation products individually or in teams. Information on the scope of our semester's work and specific assignments were included to provide a useful starting point for other instructors interested in pursuing similar projects with partners in their own communities, since much of the available literature on community-engaged teaching focuses on its benefits and utility, not how it is done.
Shifting pedagogy to build on the strengths of this framework requires purposeful effort. Unlike more well-known simulation or role-play strategies, CEL requires instructors to build community-campus partnerships and redesign syllabi to address specific partners’ needs each semester, increasing complexity during the course planning phase. While this process requires additional work by the instructor up front, it also creates an opportunity for faculty to strengthen their connections to the community and build relationships that will facilitate other types of work—including research—and improve the likelihood of student job placement after graduation. In my experience, the benefits clearly outweigh the additional demand, particularly because they promote students’ community connectedness and real-world competence.
CEL at the graduate level, where students have sufficient evaluation skills to design and conduct (supervised) evaluations, would necessarily look different. Similarly, some undergraduate students who completed this community-engaged introductory course later conducted more detailed evaluation work with community partners during their senior practicum placements, which built on this earlier learning and offered more focused time for students and community partners to collaborate.
RCN Framework
Becker and Renger’s (2017) suggested RCN guidelines provided a helpful organizing framework for this case study. The five recommended sections (Background and Context, Motivation and Scope of the Evaluation Strategy, Review of Subjectivity, Evaluation Strategy in Context and Indicators of Success, and Lessons Learned) offered a useful structure and encouraged me to include components that may otherwise been left out, such as the review of my own subjectivity and process indicators of success. The questions offered by Becker and Renger to guide systematic reflection within each of the five sections are concrete and specific, eliciting detailed descriptions of vital components. I encourage others writing case studies to consider this framework as a tool to guide deep reflection on the inputs, interactions, and processes that affect an evaluation or project.
Footnotes
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Ethics Statement
This project was reviewed by the University of Alaska Anchorage Institutional Review Board (project # 1815721-1) who confirmed it was research that did not meet the definition of human subject research under the purview of the IRB, according to federal regulations.
