Developing an Evidence Base to Support Service Delivery

Abstract: 

Determining if and how your program has an impact is not just about counting the number of people served, or recording the number of veterans or military family members you engage as national service participants; rather, it's by understanding how the service you've provided has made a difference over time, and being able to illustrate this through performance measures and supporting data.

Issue:

Funders — not just CNCS, but veteran's organizations, mental health organizations — want to know that the organizations they support are providing services or engaging a population with a proven strategy, something with supporting data behind it. Often programs can share only anecdotal evidence about the success of their services, or they "just know it works." While they may in fact be correct, without any concrete evidence on which to base these assertions, it can be hard to convince others that the service strategy is truly having the desired effect in the community. 

Action:

Reference any external evaluations of your program that have been conducted.
Positive third-party evaluations lend credibility to your program and help funders build confidence in your ability to create change through service.  

Program example: Montana Conservation Corps was among the 21 Corps selected to participate in the 2007 CNCS-funded National Evaluation of Youth Corps, the largest experimental design evaluation ever conducted of national service programs.

Reference successful replication of the proposed program or program component.
When your program's strategy has been proven successful elsewhere, it makes it easier for a funder to validate their support.   

Program example: CalVet Corps' California Action Plan for Reintegration has been successfully replicated in Florida.

Include a comprehensive evaluation plan.
Evaluation methods and the data they produce are grouped into two basic categories — quantitative and qualitative. In general, quantitative methods produce "hard numbers" while qualitative methods capture more descriptive data. By putting a comprehensive evaluation plan in place, you can collect all of the data that supports your committment to ongoing improvements and excellence.

Program example: CalVet Corps' plan includes both quantitative and qualitative data collection and analysis. 

Describe any continuous improvement processes in place at your program.
There are no guarantees that a program will get it right the first time, even with a well-though-out strategy; but if you're tracking the data and monitoring your processes, you can figure out where improvements are needed and change things up mid-course, continually striving for strong program outcomes. 

Program example: Family Services of Butler Memorial Hospital uses the Plan-Do-Study-Act (PSDA) model, which allows for planning and implementing activities, studying and reviewing the results/outcomes, and then using the results to act and revise the plan as needed for ongoing success. The PSDA process is used to implement changes or to try new procedures as appropriate for the continued provision of excellent service to the community.

Highlight existing research that supports specific interventions or activities.
Citing external resources that support (a) the benefits of participation in service, (b) the benefits of receiving your program's services, or (c) the efficacy of your service approach, adds to the strength of your fundraising pitch and, ultimately, your program's mission.  

Program example: To support the engagement of returning veterans as AmeriCorps members, Habitat for Humanity cites research indicating that physical, outdoor, voluntary activity has beneficial effects on post-traumatic stress disorder in younger veterans.

For more information:

Related Resources: 

Making a Difference: What We Know About Using an Evidence-Based Approach

Evidence: What It Is and Where to Find It

 


Enter Keywords to Find Resources:

 

 

Back to Top