Utilization-Focused Evaluation: Finding the Value in Evaluation Sarah Gill Evaluation Technical Advisor (contractor) in CDC s Air Pollution and Respiratory Health Branch
Session Objectives Introduce CDC s approach to Utilization-Focused Evaluation and CDC s Framework for Program Evaluation in Public Health Explain strategic evaluation planning process Encourage support for the implementation of the Ohio Asthma Program s strategic evaluation plan Demonstrate the need for broad-based participation throughout the evaluation cycle Describe how to focus an evaluation on information needs of decision makers
Session Objectives Yikes! Yay!
CDC s Framework for Program Evaluation in Public Health Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Research seeks to prove, evaluation seeks to improve
Evaluation is A systematic process for generating specific information that is useful valuable to a specific set of intended users.
Strategic Evaluation Planning Process Encompasses five-year cooperative agreement Reserves one year for planning Calls for state partners to Convene planning team and engage a broad array of stakeholders Evaluate three components: partnerships, surveillance system, interventions CDC provides funding for ½-time evaluator
CDC s Framework for Program Evaluation in Public Health Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Evaluation Standards Utility Feasibility Propriety Accuracy
Case Study: Asthma Education for Adults Urban area with many small, tight-knit communities based on ethnicity Group sessions; initial session is 3 hours 6-month follow up is group session focusing on problem solving Delivered in community setting Pharmacist as trainer Patients must be referred by provider, have AAP, and bring buddy
Step 1: Engage Stakeholders Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Potential Stakeholders Program staff Patients Pharmacies/pharmacists MDs Neighboring state interested in replicating program Host organizations Employers
Step 2: Describe the Program Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
INPUTS Staff (AE-C) Funding Partner Pharmacies Partner MDs Asthma patients w/ AAP & buddy Community hosts Connections to local communities/ good reputation Adult Asthma Education Intervention ACTIVITIES Find & adapt curriculum Recruit: Pharmacies MDs, Patients, Hosts Train pharmacists Obtain demonstration equipment Schedule trainings, follow ups (trainer, community host, patients) Evaluate OUTPUTS Appropriate curriculum Prepared trainers Equipment on hand Appropriate referrals Meeting spaces Trainings scheduled Evaluation findings Trainings held Improved patient knowledge re self-mgmt Greater patient sense of control Informed buddy Findings disseminated and used OUTCOMES Less patient exposure to triggers Better patient adherence to plan Better program More referrals Better quality of life Fewer missed work days
Step 3: Focus the Evaluation Design Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Focus the Evaluation Design Evaluation purpose Users Uses Questions Methods
Types of Evaluation Questions Process Were program activities accomplished? Were activities implemented as planned? Outcome (effects) Does the program exert intended effect or impact? Is there progress toward larger program goals? Most evaluations include both types of questions. May also include questions about lessons learned or future-focused questions.
Prioritizing Questions Utility Feasibility Propriety Accuracy Information need Centrality Disparities Challenges Focus Maturity of program Prior evaluation Cost (money and time) Reach
Feasibility: Selecting Your Evaluation Design Experimental Designs Random assignment to compare effects of an intervention on equivalent groups Quasi-experimental Designs Comparisons are made among non-equivalent groups Observational Designs Comparisons are made within groups (e.g., comparative case studies or cross-sectional surveys)
Generating Evaluation Questions Assume stakeholder role: Program staff Pharmacist Employer Person with asthma Program staff from neighboring state Generate potential evaluation questions
Step 4: Gather Credible Evidence Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Start Answering Your Questions Develop indicators Collect data
Developing Indicators Specific, observable, and measurable signs of a program s performance that measure Activities (process) Results (outcomes) Help tell the program s story Can complement evaluation but can t replace!
Collecting the Data Data collection methods Surveys Interviews Focus groups Document review Observation Secondary data analysis Use multiple methods whenever possible
Answering the MD s Question If my patient participates in this class, will she know when to use her controller meds vs. her rescue meds?
Step 5: Justify Conclusions Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Analyzing Data Assess data as appropriate for each method. Qualitative data Content analysis Domain analysis Discourse analysis Policy analysis Quantitative data Frequencies or simple counts Statistical tests for differences Multivariate modeling
Interpreting Data Facts are not enough to draw conclusions Different stakeholders will judge facts differently Process for building consensus on conclusions may be needed.
Step 6: Ensure Use and Share Lessons Learned Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence
Potential Uses of Evaluation Findings Assess process and practice Target areas for improvement Develop standardized tools Strategize changes to operations Prioritize activities and resources Identify practices for replication Train staff and others Garner political support Identify areas for future evaluation
Even more uses Use your results to meet other needs! Progress reports Use logic models, outcome reporting, analysis Stakeholder groups Help you implement interventions Advocacy Show off areas of effectiveness Justify funding Point to areas needing improvement Ask for more resources
Mechanisms for Sharing Evaluation Information Written reports Presentations Formal or informal Articles in newsletters, on websites Graphs, pictures, and illustrations Stories
Utilization-Focused Evaluation USE the process, then USE the findings, then Start over wiser. Yay!
Questions?
Sarah Gill sgill@cdc.gov Learning and Growing Through Evaluation www.cdc.gov/asthma/program_eval/guide.htm