Utilization-Focused Evaluation: Finding the Value in Evaluation

Similar documents
STEPS TO EFFECTIVE ADVOCACY

Clinical Quality in EMS. Noah J. Reiter, MPA, EMT-P EMS Director Lenox Hill Hospital (Rice University 00)

-Celebrating Your QI Success-

Tun your everyday simulation activity into research

ACADEMIC AFFAIRS GUIDELINES

The development of our plan began with our current mission and vision statements, which follow. "Enhancing Louisiana's Health and Environment"

STUDENT LEARNING ASSESSMENT REPORT

Intro to Systematic Reviews. Characteristics Role in research & EBP Overview of steps Standards

Probability and Statistics Curriculum Pacing Guide

Prevent Teach Reinforce

Innovation of communication technology to improve information transfer during handover

IMSH 2018 Simulation: Making the Impossible Possible

Goal #1 Promote Excellence and Expand Current Graduate and Undergraduate Programs within CHHS

Service-Learning Projects in a Public Health in Pharmacy Course 1

NC Global-Ready Schools

RtI: Changing the Role of the IAT

ACCREDITATION STANDARDS

Early Warning System Implementation Guide

Developing an Assessment Plan to Learn About Student Learning

Math Pathways Task Force Recommendations February Background

Kristin Moser. Sherry Woosley, Ph.D. University of Northern Iowa EBI

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Last Editorial Change:

Implementing Pilot Early Grade Reading Program in Morocco

The Power of Impact: Designing Academic Interventions for 1 st Year Students. Louisiana State University

ASCD Recommendations for the Reauthorization of No Child Left Behind

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

CREATING SAFE AND INCLUSIVE SCHOOLS: A FRAMEWORK FOR SELF-ASSESSMENT. Created by: Great Lakes Equity Center

Rendezvous with Comet Halley Next Generation of Science Standards

2018 Student Research Poster Competition

THE FIELD LEARNING PLAN

Children and Adults with Attention-Deficit/Hyperactivity Disorder Public Policy Agenda for Children

Dublin City Schools Broadcast Video I Graded Course of Study GRADES 9-12

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

The Characteristics of Programs of Information

Focus Groups and Student Learning Assessment

Higher education is becoming a major driver of economic competitiveness

Note: Principal version Modification Amendment Modification Amendment Modification Complete version from 1 October 2014

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Brian Isetts University of Minnesota - Twin Cities, Anthony W. Olson PharmD University of Minnesota, Twin Cities,

ELDER MEDIATION INTERNATIONAL NETWORK

elearning OVERVIEW GFA Consulting Group GmbH 1

Higher Education Six-Year Plans

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Evaluation of a College Freshman Diversity Research Program

Types of Research EDUC 500

University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences Programmatic Evaluation Plan

Field Experience Management 2011 Training Guides

Washington County CIT Peer Review

Unit 7 Data analysis and design

REQUEST FOR PROPOSALS SUPERINTENDENT SEARCH CONSULTANT

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

How To: Structure Classroom Data Collection for Individual Students

Annex 4 University of Dar es Salaam, Tanzania

Strategy for teaching communication skills in dentistry

Student Support Services Evaluation Readiness Report. By Mandalyn R. Swanson, Ph.D., Program Evaluation Specialist. and Evaluation

Department of Plant and Soil Sciences

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

STA 225: Introductory Statistics (CT)

EQuIP Review Feedback

Assessment. the international training and education center on hiv. Continued on page 4

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

Level 1 Mathematics and Statistics, 2015

Drs Rachel Patrick, Emily Gray, Nikki Moodie School of Education, School of Global, Urban and Social Studies, College of Design and Social Context

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

Title Columbus State Community College's Master Planning Project (Phases III and IV) Status COMPLETED

TU-E2090 Research Assignment in Operations Management and Services

Section 1: Program Design and Curriculum Planning

Final. Developing Minority Biomedical Research Talent in Psychology: The APA/NIGMS Project

Longitudinal Integrated Clerkship Program Frequently Asked Questions

Assessment Method 1: RDEV 7636 Capstone Project Assessment Method Description

OCR LEVEL 3 CAMBRIDGE TECHNICAL

STUDENT ASSESSMENT, EVALUATION AND PROMOTION

Politics and Society Curriculum Specification

Interprofessional Education Assessment Strategies

The Teaching and Learning Center

Monitoring & Evaluation Tools for Community and Stakeholder Engagement

This Performance Standards include four major components. They are

GUIDE FOR THE WRITING OF THE DISSERTATION

Preparing for Medical School

Department of Communication Promotion and Tenure Criteria Guidelines. Teaching

Evaluation of the Cocoa Beach Green Business Program

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

UNIVERSITY OF UTAH VETERANS SUPPORT CENTER

Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

Global Health Interprofessional Program Summer Zambia

Montana Content Standards for Mathematics Grade 3. Montana Content Standards for Mathematical Practices and Mathematics Content Adopted November 2011

Stacks Teacher notes. Activity description. Suitability. Time. AMP resources. Equipment. Key mathematical language. Key processes

A Whole School Approach: Collaborative Development of School Health Policies, Processes, and Practices

Modified Systematic Approach to Answering Questions J A M I L A H A L S A I D A N, M S C.

PUPIL PREMIUM POLICY

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

Running Head: STUDENT CENTRIC INTEGRATED TECHNOLOGY

Introduction to Modeling and Simulation. Conceptual Modeling. OSMAN BALCI Professor

Stakeholder Engagement and Communication Plan (SECP)

Objective Research? Information Literacy Instruction Perspectives

The College of Law Mission Statement

Deploying Agile Practices in Organizations: A Case Study

No educational system is better than its teachers

Transcription:

Utilization-Focused Evaluation: Finding the Value in Evaluation Sarah Gill Evaluation Technical Advisor (contractor) in CDC s Air Pollution and Respiratory Health Branch

Session Objectives Introduce CDC s approach to Utilization-Focused Evaluation and CDC s Framework for Program Evaluation in Public Health Explain strategic evaluation planning process Encourage support for the implementation of the Ohio Asthma Program s strategic evaluation plan Demonstrate the need for broad-based participation throughout the evaluation cycle Describe how to focus an evaluation on information needs of decision makers

Session Objectives Yikes! Yay!

CDC s Framework for Program Evaluation in Public Health Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Research seeks to prove, evaluation seeks to improve

Evaluation is A systematic process for generating specific information that is useful valuable to a specific set of intended users.

Strategic Evaluation Planning Process Encompasses five-year cooperative agreement Reserves one year for planning Calls for state partners to Convene planning team and engage a broad array of stakeholders Evaluate three components: partnerships, surveillance system, interventions CDC provides funding for ½-time evaluator

CDC s Framework for Program Evaluation in Public Health Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Evaluation Standards Utility Feasibility Propriety Accuracy

Case Study: Asthma Education for Adults Urban area with many small, tight-knit communities based on ethnicity Group sessions; initial session is 3 hours 6-month follow up is group session focusing on problem solving Delivered in community setting Pharmacist as trainer Patients must be referred by provider, have AAP, and bring buddy

Step 1: Engage Stakeholders Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Potential Stakeholders Program staff Patients Pharmacies/pharmacists MDs Neighboring state interested in replicating program Host organizations Employers

Step 2: Describe the Program Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

INPUTS Staff (AE-C) Funding Partner Pharmacies Partner MDs Asthma patients w/ AAP & buddy Community hosts Connections to local communities/ good reputation Adult Asthma Education Intervention ACTIVITIES Find & adapt curriculum Recruit: Pharmacies MDs, Patients, Hosts Train pharmacists Obtain demonstration equipment Schedule trainings, follow ups (trainer, community host, patients) Evaluate OUTPUTS Appropriate curriculum Prepared trainers Equipment on hand Appropriate referrals Meeting spaces Trainings scheduled Evaluation findings Trainings held Improved patient knowledge re self-mgmt Greater patient sense of control Informed buddy Findings disseminated and used OUTCOMES Less patient exposure to triggers Better patient adherence to plan Better program More referrals Better quality of life Fewer missed work days

Step 3: Focus the Evaluation Design Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Focus the Evaluation Design Evaluation purpose Users Uses Questions Methods

Types of Evaluation Questions Process Were program activities accomplished? Were activities implemented as planned? Outcome (effects) Does the program exert intended effect or impact? Is there progress toward larger program goals? Most evaluations include both types of questions. May also include questions about lessons learned or future-focused questions.

Prioritizing Questions Utility Feasibility Propriety Accuracy Information need Centrality Disparities Challenges Focus Maturity of program Prior evaluation Cost (money and time) Reach

Feasibility: Selecting Your Evaluation Design Experimental Designs Random assignment to compare effects of an intervention on equivalent groups Quasi-experimental Designs Comparisons are made among non-equivalent groups Observational Designs Comparisons are made within groups (e.g., comparative case studies or cross-sectional surveys)

Generating Evaluation Questions Assume stakeholder role: Program staff Pharmacist Employer Person with asthma Program staff from neighboring state Generate potential evaluation questions

Step 4: Gather Credible Evidence Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Start Answering Your Questions Develop indicators Collect data

Developing Indicators Specific, observable, and measurable signs of a program s performance that measure Activities (process) Results (outcomes) Help tell the program s story Can complement evaluation but can t replace!

Collecting the Data Data collection methods Surveys Interviews Focus groups Document review Observation Secondary data analysis Use multiple methods whenever possible

Answering the MD s Question If my patient participates in this class, will she know when to use her controller meds vs. her rescue meds?

Step 5: Justify Conclusions Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Analyzing Data Assess data as appropriate for each method. Qualitative data Content analysis Domain analysis Discourse analysis Policy analysis Quantitative data Frequencies or simple counts Statistical tests for differences Multivariate modeling

Interpreting Data Facts are not enough to draw conclusions Different stakeholders will judge facts differently Process for building consensus on conclusions may be needed.

Step 6: Ensure Use and Share Lessons Learned Steps Engage stakeholders Ensure use and share lessons learned Justify conclusions Standards Utility Feasibility Propriety Accuracy Describe the program Focus the Evaluation design Gather credible evidence

Potential Uses of Evaluation Findings Assess process and practice Target areas for improvement Develop standardized tools Strategize changes to operations Prioritize activities and resources Identify practices for replication Train staff and others Garner political support Identify areas for future evaluation

Even more uses Use your results to meet other needs! Progress reports Use logic models, outcome reporting, analysis Stakeholder groups Help you implement interventions Advocacy Show off areas of effectiveness Justify funding Point to areas needing improvement Ask for more resources

Mechanisms for Sharing Evaluation Information Written reports Presentations Formal or informal Articles in newsletters, on websites Graphs, pictures, and illustrations Stories

Utilization-Focused Evaluation USE the process, then USE the findings, then Start over wiser. Yay!

Questions?

Sarah Gill sgill@cdc.gov Learning and Growing Through Evaluation www.cdc.gov/asthma/program_eval/guide.htm