Quality Improvement Methodology Next Steps

Similar documents
Great Teachers, Great Leaders: Developing a New Teaching Framework for CCSD. Updated January 9, 2013

COURSE LISTING. Courses Listed. Training for Cloud with SAP SuccessFactors in Integration. 23 November 2017 (08:13 GMT) Beginner.

INTERNATIONAL STUDENT TIMETABLE BRISBANE CAMPUS

ANNUAL CURRICULUM REVIEW PROCESS for the 2016/2017 Academic Year

We Are a Place People Can Call Their Medical Home

TENNESSEE S ECONOMY: Implications for Economic Development

Collaboration Tier 1

GRANT ELEMENTARY SCHOOL School Improvement Plan

FRESNO COUNTY INTELLIGENT TRANSPORTATION SYSTEMS (ITS) PLAN UPDATE

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

GRANT WOOD ELEMENTARY School Improvement Plan

Study on the implementation and development of an ECVET system for apprenticeship

REVIEW CYCLES: FACULTY AND LIBRARIANS** CANDIDATES HIRED ON OR AFTER JULY 14, 2014 SERVICE WHO REVIEWS WHEN CONTRACT

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Wave 46 First PDSA Cycles

The patient-centered medical

Unit 7 Data analysis and design

BUS Computer Concepts and Applications for Business Fall 2012

EDEXCEL FUNCTIONAL SKILLS PILOT

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

HANDOUT for AMCHP Conference February 14 th 2012

Different Requirements Gathering Techniques and Issues. Javaria Mushtaq

Mathematics Success Level E

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Function Tables With The Magic Function Machine

MS-431 The Cold War Aerospace Technology Oral History Project. Creator: Wright State University. Department of Archives and Special Collections

Fearless Change -- Patterns for Introducing New Ideas

Edexcel GCSE. Statistics 1389 Paper 1H. June Mark Scheme. Statistics Edexcel GCSE

Level 1 Mathematics and Statistics, 2015

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

1.11 I Know What Do You Know?

Major Milestones, Team Activities, and Individual Deliverables

REVIEW CYCLES: FACULTY AND LIBRARIANS** CANDIDATES HIRED PRIOR TO JULY 14, 2014 SERVICE WHO REVIEWS WHEN CONTRACT

Driving Competitiveness. Delivering Growth and Sustainable Jobs. 29 May 2013 Dublin Castle, Ireland

Student s Edition. Grade 6 Unit 6. Statistics. Eureka Math. Eureka Math

Spinners at the School Carnival (Unequal Sections)

K 1 2 K 1 2. Iron Mountain Public Schools Standards (modified METS) Checklist by Grade Level Page 1 of 11

Kristin Moser. Sherry Woosley, Ph.D. University of Northern Iowa EBI

Meriam Library LibQUAL+ Executive Summary

Corpus Linguistics (L615)

CTE Teacher Preparation Class Schedule Career and Technical Education Business and Industry Route Teacher Preparation Program

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

AGS THE GREAT REVIEW GAME FOR PRE-ALGEBRA (CD) CORRELATED TO CALIFORNIA CONTENT STANDARDS

They did a superb job and they did it quick. I was amazed at how fast they did everything that they had to do.

BUSINESS FINANCE 4239 Risk Management

HOLISTIC LESSON PLAN Nov. 15, 2010 Course: CHC2D (Grade 10, Academic History)

learning collegiate assessment]

Visit us at:

Math Grade 3 Assessment Anchors and Eligible Content

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Mathematics Scoring Guide for Sample Test 2005

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Shockwheat. Statistics 1, Activity 1

Advanced Corporate Coaching Program (ACCP) Sample Schedule

Centre for Evaluation & Monitoring SOSCA. Feedback Information

GCSE Mathematics B (Linear) Mark Scheme for November Component J567/04: Mathematics Paper 4 (Higher) General Certificate of Secondary Education

WORK OF LEADERS GROUP REPORT

Stakeholder Engagement and Communication Plan (SECP)

4th Grade Annotation Guide

Assessment of Generic Skills. Discussion Paper

Maths Games Resource Kit - Sample Teaching Problem Solving

FOR TEACHERS ONLY. The University of the State of New York REGENTS HIGH SCHOOL EXAMINATION PHYSICAL SETTING/PHYSICS

Update on Standards and Educator Evaluation

I N T E R P R E T H O G A N D E V E L O P HOGAN BUSINESS REASONING INVENTORY. Report for: Martina Mustermann ID: HC Date: May 02, 2017

The lab is designed to remind you how to work with scientific data (including dealing with uncertainty) and to review experimental design.

Person Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8

MKT ADVERTISING. Fall 2016

POLITICAL SCIENCE 315 INTERNATIONAL RELATIONS

TESL/TESOL DIPLOMA PROGRAMS VIA TESL/TESOL Diploma Programs are recognized by TESL CANADA

FISK. 2016/2018 Undergraduate Bulletin

essential lifestyle planning for everyone Michael W. Smull and Helen Sanderson

Introduction to the Practice of Statistics

Business Computer Applications CGS 1100 Course Syllabus. Course Title: Course / Prefix Number CGS Business Computer Applications

360 Huntington Ave R218 TF (617)

African American Fraternities And Sororities: The Legacy And The Vision

Course Law Enforcement II. Unit I Careers in Law Enforcement

SOFTWARE EVALUATION TOOL

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Multi-sensory Language Teaching. Seamless Intervention with Quality First Teaching for Phonics, Reading and Spelling

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

Date Day Scholastic Co-scholastic Activity Examinations Important Days 1-Apr-13 Mon W New Session begins 2-Apr-13 Tue W 3-Apr-13 Wed W 4-Apr-13 Thu W

TUESDAYS/THURSDAYS, NOV. 11, 2014-FEB. 12, 2015 x COURSE NUMBER 6520 (1)

Informal Comparative Inference: What is it? Hand Dominance and Throwing Accuracy

BSBCMM401A Make a presentation

INTERMEDIATE ALGEBRA Course Syllabus

Common Core Curriculum Map For Sociology

LEADERSHIP AND COMMUNICATION SKILLS

Volunteer State Community College Strategic Plan,

THE UNIVERSITY OF WESTERN ONTARIO. Department of Psychology

Department of Statistics. STAT399 Statistical Consulting. Semester 2, Unit Outline. Unit Convener: Dr Ayse Bilgin

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology

HUBBARD COMMUNICATIONS OFFICE Saint Hill Manor, East Grinstead, Sussex. HCO BULLETIN OF 11 AUGUST 1978 Issue I RUDIMENTS DEFINITIONS AND PATTER

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

West Georgia RESA 99 Brown School Drive Grantville, GA

For Portfolio, Programme, Project, Risk and Service Management. Integrating Six Sigma and PRINCE Mike Ward, Outperfom

PHILOSOPHY & CULTURE Syllabus

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

ACCA PROSPECTUS JAN-JUN 2018 SEMESTER 1 SANDTON CAMPUS BECOME YOUR VISION, A CHARTERED FINANCE PROFESSIONAL!

How People Learn Physics

Transcription:

Quality Improvement Methodology Next Steps

This breakout Some experience of testing using the Model for Improvement, would like support to take their work beyond testing and onwards to implementation, scale up and spread. If you are stuck or have made some improvements but are wondering what to do next, then this session is for you.

Participants will be supplied with a template prior to the learning session asked to bring a brief description of their current work which will form the basis of the practical elements of the session.

Purpose of this Session Consider the components of a learning system in your own improvement activity : 4. Sequential testing of new theories 6. Planning for spread at scale What stage are you at in relation to these aspects of the improvement journey? What do you need to do next to ensure that your tests are scaled up and spread correctly?

Aim Measures Changes Testing & Implementation The Improvement Guide, API

Cycles of Tests Build Confidence S D A P D A P S A P D S Changes that will result in improvement Learning from data S D A P S D A P Proposals, theories, hunches, intuition A S P D S A P D

1 person 1 day 1 family 1 setting Start small Move to 3,5,7. as confidence grows

You can only learn as quickly as you test! Years Quarters Months Weeks Days Hours Minutes Drop down next two levels to plan test cycle!

Components of a Learning System 1. System level measures 2. Explicit theory or rationale for system changes 3. Segmentation of the population 4. Learn by testing changes sequentially 5. Use informative cases: Act for the individual learn for the population 6. Learning during scale-up and spread with a production plan to go to scale 7. Periodic review 8. People to manage and oversee the learning system From Tom Nolan PhD, IHI

Sequential Testing & Scale Up Aim Achieve improved communication process for HV and SW handovers using standardised format A P S D A P D S A P D S A P D A P S A P D S Implement Cycle 6: Test all handovers-1 week Hunch/Theory A structured handover will ensure accurate information sharing between prof teams A P S D S D Cycle 5: other team HVs/SWs- 1 day Cycle 4:Test with all handovers-1 HV/1week Cycle 3: Test with 5 more families/ same HV (Mon/Tu) Cycle 2: Test with 3 family handovers/ same HV (Wed) Cycle 1: Test developed handover form 1 HV/1 SW 1 family (Mon)

Learning Through Sequential Testing at your tables discussion time Where are you currently? Can you describe an example of the following? Multiple tests with different people/under different conditions? Learning/data captured to describe your testing journey?

Testing v Implementation Testing Trying and adapting existing knowledge on small scale. Learning what works in your system. Multiple tests in a variety of conditions Implementation Making the change a part of the day-to-day operation of the system. Permanent change Would the change persist even if its champion were to leave the organisation? Avoid implementation until confident that processes are robust

INSERT VIDEO CLIP RE TESTING

Components of a Learning System 1. System level measures 2. Explicit theory or rationale for system changes 3. Segmentation of the population 4. Learn by testing changes sequentially 5. Use informative cases: Act for the individual learn for the population 6. Learning during scale-up and spread with a production plan to go to scale 7. Periodic review 8. People to manage and oversee the learning system From Tom Nolan PhD, IHI

A Framework for Spread Institute for Healthcare Improvement Leadership Topic is a key strategic initiative Goals and incentives aligned Executive sponsor assigned Day-to-day managers identified Measurement and Feedback Better Ideas Develop the case Describe the ideas Set-up Target population Adopter audiences Successful sites Key partners Initial spread plan Social System Key messages Communities Technical support Transition issues Knowledge Management

Things to Consider Checklists For Spread Leadership, Better Ideas & Set Up General Communication & knowledge transfer Developing Measurement, Feedback and Knowledge Management Systems

Take a strategic approach to scaling up Level of testing Strategy 1 5 25 125 250

Increasing uptake of Healthy Start Vitamins Level of testing Strategy 1 1 EY practitioner gives vits to 1 mum 5 25 125 250 Free vitamins in Asda with every pregnancy kit

Are You Ready for Scale up & Spread? Table discussion time In the context of your current improvement activity: Have you been testing a theory so that it could be considered ready for implementation in the area you are working? and/or Do you have a strategy for moving to scale up and spread to other sites/teams?

Seven Spreadly Sins

Seven Spreadly Sins 1. Start with a large pilot 2. Find one person to do it all 3. Be assured vigilance and hard work will solve the problems 4. Do one small test and then spread everywhere 5. Get the pilot team to spread the improvement system-wide 6. Only look at the process/quality measures on a quarterly basis (or less frequently) 7. Expect marked improvement in outcomes early on without attention to process reliability

1. Start with a large pilot

2. Find one person to do it all

3. Be assured vigilance and hard work will solve the problems

4. Do one small test and then spread everywhere

5. Get the pilot team to spread the improvement system-wide

6. Only look at the process/quality measures on a quarterly basis (or less frequently)

7. Expect marked improvement in outcomes early on without attention to process reliability Memo clearly worked job done! Memo issued to all staff Must do this!

Institute for Healthcare Improvement The Seven Spreadly Sins

Sustaining Improvement using Measurement Having the correct measures to provide assurance that new processes are reliable Measuring compliance or satisfaction through regular and random sampling of the population Understanding the variation that exists in your data

What measures? Outcome measures directly relates to the overall aim what is the result? how is the system performing? Process measures are the processes that contribute to the aim performing as planned? Balancing measures assessing from different dimensions unanticipated consequences, other factors influencing the outcome

V R A I A T I O N If we don t understand the variation that lives in our data, we will be tempted to Deny the data as it doesn t fit with our view of reality See trends where there are none Try to explain natural variation as special events Blame and give credit to people for things over which they have no control Distort the process that produced the data Kill the messenger!

Quality Improvement (e.g. % of pregnant women offered CO monitor) Action taken on all occurrences Better Quality Worse Source: Robert Lloyd, Ph.D.

Measurement of Improvement Define measures that will measure the impact of the Improvement work over time They will guide your progress through and beyond testing to implementation and monitoring for continuous improvement. Different ways of measuring e.g., Per cent compliance with process A count of correct attempts/number of attendances Verbal feedback /surveys

Common Cause Variation

Guiding Principles for Creating Charts If you have less than 10 data points, make a simple line graph to see where the data points are going. If you have 10 and 12 data points you can convert the simple line graph to a run chart (place the median on the line graph and apply the run chart rules). When you have 12-15 data points you can calculate a control chart but you should note that the control limits are trial control limits.

Measure LOS (minutes) 320 300 280 260 240 220 200 180 Elements of a Run Chart Annotate 1 The centerline (CL) on a Run Chart is the Median ~ X (CL) 160 Time 2/16/11 3/16 4/13 5/11 6/8 Week Four simple run rules are used to determine if special cause variation is present

Calculating the Median (odd number of values) Suppose we have a set of numbers as: 4, 52, 64, 32, 56, 33, 78, 100, 12 Arrange these numbers in ascending order as: 4, 12, 32, 33, 52, 56, 64, 78, 100 Cross off the lowest and highest value (as follows) 4, 12, 32, 33, 52, 56, 64, 78, 100 Therefore, 52 is the median in this case. Keep ALL values!

Calculating the Median (even number of values) Suppose we have a set of numbers as: 4, 52, 64, 32, 56, 33, 78, 12 Arrange these numbers in ascending order as: 4, 12, 32, 33, 52, 56, 64, 78 Cross off lowest and highest value (as follows) 4, 12, 32, 33, 52, 56, 64, 78 Find the average of the 2 centre values Therefore, (33 + 52)/2 = 42.5 is the median in this case. Keep ALL values!

Measure or Characteristic Rule 1 - Shift A shift in the process is six or more consecutive points either all above or all below the median. Values that fall on the median do not add to nor break a shift. Skip values that fall on the median and continue counting. Rule 1 25 20 15 10 5 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Measure or Characteristic Rule 2 - Trend Five or more consecutive points all going up or all going down. If the value of two or more successive points is the same, ignore one of the points when counting. Like values do not make or break a trend. 25 Rule 2 20 15 10 Median 11 5 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Measure or Characeristic Rule 3 Runs Too few or too many runs Rule 3 25 20 15 10 5 0 Median 11.4 Data line crosses once Too few runs: total 2 runs 1 2 3 4 5 6 7 8 9 10

Jan-08 Feb-08 Mar-08 Apr-08 May-08 Jun-08 Jul-08 Aug-08 Sep-08 Oct-08 Nov-08 Dec-08 Jan-09 Feb-09 Mar-09 Apr-09 May-09 Jun-09 Jul-09 Aug-09 Sep-09 Measure of Characteristic Rule 3 Runs Too many runs what is this data telling you? Rule 3 Too Many Runs 7 6 5 4 3 2 1 0

Too few or too many runs

Measurement or Characteristic Rule 4: Astronomical Value For detecting unusually large or small numbers: Data that is Blatantly Obvious different value Everyone studying the chart agrees that it is unusual Remember: Every data set will have a high and a low - this does not mean the high or low are astronomical 25 Rule 4 20 15 10 5 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Why use charts One of the most powerful but simple tools for improvement Describe a process captured over time (as opposed to being a single sample) Reveal any trends a process might be experiencing When combined with careful annotation they track the impact of change

Let s Build a Run Chart

How to Make a Run Chart 1. Identify the question you would like to answer using a Run Chart 2. Develop the horizontal scale (x-axis) 3. Develop the vertical scale (y-axis) 4. Plot the data points 5. Label the graph Note: Be sure you have gone out and collected the relevant data before trying to construct your chart 6. Calculate and place a median center line on the chart 7. Add any additional information which will communicate a more complete picture to the intended audience (including annotations on change efforts) Murray and Provost, Pg 3-4

RUN CHART EXAMPLE Months Percentage Notes Jan-12 25 Feb-12 27 Mar-12 37 Event 1 Apr-12 37 Event 2 May-12 39 Jun-12 45 Event 3&4 Jul-12 38 Aug-12 8 Event 5 Sep-12 33 Oct-12 34 Event 6 Nov-12 30 Dec-12 33 Jan-13 29 Feb-13 40 Implementation Start Mar-13 44 Apr-13 47

Trend

Trend Astronomical Point

Trend Shift Astronomical Point

Trend Shift Run 1 Astronomical Point

Trend Run 2 Shift Run 1 Astronomical Point

Trend Run 2 Shift Run 1 Astronomical Point Run 3

Trend Run 2 Run 4 Run 1 Shift Astronomical Point Run 3

Too few or too many runs

Types of Variation Common Cause Variation Is inherent in the design of the process Is due to regular, natural or ordinary causes Affects all the outcomes of a process Results in a stable process that is predictable Also known as random or unassignable variation Special Cause Variation Is due to irregular or unnatural causes that are not inherent in the design of the process Affect some, but not necessarily all aspects of the process Results in an unstable process that is not predictable Also known as non-random or assignable variation

Point Common Cause does not mean Good Variation. It only means that the process is stable and predictable. For example, if the percentage of pregnant women offered CO monitoring averaged around 30% and was usually between 25% and 35%, this might be stable and predictable but unacceptable..

Point Similarly Special Cause variation should not be viewed as Bad Variation. You could have a special cause that represents a very good result (e.g. a high % of pregnant women being offered CO monitoring in some particular weeks) which you would want to emulate. Special Cause merely means that the process is unstable and unpredictable.

Point You have to decide if the output of the process is acceptable!

Measurement Principles Develop aims before measuring Design measures around aims How Good By When Develop process and balancing measures Be clear on your operational definitions

Measurement Principles Establish a reliable baseline Track progress over time using annotated run charts Teams need measures to give them feedback that the changes they are making are resulting in improvement Need to understand common cause and special cause variation to ensure we don t over/under react to situations

Deming s System of Profound Knowledge Appreciation of a system Theory of Knowledge Psychology Understanding Variation

Appreciation of a System Complex system of interaction between people, procedures and equipment Success depends on integration, not performance of individual parts Theory of Knowledge Change is prediction of improvement based on knowledge of the system Learning from theory, experience Operational definitions are the basis for improvement with PDSA cycles for learning Improvement Psychology Interaction of people with systems Motivation & will of individuals & teams Situation awareness/decision making Managing stress and fatigue Helps planning for change management Understanding Variation Variation is to be expected everything we measure varies We make decisions based on interpretation Data over time data story of what has been happening

What s the Scope of Change? System Targeted for Implementation (Defined by Aim) Single-unit prototype: segments Spread to Total System (Additional units, sites, organisations) As you move from pilot testing to implementation to spread, your population of interest will need to be adjusted.

Thank You