Sarah Leng, Programme Manager, Kent Surrey Sussex Patient Safety Collaborative October 2017

Similar documents
Newcastle Safeguarding Children and Adults Training Evaluation Framework April 2016

Training Evaluation and Impact Framework 2017/19

IMPACTFUL, QUANTIFIABLE AND TRANSFORMATIONAL?

RCPCH MMC Cohort Study (Part 4) March 2016

Accreditation of Prior Experiential and Certificated Learning (APECL) Guidance for Applicants/Students

Providing Feedback to Learners. A useful aide memoire for mentors

You said we did. Report on improvements being made to Children s and Adolescent Mental Health Services. December 2014

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

General practice pharmacist training pathway. Supporting GP pharmacists of the future

Minutes of the one hundred and thirty-eighth meeting of the Accreditation Committee held on Tuesday 2 December 2014.

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

CPD FOR A BUSY PHARMACIST

Version Number 3 Date of Issue 30/06/2009 Latest Revision 11/12/2015 All Staff in NAS schools, NAS IT Dept Head of Operations - Education

Consultation skills teaching in primary care TEACHING CONSULTING SKILLS * * * * INTRODUCTION

White Paper. The Art of Learning

Eastbury Primary School

Strategic Practice: Career Practitioner Case Study

Qualification Guidance

How to Develop and Evaluate an etourism MOOC: An Experience in Progress

PERFORMING ARTS. Unit 2 Proposal for a commissioning brief Suite. Cambridge TECHNICALS LEVEL 3. L/507/6467 Guided learning hours: 60

Dean s Performance and Quality Review Hertfordshire Partnership University NHS Foundation Trust June 2013

Assessment. the international training and education center on hiv. Continued on page 4

Backstage preparation Igniting passion Awareness of learning Directing & planning Reflection on learning

Education and Training Committee, 19 November Standards of conduct, performance and ethics communications plan

PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION

Pharmaceutical Medicine

University of Essex Access Agreement

Special Educational Needs Policy (including Disability)

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

Special Educational Needs & Disabilities (SEND) Policy

Report of External Evaluation and Review

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

MASTER S COURSES FASHION START-UP

Personal Tutoring at Staffordshire University

Project Management for Rapid e-learning Development Jennifer De Vries Blue Streak Learning

Leadership Development at

THE ECONOMIC IMPACT OF THE UNIVERSITY OF EXETER

Executive Guide to Simulation for Health

Higher education is becoming a major driver of economic competitiveness

Training Priorities identified from Training Needs Analysis survey (January 2015)

Initial teacher training in vocational subjects

I set out below my response to the Report s individual recommendations.

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

POL EVALUATION PLAN. Created for Lucy Learned, Training Specialist Jet Blue Airways

Introduction to the HFLE course

Wave 46 First PDSA Cycles

Connect Communicate Collaborate. Transform your organisation with Promethean s interactive collaboration solutions

BSc (Hons) in International Business

Response to the Review of Modernising Medical Careers

HANDOUT for AMCHP Conference February 14 th 2012

Stakeholder Engagement and Communication Plan (SECP)

Visit us at:

Real Estate Agents Authority Guide to Continuing Education. June 2016

Practice Learning Handbook

Exclusions Policy. Policy reviewed: May 2016 Policy review date: May OAT Model Policy

Exploring the Development of Students Generic Skills Development in Higher Education Using A Web-based Learning Environment

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

St Michael s Catholic Primary School

Clinical Quality in EMS. Noah J. Reiter, MPA, EMT-P EMS Director Lenox Hill Hospital (Rice University 00)

Practice Learning Handbook

Early Warning System Implementation Guide

OCR LEVEL 3 CAMBRIDGE TECHNICAL

Alma Primary School. School report. Summary of key findings for parents and pupils. Inspection dates March 2015

The feasibility, delivery and cost effectiveness of drink driving interventions: A qualitative analysis of professional stakeholders

Train The Trainer(SAMPLE PAGES)

Using research in your school and your teaching Research-engaged professional practice TPLF06

PUBLIC CASE REPORT Use of the GeoGebra software at upper secondary school

Quality in University Lifelong Learning (ULLL) and the Bologna process

2007 No. xxxx EDUCATION, ENGLAND. The Further Education Teachers Qualifications (England) Regulations 2007

Qualification handbook

The views of Step Up to Social Work trainees: cohort 1 and cohort 2

Business. Pearson BTEC Level 1 Introductory in. Specification

Supplemental Focus Guide

Student Handbook 2016 University of Health Sciences, Lahore

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

A LIBRARY STRATEGY FOR SUTTON 2015 TO 2019

St Philip Howard Catholic School

MINUTES OF THE GOVERNING BOARD OF SIR WILLIAM RAMSAY SCHOOL HELD AT THE SCHOOL ON WEDNESDAY 23 FEBRUARY 2017 AT 7.00 P.M.

Every curriculum policy starts from this policy and expands the detail in relation to the specific requirements of each policy s field.

Student agreement regarding the project oriented course

BILD Physical Intervention Training Accreditation Scheme

Title:A Flexible Simulation Platform to Quantify and Manage Emergency Department Crowding

ACADEMIC AFFAIRS GUIDELINES

leading people through change

Working with Local Authorities to Support the Localism Agenda

22264VIC Graduate Certificate in Bereavement Counselling and Intervention. Student Application & Agreement Form

Results In. Planning Questions. Tony Frontier Five Levers to Improve Learning 1

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Critical Thinking in Everyday Life: 9 Strategies

Health & Natural Environment Working Group

Guidance on the University Health and Safety Management System

Swinburne University of Technology 2020 Plan

CORE CURRICULUM FOR REIKI

Continuing Education Unit Program Course Catalog

MMC: The Facts. MMC Conference 2006: the future of specialty training

Research Update. Educational Migration and Non-return in Northern Ireland May 2008

e-portfolios in Australian education and training 2008 National Symposium Report

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

License to Deliver FAQs: Everything DiSC Workplace Certification

COMMUNITY ENGAGEMENT

Transcription:

An evaluation of the Institute for Healthcare Improvement (IHI) Accelerated Patient Safety Officer programme using the Kirkpatrick Four Level Model of evaluation. Sarah Leng, Programme Manager, Kent Surrey Sussex Patient Safety Collaborative October 2017 Introduction The Kent Surrey and Sussex (KSS) Patient Safety Collaborative (PSC), in partnership with Health Education England KSS, hosted the IHI in August 2016 to provide the Accelerated Patient Safety Officer programme to 74 local improvers. Recent assessments of regional capability suggested there was both demand and need for this training to enhance the effective delivery of safety and quality improvement projects in the region. The training was targeted in two localities within the region and was offered to all provider and commissioning organisations in these areas. This strategy was to avoid the potential dilutional effects of targeting individuals across the whole KSS region. The vision was to align the quality improvement projects supported by this enhanced capability to both the local strategic transformation plans and the on-going work of the PSC and other quality improvement programmes in the KSS Academic Health Science Network. To see if this proof of concept was effective, it was important to evaluate the training. The Kirkpatrick's Four-Level Training Evaluation Model 1 was used to objectively analyse the effectiveness and impact of the training. The results from our study are presented below within the relevant sections of the model.

The Kirkpatrick's Four-Level Training Evaluation Model Donald Kirkpatrick, Professor Emeritus at the University of Wisconsin and past president of the American Society for Training and Development (ASTD), first published his Four-Level Training Evaluation Model in 1959, in the US Training and Development Journal. The model was then updated in 1975, and again in 1994, when he published his best-known work, "Evaluating Training Programs." The four levels are: Level 1: Reaction Method This level measures how the trainees react to the training. Specifically whether the training was a valuable experience, and gathers feedback about the instructor, the topics covered, the material and its presentation, and about the venue. Understanding how well the training was received by the audience helps to improve the training for future trainees, including identifying important areas or topics that are missing from the training.

In this level we captured feedback on a daily basis during the training, feedback forms were completed and analysed at the end of each day. This allowed for immediate clarification of topics if they had not been understood, altering the agenda slightly to suit the needs of the trainees and ensuring all delegates were happy with the venue. We also captured feedback from those who did not attend the training after initially agreeing to do so (no-shows at the start of the week). We felt it was important to understand why they did not attend. Finally, using the telephone interview at 6 months, we asked the trainees to reflect on what they had found most useful from the training. Results For a full analysis of the Level 1 Reaction results see Appendix 1 Overall the feedback for the course, trainers and venue was very positive. Most the delegates found the course content relevant, interesting and at the right pace. They were very complimentary to the IHI faculty and gave the venue and food a very high rating on the survey. We had a high percentage of completed feedback forms, showing good interaction and commitment to the training. Delegates were pleased that concerns highlighted in the daily feedback form were addressed the following day (e.g. using more pre-hospital examples, moving tables and ensuring the terminology used was understood). Some delegates were surprised at the pace and length of each training day, although the agenda had been distributed in advance and it was an accelerated

course. Not all delegates had seen information on how to access the IHI site to download the full course and presentations. This needs to be clearly shown at the beginning of the course in future. Due to high numbers, not all delegates were able to stay on site. This was managed as best as possible, but we did receive some negative feedback. However, the offsite delegates were shown to network well with the tool we used. In future, a venue that is large enough to hold all the delegates will be chosen. The senior leaders attending on the last day gave positive feedback. Delegates also felt this was important to ensure they would appreciate their role in supporting the staff to deliver change. One delegate commented as participants were from the same health locality I think it would have been good to work on a real challenge as multiple organisations. This would be interesting to do if the training is repeated. Prior to the event, we had several changes in staff attending from organisations, but they kept the number of places booked. Three delegates had to cancel the week before the training, due to work commitments. We only had one delegate not attend the training on the day; this was due to ill health. All senior leaders booked to attend the Gala Dinner and last day training attended. These results show a fantastic commitment to the training, possibly due to an understanding of the worth of the training, this was made clear in the information and we did advise we would charge for no-shows.

Everyone interviewed at 6 months would recommend the training to colleagues and felt it was worthwhile. Level 2: Learning Method This level measures what trainees have learned and how much their knowledge has increased as a result of the training. In this level, we captured pre and post training evaluation of the delegates selfassessment of their quality improvement capability using the Quality 2020 Supporting Leadership for Quality Improvement and Safety tool.2 This assists individuals in assessing their current attributes (knowledge, skills and attitudes) in relation to leadership for quality improvement and safety and their learning and development needs for their current role or for future roles. The aim of the training was to see an improvement in their perceived ability following completion of the course.

We also captured the number of new contacts the trainees networked with during the residential course. We felt that this would be of benefit as one of the challenges to succeeding in quality improvement projects across the whole system is that staff do not have personal contacts in external organisations. We also used the 6 month telephone interview to ask if the training had supported the delegates to do QI work. Results For a full analysis of the Level 2 Learning results see Appendix 2 Overall the evaluation shows that delegates self-assessment of their quality improvement capability had improved following the training. Following the training there was a decrease in those stating they needed lots of development and an increase in trainees feeling they were well developed against the following 6 attributes: I understand how the culture in my workplace influences the quality and safety of care and services I recognise my responsibility to question the way we work in order to improve care and services I am able to work with a team to achieve small-step change I can explain and use PDSA cycles to make small-step change to care and services I can identify where teamwork could be more effective and I work with others to improve team performance I work to involve patients/service users and their carers/families in planning care and in quality improvement activities

Across each of the questions there were consistent improvements in selfassessment of capability, shown by a reduction in training requirements. In total (across 6 key questions in delivering improvement): Table 1: Overall changes in self assessment pre- and post-training Staff needing lots of development Staff needing some development Staff feeling well developed. Pre-training Post-training 20 5 131 56 95 178 The chi-square statistic for this 3 x 2 contingency table is 64.2. The p-value is < 0.00001, significant at the 95% level. For the individual questions the largest improvements in capability were seen in the following questions: I can explain and use PDSA cycles to make small-step-change to care and services. I can identify where teamwork could be more effective and I work with others to improve team performance. In addition, an increase was seen in those stating post training that they regularly use QI methodology. The results from analysing the pre and post training surveys were consistent with the findings from the 12 telephone interviews. In all cases staff reported they felt the

training had increased their improvement knowledge and capability. There were many instances where they reflected that they enjoyed their role more as they had more confidence in their ability to deliver improvement. When asked what stood out as being the 3 most important things you learned, 75% stated PDSA training - this is also reflected in the answers to the post-training survey above. There were excellent results of improved networking/social contacts made (from the 61 all week delegates). Over the week, 355 new contacts were established. Delegates enjoyed the networking opportunities and understood the benefits of making new contacts for supporting and enhancing their QI projects. When asked at the 6 month evaluation telephone call if they felt the training had helped them run QI projects, all the delegates agreed the training had supported them by giving them the skills and tools to run a project. Level 3: Behaviour Method This level evaluates how far the trainees have changed their behaviour, based on the training they received. Specifically, this looks at how trainees apply the information. It's important to realise that behaviour can only change if conditions are favourable. If behaviour hasn't changed, it doesn't mean that trainees haven't learned anything. It is possible their organisation won't let them apply the new knowledge (due to

possible external pressures or limited capacity). It is also possible that they learned everything, but have no desire to apply the knowledge themselves. Alternatively, trainees might not receive support, recognition, or reward for their behaviour change from their boss. So, over time, they disregard the skills or knowledge that they have learned, and go back to their old behaviours. To capture if the trainees have applied the training and used it to undertake quality improvement projects we did interviews at 6 months following the training, to capture successes in the number of projects undertaken and also the reasons (including lack of senior support or capacity) if no projects had been undertaken. We considered these questions: Did the trainees put any of their learning to use? Are trainees able to teach their new knowledge, skills, or attitudes to other people? Are trainees aware that they've changed their behaviour? Results For a full analysis of the Level 3 Behaviour results see Appendix 3 We completed 12 telephone interviews at 6 10 months following the training. We were advised to do as many conversations as possible until a point of saturation was found (where no new findings are discovered). The call was structured around some key questions to understand how participants had integrated learning from the IHI training into their practice and practice settings (thus changing their behaviour and applied their knowledge).

Key themes In all cases the trainees felt they had put their learning to use, thus applying the information from the training and therefore changing their behaviour. This was in a variety of ways training and influencing other staff by using and sharing the skills and materials from the course, initiating projects or continuing with on-going projects but with better understanding of methodology that made participants feel the project was more successful that it would have been without the training. All stated they had had senior support. The majority of staff were aware and pleased to report they had changed their behaviour as a result of the training. Many felt they were better at their job so it had increased their confidence. Others reported a shift in focus to delivering more improvement. Others reported an increase in their belief in the importance of implementing change. It was interesting to see a significant number of trainees felt the training had made them more confident in speaking up if they had any patient safety concerns. They stated the training gave them confidence to articulate the concern clearly and the skills required to be able to make a positive change.

Level 4: Results Method This level analyses the final results of the training. This includes outcomes that the trainee has determined in their quality improvement project. The biggest challenges are identifying which outcomes, benefits, or final results are most closely linked to the training and coming up with an effective way to measure these outcomes over the long term. Some potential outcomes to be reviewed: Increased production Higher morale Higher quality ratings (measured by fewer serious instances, less falls/pressure ulcers or other quality measure for the project chosen) Increased patient and carer satisfaction. To capture final results, we have monitored activity on the LIFE platform. This bespoke QI platform Based on the IHI model for improvement enables users to capture the aims, drivers, change ideas, Plan, Do, Study, Act (PDSA) cycles and results in statistical process control (SPC) charts. Delegates were shown the Life Platform during the training and all given access to the system and actively encouraged to use it for their improvement projects. We have monitored the number of KSS users of LIFE, (and the active users), the number of projects registered, the number of PDSA cycles and number of page views on Life (showing interest in other projects). None of the projects are stated as being completed projects. It is also noted that not all the projects on Life can be attributed to those on the training.

Through the telephone interview at 6 months, we were also able to capture if there have been any tangible results seen as a result of the training, in the opinion of those questioned (although in some cases it is still too early and some have not collected data or used Life for a variety of reasons time being given as main reason) We have also asked the senior leaders who attended the training on the last day if they have seen evidence within their organisation of successful QI projects. Results For a full analysis of the Level 4 Results results see Appendix 4 All staff had been involved in QI projects following the training and felt that the training was supporting them in doing the project. Increased production Following the 12 telephone calls, it is clear that many of the 18 QI project examples given will have supported more efficient working (e.g. patient flow, discharge and transfer projects, electronic GP referrals), thus increasing production. Higher morale All those interviewed stated they had enjoyed the training, were pleased to have been invested in, were more confident in their roles and were sharing the knowledge and training (so the full ripple effects of the training cannot be easily quantified). This leads us to the assumption that if staff are more confident in their roles and feel they are doing a better job, their morale will increase. Although impossible to attribute solely to the training, it is also felt that if the QI projects they have initiated have been successful (due to the skills learnt at the training), and they

have engaged and taught other staff, by providing better patient care, the morale of teams working with them will have also increased. Higher quality ratings Not all the projects were completed, and as many of the projects have started in a small PDSA cycle, results may not show on quality ratings across their whole organisation. However, some of those interviewed did feel they have measurable quality outcomes (e.g. pressure ulcer reduction and reducing length of stay to care homes). Increased patient and carer satisfaction Many of the projects, by focusing on flow/referrals/safety will have a positive impact on patient and carer experience, and some projects were involving patients in codesigning and working, which is shown to improve patient and carer satisfaction3. The measures captured from the Life Platform have shown an increase in the number of users, increase in active users, number of projects registered, number of PDSA cycles and page views. However, the results cannot be fully attributed to the IHI training as the Life Platform has been shared and presented at a number of KSS collaborative events and at the Improvers Network launch. Senior Leader Feedback Two senior leaders responded and were positive about the impact of the training. One could see the training had supported a very successful project; Their project is making a real difference to our planned care programme and QIPP. She has used the methodology and learning to address the issues we had in our referral support service and implement electronic referral service (ers). She developed a business case that attracted investment in staff and software from the

CCG and it has been deployed impacting on referral rates, improved compliance with thresholds and as a by-product has improved relationships with the GP practices. The other senior leader felt it was supporting the establishment of their improvement hub; I do not think we have robust evidence of a successful QI project at the moment as a result of this training however the staff who attended have a far better understanding of what is required with improvement and they are feeding this into their everyday work when supporting other teams. We are currently establishing an improvement hub and have been training staff on improvement which will hopefully start to take effect. I think the course has had a positive effect and we are building on this. Workforce issues are definitely hampering our efforts but the course has also helped people to try and simplify improvement wherever possible. Conclusion The Kirkpatrick Four Level Model was a useful tool for evaluating the IHI Accelerated Patient Safety Officer Training. It gave structure to the evaluation and made us look beyond the normal evaluation of post training feedback. Following up at 6 to 10 months was very valuable. Speaking to delegates and hearing their experiences following the training was really insightful all had used the training to some extent and felt it had made a difference to the quality of their QI work, and enhanced their job satisfaction. We therefore feel the IHI training was worthwhile and would recommend running further courses in the future.

Considerations The model also assumes that each level's importance is greater than the last level, and that all levels are linked. For instance, it implies that Reaction is less important, ultimately, than Results, and that reactions must be positive for learning to take place. In practice, this may not be the case. Most importantly, organisations change in many ways, and behaviours and results change depending on these, as well as on training. For example, measurable improvements in areas like retention and productivity could result from the arrival of a new boss or from a new computer system, rather than from training. Kirkpatrick's model is great for trying to evaluate training in a "scientific" way, however, so many variables can be changing in fast-changing organisations that analysis at level 4 can be limited in usefulness. Acknowledgements: Tony Kelly, Clinical Lead for Leadership, Culture & Capability, Kent Surrey Sussex Patient Safety Collaborative (KSS PSC) Kate Cheema, Head of Patient Safety Measurement Unit, NHS South, Central and West Commissioning Support Unit Jo Hughes, Programme Coordinator, KSS PSC References: 1. Donald Kirkpatrick's Four Level Evaluation Model that was first published in a series of articles in 1959 in the Journal of American Society of Training Directors (now known as T+D Magazine). The series was later compiled and published as an article, Techniques for Evaluating Training Programs, in a book Kirkpatrick edited, Evaluating Training Programs (1975)

2. Quality 2020 Supporting Leadership for Quality Improvement and Safety, An Attributes Framework for Health and Social Care, November 2014 3. Working with public contributors to improve the patient experience at the Manchester Clinical Research Facility: an evaluation of the Experience Based Design approach, Kerin Bayliss, Rebecca Prince, Hal Dewhurst, Suzanne Parsons, Leah Holmes and Paul Brown, April 2017