Evaluating Knowledge Transfer

Similar documents
Evidence into Practice: An International Perspective. CMHO Conference, Toronto, November 2008

new research in learning and working

Innovation of communication technology to improve information transfer during handover

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

Tun your everyday simulation activity into research

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Systematic reviews in theory and practice for library and information studies

What is PDE? Research Report. Paul Nichols

HARPER ADAMS UNIVERSITY Programme Specification

ONTARIO FOOD COLLABORATIVE

A Pilot Study on Pearson s Interactive Science 2011 Program

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Deploying Agile Practices in Organizations: A Case Study

Section 3.4 Assessing barriers and facilitators to knowledge use

COUNSELING PSYCHOLOGY 748 ADVANCED THEORY OF GROUP COUNSELING WINTER, 2016

Research Design & Analysis Made Easy! Brainstorming Worksheet

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

PROMOTING QUALITY AND EQUITY IN EDUCATION: THE IMPACT OF SCHOOL LEARNING ENVIRONMENT

2010 ANNUAL ASSESSMENT REPORT

THE UNIVERSITY OF WESTERN ONTARIO. Department of Psychology

Patient/Caregiver Surveys

Unit 7 Data analysis and design

Monitoring & Evaluation Tools for Community and Stakeholder Engagement

Utilizing Soft System Methodology to Increase Productivity of Shell Fabrication Sushant Sudheer Takekar 1 Dr. D.N. Raut 2

TU-E2090 Research Assignment in Operations Management and Services

Stakeholder Engagement and Communication Plan (SECP)

CHAPTER 4: RESEARCH DESIGN AND METHODOLOGY

Guidance for using the icat_sr: Intervention Complexity Assessment Tool for Systematic Reviews Version 1.0

PETER BLATCHFORD, PAUL BASSETT, HARVEY GOLDSTEIN & CLARE MARTIN,

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Designing a Case Study Protocol for Application in IS research. Hilangwa Maimbo and Graham Pervan. School of Information Systems, Curtin University

Knowledge Synthesis and Integration: Changing Models, Changing Practices

The Political Engagement Activity Student Guide

Planning a Dissertation/ Project

TOURISM ECONOMICS AND POLICY (ASPECTS OF TOURISM) BY LARRY DWYER, PETER FORSYTH, WAYNE DWYER

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

February 16. Save $30 on Registration: Designed for Managers and Staff of After School Programs. Early Bird Deadline: January 26, 2017

A Framework for Safe and Successful Schools

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

COUNSELLING PROCESS. Definition

ÉCOLE MANACHABAN MIDDLE SCHOOL School Education Plan May, 2017 Year Three

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Note: Principal version Modification Amendment Modification Amendment Modification Complete version from 1 October 2014

R01 NIH Grants. John E. Lochman, PhD, ABPP Center for Prevention of Youth Behavior Problems Department of Psychology

Developing Students Research Proposal Design through Group Investigation Method

Exploring the Development of Students Generic Skills Development in Higher Education Using A Web-based Learning Environment

PUPIL PREMIUM POLICY

GUIDE FOR THE WRITING OF THE DISSERTATION

Georgetown University School of Continuing Studies Master of Professional Studies in Human Resources Management Course Syllabus Summer 2014

ATW 202. Business Research Methods

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

EQuIP Review Feedback

Using research in your school and your teaching Research-engaged professional practice TPLF06

Lecture 15: Test Procedure in Engineering Design

Objective Research? Information Literacy Instruction Perspectives

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

-Celebrating Your QI Success-

Intro to Systematic Reviews. Characteristics Role in research & EBP Overview of steps Standards

Developing an Assessment Plan to Learn About Student Learning

Department of Communication Promotion and Tenure Criteria Guidelines. Teaching

Global Convention on Coaching: Together Envisaging a Future for coaching

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

The feasibility, delivery and cost effectiveness of drink driving interventions: A qualitative analysis of professional stakeholders

L.E.A.P. Learning Enrichment & Achievement Program

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

International comparison and review of a health technology assessment skills program

Customised Software Tools for Quality Measurement Application of Open Source Software in Education

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

A pilot study on the impact of an online writing tool used by first year science students

Plans for Pupil Premium Spending

Feedback Form Results n=106 6/23/10 Emotionally Focused Therapy: Love as an Attachment Bond Presented By: Sue Johnson, Ed.D.

K-12 Academic Intervention Plan. Academic Intervention Services (AIS) & Response to Intervention (RtI)

Triple P Ontario Network Peaks and Valleys of Implementation HFCC Feb. 4, 2016

Model of Human Occupation

PROVIDENCE UNIVERSITY COLLEGE

To appear in The TESOL encyclopedia of ELT (Wiley-Blackwell) 1 RECASTING. Kazuya Saito. Birkbeck, University of London

University of Toronto Mississauga Degree Level Expectations. Preamble

COURSE SYNOPSIS COURSE OBJECTIVES. UNIVERSITI SAINS MALAYSIA School of Management

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

Mathematics Program Assessment Plan

Colloque: Le bilinguisme au sein d un Canada plurilingue: recherches et incidences Ottawa, juin 2008

Characterizing Mathematical Digital Literacy: A Preliminary Investigation. Todd Abel Appalachian State University

ACCREDITATION STANDARDS

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Prince2 Foundation and Practitioner Training Exam Preparation

CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA

LEAD 612 Advanced Qualitative Research Fall 2015 Dr. Lea Hubbard Camino Hall 101A

Planning a research project

Trainee Handbook. In Collaboration With. University of Arkansas for Medical Science (UAMS)

Brief Home-Based Data Collection of Low Frequency Behaviors

STEPS TO EFFECTIVE ADVOCACY

Social Work Simulation Education in the Field

Family Involvement in Functional Assessment. A Guide for School Professionals

Prevent Teach Reinforce

Update on Standards and Educator Evaluation

THE REFLECTIVE SUPERVISION TOOLKIT

Master s Programme in European Studies

Transcription:

Evaluating Knowledge Transfer Annual NICE Knowledge Exchange 2008 June 4-6, 2008, Toronto, Ontario Michael A. Saini, Ph.D., M.S.W. Research Associate Research Institute for Evidence-Based Social Work Factor-Inwentash Faculty of Social Work University of Toronto Email: michael.saini@utoronto.ca

Objectives Consider the goals and purposes of initiating evaluation of knowledge transfer (EKT) Describe a process for evaluating the effectiveness and impact of knowledge transfer Participate in EKT exercise relevant to NICE Consider benefits and limitations of EKT

Knowledge Transfer To ensure research has maximum impact To strengthen the relevance of research Funders and institutions require it Can provide a direct link to consumers

Consumers of KT Clients recognize they need more information regarding issues/problems effecting them Knowledge transfer has the potential of facilitating more effective consumers of knowledge (Tudgwell, et al., 2007) KT is to put the results of rigorous research into the hands of clients

Three KT Models 1. Producer-push model Researcher responsible for transferring and facilitating the uptake of knowledge 2. User-pull mode Decision makers responsible for identifying and making use of research knowledge 3. Exchange model Researchers and decision makers jointly responsible for the uptake of knowledge (Lavis, et al., 2003)

Evaluating Knowledge Transfer Evaluation provides evidence of the effectiveness of the initial and ongoing knowledge investment (money, time, effort, reputation, client faith) Tests the suitability of current strategies Justifies future expenditure on knowledge transfer Provides accountability Provides evidence of impact (Debowski, 2005)

Evaluating Knowledge Transfer Rapid growth of KT activities with a wide range of strategies Little has been documented about KT effectives Programs that incorporate ongoing evaluation are more aware of what works and what does not

Evaluating Knowledge Transfer Framework has three main phases

Knowledge Creation Primary or secondary research Synthesizing the research Systematic Reviews (e.g. meta-analysis, qualitative synthesis) Rapid Evidence Assessments / Scoping / Conceptual Maps Refining research into relevant products / tools for knowledge transfer Evidence-Based Guidelines Consumer-friendly summaries Booklets

Knowledge Action Address the barriers and support needed to implement the products / tools to target audience Assess whether products /tools are the best format for the target audience Facilitate the awareness and uptake of products / tools by consumers Monitor knowledge use

Knowledge Evaluation Create goals and objectives for the evaluating Decide on desired outcomes Identify how to measure desired outcomes Complete analysis according to the pre-determined evaluation plan

Evaluating Knowledge Transfer (Adapted from Graham, Logan, Harrison, et al, 2006; Tugwell, Santesso, O Connor et al, 2007)

Getting Started Determine if systematic reviews have been performed Identify stakeholders and include them in all steps Create support system to facilitate giving and receiving feedback Identify all potential uses of knowledge Plan for evaluation at the onset of knowledge transfer (CHEO, 2006)

Identify Knowledge for KT /EKT Interview relevant stakeholders about the kinds of knowledge consumers need Identify sources of knowledge based on rigorous accumulation of empirical research Consider implications of systematic reviews for target audience

Guiding Questions 1. What message do you want to transfer? 2. To whom should the message be delivered? 3. By who should the message be delivered? 4. How should the message be delivered? 5. With what effect? www.researchtopolicy.ca

Adapt KT to Local Context Integrate scientific evidence with practice / policy experiences and the views, preferences, characteristics and situations of the target audience Developing a tailored product by evaluating KT with key stakeholders and then making modifications based on feedback Interviews Focus groups Test pilot

Access Barriers to Knowledge Use Planning and executing strategies to actively promote awareness and knowledge uptake Barriers may include: Difficult or no access to products / tools Limited time by helping profession to promote products / tools Cultural values, preferences, awareness and resources for relevant audiences (Tugwell et al., 2006)

Access Barriers to Knowledge Use Methods to assess facilitators and barriers to knowledge use include: Interviews Focus Groups Questionnaires Direct observation Analysis of administrative data

Implement Products / Tools Assess evidence to design feasible, targeted products and/or tools, including evidence-based actionable messages tailored for relevant audiences (Tugwell, et al., 2006, p. 646) Tailor products / tools for maximum uptake by the target audience Consider style, design, look, user-friendliness, portability, etc

Implement Products / Tools

Barnardo, 2002 Evaluate KT Process

Evaluate KT Process Monitor the uptake of products / tools Count the number of visits to the website Ratio between products created and products distributed Use stakeholders group to provide assessment of uptake and use by consumers Interviews with consumers to compare intended audience with actual audience

Evaluate Outcomes Relatively new field of research for Knowledge Transfer (Graham et al., 2006) Should be connected to original goals of KT activities Research design should fit intended goals of the evaluation Temporal order: RCT / Quasi experimental Correlation: Survey In-depth analysis: Qualitative

Outcomes of Knowledge Use Conceptual knowledge use: changes in understanding Instrumental knowledge use: changes in behaviour Strategic knowledge use: use of knowledge for personal power (Graham et al, 2006)

Outcomes of Knowledge Use Outcomes need to be chosen at the onset and in combination with the development of the product / tool Outcomes need to be chosen on the basis of validity, reliability and sensitivity to change Outcomes may need to be tested and modified prior to implementation (Tugwell, 2006)

Evaluating KT Depends on the purpose and access to resources Tugwell (2006) distinguishes three purposes of evaluating KT activities: For internal quality improvement For external accountability To research KT effectiveness

For Internal Quality Improvement Non-randomized pre-experimental study design Pre-Post KT or time series Qualitative research for rich description Explore process variables and changes in understanding

For External Accountability Non-randomized observational study design Pre-Post KT or time series Assess outcomes related to process, changes in understanding and changes in behaviour Qualitative research for rich description

To Research KT Effectiveness Experimental study designs are preferred RCT or Quasi-experimental designs with parallel cohorts and similar baselines Assess both process (how, why and what setting) and outcomes specific to the goals of the KT activities (e.g. reduction of falls) Qualitative research to provide additional insight

Sustain Knowledge Use Maintaining knowledge transfer activities Reassessment of facilitators and barriers Tailoring products and intervention strategies Monitoring use Evaluating impact Sustaining knowledge use

Evaluation Challenges Justifying knowledge transfer impact Long-term and short-term perspectives Defining strategic knowledge performance Choosing data collection methods to get at preferred answers (Debowski, 20005)

Discussion Knowledge transfer needs to be rigorously evaluated and monitored Qualitative and quantitative measures are suited to exploring the inputs and outcomes associated with knowledge transfer Evaluating Knowledge Transfer should reflect the nature of knowledge creation, action and evaluation

Activity In groups, you will be provided with a target audience and a mock fact sheet Mock fact sheet based on review knowledge of depression in the elderly after TBI based on systematic review: Menzel, J (2008) Depression in the elderly after traumatic brain injury: A systematic review. Brain Injury, 22(5), 375-380

Activity 1. List the primary goals of developing the KT strategy: Based on audience, message and desired effect 2. List potential barriers of implementing KT based on the intended audience 3. Brainstorm regarding the various products / tools to transfer knowledge and then pick the best 4. Developing an evaluation protocol by choosing a research design, type of outcomes and how you plan to measure the outcomes

References Barnardo s R&D (2000). What works? Making connections linking research and practice. Barkingside Debowski, S. (2006), Knowledge Management, John Wiley & Sons, Sydney Australia Graham, JD, Logan, Harrison, MB, et al., (2006). Lost in knowledge translation: time for a map? Journal of Continuing Education in the Health Profession., 26(1), 13-24. Lavis, J., Ross, S., McLeod, C., & Gildiner, A. (2003). Measuring the impact of health research. Journal of Health Services & Research Policy, 8(3), 165-170. Menzel, J (2008) Depression in the elderly after traumatic brain injury: A systematic review. Brain Injury, 22(5), 375-380 Petticrew, M & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Blackwell Publishing, Oxford, UK Provincial Centre of Excellence for Child and Youth Mental Health at CHEO (2006Doing more with what you know. Ottawa, Ontario, www.onthepoint.ca Tugwell, P., Robinson, V., Grimshaw, J., Santesso, N (2006). Systematic reviews and knolwedge translation. Bulletin of the World Health Organization, 84(8), 643-651. Tugwell, P. Santesso, N., O Connor, A., Wilson, A. (2007) Knowledge translation for effective consumers. Physical Therapy, 87(12), 1728-1738. www.researchtopolicy.ca