Souraya Sidani Canada Research Chair Ryerson University

Similar documents
EVALUATING MATH RECOVERY: THE IMPACT OF IMPLEMENTATION FIDELITY ON STUDENT OUTCOMES. Charles Munter. Dissertation. Submitted to the Faculty of the

Person Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8

Tun your everyday simulation activity into research

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

Evaluation Off Off On On

KENTUCKY FRAMEWORK FOR TEACHING

Summary results (year 1-3)

School Leadership Rubrics

The Efficacy of PCI s Reading Program - Level One: A Report of a Randomized Experiment in Brevard Public Schools and Miami-Dade County Public Schools

EQuIP Review Feedback

Process Evaluations for a Multisite Nutrition Education Program

Modified Systematic Approach to Answering Questions J A M I L A H A L S A I D A N, M S C.

Program Alignment CARF Child and Youth Services Standards. Nonviolent Crisis Intervention Training Program

Higher education is becoming a major driver of economic competitiveness

Learning Lesson Study Course

Systematic reviews in theory and practice for library and information studies

PROGRAM HANDBOOK. for the ACCREDITATION OF INSTRUMENT CALIBRATION LABORATORIES. by the HEALTH PHYSICS SOCIETY

Kentucky s Standards for Teaching and Learning. Kentucky s Learning Goals and Academic Expectations

Quantitative Research Questionnaire

Full text of O L O W Science As Inquiry conference. Science as Inquiry

Qualitative Site Review Protocol for DC Charter Schools

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

WHY SOLVE PROBLEMS? INTERVIEWING COLLEGE FACULTY ABOUT THE LEARNING AND TEACHING OF PROBLEM SOLVING

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Thesis-Proposal Outline/Template

PAPILLON HOUSE SCHOOL Making a difference for children with autism. Job Description. Supervised by: Band 7 Speech and Language Therapist

Summary / Response. Karl Smith, Accelerations Educational Software. Page 1 of 8

Special Educational Needs & Disabilities (SEND) Policy

Building our Profession s Future: Level I Fieldwork Education. Kari Williams, OTR, MS - ACU Laurie Stelter, OTR, MA - TTUHSC

PCG Special Education Brief

Understanding and improving professional development for college mathematics instructors: An exploratory study

5 Early years providers

PR:EPARe: a game-based approach to relationship guidance for adolescents.

Developing Students Research Proposal Design through Group Investigation Method

Strategic Planning for Retaining Women in Undergraduate Computing

Special Educational Needs Policy (including Disability)

Research Design & Analysis Made Easy! Brainstorming Worksheet

Intro to Systematic Reviews. Characteristics Role in research & EBP Overview of steps Standards

Strategic Practice: Career Practitioner Case Study

A Game-based Assessment of Children s Choices to Seek Feedback and to Revise

Programme Specification

This Performance Standards include four major components. They are

THE UNIVERSITY OF WESTERN ONTARIO. Department of Psychology

University of Toronto

What is PDE? Research Report. Paul Nichols

Assessment. the international training and education center on hiv. Continued on page 4

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

ACCREDITATION STANDARDS

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Simple Random Sample (SRS) & Voluntary Response Sample: Examples: A Voluntary Response Sample: Examples: Systematic Sample Best Used When

DSTO WTOIBUT10N STATEMENT A

A Pilot Study on Pearson s Interactive Science 2011 Program

Monitoring & Evaluation Tools for Community and Stakeholder Engagement

Spring Course Syllabus. Course Number and Title: SPCH 1318 Interpersonal Communication

The Impact of Formative Assessment and Remedial Teaching on EFL Learners Listening Comprehension N A H I D Z A R E I N A S TA R A N YA S A M I

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

STUDYING RULES For the first study cycle at International Burch University

Monitoring and Evaluating Curriculum Implementation Final Evaluation Report on the Implementation of The New Zealand Curriculum Report to

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Discrimination Complaints/Sexual Harassment

Running head: DEVELOPING MULTIPLICATION AUTOMATICTY 1. Examining the Impact of Frustration Levels on Multiplication Automaticity.

Major Milestones, Team Activities, and Individual Deliverables

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Assessment of Student Academic Achievement

Chemistry Senior Seminar - Spring 2016

Mathematics Program Assessment Plan

Evidence for Reliability, Validity and Learning Effectiveness

PEDAGOGY AND PROFESSIONAL RESPONSIBILITIES STANDARDS (EC-GRADE 12)

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

The Effect of Written Corrective Feedback on the Accuracy of English Article Usage in L2 Writing

Linguistics Program Outcomes Assessment 2012

Colloque: Le bilinguisme au sein d un Canada plurilingue: recherches et incidences Ottawa, juin 2008

ANGLAIS LANGUE SECONDE

Short Term Action Plan (STAP)

CHAPTER 4: RESEARCH DESIGN AND METHODOLOGY

Mexico (CONAFE) Dialogue and Discover Model, from the Community Courses Program

PHYSICAL EDUCATION LEARNING MODEL WITH GAME APPROACH TO INCREASE PHYSICAL FRESHNESS ELEMENTARY SCHOOL STUDENTS

school students to improve communication skills

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Early Warning System Implementation Guide

A Note on Structuring Employability Skills for Accounting Students

George Mason University Graduate School of Education Program: Special Education

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

ADDIE MODEL THROUGH THE TASK LEARNING APPROACH IN TEXTILE KNOWLEDGE COURSE IN DRESS-MAKING EDUCATION STUDY PROGRAM OF STATE UNIVERSITY OF MEDAN

Utilizing Soft System Methodology to Increase Productivity of Shell Fabrication Sushant Sudheer Takekar 1 Dr. D.N. Raut 2

Tutor Trust Secondary

How to Judge the Quality of an Objective Classroom Test

Children and Adults with Attention-Deficit/Hyperactivity Disorder Public Policy Agenda for Children

Probability estimates in a scenario tree

understandings, and as transfer tasks that allow students to apply their knowledge to new situations.

Non-Academic Disciplinary Procedures

Excellence in Prevention descriptions of the prevention programs and strategies with the greatest evidence of success

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

Myths, Legends, Fairytales and Novels (Writing a Letter)

Teachers Guide Chair Study

Ryerson University Sociology SOC 483: Advanced Research and Statistics

Unit 7 Data analysis and design

Success Factors for Creativity Workshops in RE

COUNSELING PSYCHOLOGY 748 ADVANCED THEORY OF GROUP COUNSELING WINTER, 2016

Transcription:

Souraya Sidani Canada Research Chair Ryerson University

Introduction use of mixed (quantitative and qualitative) methods in intervention evaluation research Lewin et al (2009): reviewed 100, randomly selected trials reports found that 30 of these trials had qualitative work associated with them

Reasons Limitations of RCT design related to inattention to: Issues that may arise in implementation of complex interventions, delivered by different professionals in real world. Understanding how and why intervention works, and complex interrelationships among factors (inherent in physical and social context) that influence outcomes along with intervention. Participants experiences with and responses to intervention (O Cathain et al., 2013; Midgley et al., 2014)

Pragmatic Approach to Research Recognizes strengths and limitations of each research method Advocates use of > 1 method in intervention study - biases inherent in 1 method are counterbalanced by biases associated with other method (Fleming et al., 2008). Values equal and complementary contribution of quantitative and qualitative methods in enhancing validity. Translates into application of any approach to qualitative research methods aim is to explore and describe participants perspective on aspects of intervention and of study.

Intervention Research Concerned with determining extent to which treatments or therapies are successful or effective in addressing health problems that individual patients or communities may experience.

Successful interventions 1. acceptable to members of target population 2. feasible within context of practice 3. implemented with high fidelity by interventionists and patients 4. able to trigger mechanism responsible for producing expected beneficial outcomes 5. associated with minimal risks or adverse effects

Phases of intervention evaluation Phase 1: focus on exploring a) acceptability, feasibility and preliminary effects of intervention, b) acceptability and feasibility of research methods planned for phase 2 (e.g. would participants agree to randomization). Phase 2: concerned with determining efficacy of intervention (i.e. does intervention produce hypothesized outcomes under controlled conditions). Phase 3: aimed to evaluate effectiveness of intervention (i.e. replication of effects in different subgroups of target population, in different contexts). Phase 4: related to dissemination, implementation and evaluation of intervention when delivered in real world of practice.

Qualitative research methods Can be used alongside quantitative methods in all phases of an intervention evaluation. Commonly applied in phase 1 to examine acceptability and feasibility of intervention, trial design and research methods; used to a lesser extent in phases 3 and 4 to monitor fidelity of intervention implementation in less controlled contexts (Lewin et al., 2009; O Cathain et al., 2013). Reason for frequent use of qualitative research methods in phase 1: flexibility of methods allows exploration of participants and research staff s perspectives on intervention and utility of specific procedures, process and outcome measures findings inform refinement of intervention and study methods planned for phase 2.

Mixed - quantitative and qualitative - methods Useful to assess acceptability, feasibility, processes and outcomes in all phases of intervention evaluation research. Assessment of these characteristics contributes to validity by identifying what exactly produced beneficial or unfavorable outcomes. Consistent with current trends acknowledging multiple causality, whereby a range of factors influence implementation and outcomes of interventions. Contrary to what is in literature, instruments have been developed to quantitatively assess acceptability and process. Concurrent use of qualitative methods is valuable to expand, extend, or complement quantitative data and interpret findings

Focus of Presentation describe strategies for integrating quantitative and qualitative methods to examine intervention s acceptability, feasibility, processes and outcomes, within RCT context.

Intervention Acceptability Definition: patients favorable perception of intervention as: appropriate and reasonable in addressing health problem; Convenient (i.e. intervention is suitable to people s lifestyle and easy to apply in daily life); effective in addressing health problem in the short and long term; and associated with minimal risks or side effects

Intervention Acceptability Importance: Patients who perceive an intervention as acceptable are likely to initiate, engage and adhere to it hypothesized improvement in outcomes. Patients who view the intervention unfavorably, may not carry it out or may be selective in applying it not exhibit hypothesized improvement in outcomes. (Craig et al., 2008; Eckert & Hintz, 2000)

Intervention Acceptability Assessment: use different indirect and direct, quantitative and qualitative, strategies, at different points in an intervention trial: 1. At enrollment into a trial: brief interview to explore reasons for enrollment or non-enrollment in a trial (Rengerin et al., 2015) open-ended questions to inquire about reasons (related to trial methods and/ or intervention).

Intervention Acceptability Assessment: use different indirect and direct, quantitative and qualitative, strategies, at different points in an intervention trial: 2. At baseline: direct assessment using available treatment acceptability measures (e.g. Tarrier et al., 2006; Sidani et al., 2009) provide description of interventions, followed by items to rate their appropriateness, effectiveness, convenience and severity of risks. 3. After exposure to first intervention session: administer measures that include only the items to rate appropriateness, effectiveness, convenience and severity of side effects.

Intervention Acceptability Assessment of acceptability (mixed methods): measures can be administered by research staff member or completed by patients in individual or group sessions open-ended questions to elaborate on what made intervention appealing or not appealing Example: helpful in delineating additional factors that influenced smokers perception of pharmacological, educational and behavioral interventions for smoking cessation and in interpreting their quantitative ratings of interventions (Sidani et al., 2016).

Intervention Acceptability Assessment: use different indirect and direct, quantitative and qualitative, strategies, at different points in an intervention trial: 4. Throughout treatment period and following treatment completion: exit interview using openended questions, with patients who withdraw from treatment and/or trial to inquire about reasons for withdrawal.

Intervention Acceptability Data analysis: Thematic or content analysis corroborate and extend quantitative ratings of acceptability; highlight aspects of intervention that were viewed unfavorably guide improvement of intervention design and/or implementation; assist in delineating subgroups of patients that vary in intervention perception and in outcomes.

Intervention Feasibility Definition: Feasibility has to do with practicality or logistics of delivering intervention.

Intervention Feasibility Importance: Identify potential challenges in carrying out intervention components and activities. Challenges related to adequacy of human and material resources required for providing intervention. Challenges may interfere with proper, smooth and/or prompt implementation of intervention reduce interventionists and patients enthusiasm for intervention.

Intervention Feasibility Assessment of adequacy of human resources: 1. Helpfulness of interventionists training: formally evaluated upon completion of all training sessions Quantitative rating of extent to which didactic and hands-on aspects of e training were useful in understanding conceptualization and operationalization of intervention and in gaining cognitive and technical skills for delivering intervention. Qualitative, open-ended questions (added to training evaluation questionnaire, or administered in a group interview) to explore issues with training.

Intervention Feasibility Assessment of adequacy of human resources: 1. Cognitive and technical skills: tested toward end of training, using a) formal self-report questionnaire measuring interventionists understanding of theory underlying intervention b) performance tests during which interventionists demonstrate particular technical procedures Use: Closed and open-ended questions, often developed for particular interventions, and structured or unstructured observation Interpretation: converging findings guide design of remedial strategies to prepare interventionists for proper implementation of intervention or to dismiss those showing poor skills.

Intervention Feasibility Assessment of adequacy of human resources: 2. Interpersonal skills: examined during or upon completion of intervention implementation. Done formally, by having patients complete a measure of therapeutic alliance or relationship. Note: Evidence clearly supports the influence of therapeutic alliance on patients engagement and adherence to treatment and achievement of outcomes.

Intervention Feasibility Assessment of adequacy of material resources: Properly functioning equipment. Required number of printed materials (e.g. booklet) and general supplies (e.g. items to demonstrate a skill). Suitable context in which intervention is provided (i.e. contextual features are appropriate to facilitate implementation of intervention).

Intervention Feasibility Assessment of adequacy of material resources: Done informally by reviewing: a) Complaints made by patients either in writing or verbally to research staff; b) Issues raised by interventionists and research staff during regularly scheduled meetings. Done formally prior to intervention delivery: a) develop a checklist that contains contextual features needed to facilitate intervention implementation b) visit potential sites to inspect the presence of features c) Document features on checklist to inform decisions about site selection

Intervention Feasibility Importance: Interventionists facing difficulties may become frustrated quit study need to find and train others within a short time frame to maintain patients flow impact adversely on quality of training or preparedness of interventionists. Interventionists frustration negatively affects their interactions with patients patients may react unfavorably by withdrawing from treatment or by not engaging in treatment reduced benefits. Interventionists facing inadequate material resources are forced to modify intervention to make it fit with what is available and feasible deviations in implementation of the intervention.

Intervention Feasibility Adequacy of human and material resources may have a subtle influence on implementation of intervention and improvement in patient outcomes. Assessment of resources, using relevant methods, provides meaningful explanation of findings, and highlight points to consider when disseminating intervention to practice.

Intervention Process Definition: Concerned with fidelity of implementation and mechanism underlying intervention effects (Nelson et al., 2015; O Cathain et al., 2013; Spillane et al., 2010). Fidelity = extent to which 1) interventionists implement intervention as planned, and in a consistent manner across all patients, and 2) patients carry out treatment recommendations correctly and as prescribed.

Intervention Fidelity - Interventionist Importance: Deviation in intervention implementation results in variability in patients exposure to all active ingredients of intervention influences their understanding, enactment, adherence, and satisfaction with intervention variability in their level of outcome improvement reduced statistical power to detect significant intervention effects (Durlak & DePre, 2008; Sidani, 2015).

Intervention Fidelity - Interventionist Assessment: 1. Quantitative fidelity checklist: derived from intervention protocol, to identify activities to be performed or behaviors to be exhibited by interventionists when delivering each intervention session, as explained by Stein et al. (2007) rating scales used to assess occurrence of activities or behaviors, or frequency and quality of performance. completed by research staff / interventionists at end of each session or by patients (short version) reduces reporting bias.

Intervention Fidelity - Interventionist Assessment: 2. Qualitative component: to obtain complementary fidelity data on nature of deviations and factors that affect interventionists ability to implement intervention. Add a column to checklist for interventionists or observers to document type and reason for deviation, or Conduct individual or group interviews with interventionists scheduled upon completion of first intervention wave and on regularly scheduled time intervals.

Intervention Fidelity - Patient Assessment: A similar mix of quantitative and qualitative methods: used to assess patients engagement and adherence to treatment. Strategies to measure adherence: single self-report items (e.g. overall adherence), multiple items (e.g. frequency of carrying out intervention components), daily diary (e.g. performance of specific treatment activities). Open-ended questions added to adherence scale or posed during individual / group interviews with patients to explore reasons for non-adherence (Lutge et al., 2014).

Intervention Fidelity Note: Measures of fidelity and adherence are administered to comparison group in order to assess extent of contamination or dissemination of any intervention s active ingredient, which is a threaten to validity (Spillane et al., 2010).

Intervention Mechanism Definition: Illustrates series of changes or events that take place during or following intervention and that lead to outcomes (i.e. mediate intervention s effects on ultimate outcomes). Theory underlying intervention identifies: mediators, defines them at conceptual level, and informs selection of relevant measures. Examples of mediators of behavioral interventions in promoting healthy behaviors: increased knowledge and self-efficacy.

Intervention Mechanism Assessment: 1. Quantitative measures: used if mediators are well specified and defined enable quantitative path analysis to demonstrate their hypothesized relationships.

Intervention Mechanism Assessment: 2. Qualitative methods: used when a) mediators are not clearly known, and b) interest in corroborating hypothesized mediated relationships and in exploring contribution of additional factors responsible for mediating intervention s effects. Interviews with interventionists and/or patients, held in individual or group format, and aimed at exploring: Aspects of intervention that were helpful; Factors that facilitated or hindered patients engagement in treatment and contributed to improvement or lack of improvement in patients condition; Changes in patients condition experienced as a result of intervention and that led to improvement in outcomes (Lewin et al., 2009; Midgley et al., 2014).

Intervention Mechanism Qualitative data analysis: Thematic or content analysis of responses to openended questions done independent of quantitative analysis results identify what contributed to changes (e.g. methodological issues or non-specific elements). Results delineate inter-relationships among elements of intervention and changes that lead to ultimate outcomes, as experienced by different subgroups of participants corroborate or expand hypothesized mechanism responsible for intervention effects, support internal validity in intervention research.

Intervention Outcomes Mixed methods often used to develop and evaluate content of outcome measures that are / will be administered in intervention evaluation trials (e.g. Drabble et al., 2014; Lewin et al., 2009; Spillane et al., 2010)

Intervention Outcomes Few researchers applied qualitative methods to assess outcomes (due to traditional view of these methods as providing low level of evidence on causality). Midgley et al. (2014) used structured qualitative interviews before and after treatment to assess outcomes (e.g. Expectation of Therapy Interview at pretest to explore patients hopes for change; Experience of Therapy Interview at post-test to elicit patients experience of therapy and of changes in their condition, as well as perspectives on factors that affected outcomes). O Cathain et al. (2014) recognized value of qualitative research in identifying unintended outcomes (i.e. changes in patients condition not hypothesized but experienced post intervention).

Intervention Outcomes Open-ended questions added to a questionnaire or asked in individual interviews would be very useful in identifying unintended outcomes.

Final Remarks Value of mixing quantitative and qualitative methods in RCTs: corroborating, explaining, and expanding findings, which strengthen validity of conclusions regarding intervention effectiveness. Challenge: need to find appropriate strategies to analyze and synthesize quantitative and qualitative findings in a meaningful way.