Problem Formulation. Jeffrey C. Valentine University of Louisville. Center for Evidence-Based Crime Policy / Campbell Collaboration Joint Symposium

Similar documents
Intro to Systematic Reviews. Characteristics Role in research & EBP Overview of steps Standards

Systematic reviews in theory and practice for library and information studies

Probability and Statistics Curriculum Pacing Guide

Tun your everyday simulation activity into research

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

What is Thinking (Cognition)?

BENCHMARK TREND COMPARISON REPORT:

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology

The IDN Variant Issues Project: A Study of Issues Related to the Delegation of IDN Variant TLDs. 20 April 2011

EQuIP Review Feedback

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

Graduate Program in Education

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

TU-E2090 Research Assignment in Operations Management and Services

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Lecture 2: Quantifiers and Approximation

Objective Research? Information Literacy Instruction Perspectives

School Leadership Rubrics

Developing a Language for Assessing Creativity: a taxonomy to support student learning and assessment

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

R01 NIH Grants. John E. Lochman, PhD, ABPP Center for Prevention of Youth Behavior Problems Department of Psychology

GUIDE FOR THE WRITING OF THE DISSERTATION

Unit 7 Data analysis and design

Designing Case Study Research for Pedagogical Application and Scholarly Outcomes

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

EVALUATING MATH RECOVERY: THE IMPACT OF IMPLEMENTATION FIDELITY ON STUDENT OUTCOMES. Charles Munter. Dissertation. Submitted to the Faculty of the

Research Design & Analysis Made Easy! Brainstorming Worksheet

Politics and Society Curriculum Specification

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

The College Board Redesigned SAT Grade 12

Master s Programme in European Studies

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Full text of O L O W Science As Inquiry conference. Science as Inquiry

Unit 3. Design Activity. Overview. Purpose. Profile

ABET Criteria for Accrediting Computer Science Programs

Study Abroad Housing and Cultural Intelligence: Does Housing Influence the Gaining of Cultural Intelligence?

Mathematical learning difficulties Long introduction Part II: Assessment and Interventions

Freshman On-Track Toolkit

Strategic Practice: Career Practitioner Case Study

Learning Lesson Study Course

UNIVERSITY OF THESSALY DEPARTMENT OF EARLY CHILDHOOD EDUCATION POSTGRADUATE STUDIES INFORMATION GUIDE

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

Towards a Collaboration Framework for Selection of ICT Tools

Modified Systematic Approach to Answering Questions J A M I L A H A L S A I D A N, M S C.

How to Apply for Fellowships & Internships Connecting students to global careers!

A. What is research? B. Types of research

Science Fair Project Handbook

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

5 Early years providers

George Mason University Graduate School of Education Education Leadership Program. Course Syllabus Spring 2006

The Political Engagement Activity Student Guide

AGS THE GREAT REVIEW GAME FOR PRE-ALGEBRA (CD) CORRELATED TO CALIFORNIA CONTENT STANDARDS

Integrating Blended Learning into the Classroom

1.1 Examining beliefs and assumptions Begin a conversation to clarify beliefs and assumptions about professional learning and change.

1. Programme title and designation International Management N/A

Exemplar 6 th Grade Math Unit: Prime Factorization, Greatest Common Factor, and Least Common Multiple

Implementing an Early Warning Intervention and Monitoring System to Keep Students On Track in the Middle Grades and High School

SOLUTION-FOCUSED (S.F.) COUNSELLING AT AN INNER CITY SCHOOL, LONDON UK Reflection, Results and Creativity

What Am I Getting Into?

Thesis-Proposal Outline/Template

General study plan for third-cycle programmes in Sociology

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

SACS Reaffirmation of Accreditation: Process and Reports

URBANIZATION & COMMUNITY Sociology 420 M/W 10:00 a.m. 11:50 a.m. SRTC 162

Guidance for using the icat_sr: Intervention Complexity Assessment Tool for Systematic Reviews Version 1.0

Can Money Buy Happiness? EPISODE # 605

Cal s Dinner Card Deals

South Carolina English Language Arts

Procedia Social and Behavioral Sciences 8 (2010)

Evaluation of pupil premium grant expenditure 2015/16 Review Date: 16th July 2016

Georgetown University School of Continuing Studies Master of Professional Studies in Human Resources Management Course Syllabus Summer 2014

Academic Internships: Crafting, Recruiting, Supervising

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

Language Acquisition Chart

Person Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8

WORK OF LEADERS GROUP REPORT

Learning and Teaching

Room: Office Hours: T 9:00-12:00. Seminar: Comparative Qualitative and Mixed Methods

Add+Vantage Math Recovery. College Station ISD

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

An Empirical and Computational Test of Linguistic Relativity

Implementing Response to Intervention (RTI) National Center on Response to Intervention

Assessment. the international training and education center on hiv. Continued on page 4

Priorities for CBHS Draft 8/22/17

Update on Standards and Educator Evaluation

The Role of a Theoretical Framework. what the researcher will look for and how data will be sorted. Making a theoretical framework

TCH_LRN 531 Frameworks for Research in Mathematics and Science Education (3 Credits)

Extending Place Value with Whole Numbers to 1,000,000

California Professional Standards for Education Leaders (CPSELs)

Formative Assessment in Mathematics. Part 3: The Learner s Role

A student diagnosing and evaluation system for laboratory-based academic exercises

Evidence for Reliability, Validity and Learning Effectiveness

Instructor: Mario D. Garrett, Ph.D. Phone: Office: Hepner Hall (HH) 100

Transcription:

Problem Formulation Jeffrey C. Valentine University of Louisville Center for Evidence-Based Crime Policy / Campbell Collaboration Joint Symposium George Mason University August 15, 2011 Overview of Today Brief overview of systematic reviews Problem Formulation Broad vs. Narrow questions PICOS Characteristics of a good question (SAMPLE) Exercise 1

Brief Overview of Systematic Reviews Systematic reviews are a form of research secondary observations in which studies are the unit of analysis Follow basic steps in the research process Aim to minimize bias and error But SRs are not immune to bias and error (not a panacea) Stages of a Research Synthesis (Cooper, 1982) The seminal article outlining five stages of a research review is by Harris Cooper: Cooper, H.M. (1982). Scientific guidelines for conducting integrative research reviews. Review of Educational Research, 52, 291-302. Cooper, H. M. (2009). Research synthesis and metaanalysis: A step-by-step approach. Thousand Oaks, CA: Sage. 2

Stages of a Research Synthesis (Cont.) Problem formulation Clarifying your questions and writing a protocol Set explicit inclusion/exclusion criteria Data collection Literature search Information-gathering from studies Data evaluation Criteria for including and excluding studies Assessing study quality Stages of a Research Synthesis (Cont.) Data analysis and interpretation Integrating the effects from collected studies Interpreting analysis results Report preparation Narrative, statistics, graphs, tables 3

What s Required to do a Good SR? A team with Substantive expertise Methodological expertise Statistical expertise Information retrieval expertise Time and money SRs are labor intensive $50-$150K + depending on scope, complexity, and number of studies SRs Should be Governed by a Protocol A detailed protocol (plan) for the SR should be developed and made available to readers (Higgins & Green, 2008; Moher et al., 2009) Protocols increase transparency, limit ad hoc decisions The review process is iterative and plans may change during the process The final report should document and explain changes made (deviations from the protocol) 4

SRs Should be Thoroughly Reported One goal of a SR is to increase transparency Therefore a lot of methodological detail needs to be reported Systems exist that provide suggestions (e.g., PRISMA) Preferred Reporting Items for Systematic Reviews and Meta- Analyses The need to document research process creates the need for more up-front decision making and explication Final Notes Most of the work involved in conducting a review is not spent in statistical analysis. The scientific contribution of the final product is dependent on all stages of the review and not just the statistical analysis stage. 5

Problem Formulation Two primary questions: What is the specific hypothesis of interest in this synthesis? What evidence is relevant to this hypothesis? A well-formulated problem will define the variables of interest so that relevant and irrelevant studies can be distinguished from one another SRs Vary in Scope Specific, narrow questions Useful for testing effects of specific treatments Broad, global questions Useful for generating new knowledge Identify common elements of effective programs (Lipsey, 2008) Build better intervention theories to guide program development and evaluation design (Lipsey, 1997) Differences in scope can lead to differences in conclusions across reviews 6

Scope of SRs Not limited to questions about effects of interventions Can address trends, epidemiology, accuracy of diagnostic and prognostic tests, more Not limited to randomized controlled trials or quantitative data Qualitative synthesis (e.g., meta-ethnography, narrative analysis of qualitative research reports) Mixed/multiple methods synthesis (e.g., Thomas, Harden, et al. on programs to combat childhood obesity) What Kinds of Research Questions? Questions about intervention effects: What are the effects of x intervention on y outcomes for z populations/problem? Variations on this theme (e.g., differences in effects of interventions x 1 vs x 2 ) 7

Questions, cont d Questions about associations How does x 1 relate to x 2 for population z? (direction and strength of correlation) Variations on this theme (e.g., differences in relation of x 1 and x 2 between populations z 1 and z 2 ) Diagnostic/Prognostic questions Which test (A vs. B) is a better predictor of y? Which test (A vs. B) is a better predictor of y for z 1 vs. z 2 populations? Example Research Area: Mentoring To guide the discussion, will use Dubois et al. (in press) work on mentoring How effective are mentoring programs for youth?: A systematic review of the evidence (to appear in Psychological Science in the Public Interest) 8

Steps in Problem Formulation Determine the conceptual definitions that are relevant to the research Determine the operational definitions that are relevant to the research Set the review parameters in terms of PICOS Populations/Participants Interventions (if applicable) Comparison group (counterfactual condition) Outcomes (what classes of outcomes? what specific operations?) Study designs (should fit purpose) Note that the conceptual and operational definitions will also inform the literature search (next session) Conceptual and Operational Definitions It is critical that operational definitions fit their intended constructs (or concepts) Sometimes this is easy to spot Sometimes it is not One difficulty is that different fields may use different terminology to refer to the same thing Other times, different fields can use the same term to refer to different things As such one has to be careful about relying on the labels provided in the studies, and refer instead to their operational definitions (if given!) 9

Example A study used the term mentoring to describe the following activity: An adult was paired with 2-4 children Met one hour a week for four weeks Met in school Sessions focused on improving math achievement Probably fits the definition of tutoring better than the definition of mentoring More on Concept to Operations Fit Systematic reviews are sometimes subject to the apples and oranges critique Combining things that are actually different from one another Parallels problems in primary studies (there is no average person, all people are different) The strong version of this argument and the strong response rests on identifying whether things that are different are conceptually similar enough to be combined In other words, if you think that two measures are tapping the same underlying construct, ok to combine. If they are tapping different underlying constructs, then don t combine. 10

The PICOS Elements How effective are mentoring programs for youth?: A systematic review of the evidence What can we learn from the title? Participants = youth. Operationally defined as??? Interventions = mentoring programs. Operationally defined as??? Comparison group =??? Outcomes = Nothing specified, so??? Study Design = Effective therefore experiments We can see some of the concepts of interest just from the title. Other concepts, and all of the operational definitions, need to be specified. Examining the PICOS Elements Population/Participants Clearly interested in youth Included studies of youth under age 19 Study Design Because interested in effects, clearly need experimental evidence. Dubois et al. included both randomized and quasiexperiments. 11

Examining the PICOS Elements (cont d) Interventions Mentoring (conceptual): on-going, positive relationship with extrafamilial adult with or without specific instrumental goals Mentoring (more operational): In the typical program, each youth is paired on a one-to-one basis with a volunteer from the community with the aim of cultivating a relationship that will foster the young person s positive development and wellbeing. What can we learn about the interests of Dubois et al. from this statement? Types of Interventions to Include In the typical program, each youth is paired on a one-to-one basis with a volunteer from the community with the aim of cultivating a relationship that will foster the young person s positive development and well-being One-to-one: Excludes studies in which multiple youth are matched with a single mentor Volunteer: Excludes studies that used paid professionals (e.g., therapists) Cultivating a relationship positive development and well-being: Excludes studies that have instrumental goals, e.g., tutoring to improve achievement 12

Examining the PICOS Elements (cont d) Comparison group For studies of interventions, have to decide what you want to compare to: Other presumably effective interventions? Mentoring vs. Tutoring Other presumably ineffective interventions? Mentoring vs. Bibliotherapy Usual treatment? Not really applicable here, but often is and often cannot be avoided No treatment? Wait list control, no treatment group, etc. Comparison to another (possibly effective) intervention is quite different from the other types of comparisons, and this has implications for interpreting the results of the review Outcomes Use theoretical and practical considerations to determine which outcomes should be included in the review Usually only include outcomes that the intervention ought to impact Important to have both conceptual and operational definitions Academic achievement: Conceptual: a person s level of knowledge in a specific academic domain Operational: many, including GPA, achievement test scores, homework scores, project grades, graduation, etc. 13

Outcomes in Dubois et al. Dubois et al. examined a broad array of youth outcomes Attitudes toward school (including academic motivation) Academic achievement Socio-emotional outcomes (e.g., self-esteem, levels of depression) Conduct problems Physical activity levels Inclusion / Exclusion Criteria All systematic reviews should have explicit, operational inclusion/exclusion criteria Most of these are addressed in the PICOS E.g., if study is on at-risk students, then population = at-risk students Note that you still have to define what you mean by at-risk though Others include contextual factors Geographic/political boundaries, language restrictions, time period study was conducted These should be justified The coding session will show you how to screen studies based on inclusion/ exclusion criteria 14

Examples of How Dubois et al. Could Have Been More Narrow in Scope Restrict the types of interventions / comparisons More narrow definition of mentoring Limited to a specific program (e.g., Big Brothers / Big Sisters) Compared mentoring to another intervention (e.g., tutoring) Limited to specific populations (e.g., children who are perceived to be at risk ) Focused on particular outcomes Included only randomized experiments Etc. Generally there are no right answers to questions of scope. The onus on the reviewer is to be explicit about the choices and to defend them Using Logic Models to Help Problem Formulation Logic models (see Anderson et al. 2011) describe the connection between different determinates of outcomes and how these are related to a specific intervention Developing your own logic model, or carefully studying someone else s, can help formulate a problem for a systematic review 15

Zief et al. (2006) Logic Model for After School Programs (Partial) Intermediate Outcomes Intervention Program activities Implementation Resources Staff Etc. Immediate Change Supervision Academic support Enriching activities Behavior Homework Television Attendance Socio-emotional Safety Longer Term Outcomes Grades Test Scores Educational Attainment Self-esteem Using Logic Models to Formulate a Problem Logic model can be used to narrow the question Could focus on academic outcomes and not socio-emotional Could focus on the more immediate outcomes (e.g., do students in after school programs have more supervised time than students not in after school programs?) Can also use the logic model to diagnose a failure to find an effect on longer term outcomes 16

Elements of a Good Research Question: SAMPLE To determine if you have a good research question, ask yourself: Is it specific? Is it answerable? Are there measurable constructs? Is it practical? Is it relevant for policy/practice? Is it logical? Is it based on theory/logic model? Is it empirical? Can answers be attained using observable evidence? When Should a Research Question Be Subject to a Systematic Review? Generally, scholars undertaking a systematic review ultimately expect to find multiple studies that will meet their inclusion criteria If a good review already exists on the topic, you might be able to (a) update the literature search, (b) test new relations, and/or (c) improve the methodology If a state-of-the-art review does not exist, you can provide it! It can sometimes be informative to conduct a SR when it is believed that no studies exist A thorough literature search might find some! If a practice is widespread but has never been tested, might be good to make this point (e.g., drug testing in HS for after school activity participation) 17

Exercise With those around you, come up with a specific research question that can be the focus of a SR What intervention(s) are being considered? What outcomes are of importance? What is the key population or context? What purposes could be served by synthesizing the knowledge in this area objectives of the review)? Remember SAMPLE One person reports for the group P.O. Box 7004 St. Olavs plass 0130 Oslo, Norway E-mail: info@c2admin.org http://www.campbellcollaboration.org 18