Developmental Evaluation

Similar documents
Developmental Evaluation: Systems Thinking and Complexity Science

Coaching Others for Top Performance 16 Hour Workshop

Full text of O L O W Science As Inquiry conference. Science as Inquiry

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

Ministry of Education General Administration for Private Education ELT Supervision

TIM: Table of Summary Descriptors This table contains the summary descriptors for each cell of the Technology Integration Matrix (TIM).

Global Convention on Coaching: Together Envisaging a Future for coaching

IMPORTANT STEPS WHEN BUILDING A NEW TEAM

SACS Reaffirmation of Accreditation: Process and Reports

EOSC Governance Development Forum 4 May 2017 Per Öster

Understanding Co operatives Through Research

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Knowledge Synthesis and Integration: Changing Models, Changing Practices

A Systems Approach to Small-Scale Development Projects

KENTUCKY FRAMEWORK FOR TEACHING

Innovating Toward a Vibrant Learning Ecosystem:

Unit 3. Design Activity. Overview. Purpose. Profile

Conceptual Framework: Presentation

The Consistent Positive Direction Pinnacle Certification Course

Position Statements. Index of Association Position Statements

Deploying Agile Practices in Organizations: A Case Study

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

ACTION LEARNING: AN INTRODUCTION AND SOME METHODS INTRODUCTION TO ACTION LEARNING

CHAPTER 2: COUNTERING FOUR RISKY ASSUMPTIONS

Getting Started with Deliberate Practice

Delaware Performance Appraisal System Building greater skills and knowledge for educators

UNDERSTANDING DECISION-MAKING IN RUGBY By. Dave Hadfield Sport Psychologist & Coaching Consultant Wellington and Hurricanes Rugby.

Relationships Between Motivation And Student Performance In A Technology-Rich Classroom Environment

1.1 Examining beliefs and assumptions Begin a conversation to clarify beliefs and assumptions about professional learning and change.

University of Toronto

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

Program evaluation models and related theories: AMEE Guide No. 67

UNIVERSITY OF DERBY JOB DESCRIPTION. Centre for Excellence in Learning and Teaching. JOB NUMBER SALARY to per annum

Lecturing Module

PEDAGOGICAL LEARNING WALKS: MAKING THE THEORY; PRACTICE

PROGRAM HANDBOOK. for the ACCREDITATION OF INSTRUMENT CALIBRATION LABORATORIES. by the HEALTH PHYSICS SOCIETY

PHILOSOPHY & CULTURE Syllabus

Strategic Practice: Career Practitioner Case Study

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

INFORMATION What is 2GetThere? Learning by doing

Testimony to the U.S. Senate Committee on Health, Education, Labor and Pensions. John White, Louisiana State Superintendent of Education

Higher education is becoming a major driver of economic competitiveness

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Final Teach For America Interim Certification Program

VIA ACTION. A Primer for I/O Psychologists. Robert B. Kaiser

+ Restorative Justice: An Anthology

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

Plenary Session The School as a Home for the Mind. Presenters Angela Salmon, FIU Erskine Dottin, FIU

The Success Principles How to Get from Where You Are to Where You Want to Be

ECON 442: Economic Development Course Syllabus Second Semester 2009/2010

What is Thinking (Cognition)?

SPORTS POLICIES AND GUIDELINES

Leadership Development

This table contains the extended descriptors for Active Learning on the Technology Integration Matrix (TIM).

Charter School Performance Accountability

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

Program Assessment and Alignment

Regional Bureau for Education in Africa (BREDA)

Rule-based Expert Systems

A process by any other name

World Power Method TM Coaching Playbook

Using Rhetoric Technique in Persuasive Speech

Summary results (year 1-3)

Unit 7 Data analysis and design

PM tutor. Estimate Activity Durations Part 2. Presented by Dipo Tepede, PMP, SSBB, MBA. Empowering Excellence. Powered by POeT Solvers Limited

DEPARTMENT OF PHILOSOPHY: PER COURSE TEACHING POSITIONS Spring, 2017

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

PCG Special Education Brief

Assessment. the international training and education center on hiv. Continued on page 4

Developing an Assessment Plan to Learn About Student Learning

National and Regional performance and accountability: State of the Nation/Region Program Costa Rica.

Tun your everyday simulation activity into research

Davidson College Library Strategic Plan

Quality in University Lifelong Learning (ULLL) and the Bologna process

Kelli Allen. Vicki Nieter. Jeanna Scheve. Foreword by Gregory J. Kaiser

Teaching Task Rewrite. Teaching Task: Rewrite the Teaching Task: What is the theme of the poem Mother to Son?

PAST EXPERIENCE AS COORDINATION ENABLER IN EXTREME ENVIRONMENT: THE CASE OF THE FRENCH AIR FORCE AEROBATIC TEAM

UoS - College of Business Administration. Master of Business Administration (MBA)

HOW DO YOU IMPROVE YOUR CORPORATE LEARNING?

The Good Judgment Project: A large scale test of different methods of combining expert predictions

Aviation English Training: How long Does it Take?

Section 1: Basic Principles and Framework of Behaviour

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

Knowledge for the Future Developments in Higher Education and Research in the Netherlands

West Georgia RESA 99 Brown School Drive Grantville, GA

Additional Qualification Course Guideline Computer Studies, Specialist

School Leadership Rubrics

Faculty Schedule Preference Survey Results

Programme Specification

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

K-12 PROFESSIONAL DEVELOPMENT

Chaffey College Program Review Report

Politics and Society Curriculum Specification

International Variations in Divergent Creativity and the Impact on Teaching Entrepreneurship

Activity Analysis and Development through Information Systems Development

What Am I Getting Into?

Scenario Design for Training Systems in Crisis Management: Training Resilience Capabilities

Transcription:

Developmental Evaluation with CES June 1, 2009 1

Evolving Understandings I keep changing what I said. Any person who is intellectually alive changes his ideas. If anyone at a university is teaching the same thing they were teaching five years ago, either the field is dead, or they haven t been thinking. Noam Chomsky The Professor Provaocateur, The New York Times Magazine, Nov. 2, 2003: 13. 2

Interpretive Frameworks May 2003 Harvard Business Review "The High Cost of Accuracy." Kathleen Sutcliffe and Klaus Weber. They concluded that "the way senior executives interpret their business environment is more important for performance than how accurately they know their environment." 3

They further concluded that it is a waste of resources to spend a lot of money increasing the marginal accuracy of data available to senior executives compared to the value of enhancing their capacity to interpret whatever data they have. Executives were more limited by a lack of capacity to make sense of data than by inadequate or inaccurate data. In essence, they found that interpretive capacity, or "mind-sets," distinguish highperformance more than data quality and accuracy. 4

Original Primary Options Formative and Summative Evaluation (Mid-term and End-of-Project Reviews) 5

Improvement versus Development 6

Evidence-based Practice Evaluation grew up in the projects testing models under a theory of change that pilot testing would lead to proven models that could be disseminated and taken to scale: The search for best practices and evidenced-based practices 7

Fundamental Issue: How the World Is Changed Top-down dissemination of proven models versus Bottoms-up adaptive management 8

Models vs. Principles Identifying proven principles for adaptive management (bottoms-up approach) versus Identifying and disseminating proven models (top down approach) 9

Conditions that challenge traditional model-testing evaluation High innovation Development High uncertainty Dynamic Adaptive Management Emergent Systems Change 10

Mintzberg on Strategy Two types of strategy: Intended & Emergent Intended Strategy Unrealized Strategy Deliberate Strategy Emergent Strategy Realized Strategy 11

Re-conceptualizing Use Use is a process not a event Use involves an interaction not just a report Use involves training for use not just delivery of results Use is a leadership function 12

Some Evaluation Premises: Evaluation is part of initial program design, including conceptualizing the theory of change Evaluator s role is to help users clarify their purpose, hoped-for results, and change model. Evaluators can/should offer conceptual and methodological options. Evaluators can help by questioning assumptions. Evaluators can play a key role in facilitating evaluative thinking all along the way. Interpretative dialogue is critical. Designs can be emergent and flexible. 13

Contingency-based Evaluation Situational analysis & responsiveness Context sensitivity Clarify and focus on intended users: stakeholder analysis Clarify and focus on intended uses Methodological appropriateness Criteria for evaluating the evaluation: credibility, meaningfulness 14

Seeing Through A Complexity Lens You don't see something until you have the right metaphor to let you perceive it. Thomas Kuhn 15

Getting to Maybe: How the World Is Changed? 2006 Frances Westley, Brenda Zimmerman, Michael Q. Patton Random House Canada, 16

Conceptual Options Simple Complicated Complex 17

Types of Community Issues The Certainty/Agreement Matrix Close to Close to Certainty Far from 18

Know When Your Challenges Are In the Zone of Complexity Socially Complicated Build relationships, create common ground Zone of Complexity Systems Thinking Relationship Building Collaboration Good Enough Vision Chunking Around Drivers Minimum Specifications Multiple Actions Adaptability & Organic Close to Simple Plan, control Close to Technically Complicated Experiment, coordinate expertise Certainty Far from 19

Simple (Known arena of action) Tight, centralized connections. Can identify and make sense of patterns. Linear cause and effect. Best practices identifiable within the current context (which of course may not be self-evident or known to others hence importance of context). 20

Simple Complicated Complex Following a Recipe A Rocket to the Moon Raising a Child The recipe is essential Recipes are tested to assure replicability of later efforts No particular expertise; knowing how to cook increases success Recipes produce standard products Certainty of same results every time 21

Complicated (Knowable arena) Relationships are looser but still clustered around a central core. Cause and effect is dynamic, multidimensional, and enmeshed in system relationships. System relationships can be modelled and understood. Expertise and coordination needed. 22

Simple Following a Recipe A Rocket to the Moon Raising a Child The recipe is essential Recipes are tested to assure replicability of later efforts No particular expertise; knowing how to cook increases success Recipes produce standard products Certainty of same results every time Complicated Formulae are critical and necessary Sending one rocket increases assurance that next will be ok High level of expertise in many specialized fields + coordination Rockets similar in critical ways High degree of certainty of outcome Complex 23

Socially complicated Implementing human rights agreements, like gender equity or outlawing child labor Environmental Initiatives Many different and competing stakeholders Diverse vested interests High stakes 24

Socially complicated situations pose the challenge of coordinating and integrating many players 25

Stakeholder Mapping High Interest/ Low Power High Interest/ High Power THE INVOLVED THE PLAYERS THE CROWD Low interest/ Low Power CONTEXT SETTERS Low Interest/ High Power 26

Complex Centre is loosely connected to network. Cause effect difficult to track; nonlinear, interdependent relationships Highly context dependent. Outcomes emergent, not predictable. 27

Simple Complicated Following a Recipe A Rocket to the Moon Complex Raising a Child The recipe is essential Recipes are tested to assure replicability of later efforts No particular expertise; knowing how to cook increases success Recipes produce standard products Certainty of same results every time Sending one rocket increases assurance that next will be ok High level of expertise in many specialized fields + coordination Rockets similar in critical ways High degree of certainty of outcome Formulae have only a limited application Raising one child gives no assurance of success with the next Expertise can help but is not sufficient; relationships are key Every child is unique Uncertainty of outcome remains 28

Complex Nonlinear Dynamics Nonlinear: Small actions can have large reactions. The Butterfly Wings Metaphor Emergent: Self-organizing, Attractors Dynamic: Interactions within, between, and among subsystems and parts within systems can volatile, changing Getting to Maybe: Uncertainty, unpredictable, uncontrollable 29

Simple Following a Recipe A Rocket to the Moon The recipe is essential Recipes are tested to assure replicability of later efforts No particular expertise; knowing how to cook increases success Recipe notes the quantity and nature of parts needed Recipes produce standard products Certainty of same results every time Complicated Formulae are critical and necessary Sending one rocket increases assurance that next will be ok High level of expertise in many specialized fields + coordination Separate into parts and then coordinate Rockets similar in critical ways High degree of certainty of outcome Complex Raising a Child Formulae have only a limited application Raising one child gives no assurance of success with the next Expertise can help but is not sufficient; relationships are key Can t separate parts from the whole Every child is unique Uncertainty of outcome remains 30

A Leader's Framework for Decision Making by David J. Snowden and Mary E. Boone, Harvard Business Review, November, 2007: Wise executives tailor their approach to fit the complexity of the circumstances they face. 31

Example The McGill-McConnell Leadership Program Example Simple elements Complicated elements Complex elements 32

Simple outcomes Increase knowledge and skills of participants Evaluation: Pre-post data and documentation of learning 33

Complicated Impacts Change participants organizations Evaluation: Case studies of organizational change 34

Complex Vision Infuse energy into the moribund notfor-profit (voluntary) sector Make the sector more dynamic Create network of leaders who actively engage in change 35

Evaluating the Complex Real time follow-up of network connections and actions Follow-up is an intervention Rapid feedback of findings permits infusion of resources in support of emergent outcomes 36

Process Use Infusing evaluative thinking as a primary type of process use. Capacity-building as an evaluation focus of process use. 37

Paradigms and Lenses The importance of interpretive frameworks Complexity as an interpretive framework 38

Complex Situations Highly emergent (difficult to plan and predict) Highly dynamic, rapidly changing Relationships are interdependent and non-linear rather than simple and linear (cause-effect) 39

Contingency-based Developmental Evaluation 40

DEVELOPMENTAL EVALUATION DEFINED Evaluation processes, including asking evaluative questions and applying evaluation logic, to support program, product, staff and/or organizational development. The evaluator is part of a team whose members collaborate to conceptualize, design and test new approaches in a long-term, on-going process of continuous improvement, adaptation and intentional change. The evaluator's primary function in the team is to elucidate team discussions with evaluative questions, data and logic, and facilitate data-based decision-making in the developmental process. 41

Other names Real time evaluation Emergent evaluation Action evaluation Adaptive evaluation 42

CONTRASTS Traditional evaluations Testing models Complexity-based, Developmental Evaluation Supporting innovation and adaptation 43

Traditional Evaluation Render definitive judgments of success or failure Developmental Evaluation Provide feedback, generate learnings, support direction or affirm changes in direction in real time 44

Traditional Evaluation Render definitive judgments of success or failure Measure success against predetermined goals Developmental Evaluation Provide feedback, generate learnings, support direction or affirm changes in direction Develop new measures and monitoring mechanisms as goals emerge & evolve 45

Traditional Evaluation Evaluator external, independent, objective Developmental Evaluation Evaluator part of a team, a facilitator and learning coach bringing evaluative thinking to the table, supportive of the organization s goals 46

Traditional Evaluation Evaluator determines the design based on the evaluator s perspective about what is important. The evaluator controls the evaluation. Developmental Evaluation Evaluator collaborates with those engaged in the change effort to design an evaluation process that matches philosophically and organizationally. 47

Traditional Evaluation Design the evaluation based on linear cause-effect logic models Developmental Evaluation Design the evaluation to capture system dynamics, interdependencies, and emergent interconnections 48

Traditional Evaluation Aim to produce generalizable findings across time & space. Developmental Evaluation Aim to produce context-specific understandings that inform ongoing innovation 49

Traditional Evaluation Accountability focused on and directed to external authorities and funders. Developmental Evaluation Accountability centered on the innovators deep sense of fundamental values and commitments and learning. 50

Traditional Evaluation Accountability to control and locate blame for failures Developmental Evaluation Learning to respond to lack of control and stay in touch with what s unfolding And thereby respond strategically 51

Traditional Evaluation Evaluation often a compliance function delegated down in the organization Developmental Evaluation Evaluation a leadership function: Reality-testing, results-focused, learning-oriented leadership 52

Traditional Evaluation Evaluation engenders fear of failure. Developmental Evaluation Evaluation supports hunger for learning. 53

Conditions High innovation Development High uncertainty Dynamic Emergent 54

Challenge: Matching the evaluation process and design to the nature of the situation: Contingency-based Evaluation 55

References Getting to Maybe: How the World Is Changed? Frances Westley, Brenda Zimmerman, Michael Q. Patton, Random House Canada, 2006 Utilization-Focused Evaluation, 4 th ed.,, Sage, 2008. Developmental Evaluation,, Guilford Press, forthcoming. 56

Resources The J.W. McConnell Foundation on Developmental Evaluation http://www.mcconnellfoundation.ca/default.aspx?page=139 57