Clear Horizon. Workbook. Quick Start Guide (jan09) MSC DESIGN

Similar documents
Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

TU-E2090 Research Assignment in Operations Management and Services

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

The Political Engagement Activity Student Guide

Politics and Society Curriculum Specification

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

elearning OVERVIEW GFA Consulting Group GmbH 1

Stakeholder Engagement and Communication Plan (SECP)

Personal Tutoring at Staffordshire University

White Paper. The Art of Learning

The Power of Impact: Designing Academic Interventions for 1 st Year Students. Louisiana State University

Note: Principal version Modification Amendment Modification Amendment Modification Complete version from 1 October 2014

Programme Specification. MSc in International Real Estate

A Note on Structuring Employability Skills for Accounting Students

Deploying Agile Practices in Organizations: A Case Study

ABET Criteria for Accrediting Computer Science Programs

DICE - Final Report. Project Information Project Acronym DICE Project Title

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

PUPIL PREMIUM POLICY

If we want to measure the amount of cereal inside the box, what tool would we use: string, square tiles, or cubes?

Harvesting the Wisdom of Coalitions

ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY DOWNLOAD EBOOK : ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY PDF

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

PETER BLATCHFORD, PAUL BASSETT, HARVEY GOLDSTEIN & CLARE MARTIN,

Programme Specification

Linguistics Program Outcomes Assessment 2012

APAC Accreditation Summary Assessment Report Department of Psychology, James Cook University

Reducing Spoon-Feeding to Promote Independent Thinking

teaching issues 4 Fact sheet Generic skills Context The nature of generic skills

WORK OF LEADERS GROUP REPORT

TOURISM ECONOMICS AND POLICY (ASPECTS OF TOURISM) BY LARRY DWYER, PETER FORSYTH, WAYNE DWYER

Save Children. Can Math Recovery. before They Fail?

Strategic Practice: Career Practitioner Case Study

A process by any other name

Evidence into Practice: An International Perspective. CMHO Conference, Toronto, November 2008

Changing User Attitudes to Reduce Spreadsheet Risk

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

EQuIP Review Feedback

Queen's Clinical Investigator Program: In- Training Evaluation Form

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

Assessment and Evaluation

Part I. Figuring out how English works

STEPS TO EFFECTIVE ADVOCACY

VIEW: An Assessment of Problem Solving Style

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

Stacks Teacher notes. Activity description. Suitability. Time. AMP resources. Equipment. Key mathematical language. Key processes

Lesson M4. page 1 of 2

Major Milestones, Team Activities, and Individual Deliverables

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Virtual Seminar Courses: Issues from here to there

Success Factors for Creativity Workshops in RE

Upward Bound Program

WP 2: Project Quality Assurance. Quality Manual

e-portfolios in Australian education and training 2008 National Symposium Report

IBM Software Group. Mastering Requirements Management with Use Cases Module 6: Define the System

Illinois WIC Program Nutrition Practice Standards (NPS) Effective Secondary Education May 2013

Key concepts for the insider-researcher

E C C. American Heart Association. Basic Life Support Instructor Course. Updated Written Exams. February 2016

EXPO MILANO CALL Best Sustainable Development Practices for Food Security

Henley Business School at Univ of Reading

RETURNING TEACHER REQUIRED TRAINING MODULE YE TRANSCRIPT

I. INTRODUCTION. for conducting the research, the problems in teaching vocabulary, and the suitable

understandings, and as transfer tasks that allow students to apply their knowledge to new situations.

Eduroam Support Clinics What are they?

Fulltime MSc Real Estate and MSc Real Estate Finance Programmes: An Introduction

Client Psychology and Motivation for Personal Trainers

Using research in your school and your teaching Research-engaged professional practice TPLF06

Kelli Allen. Vicki Nieter. Jeanna Scheve. Foreword by Gregory J. Kaiser

This curriculum is brought to you by the National Officer Team.

Learning and Innovation Networks for sustainable agriculture - LINSA

Alternative education: Filling the gap in emergency and post-conflict situations

Developing Students Research Proposal Design through Group Investigation Method

Community engagement toolkit for planning

Chapter 9 The Beginning Teacher Support Program

Albemarle County Public Schools School Improvement Plan KEY CHANGES THIS YEAR

SEDRIN School Education for Roma Integration LLP GR-COMENIUS-CMP

Understanding and Changing Habits

Planning a Dissertation/ Project

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Assessing Children s Writing Connect with the Classroom Observation and Assessment

Mandatory Review of Social Skills Qualifications. Consultation document for Approval to List

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Lecturing Module

Marketing Management MBA 706 Mondays 2:00-4:50

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

The feasibility, delivery and cost effectiveness of drink driving interventions: A qualitative analysis of professional stakeholders

Language Acquisition Chart

Films for ESOL training. Section 2 - Language Experience

Being BEING ENTREPRENEURIAL OCR LEVEL 2 AND 3 AWARDS IN BEING ENTREPRENEURIAL DELIVERY GUIDE

Interim Review of the Public Engagement with Research Catalysts Programme 2012 to 2015

How to Take Accurate Meeting Minutes

Soft Systems Approach in Facilitating Regional Forest Policy Processes

EAL Train the Trainer Course New dates: 31 st January 1 st February 2018

Delaware Performance Appraisal System Building greater skills and knowledge for educators

University of Toronto

Integrating simulation into the engineering curriculum: a case study

Transcription:

Clear Horizon Workbook Quick Start Guide (jan09) MSC DESIGN

Designing a Most Significant Change Technique (MSC) system Programme/ project name This aim of this guide is to help groups design an MSC system for their programme or project. The guide splits MSC design into four core parts. Each one is described, then some questions are asked in relation to the step. Hopefully, answering these questions will help your group develop an appropriate MSC process. This document was designed to be used as part of a one-day training workshop in MSC it was not intended to be a stand-alone document. But please feel free to copy this document, just acknowledge Clear Horizon as the author and please let us know how it can be improved by contacting Tracey at tracey@clearhorizon.com.au Overview of MSC MSC involves the collection and systematic participatory interpretation of stories of significant change. Unlike conventional approaches to monitoring, MSC does not employ quantitative indicators, but is a qualitative approach. The MSC approach was originally developed by Rick Davies through his work with a participatory rural development project in Bangladesh in 1994. It has since been adapted and widely promoted by Jess Dart in Australia. Information about the MSC approach has also been made available globally through an MSC internet discussion group set up in 2000, which now has more than 120 members. Access to the mailing list and papers concerning the work of Rick, Jessica and others can be found at: http://groups.yahoo.com/group/mostsignificantchanges In 2000 the name Most Significant Change Approach was settled on as it embodies one of the most fundamental aspects of the approach: the collection and systematic selection of reported changes. Overview of steps to design an MSC process: 1. Clarifying purpose of MSC 2. Working out the process -Establishing 'domains of change' -Collecting significant change -Determining a structure to select significant change (SC) stories -Determining a process for reviewing the SC stories -Feedback -Verification 3. How to use the stories -Adapting the program (closing the loop) -Secondary analysis -Quantification -Communication and promotion - Reporting 4. Thinking about Starting 1 P a g e

1. Clarifying the purpose of MSC The purpose of MSC in your project Monitoring and evaluation in an organisation may serve several purposes. MSC addresses some purposes more than others. In our experience, MSC is suited to monitoring that focuses on learning rather than just accountability. It is also an appropriate tool when you are interested in the effect of the intervention on people s lives and keen to include the words of nonprofessionals. In addition, MSC can help staff to improve their capabilities in capturing and analysing the impact of their work. Monitoring or Evaluation? Deciding whether to use MSC for monitoring or evaluation depends on the resources you have available and how often you intend to run the MSC cycle. Building into on-going monitoring usually means that staff take a more active role and the MSC cycle happens on a more regular basis (i.e. quarterly). If however, you are more interested in using MSC more strategically and at critical points during an interventions life, then you may consider MSC for evaluation. The type of program where MSC work best MSC is better suited to some program contexts than others. In a simple program with easily defined outcomes (such as vaccination, perhaps), quantitative monitoring may be sufficient and would certainly consume less time than MSC. In other program contexts, however, conventional monitoring and evaluation tools may not provide sufficient data to make sense of program impacts and foster learning. The types of programs that are not adequately catered for by orthodox approaches and can gain considerable value from MSC include programs that are: complex and produce diverse and emergent outcomes focused on social change participatory in ethos designed with repeated contact between field staff and participants struggling with conventional monitoring systems highly customised services to a small number of beneficiaries (such as family counselling). There are also some instances where MSC costs may not justify the benefits. While MSC can be used to address the following, there may be other less time-consuming ways to achieve the same objectives: develop good news stories for public relations (PR) conduct retrospective evaluation of a program that is complete understand the average experience of participants produce an evaluation report for accountability purposes Is MSC right for your organisation/program context? Some program contexts are more conducive to the successful implementation of MSC. In our experience, some of the key enablers for MSC are: an organisational culture that encourages learning and commitment by management champions (i.e. people who can promote the use of MSC) with good facilitation skills a willingness to try something different time to run several cycles of the approach infrastructure to enable regular feedback of the results to stakeholders 2 P a g e

Why have you decided to use MSC? Questions about fit and purpose Will you use it for monitoring or evaluation? In what ways does your project lend itself to MSC? Is their management support/and organisational learning culture? How will it complement your existing M&E system? What are the main benefits your hope to get from using the technique? 3 P a g e

2. Working out the process 2.1 Establishing domains of change What are Domains? Domains are broad and often fuzzy categories of possible changes. For example, participants in MSC could be asked to look for significant changes in four domains: Changes in the quality of people's lives Changes in the nature of people's participation in development activities Changes in the sustainability of people's organisations and activities Any other changes A domain of change is not an indicator. Indicators are almost the opposite. Indicators are supposed to be SMART (Specific, Measurable, Achievable, Relevant and Timebound changes). Indicators need to be defined in such a way that everyone has the same interpretation of what they mean. On the other hand, domains of change are deliberately fuzzy, enough to allow people to have different interpretations of what constitutes a change in that area. Why use domains? Dividing SC stories up into domains can make story selection process easier to manage. If you have domains, SC stories from each domain can be considered separately, so that you are not comparing apples to pears. This helps if you are going to collect and select among many SC stories. There are two main ways of determining domains, the first distinguishes SC stories by their content the second by stakeholder groups: 1) If domains are to refer to the content of the SC stories, many organisations base the domains on their pre-existing high order objectives. This allows them to track whether they are achieving their objectives. Alternatively new categories can be developed. 2) Domains can be used to help describe SC stories from different stakeholders eg: significant changes from beneficiaries, from programme staff, from partners. Tips: You can start without domains and allow them to emerge We suggest you don t have more than 4-5. You can have a lessons learned domain to pick up on the negatives 4 P a g e

Questions about Domains Will you use domains? Who will select domains? Top down/bottom up? How will the domains be selected? How will lessons learned domains be handled? Any ideas at this stage about what they should be? 5 P a g e

2.2 Collect the significant change stories There are many ways to collect SC stories such as by interview or through group discussion. Ideally SC stories will be 1-2 pages long, and will be documented at some stage. Who will collect the SC stories? Questions about story collection: How often will SC stories be collected? How will they be collected? (Method) Who will tell the SC stories? How will you ensure that the collection process is ethical? 6 P a g e

2.3 Selection Determine a structure to select significant change stories A central idea in MSC is the use of a hierarchy of selection processes. This helps reduce a large volume of locally important SC stories down to a small number of widely valued SC stories. The use of multiple levels of selection enables this to happen without overburdening any individual or groups with too much work, despite the participatory nature of the selection process. Therefore, in designing an MSC process for your project/programme you need to consider who should be involved in the selection process. Questions How often will you select stories? Who would benefit from reading & selecting SC stories? What related reporting requirements do you have? How often? 7 P a g e

Map out a possible structure: 8 P a g e

Determining a process for reviewing the SC stories Often story selection begins with a group of people sitting together, with a pile of documented SC stories which may or may not be assigned to domains. The task of selecting SC stories is to reduce the pile of stories down to one story per domain. So if there are four domains, in each domain the participants will select a story that they believe represent the most significant change of all. If the SC stories have not yet been designated against domains, this is the one of the first jobs to be done. The selection process invariably begins with reading some or all of the SC stories out loud or individually. We tend to prefer reading the SC stories aloud, as it brings them to life. But the effectiveness and practicality of this of this may vary with context. If the SC stories have already been allocated domains, then stories from each domain are considered as separate groups. From here various facilitated and unfacilitated processes can be used to help participants decide which are the most significant stories. Whatever process you use to select the stories, it is most important to remember to document the reasons why certain stories were selected over the others. We encourage you to experiment with different selection processes to find one that best suits your cultural context. While various processes can be used, the key ingredients to story selection are: Everybody reads the SC stories Hold an in-depth conversation about which ones should be chosen Come to a decision with regard to which stories everyone feels to be most significant - try to reflect all views (eg. Choose 2 if necessary or add caveat if you cannot reach agreement in the time available) Document the reasons for the choice. We think it is quite important to do this in an inductive way - that is to choose the story first, then discuss the criteria afterwards. The process tends to be more productive than if you set the criteria in advance. 9 P a g e

Questions: How will you select the SC stories? (i.e. facilitated, negotiated, mixed groups) How will you make sure that everyone in the group is happy with the choice? How/ who will record the reasons for choosing the story? 10 P a g e

2.4. Feedback It is really important to feedback your selected SC stories and the reasons for the choice to the relevant people. Then the next round of SC stories can benefit from the feedback. Different ways to provide feedback include: in an evaluation summit via fax email feedback verbal feedback, via phone or face-to-face newsletters formal report creative dissemination: song, theatre, etc. report Questions Who needs to get feedback on selected SC stories and reasons for selection? How will feedback be communicated? What will the feedback cover? (Comments on all SC stories or just those selected?) 11 P a g e

2.5. Verification There are two ways in which reported changes can be selected for verification: Making random checks of reported changes. This method is not advocated and we don t know of any organisation that has made use of random checks. Making checks on those changes that have been selected as most significant of all, i.e. those that are selected as most significant at all levels, from the field level, though middle management, up to senior management. Given the weight of meaning attached to these reported changes it makes sense to make sure that the foundations are secure, in the sense that the basic facts of what happened are basically correct. Will you verify any of the SC stories? Questions: If yes which SC stories? If yes what aspects will be verified? Who will verify them and how /when? 2.6. Meta-monitoring Often a spreadsheet is used to help record where the SC stories come from. Things that might be included are: gender/region/ of storytellers outcome of the selection process frequency of SC stories. Questions meta- monitoring: How will you monitor and store the SC stories? If you will use a spreadsheet what information will you record? Who will do this? 12 P a g e

3. Use of stories 3.1 Secondary analysis Some organisations choose to analyse the all the SC stories together both those selected and those not selected. This can be done in a variety of ways such as thematic analysis. For an example of this analysis look at the Target 10 evaluation stories publication on www.clearhorizon.com.au. Often it involves looking for key themes and quantifying how many times each theme has occurred across all the stories. Questions secondary analysis Will you do this? Who is best placed? What types of things might you look for? When? 3.2 Quantification One further step in MSC is actually quantify the emergence of a particular theme across a random sample of the population. For example, in the Target 10 implementation several stories explored the way dairy farmers were feeling more confident to challenge the feed stock agents as a result of the workhop on cow nutrition. This was an unexpected outcome to the project team. They could have done a short phone survey to determin how wide spread this change was. In the MSC guide we refer to this usage as developing dynamic indicators. Will you do this and when? Questions quantification What things might you want to quantify? How? 13 P a g e

3.3 Communication and promotion Stories provide useful information about what is happening on the ground. This information, particularly where things are working well or going wrong, could be useful learning for others involved in similar initiatives. Similarly, stories are a good way of promoting what you are doing to interested parties. Questions communication and promotion Who would decide which stories to share? Who might be interested in reading the stories? How best could they be used to communicate information about your project? Who would do this? 3.4 Reporting MSC stories are a useful way of reporting on outcomes. Question reporting What reporting requirements do you have? Report name Compiled by (who writes) Audience who) (For What s included? How would Stories be used/presented in a report? 14 P a g e

4. Starting One of the most daunting steps is getting started. Often there is scepticism about the validity of the technique, and fears that it will take too much time. It often takes an enthusiastic individual or small group to raise interest in MSC by visiting key people/groups and presenting the methodology. It can helps to present SC stories from other projects and show example reports. These can be downloaded from the web site. It is also worth presenting the technique simply - presenting all the theory at the start only confuses people. Many practitioners will not be interested in, nor need to understand, all the theory behind MSC. The message that needs to be conveyed at the start is that MSC is simple and straightforward to implement. In order to raise interest in MSC, it also helps if there is clarity about the purpose of MSC and the role it will play in an organisation. It cannot be stressed too often that MSC is not intended to be a stand-alone technique in monitoring and evaluation. Another really important lesson we have learned from experience is to start small. It is a risky exercise to implement a huge and complicated MSC system without first piloting it on a smaller scale. Every organisational context is different, so MSC will have to be moulded somewhat to your particular context. Because of this, it pays to conduct a pilot and learn what works and what does not work. When piloting MSC look for people and sections of your organisation that are most interested and enthusiastic about the potential of MSC. 15 P a g e

Questions 1. How will you get buy in from the people who will be involved in creating/selecting SC stories? 2. How will you expose people to MSC what training, if any, is needed? 3. Where can you begin is there a small pilot that you can test first? 3. Who are the best people to capture the first SC stories from? 16 P a g e

Key resources for MSC MSC User guide: can be downloaded at www.clearhorizon.com.au Quick start guide: can be downloaded at www.clearhorizon.com.au Training available through Clear Horizon twice a year Egroup, and repository of many papers from all over the world on MSC: http://groups.yahoo.com/group/mostsignificantchanges Key Papers: Dart, J. J. & Davies R.J. (2005) The Most Significant Change User guide. Available at www.clearhorizon.com.au Dart, J. J. & Davies R.J. (2003) A dialogical story-based evaluation tool: the most significant change technique, American Journal of Evaluation 24, 137-155. Dart, J. J. (2000). Stories for change: A new model of evaluation for agricultural extension projects in Australia. PhD thesis, Institute of Land and Food Resources. Melbourne: University of Melbourne. Dart, J.J. (1999a). A story approach for monitoring change in an agricultural extension project. Proceedings of the Association for qualitative research (AQR) international conference. Melbourne: AQR, [online]: http://www.latrobe.edu.au/www/aqr/offer/papers/jdart.htm. Dart, J.J. (1999b). The tale behind the performance story approach. Evaluation News and Comment, 8, no.1, 12-13. Davies, R. J. (1996). An evolutionary approach to facilitating organisational learning: An experiment by the Christian Commission for Development in Bangladesh. Swansea. UK: Centre for Development Studies, [online]: http://www.swan.ac.uk/cds/rd/ccdb.htm. This paper has also been published, with some variations, in Mosse, D. Farrington, J., and Rew, A. (1998) Development as process: concepts and methods for working with complexity. London: Routledge/ODI (pp 68-83); and in Impact assessment and project Appraisal, 16. No. 3, September 1998, 243-250. Davies, R. J. (1998). Order and diversity: Representing and assisting organisational learning in nongovernment aid organisations. PhD thesis. Swansea. UK: Centre for Development Studies, [online]: www.swan.ac.uk/cds/rd/thesis.htm. Delaney, T. (2006). A story of Influence: an integrated theory of influence for the most significant change technique. Paper presented at Australasian Evaluation conference, Darwin, 2006 17 P a g e