Using Responsive Evaluation in Strategic Management Bobby Thomas Cameron

Similar documents
Research as Design-Design as Research

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Understanding Co operatives Through Research

The SREB Leadership Initiative and its

Developing an Assessment Plan to Learn About Student Learning

1 3-5 = Subtraction - a binary operation

Master s Programme in European Studies

Unit 7 Data analysis and design

Development and Innovation in Curriculum Design in Landscape Planning: Students as Agents of Change

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

LEAD 612 Advanced Qualitative Research Fall 2015 Dr. Lea Hubbard Camino Hall 101A

Mapping the Assets of Your Community:

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

Social Emotional Learning in High School: How Three Urban High Schools Engage, Educate, and Empower Youth

AGENDA LEARNING THEORIES LEARNING THEORIES. Advanced Learning Theories 2/22/2016

Tutor s Guide TARGET AUDIENCES. "Qualitative survey methods applied to natural resource management"

Is operations research really research?

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Study Abroad Housing and Cultural Intelligence: Does Housing Influence the Gaining of Cultural Intelligence?

DEPARTMENT OF PHILOSOPHY: PER COURSE TEACHING POSITIONS Winter, 2017

WORK OF LEADERS GROUP REPORT

Interim Review of the Public Engagement with Research Catalysts Programme 2012 to 2015

Syllabus: Introduction to Philosophy

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Executive Summary: Tutor-facilitated Digital Literacy Acquisition

Writing for the AP U.S. History Exam

Leadership Development at

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Problems of practice-based Doctorates in Art and Design: a viewpoint from Finland

MULTIDISCIPLINARY TEAM COMMUNICATION THROUGH VISUAL REPRESENTATIONS

Advancing the Discipline of Leadership Studies. What is an Academic Discipline?

Content analysis (qualitative, thematic) (Last updated: 9/4/06, Yan Zhang)

eportfolios in K-12 and in Teacher Education

HOLISTIC LESSON PLAN Nov. 15, 2010 Course: CHC2D (Grade 10, Academic History)

Multidisciplinary Engineering Systems 2 nd and 3rd Year College-Wide Courses

eportfolio Guide Missouri State University

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

UNIVERSITY OF THESSALY DEPARTMENT OF EARLY CHILDHOOD EDUCATION POSTGRADUATE STUDIES INFORMATION GUIDE

Different Requirements Gathering Techniques and Issues. Javaria Mushtaq

THE 2016 FORUM ON ACCREDITATION August 17-18, 2016, Toronto, ON

California Professional Standards for Education Leaders (CPSELs)

POLITICAL SCIENCE 315 INTERNATIONAL RELATIONS

Trust and Community: Continued Engagement in Second Life

IMPLEMENTING THE EARLY YEARS LEARNING FRAMEWORK

TU-E2090 Research Assignment in Operations Management and Services

EDUC-E328 Science in the Elementary Schools

Assessment and Evaluation

GRADUATE CURRICULUM REVIEW REPORT

Strategic Practice: Career Practitioner Case Study

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Using portfolio assessment as an innovation to assess problembased learning in Hong Kong schools

b) Allegation means information in any form forwarded to a Dean relating to possible Misconduct in Scholarly Activity.

Activity Analysis and Development through Information Systems Development

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

CHAPTER V: CONCLUSIONS, CONTRIBUTIONS, AND FUTURE RESEARCH

Working with Local Authorities to Support the Localism Agenda

Interdisciplinary Research - Challenges and Opportunities for Actuarial Profession. Aldona Skučaitė, lecturer Vilnius university

Key concepts for the insider-researcher

Learning or lurking? Tracking the invisible online student

Procedia - Social and Behavioral Sciences 209 ( 2015 )

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

DEPARTMENT OF SOCIAL SCIENCES

Opening Essay. Darrell A. Hamlin, Ph.D. Fort Hays State University

Lincoln School Kathmandu, Nepal

A Study of Successful Practices in the IB Program Continuum

Vision for Science Education A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

Innovating Toward a Vibrant Learning Ecosystem:

PUBLIC CASE REPORT Use of the GeoGebra software at upper secondary school

Going back to our roots: disciplinary approaches to pedagogy and pedagogic research

e-portfolios in Australian education and training 2008 National Symposium Report

Monitoring Metacognitive abilities in children: A comparison of children between the ages of 5 to 7 years and 8 to 11 years

Georgetown University School of Continuing Studies Master of Professional Studies in Human Resources Management Course Syllabus Summer 2014

A cautionary note is research still caught up in an implementer approach to the teacher?

Oakland Schools Response to Critics of the Common Core Standards for English Language Arts and Literacy Are These High Quality Standards?

Virtual Seminar Courses: Issues from here to there

Engaging Faculty in Reform:

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

International Organizations and Global Governance: A Crisis in Global Leadership?

What Women are Saying About Coaching Needs and Practices in Masters Sport

Litterature review of Soft Systems Methodology

CHAPTER 3 3. THE INVESTIGATION. 3.1 Research design. The investigation is presented in the following two parts:

Mastering Team Skills and Interpersonal Communication. Copyright 2012 Pearson Education, Inc. publishing as Prentice Hall.

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

The Consistent Positive Direction Pinnacle Certification Course

Politics and Society Curriculum Specification

Ryerson University Sociology SOC 483: Advanced Research and Statistics

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Improving the impact of development projects in Sub-Saharan Africa through increased UK/Brazil cooperation and partnerships Held in Brasilia

Room: Office Hours: T 9:00-12:00. Seminar: Comparative Qualitative and Mixed Methods

Program Assessment and Alignment

Just Because You Can t Count It Doesn t Mean It Doesn t Count: Doing Good Research with Qualitative Data

Additional Qualification Course Guideline Computer Studies, Specialist

THEORETICAL CONSIDERATIONS

DIOCESE OF PLYMOUTH VICARIATE FOR EVANGELISATION CATECHESIS AND SCHOOLS

South Carolina English Language Arts

DESIGNPRINCIPLES RUBRIC 3.0

Transcription:

Using Responsive Evaluation in Strategic Management Bobby Thomas Cameron Abstract Program and policy evaluation is a key component of strategic management. Managers and other leaders need to know what works to determine if strategic priorities are being achieved. In the 21 st century, managers are required to have a broad range of knowledge and skill sets to support business development and to establish and maintain strategic priorities. In our efforts to ensure that our evaluative work is evidence-based and scientific we may overlook those evaluative approaches and models which do not necessarily align with the traditional, positivist approach to program and policy evaluation. Managers should be aware of alternative approaches to program and policy evaluation to ensure that methods are able to capture the complexity of the social phenomenon which is inherent to every program and policy. This paper makes a case for using Responsive Evaluation and provides an overview of its development, application, and strengths. Program evaluation methods and techniques are not a new phenomenon; throughout the 19 th century government commissions were established in the United Kingdom and the United States to reform education and as a means to evaluate other human services programs (Madaus and Stufflebeam, 2000). From 1964-1972, under U.S. President Lyndon Johnson, new policies on social programs in health, education, housing, and unemployment were implemented to support his social agenda to enhance opportunities for all citizens (Fitzpatrick, Sanders, and Worthen, 2011). Yet this social spending lacked the proper mechanism to measure and examine the effectiveness and impacts of programs. As the need for evaluation grew, academics were hired by the government to evaluate programs against their objectives (Kalman, 1976). Largely evaluations were pre-ordinate, that is they were oriented around measuring if a program met its intended goals. Throughout the 1960s, evaluation scholars began to critique this goal-oriented evaluation method as being too simplistic and for ignoring the complexity of social phenomenon (Abma and Stake, 2001). As such, Robert Stake and others began to discuss ways of broadening the scope of evaluation; essentially becoming more responsive to the needs of stakeholders and including them in all stages of the evaluation. In looking beyond the stated goals of the program to evaluate its effectiveness, what

Using Responsive Evaluation in Strategic Management 23 emerged is what is now known as responsive evaluation. This new approach was accompanied by a paradigmatic shift in the philosophical understanding of evaluation and the methods best used to complete an evaluation. Postmodernist Ian Stronach (2001) wrote that responsive evaluation continues to be important because it holds open a set of possibilities for educational meaning (a quasi-descriptive space), in the face of the normative closures of effectiveness discourses (p. 59). Responsive evaluation, although located on the margins of traditional evaluation approaches, continues to be used by scholars and practitioners today and is supported by 30 years of scholarship showing its utility and philosophy (see Abma, 2000; 2006; Abma and Stake, 2001; Howe, 1978; Kalman, 1976; Kilintenberg, 1976; Rakel, 1976; Spiegel, Bruning, and Giddings, 1999; Stake, 1972; 1976; 2004; Wadsworth, 2002; Stronach, 2001). The Philosophy of Responsive Evaluation Philosophically, the responsive approach is grounded in a constructivist ontology and nonpositivist epistemology (Abma, 2006). The approach posits that there are multiple perspectives in regards to what is truth and therefore evaluation designs ought to be conscious of individuals and groups multiple interpretations of reality (Finne, Levine, and Nilssen, 1995; Sorcinelli, Parsons, and Halpern, 1984). Reality is considered a result of an individual s construction of meaning (Sorcinelli, Parsons, and Halpern, 1984). Proponents of the responsive approach contest that people naturally examine, record, and conclude about the value of phenomena and what it means to them (Howe, 1978). Participatory approaches generally follow this philosophy and in practice entails that [e]valuations are carried out as constructive dialogues between stakeholders, with the evaluator acting primarily as communication agent (Finne, Levine, and Nilssen, 1995, 14; see also Abma, 2006). The program under evaluation is also viewed through a constructivist lens. As noted by Abma (2000: 463), a program is a phenomenon which is complex and dynamic; it has different meanings for different stakeholders. Howe (1978) inferred that the evaluator therefore does not rely on solely the pre-ordinate objectives of the program, rather, the evaluator focuses more directly on the program s activities and not its intents. Sorcinelli, Parsons, and Halpern (1984) assigned responsive evaluation to belong to naturalistic evaluation (5). Following Guba and Lincoln, the authors stated that... the naturalistic evaluator seeks to acquire and present thick description (Sorcinelli, Parsons, and Halpern, 1984, 6) of the program understudy. As such, the naturalistic evaluator relies on methods from field research including interviews, observations, and document analysis (Sorcinelli, Parsons, and Halpern, 1984). Stake (1976) noted that the responsive approach thus provides a more naturalistic and humanistic approach to program evaluation. In expanding beyond the goal-oriented or pre-ordinate evaluation design, responsive evaluation takes into consideration the program s background (history), conditions, and transactions among stakeholders (Stake, 1972). It is largely emergent, the design unfolds as contact is made with stakeholders. Yet, it is important to note that the responsive approach does not disregard the program s stated goals entirely. Rather, objective goals come to be evaluated alongside the other components of the program as identified by the stakeholders (Stake, 1974 in Kalman, 1976). Thus, the responsive design evaluates what stakeholders believe ought to be evaluated and part of this can include the program s intended goal. The truth which is sought in the responsive approach is the knowledge which is generated by stakeholders and interpreted by the evaluator. Philosophically, evaluation in this context is not exclusively focused on policies and programs on the basis of their effectiveness, rather it is based on engaging stakeholders so that the evaluator can better understand the meaning of their interaction with the policy or program (Abma, 2006). The experiences of stakeholders including feelings and emotions are thus of concern to the responsive evaluator (Abma, 2006).

Using Responsive Evaluation in Strategic Management 24 Responsive Design The essential design of the responsive approach is Stake s 12 Prominent Events: A) Identify program scope; B) Overview program activities; C) Discover purposes, concerns; D) Conceptualize issues, problems; E) Identify data needs; F) Select observers, judges, and instruments (if any); G) Observe designated antecedents, transactions, and outcomes; H) Thematize and prepare portrayals and case studies; I) Winnow, match issues to audiences; J) Format for audience use; K) Assemble formal reports (if any); L) Talk with clients, program staff, and audiences (Stake, 1976). These events are carried out in no specific order, and an evaluator can repeat an event, or skip one, if necessary (see for example Kalman, 1976 Klintberg, 1976). Responsive evaluation is a general perspective in the search for quality and the representation of quality in a program. It is an attitude more than a model or recipe (Stake, 2004, 86). An early example of application of this approach is Klintberg s (1976) responsive evaluation of a medical family practice residency program. Here Klintenberg carried out events A, B, and C and spoke with the chairman of the program and faculty to identify the nature, scope, and purpose of the residency program. Materials relating to its philosophy and the literature on best practices for residency programs were also reviewed. Klintberg used standardized interviews to solicit information from stakeholders on identification of problems, performance and program deficiencies, and recommendations. Klintenberg also used questionnaires to obtain factual data. Participants included enrolled students, office staff, hospital staff, volunteers and patients. Questions were also asked relating to the acceptance of the evaluation approach itself. Klintberg found that participants overwhelmingly accepted the responsive approach, that there was an outstanding willingness to respond, and that all participants were genuinely enthusiastic about engaging in the evaluation. In regards to specific methods, although Stake was a proponent that traditional scientific methods alone could not properly evaluate a program and Amba (2006: 32) also stated that responsive evaluation emerged due to the overreliance on experimental methods, evaluation scholars who have employed this approach have indeed integrated traditional methods. Howe (1978) stated that, the interviews which averaged 40 minutes in duration were conducted in an open manner with unstructured responses (21). Further she went on to identify themes from the interviews, although not specifying the way in which she analyzed the data. Abma (2006:33) stated that if one is interested in the meaning of a program, it is important to conduct in-depth interviews. Strengths of Responsive Evaluation The responsive approach is applicable to both summative and formative evaluations. Stake (1972) noted that the approach is particularly useful during formative evaluation when the project staff needs help in monitoring the program and when no one is sure what problems will arise. It will be particularly useful in summative evaluation when audiences want an understanding of the activities and of the strengths and shortcoming of the program (2). It is preferred over pre-ordinate evaluation when one wants to know the extent to which set objectives have changed (Stake, 1972). The responsive approach has shown to generate large amounts of data (Klintenberg, 1976), allowing for a thick description of a program (Sorcinelli, Parsons, and Halpern, 1984). Klintenberg (1976) concluded that the approach allows for evaluations of programs which are either limited or broad in scope and is especially suited for programs

Using Responsive Evaluation in Strategic Management 25 which are in transition. Rakel (1976) also stated that responsive evaluation is best suited for programs which are not yet fully established and structured. Kalman (1976) showed the utility of responsive approach at the state level by evaluating a migrant program. The purpose of the evaluation was to identify areas of concern that could impact state and local administrators. This study resulted in finding that participants were concerned with the lack of structure and pre-ordinate objectives. Kalman (1976) concluded with policy recommendations on how state level administrators ought to interact with local administrators based on the interviews she conducted. Further, Kalman (1976) noted that [s]tate administrators can take the information gleaned from interviewing local staff in their natural environment and make broad changes that can improve the quality of programs statewide (17). Critiques from the positivist school of evaluation, which posit that an objective reality does exist and can be understood by isolating variables, would suggest that the responsive approach is not scientifically rigorous enough. However, responsive evaluator Yoland Wadsworth (2001) noted that the approach takes into account ever-broader cultural, organizational, physical, social, political, and economic environment contexts, as well as ever-widening circles of stakeholders... (51). As such, the opportunity to incorporate traditional qualitative methods is useful in this type of research (see for example Patton, 1990) and can provide more validity for research findings for both the positivist and nonpositivist schools. Implications for Future Research The literature surveyed here did not discuss in detail the methods evaluators used to analyze data. For example, Howe (1978) stated that she identified themes in her articles, but did not articulate the way in which these themes emerged from the data. Although the articles adequately discussed the philosophy of their evaluation preference that it was constructivist it is unknown whether or not evaluators employed this same philosophy during analysis. For the responsive approach to truly be holistically constructivist, analytical methods should reflect the constructivist philosophy. Knowing more about the analysis of data in the responsive approach would allow for further academic inquiry and knowledge generation. Responsive evaluation seemingly deviates from basic academic research. Whereas basic research advances fundamental knowledge about the social world. It focuses on refuting or supporting theories that explain how things happen, why social relations are a certain way, and why society changes (Neuman, 2006, 24), applied research entails conducting a study to identify a specific concern or to offer a solution to a problem (Neuman, 2004). Incorporating a basic qualitative research design into (or alongside) the responsive evaluation and analyzing data with methods from the nonpositivist school (such as grounded theory) allows for knowledge generation which suits academic standards and also aligns with the approach s constructivist philosophy. Sorcinelli, Parsons, and Halpern (1984) noted that the responsive approach indeed requires skill and does not simply imply that the evaluator hangs out with stakeholders. Rather, disciplinary skills learned from academia can be, and should be, incorporated into the study. If the literature surveyed here had of been more explicit with the way in which data was collected and analyzed, it would have presented a stronger case to skeptics that the results of such research are valid. This paper argues that a responsive approach can be both beneficial to a program as well as contribute to a better understanding of the social world. A strength of responsive evaluation is that it brings one into

Using Responsive Evaluation in Strategic Management 26 contact with a diverse range of individuals whom are interacting with the same phenomenon. Responsive evaluation gives the researcher or evaluator the opportunity to generate knowledge as well as evaluate a program. Indeed, the methods employed could be brought from the nonpositivist tradition such as hermeneutics and other phenomenological approaches, but knowledge production can nevertheless be achieved which can provide answers to the social world beyond the scope of the program. About the Author Bobby Thomas Cameron is a graduate of Ryerson University s Master of Arts in Public Policy and Administration program in Toronto, Canada. He is currently working as a Policy Analyst at the intersection of health and management in Prince Edward Island, Canada. His website is here: http://www.bobbycameron.net.

Using Responsive Evaluation in Strategic Management 27 References Abma, T. (2000). Responding to ambiguity, responding to change the value of a responsive approach to evaluation. Evaluation and Program Planning, 23 (4), 461-470. Abma, T. (2006). The practice and politics of responsive evaluation. American Journal of Evaluation, 27 (1), 31-43. Abma, T., & Stake, R. (2001). Stake s responsive evaluation: core ideas and evolution. New Directions for Evaluation, 92, 7-21. Finne, H., Levin, M., & Nilssen, T. (1995). Trailing research. Evaluation, 1 (1), 11-31. Howe, C. (1978). A responsive evaluation of a graduate seminar. Studies in Educational Evaluation, 4 (1), 19-23. Kalman, M. (1976). Use of responsive evaluation in statewide program evaluation. Studies in Educational Evaluation, 2 (1), 9-18. Klintberg, I. (1976). A responsive evaluation of two programs in medical education. Studies in Educational Evaluation, 2 (1), 23-30. Madaus, G., & Stufflebeam, D. (2000). Program evaluation: a historical overview. In Stufflebeam, D.L., C.F. Madaus, and T. Kellaghan (Eds.), Evaluation Models (3-18). Boston: Kluwer Academic Publishers. Neuman, W. (2006). Social research methods. Boston: Pearson. Patton, M. (1990). Qualitative evaluation and research methods. California: Sage Publications. Rakel, R. (1976). A summary: responsive evaluation and family practice. Studies in Educational Evaluation, 2 (1), 35-36. Spiegel, A., Bruning,R., & Giddings, L. (1999). Using responsive evaluation to evaluate a professional conference. American Journal of Evaluation, 20 (1), 57-67 Sorcinelli, G., Parsons, M., & Halpern, E. (1984). Naturalistic responsive evaluation: a new methodology for evaluating health and safety in education. Lifelong Learning, 8 (1), 4-6. Stake, R. (1972). Responsive evaluation. U.S. Department of Health, Education, and Welfare. Stake, R. (1976). A theoretical statement of responsive evaluation. Studies in Educational Evaluation, 2 (1), 19-22. Stake, R. (2004). Standards-based & responsive evaluation. California: Sage Publications Wadsworth, Y. (2002). Becoming responsive and some consequences for evaluation as dialogue across distance. New Directions for Evaluation, 92, 45-58. Stronach, I. (2001). The changing face of responsive evaluation: a postmodern rejoinder. New Directions for Evaluation, 92, 59-72.