COMMONLY USED ASSESSMENT METHODS AND TOOLS. Brooklyn College Assessment Day Spring 2017

Similar documents
Revision and Assessment Plan for the Neumann University Core Experience

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Program Assessment and Alignment

OFFICE OF ENROLLMENT MANAGEMENT. Annual Report

Promotion and Tenure Guidelines. School of Social Work

School Leadership Rubrics

LEADERSHIP AND COMMUNICATION SKILLS

You Gotta Go Somewhere Prep for College Calendar

COACHING A CEREMONIES TEAM

University of Toronto

Thought and Suggestions on Teaching Material Management Job in Colleges and Universities Based on Improvement of Innovation Capacity

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

HARPER ADAMS UNIVERSITY Programme Specification

Higher education is becoming a major driver of economic competitiveness

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

1. Professional learning communities Prelude. 4.2 Introduction

Carolina Course Evaluation Item Bank Last Revised Fall 2009

GRADUATE STUDENTS Academic Year

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

ACCREDITATION STANDARDS

University of Massachusetts Lowell Graduate School of Education Program Evaluation Spring Online

2020 Strategic Plan for Diversity and Inclusive Excellence. Six Terrains

Writing the Personal Statement

Educational Leadership and Administration

Connecting Academic Advising and Career Advising. Advisory Board for Advisor Training

Ryerson University Sociology SOC 483: Advanced Research and Statistics

Evaluation of Learning Management System software. Part II of LMS Evaluation

Book Review: Build Lean: Transforming construction using Lean Thinking by Adrian Terry & Stuart Smith

Practice Examination IREB

Improving the impact of development projects in Sub-Saharan Africa through increased UK/Brazil cooperation and partnerships Held in Brasilia

Questions to Consider for Small Parent Groups/Parent Cafés

Augusta University MPA Program Diversity and Cultural Competency Plan. Section One: Description of the Plan

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

Vision for Science Education A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas

Dear Internship Supervisor:

University of Toronto Mississauga Sociology SOC387 H5S Qualitative Analysis I Mondays 11 AM to 1 PM IB 250

STUDENT LEARNING ASSESSMENT REPORT

Nothing is constant, except change - about the hard job of East German SMEs to move towards new markets

Presentation Advice for your Professional Review

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

Presentation of the English Montreal School Board To Mme Michelle Courchesne, Ministre de l Éducation, du Loisir et du Sport on

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

Evaluation of Hybrid Online Instruction in Sport Management

National Survey of Student Engagement at UND Highlights for Students. Sue Erickson Carmen Williams Office of Institutional Research April 19, 2012

Lincoln School Kathmandu, Nepal

A. True B. False INVENTORY OF PROCESSES IN COLLEGE COMPOSITION

elearning OVERVIEW GFA Consulting Group GmbH 1

SETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT

Measurement & Analysis in the Real World

10.2. Behavior models

There are three things that are extremely hard steel, a diamond, and to know one's self. Benjamin Franklin, Poor Richard s Almanac, 1750

Developing an Assessment Plan to Learn About Student Learning

Short vs. Extended Answer Questions in Computer Science Exams

A comparative study on cost-sharing in higher education Using the case study approach to contribute to evidence-based policy

American Studies Ph.D. Timeline and Requirements

Red Flags of Conflict

ACADEMIC AFFAIRS GUIDELINES

The Effect of Discourse Markers on the Speaking Production of EFL Students. Iman Moradimanesh

Predatory Reading, & Some Related Hints on Writing. I. Suggestions for Reading

eportfolio Assessment of General Education

visual aid ease of creating

Virtual Seminar Courses: Issues from here to there

TRAINING MANUAL FOR FACILITATORS OF RADIO LISTENING GROUPS

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1

learning collegiate assessment]

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

District Superintendent

KENTUCKY FRAMEWORK FOR TEACHING

Three Crucial Questions about Target Audience Analysis

Tap vs. Bottled Water

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

Utilizing Soft System Methodology to Increase Productivity of Shell Fabrication Sushant Sudheer Takekar 1 Dr. D.N. Raut 2

Strategic Planning for Retaining Women in Undergraduate Computing

On Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Colorado State University Department of Construction Management. Assessment Results and Action Plans

eportfolio for Your Professional Teaching Practice

A cognitive perspective on pair programming

Math Pathways Task Force Recommendations February Background

Community Rhythms. Purpose/Overview NOTES. To understand the stages of community life and the strategic implications for moving communities

Barstow Community College NON-INSTRUCTIONAL

The Political Engagement Activity Student Guide

DSTO WTOIBUT10N STATEMENT A

CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE

Calculators in a Middle School Mathematics Classroom: Helpful or Harmful?

Graduate Program in Education

Conducting an Interview

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Student Self-Assessment: An Overview of Research and Problems of Practice

College Pricing and Income Inequality

CPD FOR A BUSY PHARMACIST

Making Outdoor Programs Accessible. Written by Kathy Ambrosini Illustrated by Maria Jansdotter Farr

Introduction. Background. Social Work in Europe. Volume 5 Number 3

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

TAI TEAM ASSESSMENT INVENTORY

Transcription:

COMMONLY USED ASSESSMENT METHODS AND TOOLS Brooklyn College Assessment Day Spring 2017

Derived from Goals and Outcomes Many factors dictate what kind of evidence will be gathered in an assessment effort: time constraints, budgets, sampling limitations, best practices, etc. But the primary factor should always be the outcomes to be investigated themselves. Evidence is only as good as the questions that generate it, and questions should flow from goals and outcomes.

Academic vs Administrative While there will often be overlap in the tools used in academic and administrative departments, the latter often have less obvious choices for what data to collect. Most obviously, academic departments already regularly collect student artifacts and assign them grades, an advantage administrative units don t have.

Direct vs Indirect - What s the difference? Direct assessment attempts to measure progress by examining evidence of success directly, such as through counting metrics or percentages Common types of direct assessment include counting number of constituents helped or services performed, adjudicating whether performance goals have been met, and Indirect assessment attempts to better understand the attitudes and observations of people involved in the educational process by soliciting their feedback in a variety of forms Common types of indirect assessment include focus groups, surveys, and testimonials

We Should Gather As Many Types of Evidence as We Reasonably Can Colleges and universities have a variety of functions, and we should use various means to assess them Student learning and growth occur in many different facets, and not all of them are easily investigated with direct measures The more that we rely on individual assessment metrics, the more that we are vulnerable to their various limitations and biases

Direct and Indirect Assessments Can Work in Concert With Each Other Direct assessments provide evidence for whether grades and graduation rates are reflections of actual student growth Direct assessments can guide pedagogical and administrative decisions by identifying areas of strength and areas of need Drawback: direct assessment can be reductive and prompt teaching to the test Indirect assessments help to make our institutions more ethical, more fair, and more humane by treating student and instructor attitudes as an essential guide to our practices Indirect assessments can fill in the gaps, helping us to see important dynamics that might otherwise get lost in the data Drawback: can contribute to a service philosophy, where students are customers to be placated

Academic Institutional Level Direct vs Indirect The Collegiate Learning Assessment Plus A standardized test of student learning, developed by the Council for Aid to Education Includes a written Performance Task designed to assess various academic strengths, such as quantitative reasoning and critical reading, in concert, and a multiple choice section Provides individual student scores, institutional averages, and scores for student growth over time The Gallup-Purdue Index A large-scale survey that solicits self-reported information from college graduates at regular intervals after their graduation Survey questions ask about economic outcomes like employment and income, but also about life satisfaction and satisfaction with the college experience

Administrative Institutional Level Direct vs Indirect Evaluating the Strategic Plan Specific evidence-based benchmarks are developed to ascertain how well the Strategic Plan is being implemented Data is presented in such a way that progress can be measured in subsequent evaluations Constituent Satisfaction Survey A large-scale survey that solicits self-reported information from college stakeholders such as students, faculty, administrators, and staff. Survey questions ask about quality of service, accessibility of service, timeliness of service, etc.

Academic Departmental Level Direct & Indirect Document Review Key documents used in the ordinary workflow of a given unit are analyzed for efficiency and Analysis can be highly structured, semi-structured, or freeform Potential conflicts between intended uses of documents and their actual use in day-to-day operations can be identified Constiuent Focus Groups A convenience sample of t willing to participate in a focus group is gathered An assessment coordinator guides constituents through a semistructured focus group designed to solicit information about issues such as clarity of quality of service, accessibility of resources, and ease of scheduling appointments Representative quotes from constituents can help deepen understanding of real-world attitudes

Administrative Departmental Level Direct & Indirect Rating of Student Essays A representative sample of students in a given major produce essays at the beginning and end of the semester Essays are rated by a faculty committee based on faculty-defined standards of success Comparisons between pre- and post-essay ratings can help establish whether students are learning essential course material Student Focus Groups A convenience sample of students willing to participate in focus groups is gathered An assessment coordinator guides students through a semi-structured focus group designed to solicit information about issues such as clarity of learning goals, accessibility of resources, and student satisfaction with technology Representative quotes from students can help deepen understanding of real-world student attitudes

Counting Simply tracking numbers (of constituents served, of reports filed, of hours worked, of goals achieved, etc) can function as an important form of assessment Pros: numbers are often the coin of the realm in administrative contexts, counting is usually lowresource or no-resource and merely involves pulling numbers generated in ordinary operations Cons: raw numbers can rarely convey necessary details and complications such as the context of a given service, quantitative data can create a false sense of objectivity or certainty

Surveys Externally Developed Survey instruments developed externally to the institution. Can be one-size-fits-all or commissioned specifically for one institution. Pros: much of the hard work is farmed out, survey instruments have (presumably) been validated, offer rare breadth of information Cons: external surveys can be expensive, can lack the kind of finegrained and contextual questions of internally developed surveys, getting an adequate response rate can be difficult

Surveys Internally Developed Survey instruments developed within the institution, often using software like SurveyMonkey or Qualtrics. Can address specific. Pros: questions can be targeted to find out specific information of interest, highly flexible in their format and administration, can provide more generalizable data than most assessment tools Cons: developing a high-quality survey instrument is a major undertaking, institutions typically lack the means to validate such surveys, sample sizes are frequently disappointing

Focus Groups a group of people whose reactions are studied in guided or open discussions to determine the reactions that can be expected from a larger population Can be highly structured, semi-structured, or freeform Pros: inexpensive, easy to interpret, provides deep insight into a given unit s performance Cons: inability to generalize responsibly, can be dominated by strong personalities/extreme opinions

Observations internal A stakeholder in the institution, whether in the given unit or not, observes parts of normal operations in that unit and provides feedback. The observation can be highly structured, proceeding according to prewritten questions, or freeform. Pros: inexpensive or free to implement, can provide deeper insight than possible through other means, highly adaptable to different contexts and purposes Cons: often appears invasive to those observed, observed subjects may change their behavior due to observation, may not be generalizable

Observations external An expert in a given unit s area or subject is invited to campus, observes parts of normal operations in that unit, and provides feedback. The observation can be highly structured, proceeding according to prewritten questions, or freeform. Pros: can provide deep insight into a given unit s operations, provides a fresh set of eyes that can see dynamics those internal to the institution might miss Cons: often considered even more invasive than internal observation, may prompt observed subjects to change their behavior while observed, observer will typically require travel funds and/or lodging

Time Use Diaries Personnel in a given unit are tasked with keeping a diary of how they spend their time at work for a given period, typically ranging from one day to one week Pros: deliberately tracking how time is used often shows surprising patterns, can prompt more efficient and effective use of time by constituents, free to implement Cons: participants must be willing to track how they use their time, the time period covered by the diary may not necessarily reflect a given subject s typical use of time

Interviews One-on-one interviews of different stakeholders in a given unit, whether those that work in that unit or the constituents that the unit serves. May be highly structured, semi-structured, or freeform. Pros: rich qualitative data can be generated about a unit s practices, requires little time or resources Cons: participants may not be representative of unit as a whole

Acknowledge and Accept Real-World Constraints We ll always face pragmatic limitations in what we can do, so use the tools that are most practically useful and realistic for your context Assess what you can where you are with what you have

Thanks! Questions? Fredrik deboer Academic Assessment Manager fredrik.deboer@brooklyn.cuny.edu 718-951-5280 Boylan 1216