2006 Pacific Crest
There are certain values that we, as educators, strive to support:
Process Education
Student Success
Undergraduate Research
Service Learning
Active Learning
Learning College
Commitment to Excellence
Student- Centeredness
Any process we undertake or goal we strive to meet MUST take place WITHIN the context of these values.
This is our LEARNING ENVIRONMENT.
And because all we do is in support of these values, we engage in specific processes:
Instructional Design
Assessment
Teaching
Mentoring
Facilitation
Learning
Evaluation
Communities of Practice
All this is part of an Learning Environment.
What is our ultimate goal within this Enriched Learning Environment?
EMPOWERMENT
EMPOWERMENT for Students
EMPOWERMENT for Faculty
EMPOWERMENT and for the Institution.
How do we get there from here?
First, we must recognize that as educators, we all perform in multiple roles:
each of which allows us to work towards empowerment in different ways:
In the role of: We work towards empowerment through:
In the role of: We work towards empowerment through:
In the role of: We work towards empowerment through:
In the role of: We work towards empowerment through:
In the role of: We work towards empowerment through:
So where are we now?
Something interesting happens on the way to empowerment:
There is:
But there s more to these five roles; they aren t independent of one another!
Which brings us full-circle and gives us a dynamic concept map showing how we, as educators, can best perform in our various roles within an enriched learning environment, to the betterment of the institution, our students and ourselves.
20 Years in Higher Education Consulting
one of the important practices for increasing student success.
Evaluation determines if a standard was met; success or failure Assessment provides feedback on performance; Strengths, Areas for Improvement, and Insights (SII) Using Standard 1, 1 the measured performance was a failure Measured performance Using Standard 2, 2 the measured performance was a success High Low Measurement Assessor provides feedback on how the performance can be improved Assessor provides clarification and meaning with insights about the performance Assessor provides feedback on what were strengths of the performance and why
(S)trength of the Performance and how it was produced (I)mprovement that can be made in the performance and how to produce it (I)nsight: what was learned from assessing the performance and why is it significant?
Key things everyone should know:
Quality assessment or evaluation needs quality performance measures
Assessment is used more by quality performers than by low performers.
Self-growers use self-assessment to self-mentor.
Mentors help to produce self-growers by assessing their selfassessments.
Program assessment is the systematic process of integrating assessment and evaluation effectively to continuously improve program quality and documenting for others that it exceeds expectations of key stakeholders.
Finally, the most important aspect of implementing quality assessment and self-assessment is shifting from an evaluation mindset to an assessment mindset.
Classification of Learning Skills: Cognitive, Social, and Affective Conference for Research-Based Practitioners in PE Elmhurst College June 28-29, 2004 Steven Beyerlein and Dan Cordon, University of Idaho Cy Leise, Bellevue University Denny Davis, Washington State University Wendy Duncan-Hewitt, Auburn University Ann Hall, Sinclair Community College Daniel Apple, Pacific Crest
Introduction Goal for Higher Education Develop a comprehensive set of life-long learning skills in the cognitive, social, and affective domains Challenges Learning skill development transcends course boundaries (SCANS 1991) Shared language is needed to promote learning skill development across the curriculum Research Needs Provide specificity in learning skill definition that can be a useful tool for daily teaching/learning Demonstrate effective use of cognitive, social, and affective domain learning skills in curriculum design, delivery, and assessment
Assumptions about Learning Learning involves multiple types of knowledge that separate into different domains i.e., cognitive, social, affective (Bransford et al., 2000) Learning involves measurable, transferable skills that underlie performance across a spectrum of disciplines (Dewey, 1936) Skill development, like subject matter mastery, can be planned, cultivated and assessed (Anderson and Krathwohl, 2001) It is best to focus on a small skill set at one time (Covey, 1989)
Domains within the Classification Cognitive Domain Learning skills primarily related to mental (thinking) processes Social Domain Learning skills primarily related to interpersonal interaction Affective Domain Learning skills primarily related to emotional and value development
Learning Skills Definition Discrete entities that are embedded in everyday behavior Operate in conjunction with specialized knowledge Can be consciously improved and refined Examples (from Cognitive Domain) Skimming - inventorying using key prompts Simplifying - representing only primary features Generalizing - transferring knowledge to new contexts Defining the problem - articulating a problem and the need for solutions Selecting tools - finding methods to facilitate solution Stating research questions - asking empirically answerable questions
Rubric for Skill Development Level Transformative Use Self-Reflective Use Consistent Performance Conscious Use Non-Conscious Use Example Skill: Listening Purposefully listens and observes nuances and contextual details that deepen understanding of information and its application to a clearly stated need Carefully listens and reflects on success to gain maximum understanding relevant to a specific need Carefully listens to understand key points useful to address a specific need Actively listens; identifies information thought important to general need Passively listens; notes only information highlighted by others
Organization in Skill Clusters within Process Areas Process Area A Skill Cluster A1 Skill A1a Skill A1b Skill A1c Skill Cluster A2 Skill A2a... Process Area B Skill Area B1 Skill B1a Skill B1b Skill Area B2... Processing Information Collecting Data Observing Listening Skimming Memorizing Recording Measuring Generating Data Predicting Estimating Experimenting Brainstorming Organizing Data...
Classroom Implementation Tips for Effective Use Select a small set of skills Relate skills to process areas emphasized in course Assess present level of skill development Make interventions to elevate skills to desired level Promote metacognition of skill development Areas of Application Activity design Facilitation planning Assessment of learning outcomes
Insights about the Classification Classification of Learning Skills supports: Instructional design for skill development Facilitation for learner growth Measurement and documentation of growth The Classification is a tool to help: Identify keys skills associated with a course Elevate specific skills that limit performance in important disciplinary processes Balance attention to content and process in design and delivery of learning experiences Promote metacognition about their personal and professional development
Distinguishing and Elevating Levels of Learning in Engineering and Technology Instruction Daniel K. Apple, Kip P. Nygren, Molly W. Williams, and Daniel M. Litynski
Overview of Presentation Outline of the Problem Bloom s Taxonomy Characteristics of Level 3 Knowledge Characteristics of Level 4 Knowledge Problem Solving Capacity versus Complexity Issues in Growing Problem Solving Capacity Techniques to grow Capacity
Issues in Building Higher Levels of Learning Making sure knowledge skill has been developed, i.e. transferable knowledge Producing the appropriate level of problem difficulty for current capability Dealing with diversity among the students skill levels Strengthening problem solving process Allocating time for doing problem solving
Bloom s Taxonomy and PE Level Bloom s Taxonomy Alternate Term Process Skill Area 1. Knowledge/facts Information base Information Processing 2. Comprehension Knowledge Critical thinking 3. Application Knowledge skill Higher- order critical thinking 4. Analysis Problem solution Problem Solving 5. Synthesis New knowledge Research 6. Evaluation Peer-reviewed knowledge Assessment Copyright Pacifi c Crest 2000 PC TI PE PE2 AE M O
Characteristics of Level 3 Knowledge Learner has skill to apply or transfer knowledge to new unpracticed contexts Learner has generalized ways in which to use the knowledge while recognizing its limitations and constraints Learner can recognize situations to apply knowledge appropriately Learner has the ability to teach others Know they know versus think they know
Characteristics of Level 4 Knowledge Ability to integrate new knowledge skill with prior knowledge to produce new problem solutions Combine knowledge skills with transferable performance skills like critical thinking, communication, and teamwork to produce quality problem solutions Use working expertise to generalize solutions that can be reused in new contexts
Learner s Problem-Solving Capacity Effect of Capacity and Complexity on Problem Solving Level Automatic Skill Exercise Problem Solving Overwhelming Simple<<Problem Complexity>> Difficult Research
Learner s Problem-Solving Capacity Effect of Capacity and context on Problem Solving Level Automatic Skill Exercise Problem Solving Research Overwhelming Familiar <<Problem Context>> New
Issues in Growing Problem Solving Capacity in Students Having solid Level 2 knowledge before building transferable knowledge Making sure students knowledge is transferable before doing problem solving Selecting the degree of problem difficulty based upon problem solving capacity Determining problem solving capacity Diversity in problem solving capacity
Techniques to Grow Capacity Exercises to generalize understanding Categorize problems - develop rating scale Develop an index of problem solving level Use a problem solving methodology Use self-assessment of problem solving Practice generalizing a problem solution Grow supporting holistic skills
Research Questions What are the key factors and their relative levels of contribution to problem complexity? What are the key ingredients and their relative levels of contribution to problem solving capacity? How to train faculty to measure levels of learning so that problem solving efforts can be more effective?
Conclusions As level of learning and its measurement increases among faculty, the more effective and efficient learning will become Problem solving capacity can increase as faculty utilizes the appropriate problems at the appropriate time in the student development Integrated performance desired by ABET 2000 can progress faster during quality problem solving experiences
Overview of Instructional Design Program Design for QEP
Analysis: Learning-Outcome Driven Instructional Design Step 1: Identify professional behaviors (for Students) Step 2: Identify program intentions Step 3: Construct measurable learning outcomes Step 4: Construct a meta-knowledge table
Design: Activities and Knowledge to Support Learning Outcomes Step 5: Choose Themes Step 6: Create the appropriate methodologies Step 7: Identify a set of experiences Step 8: Step 8: Identify a set of specific learning skills for the program
Development: Construction and Selection Step 9: Identify experience preference types Step 10: Match the experience types with the chosen experiences Step 11: Choose the formal course and extracurricular experiences Step 12: Allocate time across the themes Step 13: Sequence the experiences across the program
Development: continued Step 14: Create individual experiences from prioritized list Step 15: Enhance experiences by using technology Step 16: Ask peers to review the experiences you create Step 17: Produce key performance criteria for learners Step 18: Locate or build key performance measures
Implementation Provide professional development for the instructors to implement the design effectively
Evaluation and Assessment: Instruction that learns from itself Step 19: Design a program assessment system Step 20: Design a program evaluation system Step 21: Design a program description and schedule
Summary of key steps for accreditation Step 1: Professional Behaviors Step 3: Construct measurable learning outcomes Step 8: Identify a set of specific learning skills for the program Step 17: Produce key performance Criteria Step 18: Locate or build key performance measures Step 19: Design a program assessment system
Thank you. 2005 Pacific Crest
Creating and Using a Performance Measure for the Design Process Andy Kline, Chemical Engineering - WMU Edmund Tsang, Mechanical Engineering - WMU Betsy Aller, Industrial Engineering - WMU Johnson Asumadu, Electrical Engineering - WMU Jim Morgan, Civil Engineering - TAMU Steven Beyerlein, Mechanical Engineering - UI Denny Davis, Biological Engineering - WSU ASEE 2003 Annual Meeting Nashville, TN
NEED: Shared Framework for Discussing and Measuring Development of Design Skills Freshmen Sophomore Junior Senior Intro to Engineering and Design Discipline-Specific Courses, Including Some Projects In-Depth Analysis, Design Foundational Team Design Capabilities High-Level Design and Professional Capabilities
T NEED: Integrated Measures Transf erable I Integrated D Design E Engineering E Education Personal Profiles Project Engineer (BS graduate + 5 years) Entry-Level Engineer (BS graduate) Engineering Intern (Associate degree graduate) Engineering Recruit (High school graduate) Educational Goals Long-term Educational Objective Professional-level knowledge, skills, attitudes: technical & leadership Program Educational Outcomes Productive & emergent knowledge, skills, attitudes: engineering, social, business Mid-Program Outcomes Foundational knowledge, skills, attitudes: technical & non-technical Qualifications for Entry Interests & aptitudes: math, science, technology, social
T METHOD: Shared Language Transf erable I Integrated D Design E Engineering E Performance - Individual/team actions that can be measured, monitored, and improved over time Performance Criteria - Expected behaviors associated with a particular group of performers Performance Task - Integrated learning challenge that provides opportunities to demonstrate core knowledge, skills, and attitudes Performance Factor - Element of performance which can be directly observed and measured Performance Measure Performance Measure - Instrument that integrates relevant data for the purpose of rating performance Education
Methodology for Creating a Performance Measure for a Process Area 1) Form a team with diverse training and perspectives 2) Recruit a facilitator familiar with the process area 3) Define boundaries of the skill set for the process area 4) Analyze expert behavior in the process area 5) Identify key sources of variability in performance 6) Craft a holistic rubric from novice to expert 7) Propose scales for measuring key factors 8) Test by reflecting on a variety of performances
T Transf erable I Integrated D Design E Engineering E Education TESTING: Three Contexts Measurement is the foundation for quality assessment as well as quality evaluation Assessment is low-stakes, friendly feedback Measures knowledge, skill or ability to improve future performance Provides assessee indication of strengths, areas for improvement, ways to improve Evaluation is high-stakes, judgmental feedback Judges the merit or worth of something against a set of standards Produces a grade or score that is part of a permanent, public record
T RESULT: Broad Applicability Transf erable I Integrated D Design E Engineering Feedback Assessment Venues Formative Performance Tasks Course Outcomes Performance Criteria Peak Performance Tasks Engineer Profile Program Outcomes Survey Outcomes Performance Criteria FE Exam Outcomes... Performance Criteria E Assessment Instruments Evaluation Instruments Education Student Grades ABET Documentation