Assessing student learning against the Engineering Accreditation Competency Standards: A practical approach

Similar documents
Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Researcher Development Assessment A: Knowledge and intellectual abilities

EQuIP Review Feedback

Nottingham Trent University Course Specification

HARPER ADAMS UNIVERSITY Programme Specification

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Developing an Assessment Plan to Learn About Student Learning

MASTER S COURSES FASHION START-UP

Politics and Society Curriculum Specification

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Business. Pearson BTEC Level 1 Introductory in. Specification

Level 6. Higher Education Funding Council for England (HEFCE) Fee for 2017/18 is 9,250*

ACADEMIC AFFAIRS GUIDELINES

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Update on Standards and Educator Evaluation

MSc Education and Training for Development

Providing Feedback to Learners. A useful aide memoire for mentors

University of Toronto Mississauga Degree Level Expectations. Preamble

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

Mathematics Program Assessment Plan

Quality in University Lifelong Learning (ULLL) and the Bologna process

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

ELEC3117 Electrical Engineering Design

Chapter 2. University Committee Structure

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

An APEL Framework for the East of England

Foundation Certificate in Higher Education

teaching issues 4 Fact sheet Generic skills Context The nature of generic skills

1. Programme title and designation International Management N/A

Presentation Advice for your Professional Review

Qualification Guidance

THREE-YEAR COURSES FASHION STYLING & CREATIVE DIRECTION Version 02

Technical Skills for Journalism

Programme Specification. MSc in International Real Estate

PROGRAMME SPECIFICATION

School Experience Reflective Portfolio

5. UPPER INTERMEDIATE

TU-E2090 Research Assignment in Operations Management and Services

PAPILLON HOUSE SCHOOL Making a difference for children with autism. Job Description. Supervised by: Band 7 Speech and Language Therapist

White Paper. The Art of Learning

Thameside Primary School Rationale for Assessment against the National Curriculum

Unit 7 Data analysis and design

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

Guidance on the University Health and Safety Management System

Practice Learning Handbook

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

The Characteristics of Programs of Information

Qualification handbook

CORE CURRICULUM FOR REIKI

Kelli Allen. Vicki Nieter. Jeanna Scheve. Foreword by Gregory J. Kaiser

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

Principles, theories and practices of learning and development

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

BSc (Hons) Property Development

Self Study Report Computer Science

Guidelines for the Use of the Continuing Education Unit (CEU)

Course Specification Executive MBA via e-learning (MBUSP)

e-portfolios in Australian education and training 2008 National Symposium Report

Programme Specification

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Graduate Program in Education

Bold resourcefulness: redefining employability and entrepreneurial learning

Final Teach For America Interim Certification Program

TEACHING QUALITY: SKILLS. Directive Teaching Quality Standard Applicable to the Provision of Basic Education in Alberta

VTCT Level 3 Award in Education and Training

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

Maintaining Resilience in Teaching: Navigating Common Core and More Online Participant Syllabus

Higher education is becoming a major driver of economic competitiveness

Programme Specification

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Programme Specification

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

BSc Food Marketing and Business Economics with Industrial Training For students entering Part 1 in 2015/6

Guidelines for Incorporating Publication into a Thesis. September, 2015

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Promotion and Tenure standards for the Digital Art & Design Program 1 (DAAD) 2

Delaware Performance Appraisal System Building greater skills and knowledge for educators

BSc (Hons) Marketing

SACS Reaffirmation of Accreditation: Process and Reports

10.2. Behavior models

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

Beyond the contextual: the importance of theoretical knowledge in vocational qualifications & the implications for work

Unit 3. Design Activity. Overview. Purpose. Profile

Setting the Scene: ECVET and ECTS the two transfer (and accumulation) systems for education and training

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

DOCTORAL SCHOOL TRAINING AND DEVELOPMENT PROGRAMME

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012

BSc (Hons) in International Business

Pharmaceutical Medicine

Davidson College Library Strategic Plan

Guidelines for Writing an Internship Report

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

Continuing Competence Program Rules

THE QUEEN S SCHOOL Whole School Pay Policy

California Professional Standards for Education Leaders (CPSELs)

Transcription:

Navigating a pathway between the academic standards and a framework for authentic, collaborative, outcomes-focused thinking in engineering education Assessing student learning against the Engineering Accreditation Competency Standards: A practical approach Workshop Program and Workbook The University of Adelaide, Adelaide, Australia, October 2012

Assessing student learning against the Engineering Accreditation Competency Standards: A practical approach Workshop Table of contents ALTC National Teaching Fellowship... 2 Workshop Summary and Objectives... 3 Workshop Program... 4 1. Developing an Assessment Design Process... 5 1.1 Guiding Questions for Assessment Design... 6 1.2 My Assessment Design... 7 2. Assessment methods for Stage 1 Competency Standards... 8 2.1 Exercise 1: Knowledge and Skill Base... 9 2.2 Exercise 2: Engineering Application Ability... 12 2.3 Exercise 3: Professional and Personal Attributes... 16 3. SMART Planning... 20 3.1 The SMART Planning Matrix, an example... 21 3.2 My SMART Plan... 22 Appendix A... 24 Stage 1 Competency Standards for Professional Engineer... 24 Resources... 25 Assessment Design... 25 Creating Rubrics... 26 Collecting Evidence of Learning... 27 Use of Evidence in Learning... 28 Classroom Assessment Techniques (CATs)... 29 WORKSHOP EVALUATION... 30

ALTC National Teaching Fellowship 1 Navigating a pathway between the academic standards and a framework for authentic, collaborative, outcomes-focused thinking in engineering education Program summary (from the fellowship nomination) The pending roll-out of Threshold Learning Outcomes (TLOs) devised by the National Learning and Teaching Academic Standards project will significantly impact curriculum design, pedagogy and assessment in all Australian tertiary institutions. By developing a transferrable framework for collaborative, outcomes-focused thinking, this Fellowship program will enable academic staff to constructively engage with the imperative for universities to enhance student learning outcomes. First, during a Fellow-in-residence program at five Australian universities, an action research approach will be used to support Engineering academics in designing and implementing assessment tasks that provide evidence of students attainment of learning outcomes. This will be informed by the outcomes of an overseas study component to identify and evaluate relevant international initiatives for effective teacher engagement and curriculum reform, in the light of research findings and in the context described above. Second, the Fellow will work as a collaborator and change agent with Heads of Schools, academic leaders including at least one course coordinator at five different types of Australian universities to support academics effective teaching and assessment practices. The Director of the Learning and Teaching Development Unit in each institution will be informed of the Fellow-in-residence program and will be included in the dissemination and feedback cycle as a conduit to the broader institution and the Council of Australian Directors of Academic Development (CADAD). The major outcomes of the program will be a shift to an authentic outcomes-focused approach to teaching demonstrated in the design and evaluation of assessment tasks and a set of guides and resources for mentoring, with an emphasis on supporting early- and mid-career academics. This workshop is part of the fellowship engagement and dissemination program of activities. Further information on the Fellowship program may be obtained by email from: w.boles@qut.edu.au or h.beck@qut.edu.au 1 Support for this fellowship has been provided by the Australian Government Office for Learning and Teaching (Formerly the Australian Learning and Teaching Council, ALTC). The views expressed in this workbook do not necessarily reflect the views of the Australian Government Office for Learning and Teaching. 2

Workshop Summary and Objectives Assuring the quality and standards of the learning experience and outcomes for engineering students has national and international significance. The development of academic standards for higher education in Australia is consistent with this renewed drive for excellence. It is therefore critical that high standards are maintained and that graduates are able to demonstrate their knowledge, skills, abilities and attitudes. This impacts not only on the design of engineering programs, but also on the preparation of academics to engage with these standards and implement them through their day-to-day teaching practice. It is therefore important to design assessments capable of providing concrete, observable, and measurable evidence of student learning. This workshop aims to engage participants in a systematic process for designing assessment tasks targeting specific learning outcomes. This will be reinforced by practical approaches for assessing students learning, against the Stage 1 Competency Standards. Resources will be suggested for the different elements of the Stage 1 Competency Standard for Professional Engineer 2. At the workshop, after each exercise in which participants have considered their options for gathering evidence related to the different elements of the Stage 1 Competency Standards, the workshop facilitators might offer a page with suggested resources. Participants will leave with a draft of an assessment plan for their engineering programs. Aims: The workshop aims to assist academics: Develop a systematic assessment design process, Examine academic standards and evidence of learning Devise practical approaches for assessing students against Stage 1 Competency Standards, and Describe an assessment plan for Engineering Programs. Outcomes: Hands-on experience in systematically designing assessment items capable of providing evidence of student learning, and meeting standards. Practical ways of implementing appropriate assessment tasks. Participants will be provided with a workbook, containing activities and resources for use during and beyond the workshop. 2 See Appendix A for details. 3

Assessing student learning against the Engineering Accreditation Competency Standards: A practical approach Workshop Program Introduction: Presentation: Group work: Reporting: Part 1 Context, intended outcomes, and outline. Assessment, what is it good for? Assessment design process Starting from program and subject objectives, explore and share ideas about a process for designing evidence-based assessment. The What, Why and How of assessment. Group reporting Break Presentation: Part 2 Assessment methods for Stage 1 Competency Standards Group work: Exercises 1, 2 and 3 Reporting: Different Groups report after each exercise Take-away messages/ actions: SMART planning Participants nominate their own personal actions as a result of the workshop. Complete and return feedback forms. Workshop Facilitators: Professor Wageeh Boles ALTC National Teaching Fellow Electrical Engineering and Computer Science School Science and Engineering Faculty QUT, Brisbane, Australia Professor Jeff Froyd Director of Academic Development in the Dwight Look College of Engineering, Project Director for the NSF Foundation Coalition Texas A & M University, USA 4

1. Developing an Assessment Design Process In Assessment for Learning, teachers use assessment as an investigable tool to find out as much as they can about what their students know and can do, and what confusions, preconceptions, or gaps they might have. The wide variety of information that teachers collect about students learning processes provides the basis for determining what they need to do next to move student learning forward. It provides the basis for providing descriptive feedback for students and deciding on groupings, instructional strategies, and resources. Rethinking Classroom Assessment with Purpose in Mind, Page 29 Assessment of Learning is the assessment that becomes public and results in statements or symbols about how well students are learning. It often contributes to pivotal decisions that will affect students futures. It is important, then, that the underlying logic and measurement of assessment of learning be credible and defensible. Rethinking Classroom Assessment with Purpose in Mind, Page 55 Western and Northern Canadian Protocol for Collaboration in Education, 2006 Your Task 1. Complete the worksheet given in the next page. Select one member of the group to present to the workshop attendees. Start by working individually in your own workbook then share your ideas with the group. Think about how might you structure your teaching and assessment activities to help your students learn and make judgments about their performance. In part 2 of this workshop, you will be asked to apply the ideas developed here to examples in the units/courses you teach. So, please keep in mind a specific assessment item, while you shape and share your ideas. 2. Examine the decisions you made while completing the worksheet. Did you focus on achieving students deep understanding of big ideas? 3. Exchange your completed worksheet with another group, compare and contrast the responses. Comparing your process with that of others or other groups: Are there patterns that emerge across the groups? What are they? Why have they emerged here? How will this activity enhance my assessment practices? 5

1.1 Guiding Questions for Assessment Design 3 Why am I assessing? What is its main purpose? What students are given and what are they expected to achieve? What are the relationships to other course- or program-level objectives? What assessment method should I use? What are the most appropriate assessment instruments? Should I use a portfolio? A performance? An oral presentation? How about a checklist? Open-ended questions? An Exam? What am I assessing? What are the main targeted knowledge, skills and attitudes? What the students currently don t know/understand, what misconception they might have, what can they do/use to address this deficiency, or advance towards next level? What don t I need to assess? How can I ensure quality in this assessment? How will I design the rubric or assessment criteria? What is the evidence I will use to substantiate achievement levels? How this is linked to the Accreditation criteria (Stage 1 Competency Standards)? How can I use the information from this assessment? Where can I use the information to shape my teaching program? Is this for all students or just this one/group? Where can I change the curriculum? My assessment practices? How will these influence decision makers? 3 This activity has been modelled on a resource provided at: http://www.education.vic.gov.au/ Some questions are Adapted from Rethinking Classroom Assessment with Purpose in Mind at: http://www.edu.gov.mb.ca/k12/assess/wncp/index.html 6

1.2 My Assessment Design Why am I assessing? What assessment method should I use? What am I assessing? How can I ensure quality in this assessment? How can I use the information from this assessment? Comments: 7

2. Assessment methods for Stage 1 Competency Standards In this part, working in groups, participants will consider a number of assessment methods, determine possible approaches for collecting data on the elements of competency; the year and calendar time where these be implemented; and the data be analysed, interpreted, and evaluated. Prus & Johnson, 1994 provided a list of methods that can be used to assess students performance against the Stage 1 Competency Standards (Appendix A). These are shown below. Twelve Assessment Methods (Prus & Johnson, 1994) Commercial Norm-Referenced, Standardized Exams Locally Developed Exams Oral Examination Performance Appraisals Simulation Written Surveys and Questionnaires Third Party Reports Exit Interviews and Other Interviews Portfolios Stone Courses Archival Data Behavioral Observations Prus, Joseph S., & Johnson, Reid. (1994). A Critical Review of Student Assessment Options. New Directions for Community Colleges, 88, 69 83. 8

2.1 Exercise 1: Knowledge and Skill Base Consider the Stage 1 Competency Standards and Table 1. For each of the twelve (12) assessment methods, what would be applicable approaches for collecting data on the six (6) elements of competency related to knowledge and skill base? STAGE 1 COMPETENCIES: 1. KNOWLEDGE AND SKILL BASE 1. Comprehensive, theory based understanding of the underpinning natural and physical sciences and the engineering fundamentals applicable to the engineering discipline. 2. Conceptual understanding of the, mathematics, numerical analysis, statistics, and computer and information sciences which underpin the engineering discipline. 3. In-depth understanding of specialist bodies of knowledge within the engineering discipline. 4. Discernment of knowledge development and research directions within the engineering discipline. 5. Knowledge of contextual factors impacting the engineering discipline. 6. Understanding of the scope, principles, norms, accountabilities and bounds of contemporary engineering practice in the specific discipline. When (year and calendar time) would these be implemented? How would the data be analyzed, interpreted, and evaluated? Who will decide on actions to be taken on the basis of the evaluation? 9

Table 1. Knowledge and Skill Base: Elements and Indicators ELEMENT OF COMPETENCY INDICATORS OF ATTAINMENT 1.1 Comprehensive, theory based understanding of the underpinning natural and physical sciences and the engineering fundamentals applicable to the engineering discipline. 1.2 Conceptual understanding of the, mathematics, numerical analysis, statistics, and computer and information sciences which underpin the engineering discipline. 1.3 In depth understanding of specialist bodies of knowledge within the engineering discipline. 1.4 Discernment of knowledge development and research directions within the engineering discipline. 1.5 Knowledge of contextual factors impacting the engineering discipline. 1.6 Understanding of the scope, principles, norms, accountabilities and bounds of contemporary engineering practice in the engineering discipline. a) Engages with the engineering discipline at a phenomenological level, applying sciences and engineering fundamentals to systematic investigation, interpretation, analysis and innovative solution of complex problems and broader aspects of engineering practice. a) Develops and fluently applies relevant investigation analysis, interpretation, assessment, characterisation, prediction, evaluation, modelling, decision making, measurement, evaluation, knowledge management and communication tools and techniques pertinent to the engineering discipline. a) Proficiently applies advanced technical knowledge and skills in at least one specialist practice domain of the engineering discipline. a) Identifies and critically appraises current developments, advanced technologies, emerging issues and interdisciplinary linkages in at least one specialist practice domain of the engineering discipline. b) Interprets and applies selected research literature to inform engineering application in at least one specialist domain of the engineering discipline. a) Identifies and understands the interactions between engineering systems and people in the social, cultural, environmental, commercial, legal and political contexts in which they operate, including both the positive role of engineering in sustainable development and the potentially adverse impacts of engineering activity in the engineering discipline. b) Is aware of the founding principles of human factors relevant to the engineering discipline. c) Is aware of the fundamentals of business and enterprise management. d) Identifies the structure, roles and capabilities of the engineering workforce. e) Appreciates the issues associated with international engineering practice and global operating contexts. a) Applies systematic principles of engineering design relevant to the engineering discipline. b) Appreciates the basis and relevance of standards and codes of practice, as well as legislative and statutory requirements applicable to the engineering discipline. c) Appreciates the principles of safety engineering, risk management and the health and safety responsibilities of the professional engineer, including legislative requirements applicable to the engineering discipline. d) Appreciates the social, environmental and economic principles of sustainable engineering practice. e) Understands the fundamental principles of engineering project management as a basis for planning, organising and managing resources. f) Appreciates the formal structures and methodologies of systems engineering as a holistic basis for managing complexity and sustainability in engineering practice. 10

Exercise 1: Assessing knowledge and skill base Commercial Norm- Referenced, Standardized Exams Locally Developed Exams Oral Examination Performance Appraisals Simulation Written Surveys and Questionnaires Third Party Reports Exit Interviews and Other Interviews Portfolios Stone Subjects Archival Data Behavioural Observations 11

2.2 Exercise 2: Engineering Application Ability Consider the Stage 1 Competency Standards and Table 2. For each of the twelve (12) assessment methods, what would be applicable approaches for collecting data on the four (4) elements of competency related to Engineering application ability? STAGE 1 COMPETENCIES: 2. ENGINEERING APPLICATION ABILITY 1. Application of established engineering methods to complex engineering problem solving. 2. Fluent application of engineering techniques, tools and resources 3. Application of systematic engineering synthesis and design processes. 4. Application of systematic approaches to the conduct and management of engineering projects. When (year and calendar time) would these be implemented? How would the data be analyzed, interpreted, and evaluated? Who will decide on actions to be taken on the basis of the evaluation? 12

Table 2. Engineering Application Ability: Elements and Indicators ELEMENT OF INDICATORS OF ATTAINMENT COMPETENCY 2.1 Application of a) Identifies, discerns and characterises salient issues, determines and established engineering analyses causes and effects, justifies and applies appropriate methods to complex simplifying assumptions, predicts performance and engineering problem behaviour, synthesises solution strategies and develops substantiated solving. conclusions. b) Ensures that all aspects of an engineering activity are soundly based on fundamental principles by diagnosing, and taking appropriate action with data, calculations, results, proposals, processes, practices, and documented information that may be ill-founded, illogical, erroneous, unreliable or unrealistic. c) Competently addresses engineering problems involving uncertainty, ambiguity, imprecise information and wide-ranging and sometimes conflicting technical and non-technical factors. d) Partitions problems, processes or systems into manageable elements for the purposes of analysis, modelling or design and then recombines to form a whole, with the integrity and performance of the overall system as the paramount consideration. e) Conceptualises alternative engineering approaches and evaluates potential outcomes against appropriate criteria to justify an optimal solution choice. f) Critically reviews and applies relevant standards and codes of practice underpinning the engineering discipline and nominated specialisations. g) Identifies, quantifies, mitigates and manages technical, health, environmental, safety and other contextual risks associated with engineering application in the designated engineering discipline. h) Interprets and ensures compliance with relevant legislative and statutory requirements applicable to the engineering discipline. i) Investigates complex problems using research-based knowledge and 2.2 Fluent application of engineering techniques, tools and resources. research methods. a) Proficiently identifies, selects and applies the materials, components, devices, systems, processes, resources, plant and equipment relevant to the engineering discipline. b) Constructs or selects and applies from a qualitative description of a phenomenon, process, system, component or device a mathematical, physical or computational model based on fundamental scientific principles and justifiable simplifying assumptions. c) Determines properties, performance, safe working limits, failure modes, and other inherent parameters of materials, components and systems relevant to the engineering discipline. d) Applies a wide range of engineering tools for analysis, simulation, visualisation, synthesis and design, including assessing the accuracy and limitations of such tools, and validation of their results. e) Applies formal systems engineering methods to address the planning and execution of complex, problem solving and engineering projects. f) Designs and conducts experiments, analyses and interprets result data and formulates reliable conclusions. g) Analyses sources of error in applied models and experiments; eliminates, minimises or compensates for such errors; quantifies significance of errors to any conclusions drawn. h) Safely applies laboratory, test and experimental procedures appropriate to the engineering discipline. i) Understands the need for systematic management of the acquisition, commissioning, operation, upgrade, monitoring and maintenance of engineering plant, facilities, equipment and systems. j) Understands the role of quality management systems, tools and processes within a culture of continuous improvement. 13

2.3 Application of systematic engineering synthesis and design processes. 2.4 Application of systematic approaches to the conduct and management of engineering projects. a) Proficiently applies technical knowledge and open ended problem solving skills as well as appropriate tools and resources to design components, elements, systems, plant, facilities and/or processes to satisfy user requirements. b) Addresses broad contextual constraints such as social, cultural, environmental, commercial, legal political and human factors, as well as health, safety and sustainability imperatives as an integral part of the design process. c) Executes and leads a whole systems design cycle approach including tasks such as: - determining client requirements and identifying the impact of relevant contextual factors, including business planning and costing targets; - systematically addressing sustainability criteria; - working within projected development, production and implementation constraints; - eliciting, scoping and documenting the required outcomes of the design task and defining acceptance criteria; - identifying assessing and managing technical, health and safety risks integral to the design process; - writing engineering specifications, that fully satisfy the formal requirements; - ensuring compliance with essential engineering standards and codes of practice; - partitioning the design task into appropriate modular, functional elements; that can be separately addressed and subsequently integrated through defined interfaces; - identifying and analysing possible design approaches and justifying an optimal approach; - developing and completing the design using appropriate engineering principles, tools, and processes; - integrating functional elements to form a coherent design solution; - quantifying the materials, components, systems, equipment, facilities, engineering resources and operating arrangements needed for implementation of the solution; - checking the design solution for each element and the integrated system against the engineering specifications; - devising and documenting tests that will verify performance of the elements and the integrated realisation; - prototyping/implementing the design solution and verifying performance against specification; - documenting, commissioning and reporting the design outcome. d) Is aware of the accountabilities of the professional engineer in relation to the design authority role. a) Contributes to and/or manages complex engineering project activity, as a member and/or as leader of an engineering team. b) Seeks out the requirements and associated resources and realistically assesses the scope, dimensions, scale of effort and indicative costs of a complex engineering project. c) Accommodates relevant contextual issues into all phases of engineering project work, including the fundamentals of business planning and financial management d) Proficiently applies basic systems engineering and/or project management tools and processes to the planning and execution of project work, targeting the delivery of a significant outcome to a professional standard. e) Is aware of the need to plan and quantify performance over the full life-cycle of a project, managing engineering performance within the overall implementation context. f) Demonstrates commitment to sustainable engineering practices and achievement of sustainable outcomes in all engineering project work. 14

Exercise 2: Assessing Engineering Application Ability Commercial Norm- Referenced, Standardized Exams Locally Developed Exams Oral Examination Performance Appraisals Simulation Written Surveys and Questionnaires Third Party Reports Exit Interviews and Other Interviews Portfolios Stone Subjects Archival Data Behavioural Observations 15

2.3 Exercise 3: Professional and Personal Attributes Consider the Stage 1 Competency Standards and Table 3. For each of the twelve (12) assessment methods, what would be applicable approaches for collecting data on the six (6) elements of competency related to Professional and Personal Attributes? STAGE 1 COMPETENCIES: 3. PROFESSIONAL AND PERSONAL ATTRIBUTES 1. Ethical conduct and professional accountability. 2. Effective oral and written communication in professional and lay domains. 3. Creative, innovative and pro-active demeanour. 4. Professional use and management of information. 5. Orderly management of self, and professional conduct. 6. Effective team membership and team leadership. When (year and calendar time) would these be implemented? How would the data be analyzed, interpreted, and evaluated? Who will decide on actions to be taken on the basis of the evaluation? 16

Table 3 Professional and Personal Attributes: Elements and Indicators ELEMENT OF INDICATORS OF ATTAINMENT COMPETENCY 3.1 Ethical conduct and a) Demonstrates commitment to uphold the Engineers Australia - professional accountability Code of Ethics, and established norms of professional conduct pertinent to the engineering discipline. b) Understands the need for due-diligence in certification, compliance and risk management processes. c) Understands the accountabilities of the professional engineer and the broader engineering team for the safety of other people and for protection of the environment. d) Is aware of the fundamental principles of intellectual property 3.2 Effective oral and written communication in professional and lay domains. 3.3 Creative, innovative and pro-active demeanour. 3.4 Professional use and management of information. 3.5 Orderly management of self, and professional conduct. rights and protection. a) Is proficient in listening, speaking, reading and writing English, including: - comprehending critically and fairly the viewpoints of others; - expressing information effectively and succinctly, issuing instruction, engaging in discussion, presenting arguments and justification, debating and negotiating - to technical and non-technical audiences and using textual, diagrammatic, pictorial and graphical media best suited to the context; - representing an engineering position, or the engineering profession at large to the broader community; - appreciating the impact of body language, personal behaviour and other non-verbal communication processes, as well as the fundamentals of human social behaviour and their cross-cultural differences. b) Prepares high quality engineering documents such as progress and project reports, reports of investigations and feasibility studies, proposals, specifications, design records, drawings, technical descriptions and presentations pertinent to the engineering discipline. a) Applies creative approaches to identify and develop alternative concepts, solutions and procedures, appropriately challenges engineering practices from technical and non-technical viewpoints; identifies new technological opportunities. b) Seeks out new developments in the engineering discipline and specialisations and applies fundamental knowledge and systematic processes to evaluate and report potential. c) Is aware of broader fields of science, engineering, technology and commerce from which new ideas and interfaces may be drawn and readily engages with professionals from these fields to exchange ideas. a) Is proficient in locating and utilising information - including accessing, systematically searching, analysing, evaluating and referencing relevant published works and data; is proficient in the use of indexes, bibliographic databases and other search facilities. b) Critically assesses the accuracy, reliability and authenticity of information. c) Is aware of common document identification, tracking and control procedures. a) Demonstrates commitment to critical self-review and performance evaluation against appropriate criteria as a primary means of tracking personal development needs and achievements. b) Understands the importance of being a member of a professional and intellectual community, learning from its knowledge and standards, and contributing to their maintenance and advancement. c) Demonstrates commitment to life-long learning and professional development. d) Manages time and processes effectively, prioritises competing demands to achieve personal, career and organisational goals and objectives. 17

3.6 Effective team membership and team leadership. e) Thinks critically and applies an appropriate balance of logic and intellectual criteria to analysis, judgment and decision making. f) Presents a professional image in all circumstances, including relations with clients, stakeholders, as well as with professional and technical colleagues across wide ranging disciplines. a) Understands the fundamentals of team dynamics and leadership. b) Functions as an effective member or leader of diverse engineering teams, including those with multilevel, multi-disciplinary and multicultural dimensions. c) Earns the trust and confidence of colleagues through competent and timely completion of tasks. d) Recognises the value of alternative and diverse viewpoints, scholarly advice and the importance of professional networking. e) Confidently pursues and discerns expert assistance and professional advice. f) Takes initiative and fulfils the leadership role whilst respecting the agreed roles of others. 18

Exercise 3: Assessing Professional & personal attributes Commercial Norm- Referenced, Standardized Exams Locally Developed Exams Oral Examination Performance Appraisals Simulation Written Surveys and Questionnaires Third Party Reports Exit Interviews and Other Interviews Portfolios Stone Subjects Archival Data Behavioural Observations 19

3. SMART Planning Once you have decided what you need to do and how you may begin going about it, you need concrete plans that you can follow and track your progress by. The acronym SMART is used here to remind us that successful plans need to be Specific (so that we know exactly what we are aiming to do), Measurable (so we can specify how much of what kind of result will count as success), Achievable (so that we are not setting ourselves too hard a task), Relevant (so that we are in tune with the actual situation we are dealing with and not just pursuing a hobby or an interest) and Timely (so that we know after a given period of time whether we are on track or not). The tool provided is intended for you to record your SMART plan. It is best used as a workin-progress that you can return to and revise as often as necessary. If you get to your first review date and you haven t achieved what you set out for yourself, don t despair. There may be very good reasons, including a need to readjust goals or deal with more difficult barriers than you expected. In addition we will ask you to record the outline of your plans on a postcard and place it in a self-addressed envelope. We will post that card to you in a few weeks time as a reminder to you to review your plan. You are very welcome to get in touch with the research team at that time if you would like to. Example: The SMART Planning Matrix, eg: introducing more visual materials 4 to improve learning outcomes. GOAL Vs OBJECTIVE These can both work at various levels but for the purposes of this exercise it s most useful to think of the GOAL as introducing more visual materials and the OBJECTIVES as components of that such as replace 13 weeks of dot point PowerPoint slides with visual stimuli. Here you should be thinking about planning for OBJECTIVES. One change may necessitate others. Make sure you build this into your planning. 4 This change was informed by reading Tufte, E (2003) The Cognitive Style of Powerpoint. Cheshire, Conn;Graohics Press ; Johnson, S. (2005) Everything Bad is Good for You. London; Allen Lane ; Atkinson, C. (2005) Beyond Bullet Points. Microsoft, Washington. 20

3.1 The SMART Planning Matrix, an example S Specific what change/development do you have in mind? Use concrete terms. M Measurable what is going to be the measure of achievement? Think about degree of (what kind of) change and numbers affected. A Achievable what are the helpful and restraining factors? Think of possible external factors may have to be dealt with or activated? R Relevant is it congruent with what is known about the existing situation? What does the literature tell us can be achieved in this area and how? T Timely what will be the timeframe for measurement? (you may have intermediate goals as well as final ones). Improve pass rates and retention in my classes. Improved rates of passing subject (current 70% goal 85%) Reduced dropouts (current 10% goal 5%) Helpful What can I modify to improve retention of material? What can I modify to improve pass rates? What are students expected to do in the course, e.g., on exams, on papers? What are aspects of the course with which students struggle significantly? What aspects of the course change I change? What aspects of student behaviour should be modified? What can I do to encourage these modifications? What are student expectations at the beginning of the course? Restraining factors How much time will I invest to make changes? Time need to delay work on chapter 5 of book until slides are ready Student expectations must manage through information in course outline, on web etc. Relevance In what ways can the course be made more relevant to students taking the course? What have other faculty members tried in similar courses? Progress How can I collect evidence throughout the term/semester about progress students are making toward learning the material? Classroom Assessment Techniques can help. How can I provide feedback to students about their progress in learning? If part of the action is improving lecture slides, the timeline could be By Jan 31 8 weeks slides etc prepared, subject outline Feb 23 rest of slides, workbook, AV organised Apr 1 first 2 minute survey Apr 15 make adjustments where necessary May 7 second 2 minute survey End of term student evaluations July compare with last year s results 21

3.2 My SMART Plan S Specific what change/development do you have in mind? Use concrete terms. M Measurable what is going to be the measure of achievement? Think about degree of (what kind of) change and numbers affected. A Achievable what are the helpful and restraining factors? Consider the force field analysis. What external factors may have to be dealt with or activated? R Relevant is it congruent with what is known about the existing situation? What does the literature tell us can be achieved in this area and how? T Timely what will be the timeframe for measurement? (you may have intermediate goals as well as final ones). 22

Notes This page is left blank for you to record your notes, thoughts and ideas for future action. 23

Appendix A Stage 1 Competency Standards for Professional Engineer STAGE 1 COMPETENCIES The three Stage 1 Competencies are covered by 16 mandatory Elements of Competency. The Competencies and Elements of Competency represent the profession's expression of the knowledge and skill base, engineering application abilities, and professional skills, values and attitudes that must be demonstrated at the point of entry to practice. The suggested indicators of attainment in Tables 1, 2 and 3 provide insight to the breadth and depth of ability expected for each element of competency and thus guide the competency demonstration and assessment processes as well as curriculum design. The indicators should not be interpreted as discrete sub-elements of competency mandated for individual audit. Each element of competency must be tested in a holistic sense, and there may well be additional indicator statements that could complement those listed. STAGE 1 COMPETENCIES and ELEMENTS OF COMPETENCY 1. KNOWLEDGE AND SKILL BASE 1.1. Comprehensive, theory based understanding of the underpinning natural and physical sciences and the engineering fundamentals applicable to the engineering discipline. 1.2. Conceptual understanding of the, mathematics, numerical analysis, statistics, and computer and information sciences which underpin the engineering discipline. 1.3. In-depth understanding of specialist bodies of knowledge within the engineering discipline. 1.4. Discernment of knowledge development and research directions within the engineering discipline. 1.5. Knowledge of contextual factors impacting the engineering discipline. 1.6. Understanding of the scope, principles, norms, accountabilities and bounds of contemporary engineering practice in the specific discipline. 2. ENGINEERING APPLICATION ABILITY 2.1. Application of established engineering methods to complex engineering problem solving. 2.2. Fluent application of engineering techniques, tools and resources. 2.3. Application of systematic engineering synthesis and design processes. 2.4. Application of systematic approaches to the conduct and management of engineering projects. 3. PROFESSIONAL AND PERSONAL ATTRIBUTES 3.1. Ethical conduct and professional accountability 3.2. Effective oral and written communication in professional and lay domains. 3.3. Creative, innovative and pro-active demeanour. 3.4. Professional use and management of information. 3.5. Orderly management of self, and professional conduct. 3.6. Effective team membership and team leadership. 24

Resources Assessment Design Title: Overview: A sample process Assessment Design Process 5 Item Main Knowledge, Skills, and Abilities (KSAs): Rationale: Additional Knowledge, Skills, and Abilities: Potential observations: Potential work products: Potential rubrics: Characteristic features: Variable features: Educational standards: Exemplar tasks: Online resources: References: Description Name of the assessment item (can be used for identification) What students are given and what they are expected to achieve. What are the main targeted KSAs? Specific unit/subject/course/program objectives. Why this assessment item is designed? What the students currently don t know/understand, what misconception they might have, what can they do/use... to address this deficiency, or advance towards next level. Relation to other course- or program-level objectives. Any pre-requisite KSAs that may be required to achieve the main KSAs. These would be implicit and may not be the target of assessment. These are the possible observables that can reveal learning: What students can do accurately, correctly, or what students can say, draw, write, or analyse. What constitutes the evidence of learning; For examples: Drawings, oral presentations, written descriptions, reports, lab test results, documented processes, progressive work, portfolios, performance assessed over a period of time (eg a semester), etc. How will student performance be evaluated? For example: Quality of content; Quality of process followed; Scores of MCQ; Instances of correct conceptions, or misconceptions remaining; Also, Determined scoring: what constitutes a 7, 6, 5..., or a 1. These are descriptions of what and how students might present, as reasoning or analysis. They are the overall features that can be set at defined minimum threshold levels. What can be varied, while using similar assessment tasks with associated evaluations, e.g.: At what depth (introductory, intermediate or advanced level these may correspond to semester or year level); Presentation format; Auditory information (enthusiasm, confidence, etc.); Correct use of symbols and notations; Language, expression and grammar; Background knowledge; Cognitive complexity; Use of check lists; Goal-setting and execution; Monitoring progress of tasks; Motivation, relevance, value, authenticity; Self-regulation; etc. What Accreditation criteria/standard(s) or TLOs this assessment addresses? How are these linked to program and course level objectives? These can be provided to students to set expectations, or to learn from practicing assessment using the rubrics; They can be provided to the academics to use for assessment design. Provided, or obtained and utilised. Provided or identified, used and cited. 5 Adapted from Evidence-Centred Assessment Design: Layers, Structures, and Terminology, R. Mislevy, and M. Riconscente, July 2005, http://padi.sri.com Last accessed 27/3/2012. 25

Creating Rubrics What is a Rubric? http://stone.web.brevard.k12.fl.us/html/comprubric.html#rubric Recommendations for Rubric Construction (with appreciation to Donna Szpyrka and Ellyn B. Smith) 1. Divide overall task into distinct subtasks that evidence student skills or comprehension/application of knowledge. 2. Identify both cognitive and performance components that can be assessed. 3. Determine whether qualitative or quantitative descriptors are going to be for each subtask- this may be influenced by the purpose of the assessment ie: primarily for student feedback or for incorporating into a numerical average. 4. Allow for full range of skill/knowledge with clear indicators of each level of performance. 5. Use rubric for an evaluation of a presentation or activity. 6. Revise rubric as necessary. Creating a Rubric: Tutorial http://health.usf.edu/publichealth/eta/rubric_tutorial/default.htm Build a Rubric on Line Rubistar website: http://rubistar.4teachers.org/index.php 26

Collecting Evidence of Learning Collecting Evidence of Learning http://vels.vcaa.vic.edu.au/support/tla/evidence.html#intro Accessed 25 May 2012 This is the first page only full paper is on the available from the address above. Introduction Negotiated tasks Reflection, peer and self-assessment Group assessment Portfolios Observations Presentations, demonstrations and interviews Show all Introduction In planning activities and managing assessment, teachers should ensure that assessment is based on a variety of tasks and is inclusive of the learning needs of all students. Multiple sources of information should be used to make judgments about specific skills and depth of understanding. Assessment tasks need to be developed with the goals and objectives of the unit in mind and must reflect the learning objectives outlined. These sources include: negotiated tasks with negotiated assessment criteria self assessment and reflection group assessment portfolios learning journals observations presentations demonstrations peer evaluations. It is important that unexpected outcomes, both positive and negative, are also acknowledged. Negotiated tasks and assessment A collaborative approach to developing assessment criteria for different purposes and audiences can enable students to become better focused and engaged in learning. In relating assessment criteria to clearly developed learning expectations within a given task, students think carefully about what is being assessed and the kinds of evidence that would need to be provided to show their understanding. The negotiation of assessment tasks is central to contract work and teachers need to maintain accurate records of the tasks being undertaken to ensure that students are demonstrating their skills and knowledge across a wide range of options. Teachers lead the discussion by presenting students with options for decision-making about the kinds of evidence that might be provided to assess negotiated tasks. (For example, see options under Group assessment) (Please see website above for full paper) 27

Use of Evidence in Learning Use of evidence in learning and teaching Paper by Patrick Griffin, Director of the Assessment Research Centre, Melbourne, Australia (http://www.edfac.unimelb.edu.au/arc/) http://web.ceomelb.catholic.edu.au/uploads/publications/lmatters/griffin.pdf Accessed 25 May 2012 This is the first page only full paper is on the available from the address above. Use of Evidence in Learning and Teaching I started writing about how assessment evidence could inform strategic approaches for planning in teaching and learning intervention and lead to a culture of continuous improvement. But this begged so many questions. As I wrote about the niceties of assessment design and development, I could see that there was a serious gap in the reality. No matter how good the instrument was and how good the intention was to collect data, it mattered little if the process of collection, analysis and interpretation was flawed. Evidence is more than the assessment data. Gathering evidence is a complete package that involves a clear design, purpose and process of collection and interpretation and finally a way of telling the stakeholders about the results. It is not often that all these parts come together. Assessment evidence is often derived from a mixture of system-designed test data and local observations mixed in a cocktail of local tests and projects, together with intuition and folk wisdom. These are then applied to a set of data analyses, which in many cases are difficult to interpret and impossible to report to key players. So a specialist is sometimes employed to visit schools to interpret results and to offer teachers a brief and surface interpretation. The specialists are then left to their own devices to conflate the data into evidence and provide reports to parents, teachers, the administration and the system of the successful outcomes of the learning and teaching program. This is neither an elegant nor a suitable use of data, nor is it the appropriate accumulation of evidence of learning and teaching to be used for continuous improvement. Rather, it relies to a great extent on serendipity. Sadly, education policy is also often based on evidence of a similar quality and, when this is the case, it is almost always flawed. Evidence-based education policy and practice are now being promoted around the world. Remarkably, they are also coming under fire, not because they lack rigor or are not sufficiently scientific, but because they are too hard, too rigorous and too scientific for the art of education. But unless evidence is properly collected, adequately analyzed and correctly interpreted, it cannot be used appropriately. (Please see website above for full paper) 28

Classroom Assessment Techniques (CATs) Book by Thomas A. Angelo and K. Patricia Cross It describes methods and techniques for use as models for evaluating two fundamental questions: "What are your students learning?" and "How effectively are you teaching?" Thomas A. Angelo and K. Patricia Cross (1993), compiled the sourcebook, Classroom Assessment Techniques: A Handbook for College Teachers, that describes methods and techniques for use as models for evaluating two fundamental questions: "What are your students learning?" and "How effectively are you teaching?" From the time they wrote their first handbook in 1988, classroom assessment techniques (CATs) have been widely used in all levels of education and training. Recently, I attended a CATs presentation facilitated by Drs. Jean Runyon and Tom Gorecki of the College of Southern Maryland, which I found insightful and practical. Further research on CATs led me to Northwest Missouri State University, University of North Carolina and other schools who promote the use of CATs for both online and face-to-face environments. This article discusses the benefits of CATs for both online and face-to-face teaching tools as quick and easy methods for providing immediate feedback for student learning that allow instructors to make changes on the fly. "Classroom Assessment is a simple method faculty can use to collect feedback, early and often, on how well their students are learning what they are being taught." The reason CATs work so well online as well as face-to-face, is that the assessment techniques are simple to conduct, formative in nature, and easily tailored to the concerns of the instructor. CATs effectively evaluate three critical areas in a just-in-time format: Course-related knowledge and skills involving prior knowledge, recall and understanding, critical thinking, and problem solving skills. Student attitudes and self-awareness, specifically awareness of values, attitudes, and the learning process. Reactions to instruction methods such as student reactions to instructors/instructing, class activities, assignments, and materials. "College instructors use feedback gleaned through Classroom Assessment to inform adjustments in their teaching." The benefits for faculty are that CATs provide feedback from students while the learning is in process. Using CATs takes away the element of surprise for both the student and the instructor by closely monitoring and modifying the teaching/learning process within the classroom. Instructors can assess the status of student learning in less time than waiting for formal assessment results such as tests, papers, and end-of-class evaluations. With just-intime feedback, faculty can make immediate changes in their teaching as needed. Faculty can also share the feedback with students to help them improve their learning and study strategies. For further details visit: http://deoracle.org/online-pedagogy/assessment-feedback-rubrics/cats-in-the-classroomclassroom-assessment-techniques.html For other very useful CATs resources see: http://www.flaguide.org/cat/cat.php 29