KnowledgeAdvisors White Paper. Informal Learning Measurement

Similar documents
CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

Beyond the Blend: Optimizing the Use of your Learning Technologies. Bryan Chapman, Chapman Alliance

STANDARD OPERATING PROCEDURES (SOP) FOR THE COAST GUARD'S TRAINING SYSTEM. Volume 7. Advanced Distributed Learning (ADL)

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

Education the telstra BLuEPRint

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

Davidson College Library Strategic Plan

The Enterprise Knowledge Portal: The Concept

Worldwide Online Training for Coaches: the CTI Success Story

Enhancing Customer Service through Learning Technology

Corporate learning: Blurring boundaries and breaking barriers

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Assessment. the international training and education center on hiv. Continued on page 4

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Prepared by: Tim Boileau

Skillsoft Acquires SumTotal: Frequently Asked Questions. October 2014

Innovating Toward a Vibrant Learning Ecosystem:

THE DEPARTMENT OF DEFENSE HIGH LEVEL ARCHITECTURE. Richard M. Fujimoto

ABET Criteria for Accrediting Computer Science Programs

Lectora a Complete elearning Solution

California Professional Standards for Education Leaders (CPSELs)

State Parental Involvement Plan

Briefing document CII Continuing Professional Development (CPD) scheme.

First Line Manager Development. Facilitated Blended Accredited

Visit us at:

Pragmatic Use Case Writing

Speak Up 2012 Grades 9 12

The Ohio State University Library System Improvement Request,

Using Virtual Manipulatives to Support Teaching and Learning Mathematics

Library Consortia: Advantages and Disadvantages

Researcher Development Assessment A: Knowledge and intellectual abilities

Measurement & Analysis in the Real World

Maintaining Resilience in Teaching: Navigating Common Core and More Online Participant Syllabus

University of Toronto

COMMUNITY ENGAGEMENT

Leveraging MOOCs to bring entrepreneurship and innovation to everyone on campus

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

School Executive Standard 7: Micro-political Leadership. Dr. Kimberly Simmons NCEES Coordinator

Expert Reference Series of White Papers. Mastering Problem Management

Manchester Essex Regional Schools District Improvement Plan Three Year Plan

For the Ohio Board of Regents Second Report on the Condition of Higher Education in Ohio

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

DESIGNPRINCIPLES RUBRIC 3.0

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Ministry of Education, Republic of Palau Executive Summary

Running head: THE INTERACTIVITY EFFECT IN MULTIMEDIA LEARNING 1

Project Management for Rapid e-learning Development Jennifer De Vries Blue Streak Learning

Technology in the Classroom

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

KENTUCKY FRAMEWORK FOR TEACHING

Chart 5: Overview of standard C

Effective practices of peer mentors in an undergraduate writing intensive course

Motivation to e-learn within organizational settings: What is it and how could it be measured?

FACULTY OF PSYCHOLOGY

OilSim. Talent Management and Retention in the Oil and Gas Industry. Global network of training centers and technical facilities

LEARNER VARIABILITY AND UNIVERSAL DESIGN FOR LEARNING

A Framework for Articulating New Library Roles

SAMPLE. ORG423: Communication Strategies for Leaders

Pod Assignment Guide

New Paths to Learning with Chromebooks

Strategic Planning for Retaining Women in Undergraduate Computing

The Condition of College & Career Readiness 2016

Market Intelligence. Alumni Perspectives Survey Report 2017

Introduction to Moodle

BENCHMARK TREND COMPARISON REPORT:

Software Development Plan

CLASSROOM USE AND UTILIZATION by Ira Fink, Ph.D., FAIA

How to Develop and Evaluate an etourism MOOC: An Experience in Progress

Video Marketing Strategy

Chamilo 2.0: A Second Generation Open Source E-learning and Collaboration Platform

Management Update: A Growing Market Battle to Deliver E-Learning Systems

Strategic Practice: Career Practitioner Case Study

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

JEFFERSON COLLEGE COURSE SYLLABUS BUS 261 BUSINESS COMMUNICATIONS. 3 Credit Hours. Prepared by: Cindy Rossi January 25, 2014

State: Original. Status: Planned July 2015-June. State: Original. Status: Planned. July 2015-June. State: Original. Status: Planned.

Master Program: Strategic Management. Master s Thesis a roadmap to success. Innsbruck University School of Management

White Paper. The Art of Learning

Louisiana Free Materials List

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Statewide Strategic Plan for e-learning in California s Child Welfare Training System

Personal Tutoring at Staffordshire University

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Is M-learning versus E-learning or are they supporting each other?

La Grange Park Public Library District Strategic Plan of Service FY 2014/ /16. Our Vision: Enriching Lives

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Targetsim Toolbox. Business Board Simulations: Features, Value, Impact. Dr. Gudrun G. Vogt Targetsim Founder & Managing Partner

FAU Mobile App Goes Live

Blended E-learning in the Architectural Design Studio

DRAFT Strategic Plan INTERNAL CONSULTATION DOCUMENT. University of Waterloo. Faculty of Mathematics

Guidelines for Project I Delivery and Assessment Department of Industrial and Mechanical Engineering Lebanese American University

IMPACTFUL, QUANTIFIABLE AND TRANSFORMATIONAL?

A Coding System for Dynamic Topic Analysis: A Computer-Mediated Discourse Analysis Technique

GALICIAN TEACHERS PERCEPTIONS ON THE USABILITY AND USEFULNESS OF THE ODS PORTAL

Best Practices in Internet Ministry Released November 7, 2008

Transcription:

KnowledgeAdvisors White Paper Informal Learning Measurement

Table of Contents Informal Learning Background...2 Learning Delivery Taxonomy...2 Informal Learning Measurement Business Case...5 Informal Learning Measurement Model...6 What to Measure...6 When to Measure It...7 How to Measure It...8 Informal Learning Measurement Solutions...9 About KnowledgeAdvisors...9 Contact Us...9 References...10 Appendix A: Learning Delivery Taxonomy Definitions...11 Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 1

Informal Learning Background In recent times the concept of learning in unstructured ways without formal curricula or learning objectives has given rise to the term informal learning. The term has many definitions. For example, it relates to learning that is free from constraint and not officially recognized. It may also refer to learning that is ad hoc or accidental or unplanned without discipline. It can be described as learning that can happen anywhere, anytime, by anyone intentionally or unintentionally. For purposes of this paper, KnowledgeAdvisors will define informal learning as knowledge transfer that occurs without the assistance of structured curriculum. In addition, a subset of informal learning can be referred to as social learning, which has elements of collaboration or community. Regardless of what it s called, informal learning arose from a need to learn ondemand vs. wait for conventional methods of learning so workers could stay knowledgeable and productive in a dynamic work environment. Informal learning has been around long before formal learning existed. People learned through observation, apprenticeships, and experience. As instructional design principles emerged, formal learning emerged. New technologies are enabling a new way of learning informally but the classic methods of informal learning are actively embraced into these technologies. What is known about informal learning is that it has a strong presence in organizations and will continue to grow. It does contribute to the performance of the workforce. KnowledgeAdvisors recent research found a minority of organizations allocating significant, formal budget toward it but more is likely to come. Measurement methods have been basic and sparse, culture is a critical enabler to its uptake, technology and social networking tools are common in its adoption and growth. This paper will focus on measuring informal learning. Measurement has become important in formalizing informal learning. However, by formalizing it, that doesn t mean the learners feel constrained by structure, rules or definitions. Many informal learning tools can be learner or community self-sufficient in their maintenance and growth. Formalization means that informal learning programs can garner greater attention for budgets so that they receive care and feeding and evolve and grow. Measurement can provide reasonable indicators as to informal learning s usability, availability, accessibility, sense of community, quality and effectiveness. Learning Delivery Taxonomy Before discussing measurement of informal learning, it is important to articulate where informal learning fits into the learning delivery taxonomy. The learning delivery taxonomy is in Exhibit I below. Definitions for the taxonomy are in Appendix A. The learning delivery taxonomy is a list of ways individuals can learn, structured in cleanly defined categories based upon a mutually exclusive, collectively exhaustive approach. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 2

Exhibit I. Learning Delivery Taxonomy Formal Structured learning, curriculum required Live/Interactive Learning requiring interaction between at least 2 individuals Open Enrollment ILT Private ILT Custom Tutoring Facilitated Classroom Learning Virtual Classroom Learning Just-In-Time / Self-Paced Learning without direct interaction with another individual Self Study Guide Computer-Based Training (CBT) Video-Based Training (VBT) Web-Based Training (WBT) Audio-Based Training (ABT) Distance Learning Informal Knowledge transfer without assistance of structured curriculum Live/Interactive Learning requiring interaction between at least 2 individuals Just-In-Time / Self-Paced Learning without direct interaction with another individual Help desk Coaching/Mentoring Collaboration Communities of Practice Presentations Virtual Knowledge Sharing Desk-side Support Publications Reference Guides Job Aides Electronic Performance Support Systems On-Line Self Help The taxonomy begins with an identification of whether or not the learning is delivered with structured curriculum (formal) or not (informal). From there it breaks out each of these into Live/Interactive (requiring interaction between 2 or more individuals) or Just-In-Time/Self-Paced (no direct interaction with other individuals). The third layer of the taxonomy is the methods themselves within these classifications. The taxonomy serves as a guide to learning delivery given the audience, learning style, and organizational culture. Some of the most common informal learning deliveries are illustrated in the taxonomy. They include the high touch low tech elements like a community of practice or a coaching or mentoring program which due to their high involvement with others place them in the interactive section. It also encompasses a Virtual Knowledge Sharing system (VKS) which is collaboration in an electronic platform (ex. a blog or wiki or even Twitter) which places it in the interactive but electronic section of the taxonomy. Note all of these are support, meaning they re unstructured from the point that they do not have a formal curriculum, a key feature of informal learning. Finally, there is technology-based electronic support systems that are self-paced performance support tools and like their interactive counterparts, they don t have a curriculum. Now that we can see how informal learning and traditional learning coexist in the learning delivery taxonomy, both can also be viewed in three dimensions taking into account design, delivery and consumption. Exhibit II has the Multi- Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 3

Dimensional Learning Taxonomy (Parskey 2010). In this model Design occupies the Y-Axis. Design can be Ad-Hoc which is user created, Structured but not Instructional Design such as a Presentation or Communication and Instructional Design which has formal curricula like Instructor-led training. On the X-Axis is the Delivery which is either Scheduled by the delivery organization or content provider or Unscheduled, content is created and available anytime. Consumption is the third element and can be As Needed such as a performance support tool or Prescribed, defined by the creator such as elearning. Exhibit II. Multi- Dimensional Learning Taxonomy Consumption Approach As needed Prescribed Ad hoc Scheduled Delivery Approach Unscheduled User created content (including videos, reports, white papers) Design, Development Approach Formal approach, not ID Spot coaching All hands meetings Community meetings Communications Mentoring, formal coaching programs Reference guides Best practices Expert locators VKS content Podcast VODs ID process Webinars Performance Support Self-study guide Job Aids Face to face training Virtual, synchronous E-learning, asynchronous training Thinking about the use of multiple deliveries in these three dimensions may help create a learning experience that more closely resembles a true learning program. A learning program is a set of carefully crafted content, information and experiences that create skills and capabilities with a specific audience. Exhibit III illustrates programs along the top horizontal axis and learning deliveries, both formal and informal, along the y-axis, altogether this illustration represents the Measurement Angles that visualize the individual learning (course and event), program and initiatives and systems and methodologies. The example shows a new hire program Welcome to Acme that spans multiple learning deliveries and must measure the systems, initiatives, and courses that are components of the overall program. Measurement will be discussed in the upcoming sections. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 4

Exhibit III. Learning Program Delivery Informal Learning Measurement Business Case KnowledgeAdvisors research suggests that only about 9% of the learning budget is spent on informal learning today but that is expected to rise in the future. The information technology organization has historically funded technology infrastructure that is then used by learning entities to create informal learning enablers. In addition, 21% of executives are now asking for measurement of informal learning and this too is expected to rise. A 2009 study by the American Society of Training and Development (ASTD) stated that informal learning s existence is positively correlated to performance. In fact, 39% of respondents to the ASTD study stated informal learning enhanced performance by a high extent. Interestingly, only 15% of organizations surveyed felt they used it to reduce costs of formal learning. So there is a business case to do informal learning, even if it isn t measured. There is relatively small amount of measurement taking place today because the traditional measurement methodologies are difficult to apply. As a result, most measurement is anecdotal evidence. However, per the American Society of Training and Development, a significant finding did arise: measuring the effectiveness of informal learning has statistically significant correlation to market performance. This is based on a statistical analysis correlating measurement of informal learning to the broader organizations market value and the strength of the positive correlation. This is the single most important business case to measure your informal learning. Measure it because it may indeed show performance gains. Finally, if obtaining more direct funding for informal learning is part of your business case, KnowledgeAdvisors research revealed that organizations that systematically measure their informal learning have budgets that are twice their counterparts that do not measure. This illustrates you manage what you measure. Measure the informal learning and it becomes more visible and strategic and more budget is allocated its way. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 5

Informal Learning Measurement Model Informal learning measurement can and should be done. If done properly it can determine if the investments in informal learning should grow or not. It can also determine whether to invest in technology tools or more personalized initiatives. Measurement of informal learning can be comparable to more traditional models so the right mix that optimizes impact relative to the investment can be done, which is important for governance and stewardship. Measurement of informal learning should be looked upon from three elements: What to Measure, When to Measure It, and How To Measure It. Each is discussed below. What to Measure Organizations should measure an informal learning program from solution, experience and benefit constructs. Exhibit IV illustrates the key components within each construct. The Solution construct is measuring the informal learning to answer the question Is this the right delivery type to fulfill the stated need? To this end, it measures whether or not it was easy to access and use and whether it met the objectives. The Experience construct answers the question Did it have the right level of community and involvement? The Benefits construct answers the question Did it meet or exceed individual and organizational measurable performance goals and outcomes. Within the constructs traditional learning measurement can be seen. Quality for example is like a Level 1 in the Kirkpatrick Model, followed by Effectiveness, similar to Level 2 Knowledge Transfer, Application to Job is a Level 3 behavior change, Performance Impact is a Level 4 Results link and Business Results/ ROI is similar to the 5th Level from The ROI Process Model. Other traditional measurement concepts include operational metrics one might do for elearning such as availability, accessibility and usage. But, there are new measurement constructs that are important to informal learning. These include value and belonging as well as engagement. This is because many informal learning deliveries thrive on the sense of community to create and grow the content and keep the learning fresh and active. Support systems exist with informal learning as culture is a huge factor to informal learning uplift. However, support systems must be strong in any learning delivery because without manager support and engagement, learning becomes scrap learning quickly. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 6

Exhibit IV. Informal Learning Constructs When to Measure It After knowing what to measure, a logical question is when to measure it. KnowledgeAdvisors suggests it be done at different points in the process. Exhibit V illustrates this point. Exhibit V. When to Measure Informal Learning Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 7

The first 3 elements in Exhibit V identify measurement at the interaction point. For example, a user goes into a Virtual Knowledge Sharing System to access an expert in her organization that is experienced in processing invoices. Upon their interaction with the system a measurement opportunity exists to measure the Solution, Experience and Benefits. At this point the emphasis is on the Solutions construct. Was it accessible, usable, effective? Next is the periodic measurement. This would be a major milestone during a program that has not yet concluded and where there have been multiple interactions. At this point the emphasis is on the Experience construct. Is it a high quality experience, was there a strong sense of community, is it the right solution for the need? In addition, measuring benefits can be done here as well. However it may be a forecast or predictor of impact on performance as the program has not yet concluded. Yet forecasting is a powerful predictor to aide in resource allocation and continuous improvement. Finally there is measurement at conclusion. A program has ended, there will not be further interaction specific to the program. At this point the emphasis is on the Benefits Construct. Was it effective and did it produce the desired value, behavior change and return as expected? How to Measure It In general, the time spent on measurement should be proportional to the time spent on the learning. Because informal learning requires measuring different constructs at different points in the process, there are multiple measurement enablers to do the actual measurement itself. Exhibit VI illustrates these broken out by Systems, Tools and People enablers. Exhibit VI. How to Measure Informal Learning On the left are the informal learning deliveries with high technology enablement. Examples include Virtual Knowledge Networks and Electronic Performance Support Systems. System analytics can measure the Solution in terms of access, availability and usage. On the opposite end are the high personal touch deliveries like coaching and mentoring and Communities of Practice. Using focus groups, interviews and observation can help measure the solutions usability and effectiveness. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 8

Across any informal learning program regardless of technology or personal touch are measurement tools. Conducting a needs analysis or usability study prior to making a resource-intense investment is sound business practice. At each interaction, a quick poll of the user as a feedback link or content rating is great to gauge the Solution. Conducting a more formal survey or assessment at the periodic milestone from a sample of participants is effective in measuring the Experience and using surveys and assessments are practical and reasonable ways to measure Benefits. At this point these can be consistent and comparable to traditional evaluation for benchmarking the performance results of the deliveries. Using a smart sheet evaluation (one that asks questions around impact, value and ROI) as opposed to a smile sheet (just evaluating quality or satisfaction) can do this nicely. In addition, statistically correlating usability data with results data (ex. sales personnel that used a sales knowledge portal correlated to their sales) is an effective way to measure Benefits with more precision and credibility but it will not be as practical or scalable as a survey or assessment so it should be done on the more strategic, visible or costly informal learning investments. Informal Learning Measurement Solutions KnowledgeAdvisors focuses on learning and talent measurement. In the rapidly changing workplace, learners are changing how they learn. The business case for measurement clearly shows that measurement of informal learning is correlated to market performance and leads to more budget for informal learning. KnowledgeAdvisors can help your organization determine what to measure, when to measure it, and how to measure it. Whether it s a learning measurement strategy or the design of a periodic measurement instrument, our experts can consult with your organization to mitigate risk and ensure success. Further, to minimize the administrative burden of measurement, KnowledgeAdvisors has Metrics that Matter, our proprietary on-demand measurement technology. It can automate and simplify how your informal learning data is collected, processed, reported and analyzed. With Metrics that Matter, the abundance of data originating from informal learning becomes information for intelligence decision-making. Inherent within the technology are standard evaluation instruments to measure informal learning that have been customer-validated. About KnowledgeAdvisors KnowledgeAdvisors is the world s largest provider of learning and talent measurement solutions. Leading organizations access our measurement expertise and on-demand software to ensure a high-performing workforce. As the thought leader in Human Capital Analytics, KnowledgeAdvisors provides the most comprehensive analytics solutions available. Contact Us If you would like to learn more about how we can help you meet your measurement needs please contact us. www.knowledgeadvisors.com sales@knowledgeadvisors.com (800) 561-3341, +1 (312) 676-4400 Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 9

References Parskey, Peggy, Learning Taxonomy, v1, 2010 KnowledgeAdvisors, KnowledgeAdvisors Research: Executive Summary, Informal Learning, 2009 KnowledgeAdvisors, Informal Learning Study, v11, May 13, 2009 Bersin & Associates, The Enterprise Learning Framework: A Modern Approach to Corporate Training, April 2009 American Society of Training and Development, Tapping the Potential of Informal Learning: An ASTD Research Study, 2008 KnowledgeAdvisors, Learning Delivery Taxonomy, 2003 Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 10

Appendix A: Learning Delivery Taxonomy Definitions Audio Based Training (ABT) Learning via Audio (ex. Pod cast). Coaching A defined method of directing, instructing, and training a person or group to achieve a goal or develop specific skills in one session or over time. Collaboration Interactive learning that occurs face-to-face with 2 or more individuals (without curriculum). Communities of Practice (CoP) Groups that form to share what they know, to learn from each other regarding some aspect of their work and to provide a social context for that work. Computer-Based-Training (CBT) Traditional Computer-Based Training via CD-ROM or flash drive. Curriculum Materials developed using instructional techniques. Desk-side Support Support professional responding to request for help Distance Learning Any learning that occurs separate from live interaction. EPSS Electronic Performance Support System. An integrated web-based environment that provides access to personalized information, tools and guidance to enable optimal job performance with minimum support and intervention of others. Facilitated Classroom Learning Collaborative live learning in a face-toface environment that leverages self-paced instruction. Formal Learning Knowledge transfer that occurs with the assistance of structured curriculum Help Desk Support professional responding to request for help via the phone or Web. ILT Custom Custom training course open to individuals from same company (customized curriculum). ILT private Standard training course open to individuals from same company (not customized curriculum). Informal Learning Knowledge transfer that occurs without the assistance of structured curriculum. Job Aids Short, concise quick reference tools. Just-In-Time/Self-Paced Learning Any learning that occurs without direct interaction with another individual. Knowledge The sum of what has been learned. Learning The acquisition of knowledge either via formal or informal formats. Live Interactive Learning Any learning that occurs with interaction between at least 2 or more individuals. Mentoring A development al relationship between a more experienced mentor and a less experienced mentee. On-line Self Help Short, concise quick reference tools available electronically. Copyright 2010 KnowledgeAdvisors. All rights reserved. Page - 11

Open Enrollment Standard training course open to individuals from multiple companies (not customized curriculum). Presentation Live, one-way overview without curriculum Publications Documented materials without curriculum. Reference Guides Detailed support materials. Self-Study Guide Self-paced curriculum. Tutoring One-on-one, face-to-face training with curriculum. Video Based Training (VBT) Traditional instruction via video (ex. You Tube video) Virtual Classroom Learning Collaborative live learning that uses an electronic interface and leverages self-paced instruction. Virtual Knowledge Sharing (VKS) A collaborative learning space existing within an electronic (likely Web-based) platform. Web-Based-Training (WBT) Learning via the web (with curriculum). KnowledgeAdvisors, Inc. www.knowledgeadvisors.com (800) 561-3341 - +1 312-676-4400 - solutions@knowledgeadvisors.com 2010 by KnowledgeAdvisors. Metrics that Matter is a registered trademark of KnowledgeAdvisors. All rights reserved.