Multiple Measures Assessment Project - FAQs

Similar documents
Evaluation of a College Freshman Diversity Research Program

AB104 Adult Education Block Grant. Performance Year:

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

Math Pathways Task Force Recommendations February Background

Early Warning System Implementation Guide

EAP. updates KHENG WAICHE. early proficiency programs coordinator

State Parental Involvement Plan

State Budget Update February 2016

Delaware Performance Appraisal System Building greater skills and knowledge for educators

FY16 UW-Parkside Institutional IT Plan Report

Undergraduate Admissions Standards for the Massachusetts State University System and the University of Massachusetts. Reference Guide April 2016

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

California s Bold Reimagining of Adult Education. Meeting of the Minds September 6, 2017

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

READY OR NOT? CALIFORNIA'S EARLY ASSESSMENT PROGRAM AND THE TRANSITION TO COLLEGE

An Empirical Analysis of the Effects of Mexican American Studies Participation on Student Achievement within Tucson Unified School District

Accuplacer Implementation Report Submitted by: Randy Brown, Ph.D. Director Office of Institutional Research Gavilan College May 2012

Student Success and Support Program Plan (Credit Students)

ACADEMIC ALIGNMENT. Ongoing - Revised

10/6/2017 UNDERGRADUATE SUCCESS SCHOLARS PROGRAM. Founded in 1969 as a graduate institution.

Mathematics Program Assessment Plan

Teaching Excellence Framework

Race, Class, and the Selective College Experience

Idaho Public Schools

Validation Requirements and Error Codes for Submitting Common Completion Metrics

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Greetings, Ed Morris Executive Director Division of Adult and Career Education Los Angeles Unified School District

School Leadership Rubrics

CÉGEP HERITAGE COLLEGE POLICY #15

LATTC Program Review Instructional -Department Level

College of Liberal Arts (CLA)

Timeline. Recommendations

Graduation Initiative 2025 Goals San Jose State

Student Experience Strategy

GRADUATE PROGRAM Department of Materials Science and Engineering, Drexel University Graduate Advisor: Prof. Caroline Schauer, Ph.D.

EDUCATIONAL ATTAINMENT

Every Student Succeeds Act: Building on Success in Tennessee. ESSA State Plan. Tennessee Department of Education December 19, 2016 Draft

2012 New England Regional Forum Boston, Massachusetts Wednesday, February 1, More Than a Test: The SAT and SAT Subject Tests

STEM SMART Workshop Las Vegas - Sept 19, 2012

California State University EAP Updates 2016

Historical Overview of Georgia s Standards. Dr. John Barge, State School Superintendent

DESIGNPRINCIPLES RUBRIC 3.0

A Decision Tree Analysis of the Transfer Student Emma Gunu, MS Research Analyst Robert M Roe, PhD Executive Director of Institutional Research and

Millersville University Degree Works Training User Guide

School Inspection in Hesse/Germany

Toronto District School Board

Basic Skills Initiative Project Proposal Date Submitted: March 14, Budget Control Number: (if project is continuing)

Graduate Handbook Linguistics Program For Students Admitted Prior to Academic Year Academic year Last Revised March 16, 2015

Data Glossary. Summa Cum Laude: the top 2% of each college's distribution of cumulative GPAs for the graduating cohort. Academic Honors (Latin Honors)

STEM Academy Workshops Evaluation

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

Upward Bound Program

1GOOD LEADERSHIP IS IMPORTANT. Principal Effectiveness and Leadership in an Era of Accountability: What Research Says

Public School Choice DRAFT

ISD 2184, Luverne Public Schools. xcvbnmqwertyuiopasdfghjklzxcv. Local Literacy Plan bnmqwertyuiopasdfghjklzxcvbn

Testing for the Homeschooled High Schooler: SAT, ACT, AP, CLEP, PSAT, SAT II

Barstow Community College NON-INSTRUCTIONAL

Executive Summary. Osan High School

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Fall Semester Year 1: 15 hours

Math Placement at Paci c Lutheran University

Section V Reclassification of English Learners to Fluent English Proficient

MIDDLE SCHOOL. Academic Success through Prevention, Intervention, Remediation, and Enrichment Plan (ASPIRE)

OFFICE OF ENROLLMENT MANAGEMENT. Annual Report

CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

National Longitudinal Study of Adolescent Health. Wave III Education Data

Humanitas A. San Fernando High School. Smaller Learning Community Plan. Azucena Hernandez, Redesign Team. Bob Stromoski, Redesign Team

Assessment and Evaluation for Student Performance Improvement. I. Evaluation of Instructional Programs for Performance Improvement

CSU East Bay EAP Breakfast. CSU Office of the Chancellor Student Academic Services Lourdes Kulju Academic Outreach and Early Assessment

The Teaching and Learning Center

ARTICULATION AGREEMENT

The University of North Carolina Strategic Plan Online Survey and Public Forums Executive Summary

TABLE OF CONTENTS Credit for Prior Learning... 74

CREDENTIAL PROGRAM: MULTIPLE SUBJECT Student Handbook

Miami-Dade County Public Schools

CUSTOMER EXPERIENCE ASSESSMENT SALES (CEA-S) TEST GUIDE

Running Head GAPSS PART A 1

Mathematics. Mathematics

ReFresh: Retaining First Year Engineering Students and Retraining for Success

Strategic Plan Dashboard Results. Office of Institutional Research and Assessment

EXPANSION PACKET Revision: 2015

ENGINEERING FIRST YEAR GUIDE

The Condition of College & Career Readiness 2016

IMPORTANT: PLEASE READ THE FOLLOWING DIRECTIONS CAREFULLY PRIOR TO PREPARING YOUR APPLICATION PACKAGE.

Competency-Based Learning Series: Seminar #3 Habits of Work Slides

Qualification handbook

DOCTOR OF PHILOSOPHY IN POLITICAL SCIENCE

Functional Nutrition Application

National Survey of Student Engagement (NSSE) Temple University 2016 Results

HARPER ADAMS UNIVERSITY Programme Specification

Chapter 4 Grading and Academic Standards

NCEO Technical Report 27

CONTINUUM OF SPECIAL EDUCATION SERVICES FOR SCHOOL AGE STUDENTS

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Programme Specification

Innovating Toward a Vibrant Learning Ecosystem:

Xenia High School Credit Flexibility Plan (CFP) Application

AIS/RTI Mathematics. Plainview-Old Bethpage

ADVANCED PLACEMENT STUDENTS IN COLLEGE: AN INVESTIGATION OF COURSE GRADES AT 21 COLLEGES. Rick Morgan Len Ramist

Transcription:

Multiple Measures Assessment Project - FAQs (This is a working document which will be expanded as additional questions arise.) Common Assessment Initiative How is MMAP research related to the Common Assessment Initiative? The MMAP research is an extension of the Student Transcript Enhanced Placement Study conducted by the RP Group to evaluate the effectiveness of using high school transcript data to predict students abilities in passing college-level English and/or math coursework. The research being conducted by the MMAP research team supports the integration of meaningful, statistically validated multiple measures with the new placement test being developed by the Common Assessment Initiative (CAI). In addition to the MMAP research team, there is a multiple measures Work Group which provides advice and guidance to the Common Assessment Initiative Steering Committee and the MMAP research team regarding multiple measures. MMAP research team members have also been invited to present directly to the CAI Steering Committee on three occasions which has been effective in keeping the CAI apprised of progress with the development of valid multiple measures for assessment. Ultimately, multiple measures, based upon the MMAP research and models and the data warehouse managed by Cal-PASS Plus to support multiple measures, will be an integral part of the new Common Assessment system that is deployed to all California Community Colleges to improve local placement, counseling, and instructional decisions. How will results from the MMAP Pilot Colleges be used to inform the Common Assessment Initiative? The MMAP research team has conducted extensive analyses to identify multiple measures that can be most effectively used to predict student success in community college English and math courses. The primary data source that has been validated thus far is high school transcript and test data. The primary analytical tool was a statistical technique known as decision trees. The results of the initial analysis of over 380,000 students who took English and math at the community college and who were also matched with the high school transcript and test information. The results of these analyses have been condensed into a decision rule document that is validated for the state. This document can found on the RP Group MMAP Pilot College resource page - Decision Tree Matrix. Though the statewide rule set should work well as an initial rule set for colleges in the system, the MMAP research team has recommends that multiple measure decision rules be validated locally by replicating the state-level research with assistance from Cal-PASS Plus and the MMAP research team. The multiple measure rule sets are designed to be used disjunctively with the CAI test. That is, a student s test results produce an initial placement recommendation and then, if any of the decision rules result in a higher placement, the student is then moved into a higher level. Thus, the recommended process would use the higher of the two results from the

test and from the multiple measure decision rule set. This allows the two methods (test and multiple measures) to use their strengths. The test is suitable for a comprehensive placement of all incoming students while the multiple measures rule sets are very good at recognizing those students that have been underplaced and determining the highest level at which they are highly likely to succeed. Data How are data being analyzed in MMAP research when colleges and K-12 schools have different data coding and reporting practices? To the extent possible, the data that are being used are those that reflect common data coding and reporting practices. The Chancellor s Office Management Information Systems (CO-MIS: http://extranet.cccco.edu/divisions/techresearchinfosys/mis.aspx ) provides data on community college enrollment and performance. For K-12, data that K-12 districts are required to report to the California Department of Education for the California Longitudinal Pupil Achievement Data System (CALPADS: http://www.cde.ca.gov/ds/sp/cl/ ) are being used wherever possible. For older K-12 data, K-12 districts provide data using a standard format still available for districts that wish to upload legacy data: http://www.calpassplus.org/medialibrary/calpassplus/publicweb/documents/calpassk12dedv 2012_1.pdf While there can be occasional gaps in data quality and completeness, the quantity and quality of the remainder of the data provide a comprehensive foundation upon which to powerfully supplement assessment and placement methods built around more typical single method, single incidence standardized assessment. Further, as the project progresses, many of the gaps are closing significantly as additional districts and data sources become available and as reporting irregularities come to light and are repaired. The MMAP research team did find it necessary to create code that analyzed course titles in order to better identify and categorize coursework, particularly math high school coursework. The team found that not all historic high school courses were reliably coded with appropriate CBEDS coding. The solution was to conduct a course title analysis that improved the accuracy of categorization so that performance in specific types high school courses (e.g., intermediate algebra, pre-calclulus, trigonometry, etc.) could be included in the statewide models. A similar approach was used to categorize math courses at the community college level, again, particularly for math where there are distinct pathways that have distinct predictors of success, particularly at the transfer-level (e.g., pre-calculus, college algebra, statistics, liberal arts math, calculus, etc.). As pilot colleges, and additional interested colleges, plans to deploy a multiple measures approach to placement using these models, it is critically important to work with Cal-PASS Plus to outreach to the college s feeder school districts. These districts will need to agree to share

and provide data through the Cal-PASS Plus system in order for local placement to benefit from high school data analysis. How can colleges access feeder high school data to conduct analyses? Cal-PASS Plus has developed a data infrastructure to support access to high school transcript data, along with MIS data from the CCCCO, data from the California Department of Education, and data from other testing services. These data files are currently available to MMAP pilot colleges upon request. Contact John Hetts ( jhetts@edresults.org ) for further information on availability of college-level data for replicating and validating MMAP analyses locally. How reliable/valid are K-12 grades in predicting college course success? Analysis has shown that GPA, which accumulates and combines many indicators of student behavior and performance across disciplines, instructors, and time, is the most reliable and valid predictor of student success in college courses, substantially outperforming other predictors of student performance, including standardized testing. Typically, the next most reliable predictor is student grade in the most recent course in the discipline. What if students have not taken an AP course or test, are they included in the cohort? AP test scores were not available and did not influence inclusion in the cohort. Why is GPA used? The goal is to reduce error in placement. The object is to look across different types of assessments/information, across disciplines, and over time. Cumulative high school GPA measures performance across multiple years, courses, teachers, and variables - essentially triangulating on a reliable and accurate understanding of a student s overall capacity to be successful in college. Were the high school GPA's used in the models weighted (i.e., for inclusion of AP courses and/or AP test scores) or unweighted? The GPAs were unweighted. How were the GPA breaks for the models determined? The MMAP research team used decision tree analysis to create the break points. This form of statistical analysis is able to incorporate a wide variety of inputs (predictor variables) to create an easy-to-read set of rules that partition the student population into those more likely to succeed and those less likely to succeed at any given level of the English and math sequences. Decision trees take the most powerful predictor to make the initial split into those more likely to succeed versus those who are less likely to succeed, based on what is known about their high school performance. More often than not, the most powerful predictor was cumulative high school GPA (with physical education/gym/athletics grades removed). Another important factor that affects the value of the GPA breaks is the value of the success criterion. For the statewide analysis, the MMAP research team set the criterion variable at 2.2

grade points, or about a C+. That is, in order for a decision rule to be included as a rule for placement at a given level, the group of students to whom that rule applies must have an average of 2.2 grade points in the destination course. Initial models were run with a success criterion of 2.0 (a straight C ). The team decided to increase the success criterion to 2.2. grade points in order to increase the face validity of the rule sets and to ensure that the rule sets were achieving the desired effect of identifying only students who would be highly likely to succeed in the target course. So, in general, a lower success criterion is associated with a lower GPA cut-off while a higher success criterion is associated with a higher GPA cut-off. How will colleges collect noncognitive variables to include in the assessment process? The CAI Steering Committee is developing a standard assessment platform which will include noncognitive measures to the extent that valid and reliable non-cognitive variables can be identified and assessed. Additional information as collected in CCC Apply will also be examined for its utility in assessment and placement. As they become available, these measures will then be included in the data warehouse and will feed into subsequent MMAP models. Some MMAP Pilot Colleges are gathering data on a number of non-cognitive variables from incoming fall 2015 students. These data sets will be connected with students fall 2015 performance and the results will be evaluated to determine the relative utility of these scales in predicting performance in community college English and math and whether these scales can improve models built on high school performance information. These analyses will be completed in winter 2016. In the transfer-level English model, why is the completion of the AP course with a grade of C+ or better included, as opposed to the score on the AP test? AP test scores were not available, only the grades. Test scores would be a welcomed addition and if available locally could/should be included in a pilot model to determine efficacy. We've had some conversations with the vendor and the CDE but it will take a bit to sort out the logistics. In the English models, how is reading treated? Do the MMAP models for English take students' reading level into account, or are they based primarily on composition? The English rule sets are primarily focused on composition. They should track closely with reading and both reading and ESL are part of our phase two assessment we are just now starting. What are the requirements/criteria for a placement approach to be considered multiple measures? Title 5 Section 55522(a): The Chancellor shall establish and update at least annually, a list of approved assessment tests for use in placing students in English, mathematics, or English as a Second Language (ESL) courses and guidelines for their use by community college districts. When using an English, mathematics, or ESL assessment for placement, it must be used with one or more other measures to comprise multiple measures.

Title 5 Section 55502(i): Multiple measures are a required component of a district s assessment system and refer to the use of the more than one assessment measure in order to assess the student. Other measures that may comprise multiple measures include, but are not limited to, interviews, holistic scoring processes, attitude surveys, vocational or career aptitude and interest inventories, high school or college transcripts, specialized certificates or licenses, education and employment histories, and military training and experience. What percentage of students in the MMAP pilot colleges have missing information? Availability of high school transcript data is dependent upon participation of K-12 school districts, particularly those that feed a majority of the students into your college. This is accomplished through each institution s participation in Cal-PASS Plus by signing a data sharing agreement and uploading annually student level data into the system. Currently 60 percent of all K-12 districts have an agreement, which represents approximately 70% of all students in terms of enrollment. However, many districts are inconsistent in their timely uploads of data. It is possible to quickly recruit data from feeder high schools, particularly when a case can be made by the college about how multiple measures can fundamentally change the success trajectories at the community college of a large number of their K-12 graduates. It is not currently known how many incoming students do not have any high school transcript information available at all. This question will be examined as pilot colleges begin to go live with the MMAP rule sets in fall 2015. It is believed that the vast majority of recent high school students have valid data for the major high school variables that are being included in the analyses. The initial MMAP research data files included approximately 380,000 students (each) with matched community college and K-12 performance data. Where students are missing data for one or more grade levels, data from the remaining grade levels are used to represent overall high school performance. Missing information can negatively impact a student s placement via multiple measures, however. For example, if a student has received a grade of B in an 11th grade high school pre-calculus class, that information may help the student qualify for a higher placement level. However, if that student s 11th grade high school transcript information is missing, that student would not be able to take advantage of any rule sets that are predicated on performance in a high school pre-calculus class. For the MMAP English data set, approximately 22% all students had data on all four years; 26% had data for three years; 27% had data for two years; and 25% had data for just one year. A similar pattern was observed in the MMAP math data set: approximately 24% all students had data on all four years; 27% had data for three years; 28% had data for two years; and 21% had data for just one year. In addition to the MIS and Cal-PADS data, approximately 35% of students have English Accuplacer scores and 29% have math Accuplacer scores. Who can the colleges contact to get additional information about feeder high school data? To view a list of the participating K-12 data available for each school in a region, visit the

Cal-PASS Plus webpage: https://www.calpassplus.org/calpass/join/members#. Or contact Cal-PASS Plus directly at (916) 498-8980 and ask for the outreach team. Why aren t we using demographics and other student information (ie. zip code )? Some of this information is not allowable via policy or law. Additionally, there is no strong evidence of differentiation between high schools and their performance. The student motivational or intrinsic facet of being an A student is predictive at any school. The scenario of a B student from a good school being better or equal to an A student at a bad school does not bear out in the research. Being an A student from any school is strongly predictive of student success at the community college level. Are disaggregated data available (e.g., by ethnicity)? Have you looked at the impact on successful completion rates for underrepresented minority students in particular? Yes, we can examine many subgroups. An additional demographic we are examining in phase two is students with identified disabilities. What is the recency factor? I know you created separate models for students who matriculate directly from high school versus those who stopped out for a semester or more, but how does the amount of time away from school factor into the models? A variable for the number of terms between high school and college was included in the model. Specifically, we used the number of primary terms between the last class in English or math in high school and the first attempt of English or math, respectively. The reason for separate models for 11th versus 12th grade data are more about data availability than recency as those going directly from 12th grade into college would only have 11th grade data available in a timely fashion for placement. Pilot Logistics How can results from the MMAP research be implemented at the local level? Pilot implementation is determined by the pilot colleges. Since colleges maintain local control over multiple measures and cut scores, each college will need to come to a consensus on their own. However, the research will provide information for discussion and experimentation around multiple measures assessment. What are some methods for how multiple measures assessment and placement might be implemented? The MMAP research team is recommending that colleges use a disjunctive approach to combining traditional assessment tests and multiple measures in assessment. A disjunctive approach allows for an either/or use of the test information and multiple measures information. This approach calls for students to be tentatively placed using a standardized test. These students are assessed and placed based on an independent rule set derived from multiple measures research. Students are then placed in the higher placement of the two methods or are given the opportunity to choose their placement. This approach is very good at increasing the

overall accuracy of placement and is particularly good at reducing under-placement error, where students are placed at levels lower than those they are prepared to be successful at. An alternative approach that is sometimes employed is the compensatory or blended approach where the test and the multiple measures are combined to produce a single placement for each student. The way the methods are blended can vary: the two methods can be weighted and combined, one method can be used in a supplementary way to adjust the placements of the other method, or one method can be used in an advisory way to help inform the student and college faculty staff to allow for the placement to be adjusted. Because the two data sources (e.g., test and multiple measures) are constrained to the same function, this approach tends to result in higher levels of under-placement than the disjunctive approach. What are some specific examples of how findings from MMAP research can be implemented at the local level? Bakersfield College, Sierra College, Rio Hondo College, and Long Beach City College (among others) have all implemented multiple measures assessment and placement research at their institutions in a variety of ways. See the Bakersfield College for Multiple Measures resource on the RP Group s website. What are some specific examples of how MMAP can be validated at the local level? Local replication of MMAP research can be conducted through procedures similar to those that colleges that participated in Student Transcript Enhanced Placement Study (STEPS) went through. For a summary, please see the Student Transcript-Enhanced Placement Study Project our the RP Group website for more information. What is expected of the pilot colleges? At the very least, pilot colleges should be working to develop the internal capacity for collecting and reviewing data with the goal to assess the potential impact of the use of multiple measures in assessment and placement. Colleges may either directly use the statewide multiple measures rule sets as a placement tool for a pilot cohort of students or, instead, use the statewide rule sets to foster discussions on their campus and work on developing their own locally validated multiple measure rule set based on the approach developed by the MMAP research team. Pilot colleges are expected to engage in dialogue with other pilot colleges, Cal-PASS Plus, and the Common Assessment Initiative about the issues and opportunities created by the piloting process. To facilitate their local work, pilot colleges should download a retrospective data set from Cal-PASS Plus and follow the MMAP research team guidelines on how to develop decision trees and estimate the impact of applying statewide and/or local decision rule sets locally. Staff from MM pilot colleges should also plan to attend and discuss the series of webinars on multiple measures offered by the MMAP research team. These webinars are archived on the MMAP pilot college resource page.

Pilot colleges that are interested in piloting the statewide rule set in fall 2015 should follow the directions in the March 23, 2015 webinar, which provides more detail on ways that MM pilot colleges can be engaged in pilot activities. Visit the How to Implement Predictive Multiple Measures Models Webinar on the RP Group website. Additionally, some pilot colleges are collecting data on a set of social-psychological scales to assess the potential predictive power of traits like grit, conscientiousness, mindset, academic self-efficacy, college identity, etc. Those who are interested in piloting social-psychological scales should contact Craig Hayward ( chayward@rpgroup.org ). The intention is for the work of the pilot colleges to inform the ongoing work to integrate a valid multiple measures component into the common assessment system. As part of this system, all colleges will have access to multiple measures data from an online tool hosted by Cal-PASS Plus and that will be integrated into the Common Assessment Initiative platform. What is the timeline for MMAP implementation for the MMAP pilot colleges? Starting in late fall 2014/early spring 2015, MMAP pilot colleges are expected to begin meaningfully working toward being prepared to collect and analyze multiple measures data for students enrolling in fall 2015. Non-cognitive variables collected in fall 2015 will be matched to performance in relevant English and math classes in December 2015/January 2016 to allow the NCV analysis to proceed. Colleges that choose to implement MM in spring 2016 will be included in the analysis that will take place in summer 2016. Any colleges that would like to join after the spring implementation can still use the resources available and receive support, but their data will not be included in the statewide analysis. What type of support will be provided to pilot colleges for implementing a multiple measures approach? The MMAP team has developed a rule set to be used for multiple measures assessment. The research team has hosted multiple training webinars to guide the colleges through the process. All webinars are archived on the RP Group Website. CalPASS Plus is providing a data match to colleges that would like to upload a cohort of students and receive back their transcript data for analysis. The RP Group and Cal-PASS Plus staff are available for one-on-one coaching upon request. Who will be responsible for conducting validation studies at the college? The individuals responsible for validating the studies at the colleges will vary depending on the type of validation being established. In most cases, faculty in the target content areas will need to be involved in the process as well as individuals at the college s institutional research office to provide support in collecting, analyzing, and interpreting test and course outcome data. What is the impact on the sequence of math courses for students who are placed with the multiple measures model? (Equity impact)

The benefit is that students get to and through the course sequence earlier. This dramatically improves success and completion rates. Multiple measures also works toward closing the achievement gap and addressing equity on campuses. Is this impacting course offerings? Or cutting remediation courses? There is an impact to enrollment management. There are also professional development considerations. Systems will need to be in place for the support of instructors, whether changing course levels, or addressing a different dynamic within transfer level courses. There will be challenges with the student populations in classes. With the multiple measures placement model, the classroom populations will typically be more grade/course appropriate (ie. typically over-placed students who are able to assist with the student engagement roles in a classroom will be in higher level courses). This will be more challenging for instructors. There is also a potential to explore ideas like shifting resources from basic skills programs to transfer level courses. There are many co-requisite models being explored across the segment (both in California and in other states). What is the messaging that needs to happen on campuses to garner support for the multiple measures process? It is the responsibility of individual institutions to determine how each multiple measure project will be implemented. As such, institutions will devise their own messaging. Additionally, institutions will need to consider their commitments to professional development and allocation of resources to placement implementation. There is equity money as well as SSSP funding that might be allocated to multiple measures projects.