Different Assessments, Different Results: A Cautionary Note When Interpreting State Test Results

Similar documents
medicaid and the How will the Medicaid Expansion for Adults Impact Eligibility and Coverage? Key Findings in Brief

FY year and 3-year Cohort Default Rates by State and Level and Control of Institution

Average Loan or Lease Term. Average

46 Children s Defense Fund

STATE CAPITAL SPENDING ON PK 12 SCHOOL FACILITIES NORTH CAROLINA

Wilma Rudolph Student Athlete Achievement Award

2017 National Clean Water Law Seminar and Water Enforcement Workshop Continuing Legal Education (CLE) Credits. States

BUILDING CAPACITY FOR COLLEGE AND CAREER READINESS: LESSONS LEARNED FROM NAEP ITEM ANALYSES. Council of the Great City Schools

A Profile of Top Performers on the Uniform CPA Exam

Disciplinary action: special education and autism IDEA laws, zero tolerance in schools, and disciplinary action

2016 Match List. Residency Program Distribution by Specialty. Anesthesiology. Barnes-Jewish Hospital, St. Louis MO

Housekeeping. Questions

About the College Board. College Board Advocacy & Policy Center

Junior (61-90 semester hours or quarter hours) Two-year Colleges Number of Students Tested at Each Institution July 2008 through June 2013

Two Million K-12 Teachers Are Now Corralled Into Unions. And 1.3 Million Are Forced to Pay Union Dues, as Well as Accept Union Monopoly Bargaining

TRENDS IN. College Pricing

Student Admissions, Outcomes, and Other Data

Trends in College Pricing

Proficiency Illusion

CLE/MCLE Information by State

cover Private Public Schools America s Michael J. Petrilli and Janie Scull

Discussion Papers. Assessing the New Federalism. State General Assistance Programs An Urban Institute Program to Assess Changing Social Policies

State Limits on Contributions to Candidates Election Cycle Updated June 27, PAC Candidate Contributions

Trends in Higher Education Series. Trends in College Pricing 2016

Greta Bornemann (360) Patty Stephens (360)

Brian Isetts University of Minnesota - Twin Cities, Anthony W. Olson PharmD University of Minnesota, Twin Cities,

NASWA SURVEY ON PELL GRANTS AND APPROVED TRAINING FOR UI SUMMARY AND STATE-BY-STATE RESULTS

Redirected Inbound Call Sampling An Example of Fit for Purpose Non-probability Sample Design

History of CTB in Adult Education Assessment

Anatomy and Physiology. Astronomy. Boomilever. Bungee Drop

The following tables contain data that are derived mainly

2014 Comprehensive Survey of Lawyer Assistance Programs

Free Fall. By: John Rogers, Melanie Bertrand, Rhoda Freelon, Sophie Fanelli. March 2011

Set t i n g Sa i l on a N e w Cou rse

Update Peer and Aspirant Institutions

Why Science Standards are Important to a Strong Science Curriculum and How States Measure Up

Multi-Year Guaranteed Annuities

Understanding University Funding

The Effect of Income on Educational Attainment: Evidence from State Earned Income Tax Credit Expansions

STRONG STANDARDS: A Review of Changes to State Standards Since the Common Core

STATE-BY-STATE ANALYSIS OF CONTINUING EDUCATION REQUIREMENTS FOR LANDSCAPE ARCHITECTS

Wisconsin 4 th Grade Reading Results on the 2015 National Assessment of Educational Progress (NAEP)

Trends in Tuition at Idaho s Public Colleges and Universities: Critical Context for the State s Education Goals

Fisk University FACT BOOK. Office of Institutional Assessment and Research

Financial Education and the Credit Behavior of Young Adults

NCSC Alternate Assessments and Instructional Materials Based on Common Core State Standards

The College of New Jersey Department of Chemistry. Overview- 2009

Imagine this: Sylvia and Steve are seventh-graders

A Comparison of the ERP Offerings of AACSB Accredited Universities Belonging to SAPUA

EDUCATION POLICY ANALYSIS ARCHIVES A peer-reviewed scholarly journal

The Demographic Wave: Rethinking Hispanic AP Trends

NCEO Technical Report 27

2013 donorcentrics Annual Report on Higher Education Alumni Giving

ObamaCare Expansion Enrollment is Shattering Projections

The Value of English Proficiency to the. By Amber Schwartz and Don Soifer December 2012

Canada and the American Curriculum:

Susanna M Donaldson Curriculum Vitae

National Collegiate Retention and. Persistence-to-Degree Rates

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Career Services JobFlash! as of July 26, 2017

Institution-Set Standards: CTE Job Placement Resources. February 17, 2016 Danielle Pearson, Institutional Research

Shelters Elementary School

Peer Comparison of Graduate Data

CC Baccalaureate. Kevin Ballinger Dean Consumer & Health Sciences. Joe Poshek Dean Visual & Performing Arts/Library

ATTRIBUTES OF EFFECTIVE FORMATIVE ASSESSMENT

The Condition of College & Career Readiness 2016

King-Devick Reading Acceleration Program

Building a Grad Nation

2007 NIRSA Salary Census Compiled by the National Intramural-Recreational Sports Association NIRSA National Center, Corvallis, Oregon

Teach For America alumni 37,000+ Alumni working full-time in education or with low-income communities 86%

Common Core Postsecondary Collaborative

2009 National Survey of Student Engagement. Oklahoma State University

Moving the Needle: Creating Better Career Opportunities and Workforce Readiness. Austin ISD Progress Report

Albert (Yan) Wang. Flow-induced Trading Pressure and Corporate Investment (with Xiaoxia Lou), Forthcoming at

Stetson University College of Law Class of 2012 Summary Report

NORTH CAROLINA VIRTUAL PUBLIC SCHOOL IN WCPSS UPDATE FOR FALL 2007, SPRING 2008, AND SUMMER 2008

JANIE HODGE, Ph.D. Associate Professor of Special Education 225 Holtzendorff Clemson University

NBCC NEWSNOTES. Guidelines for the New. World of WebCounseling. Been There, Done That: Multicultural Training Can. Always be productively revisted

TENNESSEE S ECONOMY: Implications for Economic Development

Research Brief. Literacy across the High School Curriculum

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Montana's Distance Learning Policy for Adult Basic and Literacy Education

Further, Robert W. Lissitz, University of Maryland Huynh Huynh, University of South Carolina ADEQUATE YEARLY PROGRESS

Cooper Upper Elementary School

Standardized Assessment & Data Overview December 21, 2015

Strategic Plan Update, Physics Department May 2010

Cooper Upper Elementary School

Memorandum RENEWAL OF ACCREDITATION. School School # City State # of Years Effective Date

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

2012 ACT RESULTS BACKGROUND

Teaching Colorado s Heritage with Digital Sources Case Overview

Creating Collaborative Partnerships: The Success Stories and Challenges

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

EPA Approved Laboratories for UCMR 3

top of report Note: Survey result percentages are always out of the total number of people who participated in the survey.

Teacher Supply and Demand in the State of Wyoming

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

MABEL ABRAHAM. 710 Uris Hall Broadway mabelabraham.com New York, New York Updated January 2017 EMPLOYMENT

ACADEMIC ALIGNMENT. Ongoing - Revised

OSR Preclinical Grading Questionnaire Results

Transcription:

Issue Brief 215 : A Cautionary Note When Interpreting State Test Results Krista D. Mattern, PhD, and Catherine Lacina, BS A recent report by Achieve highlighted the fact that National Assessment of Educational Progress (NAEP) results and state test results often lead to incongruent findings. 1 Compared to NAEP, state test results often conclude that a much higher percentage of students are proficient in mathematics and reading. In fact, more than half of states reported proficiency rates that were more than 3 percentage points higher than NAEP state results, despite the fact both assessments ostensibly represented the same population. Both NAEP and state tests intend to gauge the educational progress of all students in the state; the conclusions from both should be congruent. The differences can be attributed to different content on NAEP and state tests or to different cut scores used to identify proficiency. Since both NAEP and the state tests measure proficiency in reading and mathematics, it appears that many states have significantly lower proficiency standards than NAEP. NAEP results and state scores on the ACT test also frequently conflict. As with state tests, the discrepancy between NAEP and the ACT could arise due to different content or proficiency standards across the two assessments. As for the content covered on the two assessments, both NAEP and the ACT assess reading and mathematics; therefore this seems unlikely to be the main source of discrepancy across assessments. Varying proficiency standards across the two assessments can plausibly explain the discrepancy because different methodologies were applied to set cut scores on the ACT and NAEP. For instance, the ACT College Readiness Benchmarks derived cut scores by identifying the ACT score associated with a 5% chance of earning a B or higher grade in typical relevant first-year credit bearing college courses. 2 In contrast, the three NAEP achievement levels basic, proficient, and advanced were set by conducting standard-setting studies. 3 A third possible explanation for the discrepancy is differences between the students who take NAEP and those who take the ACT. We argue this explanation is most likely. Students who take the ACT often reflect a self-selected group, whereas NAEP test takers are intended to be representative of all students in a state. That is, state results for the ACT often do not reflect the population of students in that state but rather the Acknowledgements The authors would like to thank Jeff Allen, Kurt Burkum, and Wayne Camara for their feedback and helpful suggestions on earlier versions of this essay. Krista Mattern is a director in Statistical and Applied Research specializing in the validity and fairness of assessment scores as well as more general issues in higher education such as enrollment, persistence, and graduation. Catherine Lacina is a specialist in Statistical and Applied Research specializing in data collection and analysis. www.act.org/research-policy Email research.policy@act.org for more information. 215 by ACT, Inc. ACT, ACT Aspire, ACT Explore, and ACT Plan are registered trademarks of ACT, Inc. 4523

sample of students who choose to sit for the test. Students who choose to take the ACT are often bound for college and thus more academically prepared compared to other students in the state. This biases the state s overall results upward. However, states that have adopted statewide testing of the ACT do not have this upward bias because all students in the state take the test regardless of their educational aspirations and prior academic performance. Therefore, to determine whether differences between NAEP and the ACT arise due to different students taking each assessment, we can examine whether NAEP and ACT results agree more for states that have census testing than states that have lower ACT participation rates. Using data reported in the recent Achieve report and data on the most recent cohort of ACT-tested high school graduates, we test this hypothesis in this study. In figure 1, we have compared 213 NAEP Mathematics proficiency rates in grade 8 to the percentage of students who met the ACT College Readiness Benchmark in mathematics in 214 by ACT state participation rate. 4 Clearly, among states that have adopted the ACT statewide, results closely mirror the NAEP results. Specifically, differences in the percentage of students considered ready in mathematics across the two assessments ranged from 4 to +6 percentage points with a mean difference of 1 percentage point among states that administered the ACT statewide. Across all states, differences ranged from 5 to +35 percentage points. Negative values indicate a higher pass rate for NAEP; positive values indicate a higher pass rate for the ACT. For example, in Colorado a state that administers the ACT statewide 42% of the 213 eighth graders scored proficient or above on NAEP Mathematics, and 39% of the 214 high school graduating class met the ACT College Readiness Benchmark in mathematics, which is a difference of 3 percentage points. These results suggest that NAEP and the ACT have fairly comparable mathematics proficiency standards: when the results are based on comparable groups (i.e., statewide testing), similar conclusions are reached. For states that have a fairly small ACT participation rate, the ACT and NAEP results diverge, with higher pass rates for the ACT than for NAEP. This can clearly be seen for New York, Connecticut, California, Delaware, Maine, and Washington,. In New York, where only 27% of the 214 graduating cohort took the ACT, only 32% of the 213 eighth graders scored at proficient or above on NAEP Mathematics while 67% met the ACT College Readiness Benchmark in mathematics, which is a difference of 35 percentage points. This is in alignment with the selection bias explanation described above. These findings highlight the care that should be taken when interpreting ACT state scores and comparing them among states when the data are based on a fraction of the population and tested populations likely differ. Percent of Grade Met ACT 8 Students Mathematics Meeting Benchmark ACT Mathematics minus Benchmark Percent minus Proficient Percent or above Proficient on or Grade above 8 on NAEP Mathematics 5 4 3 2 1-1 -2-3 -4-5 ME NY CT CA DE RI NH WA PA MD VA MA NJ VT NV OR IN AK TX ID Similar results were found when comparing reading performance on the ACT and NAEP (figure 2). For states administering the ACT statewide, ACT and NAEP results lead to similar conclusions. In particular, differences in the percentage of students considered proficient in reading across the two assessments ranged from 3 to +11 percentage points, with a mean difference of 4 percentage points among states that administered the ACT statewide. Across all states, differences ranged from 3 to +3 percentage points. Again focusing on Colorado, we find that 4% of the 213 eighth graders scored proficient or above on NAEP Reading, and 43% of the 214 high school graduating class met the ACT College Readiness Benchmark in reading, which is a difference of 3 percentage points. On the other hand, in New York we find that only 35% of 213 eighth graders scored proficient or above on NAEP reading while 59% met the ACT College Readiness Benchmark in reading a difference of 24 percentage points. GA SC WI MN SD IA MO NM OH OK KS AL WV 2 4 6 8 1 214 State ACT Participation Rate Figure 1. Consistency between ACT and NAEP Mathematics Performance by ACT Participation AR LA MI IL UT ND MS KY TN MT WY NC CO 2

These results are based on different cohorts of students at different points in their educational careers. Though it is true that state performance data tend to be stable over time, a more stringent test of the comparison between the ACT and NAEP would be to examine results based on the same set of students in the same grade. Preferably, we would compare readiness rates for the 213 ACT-tested high school graduates and proficiency rates for the 213 NAEP twelfth graders; however, twelfthgrade NAEP results are only available for thirteen states. Moreover, there are concerns about lower participation rates and motivation on NAEP at grade 12. 5 To examine agreement rates for the same cohort of students, we conducted the same analyses reported for the 213 ACT-tested high school graduates and 29 NAEP eighth-grade results. A similar pattern emerged. Among states with ACT statewide adoption, we find very similar readiness/proficiency rates, with differences ranging from 3 to +9 percentage points (M = 2 percentage points) for mathematics, and 2 to 13 percentage points (M = 7 percentage points) for reading. This is not a direct comparison: some eighthgrade students included in the NAEP results would have dropped, stopped, or migrated out of the state during high school. Moreover, higher achieving students are less likely to drop out of high school. 6 For this reason, we would expect higher readiness rates on the ACT than on NAEP, if all else were equal. On the other hand, comparing ACT and state assessment results by ACT participation rate showed no clear pattern. Negative values shown in figures 3 and 4 indicate that the state-reported results suggest a higher percentage of students are ready/ proficient than the ACT results suggest. In mathematics, the ACT readiness rates were 12 percentage points lower than the statereported results, on average, with differences between the ACT and the state assessments Percent of Met Grade ACT 8 Students Reading Meeting Benchmark ACT Reading minus Percent Benchmark minus Proficient Percent or above Proficient on Grade or above 8 on NAEP NAEP Reading 5 4 3 2 1-1 -2-3 -4-5 ME DE RI NY NH VA CA CT WA MA PA MD VT NJ NV AK IN OR TX ranging from 46 percentage points to +45 percentage points. For example, the Mississippi state test results indicated that 67% of their 213 eighth graders were prepared in mathematics; however, only ID GA SC WV AL IA NM WI OK KS MN MO SD OH 2 4 6 8 1 214 State ACT Participation Rate Figure 2. Consistency between ACT and NAEP Reading Performance by ACT Participation Percent Percent of Grade Met 8 ACT Students Mathematics Meeting ACT minus Mathematics Percent Proficient Benchmark minus Percent or above Proficient on Grade or above 8 State on State Mathematics Tests 5 4 3 2 1-1 -2-3 -4-5 ME MA WA NH RI NY MD DE NJ PA VA NV OR AK IN TX GA SC WV NM IA WI OH MS ND LA MT IL CO UT TN WY MI KY NC 21% of the 214 high school graduating class in Mississippi met the ACT College Readiness Benchmark in mathematics. There is much more variability in the ACT state test comparisons than in the ACT NAEP MN AL MO OK AR AR 2 4 6 8 1 214 State ACT Participation Rate Figure 3. Consistency between ACT and State Test Mathematics Performance by ACT Participation UT NC CO KY WY TN IL ND LA MS 3

comparisons, even among ACT censustested states. One exception is Utah, where state test results aligned with both ACT and NAEP results. 7 Figure 4 shows the consistency between state reading results based on the ACT versus state tests. Similar to figure 3, we find that state tests often conclude that a higher percentage of students are prepared (i.e., negative values) than ACT test results suggest. Specifically, the ACT readiness rates were 2 percentage points lower than the state-reported results, on average, with differences between the ACT and the state assessments ranging from 54 percentage points to +26 percentage points. For example, the Georgia state test results indicated that 98% of their 213 eighth graders were prepared in reading, but only 44% of the 214 high school graduating class met the ACT College Readiness Benchmark in reading. This corroborates findings of the Achieve report, where the Georgia state test results showed the largest discrepancy with NAEP. 8 Conclusion It is a national priority to have all high school students graduate ready for college or workforce training. Realizing the disconnect between high school requirements and academic demands of college-level work, ACT released the College Readiness Standards in 1997, which identified the skills required for success in entry-level postsecondary courses and described skills associated with specific score ranges across its assessments (ACT Explore, ACT Plan, and the ACT). 9 In 25, ACT published cut scores in the ACT College Readiness Benchmarks, which identified the minimum scores on the ACT required for college readiness in grades 8 12. 1 More recently, ACT Aspire, which was launched in 214, Percent Percent of Grade Met 8 ACT Students Reading Meeting minus ACT Percent Reading Proficient Benchmark minus Percent or above Proficient on Grade or above 8 State on State Reading Tests 4 3 2 1 was developed to assess students mastery of mathematics, English language arts, and science in grades 3 through 1, allowing for even earlier monitoring of students academic strengths and weaknesses on an annual basis and providing feedback at the individual student level. These efforts gained national recognition with policymakers, educational organizations, and education reformers who had long argued that low standards and minimum competency testing disguised large inequities across schools and states and led to complacency among parents and students who assumed that proficiency ratings on state tests were indicators of readiness to proceed to the next level. However, it was becoming increasingly evident that receiving a high school diploma and passing a state test did not ensure preparation for these postsecondary experiences. In fact, national statistics showed just the opposite: many high school graduates were largely in need of remediation in college. 11 NY MN MO UT NV -1 ME DE NH TN RI WA VA MA IN KY OR IL WY -2 PA MD NJ WV IA NM NC CO OK AL SC -3 ND OH LA AK AR MS MI -4 TX -5 GA -6 2 4 6 8 1 214 State ACT Participation Rate Figure 4. Consistency between ACT and State Test Reading Performance by ACT Participation WI By setting different standards across states and assessments, we run the risk of sending mixed messages to our students. Moreover, depending on which feedback the student receives or attends to, we facilitate a scenario where students continue to graduate high school, unknowingly illprepared for postsecondary endeavors. The Achieve report and the results presented here illustrate this problem. College readiness benchmarks that are empirically linked to important postsecondary outcomes provide both useful information to students, teachers, and administrators to gauge students academic strengths and weaknesses and an opportunity to remedy students skill deficiencies and best position them for future success. As states transition to proficiency standards tied to college and career readiness, their proficiency rates are likely to drop; communication plans can be developed to explain why the new proficiency standards are needed and the likely impact the standards will have on proficiency rates. 12 4

Appendix Table A1. State-Specific Mathematics and Reading Performance on the ACT, NAEP, and State-Reported Tests State Graduates tested, 214 Met ACT College Readiness Benchmark in reading, 214 Met ACT College Readiness Benchmark in mathematics, 214 (%) NAEP 8th-grade Mathematics, 213 State-reported 8thgrade Mathematics, 213 214 NAEP 8th-grade Reading, 213 State-reported 8th-grade Reading, 213 215 Colorado 1 43 39 42 51 4 66 Illinois 1 41 41 36 6 36 57 Kentucky 1 37 31 3 45 38 52 Louisiana 1 32 27 21 64 24 66 Michigan 1 36 35 3 35 33 73 Mississippi 1 31 21 21 67 2 67 Montana 1 44 39 4 NA 4 NA North Carolina 1 3 33 36 41 33 53 North Dakota 1 42 41 41 66 34 74 Tennessee 1 37 3 28 47 33 48 Utah 1 43 39 36 38 39 42 Wyoming 1 4 34 38 5 38 58 Arkansas 93 41 35 28 64 3 77 Hawaii 9 26 27 32 6 28 71 Nebraska 86 48 45 36 67 37 79 Florida 81 38 33 31 48 33 57 Alabama 8 43 31 2 29 25 68 South Dakota 78 51 52 38 NA 36 NA Minnesota 76 56 61 47 59 41 57 Missouri 76 51 45 33 52 36 51 Kansas 75 51 5 4 NA 36 NA Oklahoma 75 45 35 25 54 29 71 Wisconsin 73 51 54 4 46 36 33 Ohio 72 52 5 4 8 39 87 New Mexico 69 37 33 23 4 22 59 Iowa 68 52 48 36 75 37 75 West Virginia 65 45 31 24 39 25 68 South Carolina 58 41 39 31 7 29 68 National 57 44 43 35 NA 36 NA Arizona 55 37 37 31 59 28 7 Georgia 53 44 38 29 82 32 98 Idaho 45 55 53 36 NA 38 NA Indiana 4 51 52 38 81 35 66 Texas 4 42 47 38 86 31 9 Alaska 37 48 45 33 68 31 84 District of Columbia 37 47 47 19 65 17 54 Nevada 36 47 46 28 38 3 52 Oregon 36 49 47 34 61 37 66 California 29 51 57 28 NA 29 NA Connecticut 29 65 69 37 NA 45 NA Vermont 29 58 6 47 NA 45 NA Virginia 28 58 57 38 68 36 7 New York 27 59 67 32 22 35 33 New Jersey 25 57 64 49 72 46 8 Massachusetts 23 65 72 55 51 48 79 Maryland 22 54 55 37 59 42 77 Washington 22 58 62 42 55 42 71 New Hampshire 2 66 69 47 64 44 78 Pennsylvania 19 55 59 42 72 42 79 Delaware 18 61 6 33 68 33 71 Rhode Island 16 6 59 36 58 36 74 Maine 9 61 65 4 56 38 71 Note: State proficiency data were not available (NA) for seven states. See the Achieve (215) report for more details. 5

Notes 1 Achieve, Proficient versus Prepared: Disparities between State Tests and the National Assessment of Educational Progress (NAEP) (Washington, : Achieve, 215), http://www. achieve.org/files/naepbrieffinal51415. pdf. 2 Jeff Allen and Jim Sconing, Using ACT Assessment Scores to Set Benchmarks for College Readiness, ACT Research Report No. 25-3 (Iowa City, IA: ACT, 25), http:// www.act.org/research/researchers/reports/ pdf/act_rr25-3.pdf; Jeff Allen, Updating the ACT College Readiness Benchmarks, ACT Research Report 213-6 (Iowa City, IA: ACT, 213), http://www.act.org/research/ researchers/reports/pdf/act_rr213-6.pdf. 3 To set cut scores, the modified Angoff method was used. This is how the three levels were defined: Basic = This level, below proficient, denotes partial mastery of the knowledge and skills that are fundamental for proficient work at each grade 4, 8, and 12. Proficient = This central level represents solid academic performance for each grade tested 4, 8, and 12. Students reaching this level have demonstrated competency over challenging subject matter and are well prepared for the next level of schooling. Advanced = This higher level signifies superior performance beyond proficient grade level mastery at grades 4, 8, and 12. Nancy L. Allen, James E. Carlson, and Christine A. Zelenak, The NAEP 1996 Technical Report, NCES 1999-452 (Washington, : National Center for Education Statistics, 1999): 741. http://nces.ed.gov/nationsreportcard/pdf/ main1996/1999452.pdf. 4 NAEP state assessments are administered from January through March of the academic year. Therefore, the 213 NAEP results correspond to the 212 13 academic year. 5 National Center for Education Statistics, Grade 12 State Program (Washington, : National Center for Education Statistics, 213), https:// nces.ed.gov/nationsreportcard/pdf/about/ schools/grade12_stateprogramfactsheet. pdf. 6 Richard Buddin, Implications of Education Attainment Trends for Labor Market Outcomes, ACT Research Report 212-7 (Iowa City, IA: ACT, 212), http://www.act.org/research/ researchers/reports/pdf/act_rr212-7.pdf. 7 Achieve, Proficient versus Prepared, 215. 8 Achieve, Proficient versus Prepared, 215. 9 ACT, The ACT Technical Manual (Iowa City, IA: ACT, 214). 1 Allen and Sconing, Using ACT Assessment Scores, 25. The ACT College Readiness Benchmarks were updated in 213 based on more current data. The new analyses revealed no change in the English and Mathematics Benchmarks of 18 and 22, respectively. The Reading Benchmark increased from 21 to 22, and the Science Benchmark decreased from 24 to 23 (Allen, Updating the ACT College Readiness Benchmarks, 213). 11 Susan Aud, William Hussar, Grace Kena, Kevin Bianco, Lauren Frohlich, Jana Kemp, and Kim Tahan, The Condition of Education 211, NCES 211-33 (Washington, : National Center for Education Statistics, 211), http://files.eric. ed.gov/fulltext/ed521.pdf. 12 Michelle Croft, Gretchen Guffy, and Dan Vitale, Communicating College and Career Readiness through Proficiency Standards (Iowa City, IA: ACT, 214), http://www.act.org/research/ policymakers/pdf/communicating-ccrthrough-proficiency-standards.pdf. 6