THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS


 Amie Bryan
 6 years ago
 Views:
Transcription
1 THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS ELIZABETH ANNE SOMERS Spring 2011 A thesis submitted in partial fulfillment of the requirements for a baccalaureate degree in Mathematics with honors in Mathematics Reviewed and approved* by the following: Stanley Smith Associate Professor Thesis Supervisor Mark Levi Professor Honors Adviser * Signatures are on file in the Schreyer Honors College.
2 Abstract This study was conducted at Penn State University and focused on the Math 021, College Algebra, final exam results. The goal of this study was to examine the effectiveness and difficulty of multiple choice exam questions. This study looked at group data from the fall 2008 final exam (Schreyer, 2008) and based on this data, five questions were identified to appear on the spring 2010 final. These questions were chosen for their high effectiveness and appropriate difficulty level. This study found that both the effectiveness and difficulty of these exam questions changed from the fall 2008 to spring 2010 exams. Two of the five questions had an increase in difficulty, and one had a decrease in difficulty. The remaining two questions saw fairly consistent difficulty levels across the two exam years. The effectiveness of the questions decreased from the fall 2008 exam to the spring 2010 exam. The first three identical questions showed a large decrease in effectiveness, while the second two questions had more consistent effectiveness across exam years (MATH, 2010). The factors that affected the effectiveness and difficulty of the exam questions are discussed. The ALEKS program (ALEKS, 2010) used during the spring 2010 semester is one of these factors and this factor appeared to have an effect on the changes in effectiveness and difficulty. i
3 Table of Contents Introduction 1 Methods and Materials 3 Table 1: Fall 2008 Data 4 Table 2: Distribution of items by ITEM EFFECT: Biserial Coefficient 4 Table 3: Distribution of items by % Correct 5 Figure 1: Questions found on both fall 2008 and spring Results 8 Table 4: Spring 2010 Data 8 Table 5: Difficulty and Effectiveness 9 Discussion 12 Conclusion 21 Appendices 23 Appendix A: Glossary 23 Appendix B: Consent Forms 24 References 28 ii
4 Acknowledgements I would like to thank all of those who aided me in the process of completing this project. First I would like to thank my thesis advisor Dr. Stanley Smith for all the time and effort he gave. I would also like to thank Mary Erickson, Coordinator of First Year Courses and all the others in the Math Department who allowed me to complete this study, and Crystal Ramsay, Instructional Consultant for the Schreyer Institute for Teaching Excellence for providing me with integral information for my study. Finally I would like to thank my family and friends for the support and encouragement they gave me as I was completing this project. iii
5 Introduction The use of multiple choice questions to test knowledge is prevalent in the academic institutions of the United States. Many institutions rely on multiple choice questions to test students and assess their knowledge and learning. Multiple choice questions appear on standardized tests such as the PSSA and the SAT s and classroom tests found in primary schools, secondary schools, and universities. Multiple choice exams fall under the larger category of assessment. The types and quality of assessment are being examined in the mathematics community. In 1991, teachers participating in a Garet and Mills (1995) study rated the frequency for which they implemented multiple choice tests in their classroom. On a scale of 1 to 5 where 1 indicated never and 5 indicated very frequently teachers scored their use at a 2.5. The Thompson article that reports this study pushes the use of alternate assessments, but also advocates for the improvement of tests (Thompson, 1997). Tests can be improved through improving the effectiveness of test items and through insuring that the test items individually and collectively have an appropriate difficulty level. Meeting with Crystal Ramsay, an Instructional Consultant for the Schreyer Institute for Teaching Excellence led to the awareness of three university websites that address these issues along with other aspects of multiple choice item analysis (C. Ramsay, Personal Communication, October 7, 2010). A website provided by the University of Texas at Austin provides information about how to analyze test items, addressing item discrimination and difficulty along with other measures pertinent to the analysis of a multiple choice question. This site also supplies information about the testing process including how to write test items, how to produce data that reflects the test items, and how to analyze test items (Instructional, 2010). The University of Wisconsin Oshkosh provides a website that discusses item discrimination and its importance making judgments about 1
6 a test item. This site includes a method of calculating the item discrimination (University, 2005). Finally, Vassar College supports a website that calculates statistical information about a data set including the item discrimination (Lowry, 2010). Each of these websites includes information about calculating the effectiveness of test items. The presence of these pages indicates the importance of test effectiveness. Penn State University is also working to improve question effectiveness on multiple choice math exams. Indeed, this study was a part of this process. The purpose of this study was to investigate the effectiveness and difficulty of multiple choice items on the MATH 021, College Algebra exam. This article will discuss the study that was conducted to generate data and use this data to help make conclusions about the effectiveness and difficulty of test items. 2
7 Methods and Materials This study was conducted using data from exam scores from the College Algebra I class, Math 021, at Penn State University, University Park. The study consisted of two rounds of testing. The first round of testing took place before the study began. Group data was obtained from this study (Schreyer, 2008). This data was used to help shape the second round of testing. The data from this round includes fewer participants, but more in depth analysis of the data (MATH, 2010). Data from the fall 2008 Math 021 final exam was obtained from the Schreyer Institute for Teaching Excellence at Penn State University. The sample size of this data set was 653 students. These students belonged to one of four subsets. Each subset of students took a different version of the exam. The versions included A, B, C, and D. There were 162 students who took version A of the exam. Likewise, 162 took version B of the exam. Version C had 166 students participate. Version D had 163 students participate. Each version of the exam was made up of the same 30 questions. The questions appeared in the same order on all four exams. The variation between versions occurred in the order that the answers appeared. There was not one particular pattern that characterized how these answer choices varied on each exam (Schreyer, 2010). The data produced through the Schreyer Institute for Teaching Excellence provides the percent of students who chose each answer choice and the difficulty of the question. A key is provided along with the mean score of each form. The data results include the item effect and the reliability of the test scores. The group data for the fall 2008 exam is displayed in Table 1 (Schreyer, 2010). The data includes the results of the five problems that appear identically on the spring 2010 exam. 3
8 Table 1: Fall 2008 Data Version A 162 students Version B 162 students Reliability mean: Reliability mean: Version C 166 students Version D 163 students Reliability mean: Reliability mean: Table 1. Extracted from the item analysis data for five of the 30 questions on the fall 2008 exam (Schreyer, 2008). The group data from the fall 2008 exam was studied and questions were chosen to reappear on the spring 2010 exam based on high item effect across the four versions and a difficulty that when averaged together came to roughly a C. The information provided by the Schreyer Institute for Teaching Excellence includes the ranges of values that indicate ineffective questions, questions with low effectiveness, questions with medium effectiveness, and questions Table 2: Distribution of items by ITEM EFFECT: Biserial Coefficient Negative (ineffective) (low effectiveness) (medium effectiveness) (high effectiveness) Table 2. The ranges for item effect as displayed in the item analysis data (Schreyer, 2008). with high effectiveness. A table displaying these ranges can be found in the item analysis produced by the Schreyer Institute for Teaching Excellence, and this table can be viewed in Table 2 (Schreyer, 2008). With these ranges in mind, the cutoff item effect chosen for 4
9 this study was This means that every version of the exam must have a 0.40 item effect or greater. There were 14 questions on the fall 2008 exam that fit into this category (Schreyer, 2008). Of these 14 questions, 5 of these questions were used verbatim on the spring 2010 exam. These five questions are those represented in Table 1. The answer order for corresponding versions was also identical from the fall 2008 exam to the spring 2010 exam. Two of the 14 eligible questions, numbers 28 and 29, involved content that was not being tested in the spring 2010 semester of Math 021. Questions 10 and 12 were also not used on the spring 2010 exam. The remaining eligible questions appeared with some alterations on the spring 2010 final. Table 3: Distribution of items by % Correct 020 (very difficult) (difficult) (moderately difficult) (easy) Table 3. The ranges for item difficulty as displayed in the item analysis data (Schreyer, 2008). Questions 3 and 24 had different distracters. In question 6, a negative was factored out of part of the equation being solved. Question 11 appears on the spring 2010 exam, but the answers are not shuffled between versions. Questions 4 and 5 also appeared on the spring 2010 exam but these were not questions chosen for this study (Penn State, 2010). The difficulty of each question was also considered. The goal was to create an exam that had a difficulty percent of a C. The ranges for difficulty levels can be found in Table 3 (Schreyer, 2008). The remainder of the exam was written by Mary Erickson, Coordinator of First Year Courses at Penn State University. The questions that will be focused on in the data collection and review are the five questions that are identical to those that appeared on the fall 2008 exam. The spring 2010 exam included questions chosen for their effectiveness and difficulty on the fall 2008 exam. There were 25 questions on this exam, and 4 versions of the exam. The 5
10 versions contained the same questions, but with the answer choices in different orders (Penn State, 2010). This exam was administered to students in May of 2010 who were taking the Penn State course Math 021. Before students completed the exam, students were asked to indicate if their scores could be used for the study. The informed consent form used for this process (Appendix B) was in compliance with the IRB Office at Penn State University (Research, 2010). There were 143 students who indicated that their scores could be used. These scores were extracted from the group data after being stripped of identifying information. There were 34 students who completed version A, 38 completing version B, 40 completing version C, and 31 completing version D (MATH, 2010). The extracted data provided by my thesis advisor, Stanley Smith, Associate Professor and Director of Online Instruction, included the answer choices each student made for each question (MATH, 2010). The exams were graded in excel after an answer key was made for each version. The percent of students who chose each answer choice was determined along with the difficulty, effectiveness, overall reliability, and mean for each version. The difficulty and mean were calculated using an excel spreadsheet. The overall reliability was calculated using the Kuder Richardson formula 20 value (Appendix A). This is the same formula used by testing services to calculate the overall reliability (Schreyer, 2010). The formula for this calculation was found in Psychometrics: An Introduction by Furr and Bacharach (Furr, 2008). The effectiveness of each question was determined using an online calculator provided by Vassar College (Lowry, 2010). Crystal Ramsey, an instructional consultant for the Schreyer Institute for Teaching Excellence, recommended this calculator. She also confirmed that Penn State University uses a similar calculation to determine the effectiveness of exam questions. The top 33 percent and bottom 33 percent of scorers are used to calculate the effectiveness of a 6
11 question (C. Ramsey, Personal Communication, October 7, 2010). When defining the top 33 percent and bottom 33 percent of scorers, there was overlapping scores between the top and middle or middle and bottom groups. The determination was made using excel. The scores were sorted from smallest to largest and then the top 33 percent were determined using this order. In order to make sure that the process of choosing which students scores to use did not affect the effectiveness of a question significantly, the effectiveness of the five questions that are studied in greater depth was recalculated. The students with the same scores were assigned a number and a TI83 calculator was used to randomly generate the numbers represented to indicate which scores would and would not be used. The Spring 2010 data was compared to the original group data. The difficulty and effectiveness of the five chosen questions was compared to the difficulty and effectiveness of these questions on the fall 2008 exam. The five questions are displayed in Figure 1 (Penn State, 2010). Figure 1: Questions found on both fall 2008 and spring Simplify a) x(x;2) 2(x:3) 98x 3 y 2 b) (x:3) 3 (x;2) c) x 2 d) x;2 x:3 7x 2 y 14xy x 2 :6x:6 x 2 :x;6 2. Simplify (3 ;2 2 ;2 ) ;1. 3. Write 9 50 in the simplest radical form a) 5 a) 3 b) 5 b) 12 2 c) 5 36 d) 36 5 c) 1 d) b 4. Simplify 6 5. Find the center of the circle x 2 + y 2 4x + 6y + 1 = 0 b a) 1 3 b a) center (2, 3) 3 b) b b) center (4,9) c) 1 6 b c) center ( 2,3) 6 d) b d) center ( 4,9) Figure 1. The five questions that appear identically on the fall 2008 and spring 2010 exams (Penn State, 2010). 7
12 Results Table 4: Spring 2010 Data Version A 34 students Version B 38 students Reliability 0.76 mean: Reliability 0.67 mean: Version C 40 students Version D 31 students Reliability 0.75 mean: 65.2 Reliability 0.80 mean: Table 4. Item analysis for spring 2010 exam based on raw data (MATH, 2010). 8
13 The results for the spring 2010 exam, based on the raw data provided by the thesis advisor of this study (MATH, 2010), are displayed in Table 4. The results are displayed for each question on the exam. The number of participants, mean score, overall reliability, difficulty, and effectiveness for each question is present in this data. The five questions that will be further examined are highlighted. The specific questions and comparisons between the fall 2008 and spring 2010 exam are displayed in Table 5. Table 5. Difficulty and effectiveness of five selected problems. Values from item analysis (Schreyer, 2008), and calculated used raw data (MATH, 2010). 9
14 The percent difficulty for each of the five selected questions is displayed in Table 5. This data was obtained through the item analysis for the fall 2008 exam provided through the Schreyer Institute for Teaching Excellence (Schreyer, 2008), and calculated from the raw data provided by the thesis advisor for this study (MATH, 2010). The difference in difficulty between the fall 2008 exam and spring 2010 exam is also shown in this table. The first question that was identical on both the fall 2008 exam and the spring 2010 exam asked students to simplify an algebraic expression (Penn State, 2010). Students had a high success rate with all versions difficulty falling between 91 and 100 and therefore being labeled easy (Schreyer, 2008), but the difficulty level was higher for the fall 2008 exam. For each version there was an increase between the fall 2008 exam and the spring 2010 exam in the percentage of students who answered correctly. Two questions, question 2 and question 4, showed an increase in difficulty across the two exams. Question 2 showed an increase in difficulty across exams. The spring 2010 results fall in the very difficult and lower end of the difficult categories. The fall 2008 result, however, are in the middle to upper end of the difficult category (Schreyer, 2008). Question 4 also showed an increase in difficulty across exams. The fall 2008 exam showed results in the moderately difficult category. The spring 2010 exam, however, had results in the very difficult and difficult categories (Schreyer, 2008). The remaining two questions, question 3 and question 5, had an increase in difficulty with some versions, and a decrease in difficulty with others. This was the case for the third question. The results stayed fairly consistent across the exams. The results from both exams lie within the upper moderately difficult category and the easy category. The fifth question also had 10
15 fairly consistent results. Across both exams and all four versions, the difficulty ratings remained in the moderately difficult category (Schreyer, 2008). The effectiveness of questions also changed from the fall 2008 exam to the spring 2010 exam. There was an overall trend of a decrease in effectiveness across the two exam years (Table 5). The effectiveness for the first question could not be calculated on version A because every student in the sample group answered this question correctly (MATH, 2010). Questions one, two, and three showed a large decrease in effectiveness while questions four and five showed a more consistent trend in effectiveness across exam years. 11
16 Discussion The progression of this study took place in Math 021 at Penn State University. This study followed five final exam questions (Penn State, 2008) that were chosen to reappear on the Math 021 final exam based on their effectiveness and difficulty levels from their initial ratings on the fall 2008 exam (Schreyer, 2008). The results of this study show discrepancy on both the effectiveness and difficulty of these questions across exams (Table 5). These five questions on the fall 2008 exam and spring 2010 exam correspond identically as set up by this study. The answer choices are in the same order for version A of the fall 2008 exam and version A of the spring 2010 and all other corresponding versions across exam years. There are disparities, however, in the conditions between the first time these questions were administered and the second time they were administered. These disparities will be further discussed in order to make a conjecture as to the cause of the differences. First, the exams contained a different number of questions and the questions not included in the identical five were different across exam years. The exams also included a slightly different range of knowledge. The fall 2008 exam contained 30 questions, and ellipses and hyperbolas were included (Penn State, 2008). The spring 2010 exam contained 25 questions and did not include ellipses and hyperbolas (Penn State, 2010). The disparities in the tests may affect the effectiveness of the test questions because effectiveness is calculated through examining how many high scoring students and how many low scoring students answered a question correctly. A highly effective question will have the high scoring students answering correctly and the low scoring students answering incorrectly (Instructional, 2010). If two exams test even slightly different knowledge or skills, the range of skills being tested changes and perhaps a student who 12
17 was a high scoring student on the first version of the exam would become a middle scoring student on the second version of the exam. This would alter the effectiveness of the question. Another difference between the fall 2008 data and spring 2010 data was the sample size. The fall 2008 data was group data that was already existing and calculated by the Schreyer Institute for Teaching Excellence. The population included all Math 021 students taking the exam. There were a total of 653 students whose scores are represented. There were over 160 students taking each version (Schreyer, 2008). For the spring 2010 exam, the population represented in the data was defined differently. The first factor influencing the population is permission was obtained from students to use their exam data. Only 143 students granted permission for their scores to be used. This meant that there were between 30 and 40 students represented for each version (MATH, 2010). The second factor that affected the population of the spring 2010 exam was a program called ALEKS. This program is an online method of instruction (ALEKS, 2010) that gave students skill practice throughout the semester. Students needed to master each category for the class. Those who mastered all of the categories before the end of the semester took the exam at an earlier date. Only those who took the exam on the final exam day were asked to participate in this study, therefore, the early test takers are not represented in the population examined for the spring 2010 data. The difference in population could affect the effectiveness and difficulty of a question because it is unknown if the sample of people represented in this study is a true representation of the entire group. The ALEKS program (ALEKS, 2010) that was used during the spring of 2010 was another factor that could have affected question difficulty and question effectiveness. The students who took this exam in the fall of 2008 did not use this program to help them with their studies throughout the semester (Math Department, 2008). This program has students take a pre 13
18 test that determines which skills need to be practiced in order for students to master all of the content of the course. Students must master each content area by answering enough questions correctly in the topic (ALEKS, 2010). The responsibility that students have to practice and master skills could affect both the difficulty and effectiveness of a question. Exam material could either be well supported by the content covered in ALEKS or poorly supported. The concepts on the exam that are well supported may show a decrease in difficulty level. The effectiveness of questions could be affected depending on how well the program mirrors the final exam content. Students could master the ALEKS material, but if there were discrepancies between topics on the final exam and topics included in the program, being an expert on the ALEKS material would not necessarily correlate with a good score on the exam. These students could score well on the ALEKS correlated questions, but poorly on the non ALEKS coordinated questions and depending on if the question being analyzed was a ALEKS correlated question and if the exam had a majority of ALEKS coordinated questions on it, the effectiveness would be affected. Finally, the fall 2008 data was calculated by the Schreyer Institute for Teaching Excellence (Schreyer, 2008). The spring 2010 data was presented in this study as raw data (MATH, 2010). The different measurements were calculated through excel and also through an online calculator on the Vassar College website. This online calculator s purpose is to calculate the effectiveness of a test question (Lowry, 2010). In addition, there were some scores that fell on the line between the top 33 percent and the middle 33 percent or the middle and the bottom, as calculated from the spring 2010 raw data (MATH, 2010). The way this situation was handled in this study was the scores in excel were sorted from smallest to largest. The top 33 percent were then determined based on the current order of scores. Testing services may use a different method for choosing which scores should be included. The disparity between the methods used 14
19 to calculate the effectiveness could have caused slight variations between the two analyses of effectiveness. Many of these differences could not be controlled through this study, and examining the effects of them is not possible through the constraints of the study. An ideal study would compare the results from every student in two different semesters of Math 021 where the exams were identical. The changes in the content tested on the exam were determined by Mary Erickson, the Coordinator of First Year Courses. These changes were made to reflect the content covered during the semester in the Math 021 class. The length of the exam was also determined by Mary Erickson and the other Penn State faculty that contributed to the creation of the exam. The constraints of this study do not allow the effect the differences in content and length of the exam have on both the effectiveness and difficulty of questions being studied, however, there are only three questions on the fall 2008 exam that ask students to use their knowledge of ellipses and hyperbolas (Penn State, 2008). The differences in length of the fall 2008 and spring 2010 exams are consistent for each student who is taking the exam in each of these semesters. The length of the exam could present differences if time is an issue for students taking the exam. If students do not have adequate time to complete the exam, the difficulty of questions could be affected if students cannot spend as much time on questions as they would like. It is the belief of this study, however, that the effectiveness of a question would not be greatly affected by time constraints because both weak students and strong students were given the same amount of time to complete the exam. The second factor that was discussed in regards to the differences between the fall 2008 exam and spring 2010 exam is the sample and the method of calculation for the data. The available data for this study caused this difference to be present. The data given from the 15
20 population that was available for the study was examined using methods similar to the methods used by the Schreyer Institute for Teaching Excellence. Crystal Ramsay, an Instructional Consultant from the Schreyer Institute for Teaching Excellence provided guiding information to help align the methods used in this study as closely as possible to the calculations of the fall 2008 exam (Personal Communication, Crystal Ramsay, October 7, 2010). The students taking Math 021 in the spring of 2010 who finished the ALEKS program early took the final exam early (Math Department, 2010). This means that their scores could not be included in the population of scores examined. These students are probably among the top students in the class and the absence of this data could have a great affect on the difficulty and effectiveness of questions. Unfortunately the constraints of this study, specifically the data available, do not allow this disparity to be examined. The presence of the ALEKS program for the students taking Math 021 during the spring semester of 2010 is a factor that can be further examined. The ALEKS program looks to create a learning environment for each student based on their needs. The program contains a list of topics for a particular course, and instructors can choose which of these topics they want students to master in the course. Students begin with a pretest that initially assesses what students know. Based on how successful students are solving each of the pretest tasks, the program will determine what other pretest questions a student should be asked. After the pretest is complete, ALEKS will create an individual pie chart based on the results of the pretest. The pie chart includes each of the topics the instructor indicated grouped into categories. The topics that the student already has mastery of, determined by the pretest will be indicated on the pie chart. Students must then work to master each skill required for the course. Students must be able to answer questions relating to a skill correctly multiple times before ALEKS determines that 16
21 students have learned a skill. The ALEKS program does not allow students access to practice for all skills at once, but instead only opens those that students can successfully complete with the use of their prior knowledge and will need to complete the later skills. As students mastered earlier skills, later skills became available for students to complete (ALEKS, 2010). The ALEKS program was used in Math 021 to help students practice and learn the content knowledge. Students were required to master each of the skills in order to complete the curriculum (ALEKS, 2010). Therefore, the correlation between the content presented in ALEKS and the content presented on the exam could greatly affect the results of the exam. In particular, the difficulty of a question would be greatly affected depending on how well the ALEKS program covered the skills need to complete the question. The difference in the difficulty of the five questions that appeared on both the fall 2008 exam and the spring 2010 exam was examined. Questions two and four stated in Table 4 had an increase in difficulty from the fall 2008 exam to the spring 2010 exam (Table 5). The skills needed to complete each of these questions were determined and the depth to which these skills were covered in ALEKS was also determined. Question two of Table 4 asks students to simplify the expression, (3 ;2 2 ;2 ) ;1 (Penn State, 2010). On the fall 2008 exam, there was approximately a 50/50 right to wrong ratio (Schreyer, 2008). On the spring 2010 exam, however, there was approximately a 25/75 right to wrong ratio (MATH, 2010). Therefore, it is important to note how effectively ALEKS addresses the skills tested in this problem. Students must first simplify what is inside the parentheses. Students must be able to rewrite terms without negative exponents present, ( );1. Students must then combine these terms using common denominators as calculators are not allowed on the exam, ( ;5 36 );1. Lastly, students must apply the negative exponent to the simplified expression to obtain 36. The students who answered this question incorrectly most often chose 5 as their 5 17
22 answer on both the fall 2008 exam (Schreyer, 2008) and spring 2010 exam (MATH, 2010). Perhaps this solution resulted from the following process: (3 ;2 2 ;2 ) ;1 = = 9 4 = 5 Students appear to have illegally distributed the exponent and then simplified the new expression. The ALEKS program provides students with practice on exponents and order of operations, and additional practice with exponents; however practice with negative exponents was not observed to be available at the base level. This category of practice also has students simplify expressions that include examples such as, 2 3 ( 2) (2 3) 2, where students must recognize that what is inside the parentheses must be simplified before the exponent can be addressed (ALEKS, 2010). Students perhaps were not given enough initial practice with negative exponents and order of operations. Question four of Table 4 asks students to simplify 3 6. (Penn State, 2010). The increase in difficulty from the fall 2008 exam to the spring 2010 exam was an average of 40% (Table 5). The skills needed to solve this problem were broken down to help assess which of these skills were present in the ALEKS program. Students should begin by writing the expression with fractional exponents. This will result in 3 6. Students must then be able to simplify this expression by writing the b terms on either the top or bottom of the fraction, and with a positive 1 exponent. Students should obtain 6. Students must recognize that this is equivalent to 6. Upon examination of the skills presented in ALEKS, there is no practice with fractional exponents when students are beginning their practice of exponents observed (ALEKS, 2010). Students perhaps require more initial practice on fractional exponents in order to be able to 18
23 complete exercises such as, simplifying 3 6 (Penn State, 2010), and apply their knowledge to more complex problems. The ALEKS program used during the spring semester of the Math 021 course was a factor that could have contributed to affecting students learning and performance on the final exam. It appears that there were some skills that were not mastered well by students and perhaps these skills were not represented enough in the ALEKS program. Students performance on questions that included these skills decreased from the fall of 2008 to the spring of There were, however, questions that appeared on both the fall 2008 exam and spring 2010 exam that had an increase in success or a constant trend of success (Table 5). The content covered in the ALEKS program most likely had some bearing on the increased or constant success. There are of course other factors that helped to determine students success on test questions, however, the material that students must master before finishing the course is important in what skills students have acquired across the course. The effectiveness of questions remained constant for some of the identical questions, but for others showed a downward trend (Table 5). Perhaps the effectiveness of questions changed because of the differences in background knowledge students had going into the final exam. The first identical question had most students answering correctly, so if even one top student answered this question incorrectly it would greatly affect the effectiveness. Question two of the identical questions was very difficult and as examined, the skills needed for the question were not completely present in ALEKS. The students who studied the materials in ALEKS, believing this would prepare them for the exam may not be fully prepared for this question. This, however, did not seem to affect the fourth identical question that also saw an increase in difficulty. 19
24 There are many factors that contributed to the changes in difficulty and effectiveness from the fall 2008 exam to the spring 2010 exam. The factors that were uncontrollable could unfortunately not be further assessed to determine their effect on the results. The ALEKS program was a factor that could be further examined, and as a result it could be seen through this program that skills needed to answer some of the questions were not present in the capacity that they perhaps needed to be. A final factor that it is important to note is the students who finished ALEKS early took the exam early and are not present in this data sample. These students are probably the top students so their results would probably increase the success level on each of the questions. 20
25 Conclusion This study followed five questions that appeared identically on both the fall 2008 (Penn State, 2008) and spring 2010 (Penn State, 2010) exams. The effectiveness and difficulty of these questions across these exam years were examined along with the factors that were present that could have affected the changes in these measurements. The results of this study show that there was an increase in difficulty for two of the five identical problems, a decrease in difficulty for one of the identical problems, and the difficulty for the remaining two identical problems remained consistent. This information was obtained through comparing the data analysis from the fall 2008 exam (Schreyer, 2008) with the analyzed raw data from the spring 2010 exam (MATH, 2010). The affect that the ALEKS program (ALEKS, 2010), used during the spring 2010 semester, had on the difficulty was examined and it was determined that skills needed for the questions that saw an increase in difficulty were perhaps not as present as need be. The effectiveness of the five identical questions showed a general decrease. The first three identical questions showed a significant decrease in effectiveness, while the fourth and fifth questions had more consistent effectiveness. This comparison was made through examining the item analysis from the fall 2008 exam (Schreyer, 2008) and the analysis of the raw data from the spring 2010 exam (MATH, 2010). There are many factors that could account for these changes including the knowledge students have gained over the semester, and the sample size and method for calculating this measurement. This study revealed a great deal about the effectiveness and difficulty of these five identical exam questions. Studies like this one can be done by classroom teachers on an individual basis to help improve the reliability of their exams. Universities, such as Penn State, provide the Schreyer Institute for Teaching Excellence to produce this data that professors can 21
26 examine. Calculators, such as the Vassar College calculator (Lowry, 2010), and other methods of analysis are available through university websites for teachers to access and compute this data independently. 22
27 Appendix A: Glossary Item effect: The item effect is the biserial coefficient for the exam question (Schreyer, 2008). The biserial coefficient is described on the University of Texas website, but is referred to as the item discrimination. The item discrimination, or item effect, measures those who had high scores on the exam and those who had low scores on the exam and whether or not they answered the question correctly. An item with high effect has the students receiving the highest scores on the exam choosing the correct answer and the students receiving the lowest scores choosing one of the incorrect answers (Instructional, 2010). The item effect is a number between 1.0 and 1.0 where values closer to 1.0 indicate a higher item effect (Schreyer, 2008). Reliability: The reliability of an exam is calculated using the KuderRichardson formula 20 value (Schreyer, 2008). This value is calculated using the following formula: = 1 [1 2 ] Where k is the number of participants in the study, is the sum of the variances of each question, and 2 = ( ; )2 is the variance of the total scores (Furr, 2008: ). The reliability is a value between.00 and The closer the value is to 1.00 the higher reliability there is. A reliability below 0.50 is a poor reliability score (Schreyer, 2008). 23
28 Appendix B: Consent Forms Informed Consent Form for Social Science Research The Pennsylvania State University Title of Project: Principal Investigator: Advisor: ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS Elizabeth Somers, Undergraduate Student 317 Simmons Hall, University Park, PA Associate Professor Stanley Smith 104 G McAllister smith_s@math.psu.edu 1. Purpose of the Study: The purpose of this study is to conduct research that will lead to the improvement of the effectiveness of multiple choice math exams. The particular test that this study will concentrate on is the MATH 021 final exam. This study is being done for research purposes and is being performed for an undergraduate honors thesis. The research for this study will aid in the improvement of the MATH 021 final exam. 2. Procedures to be followed: For this study, you will be asked to complete the MATH 021 final exam as you would in the absence of a study. At the time of the final exam, you will be asked to indicate whether or not you wish to participate in this study. 3. Duration/Time: This study will only not require any additional time apart from filling out the consent form. 4. Statement of Confidentiality: Your participation in this research is confidential. The data will be stored and secured at McAllister Building in an archived file. In the event of a publication or presentation resulting from the research, no personally identifiable information will be shared. Your test results will be archived as they normally are. The student researcher will only have access to the data after all identifying information has been removed by the Mathematics Department. 5. Right to Ask Questions: Please contact Elizabeth Somers at (610) with questions or concerns about this study. If you have questions regarding the purpose, outcomes, or any other aspects of this study, please contact Elizabeth Somers. 6. Voluntary Participation: Your decision to include your final exam responses in this research is voluntary. You may request your final exam results be removed from the study at any time by contacting the Mathematics Department, 104 McAllister Building. 24
29 You must be 18 years of age or older to consent to take part in this research study. If you agree to take part in this research study and the information outlined above, please be sure to sign your name and indicate the date on the informed consent form attached to your final exam. The preceding information will be provided for you at this time as well. 25
30 Informed Consent Form for Social Science Research The Pennsylvania State University Title of Project: Principal Investigator: Advisor: ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS Elizabeth Somers, Undergraduate Student 317 Simmons Hall, University Park, PA Associate Professor Stanley Smith 104 G McAllister smith_s@math.psu.edu 7. Purpose of the Study: The purpose of this study is to conduct research that will lead to the improvement of the effectiveness of multiple choice math exams. The particular test that this study will concentrate on is the MATH 021 final exam. This research is being done for an undergraduate honors thesis. 8. Procedures to be followed: For this study, you will be asked to complete the MATH 021 final exam as you would in the absence of a study. You are asked to indicate whether or not you wish to participate in this study. Please check yes or no, and then sign your name to indicate this. 9. Duration/Time: This study will only not require any additional time apart from filling out the consent form. 10. Statement of Confidentiality: Your participation in this research is confidential. The data will be stored and secured at McAllister Building in an archived file. In the event of a publication or presentation resulting from the research, no personally identifiable information will be shared. Your test results will be archived as they normally are. The student research will only have access to the data after all identifying information has been removed. 11. Right to Ask Questions: Please contact Elizabeth Somers at (610) with questions or concerns about this study. If you have questions regarding the purpose, outcomes, or any other aspects of this study, please contact Elizabeth Somers. 26
31 12. Voluntary Participation: Your decision to include your final exam responses in this research is voluntary. You may request your final exam results be removed from the study at any time by contacting the Mathematics Department, 104 McAllister Building. You must be 18 years of age or older to consent to take part in this research study. If you agree to take part in this research study and the information outlined above, please be sure to sign your name and indicate the date. You will be given a copy of this form for your records. I agree to allow my final exam results from MATH 021 to be released to the principal investigator and the research team of this study for the purpose of researching the effectiveness of the MATH 021 exam. I DO NOT agree to allow my final exam results from MATH 021 to be released to the principal investigator and the research team of this study. Participant Signature Date Person Obtaining Consent Date 27
32 References ALEKS. (2010). ALEKS. Retrieved November 16, 2010, from Furr, R. M., & Bacharach, V. R. (2008). Psychometrics An Introduction. Los Angeles: Sage Publications. Instructional Assessment Resources. (2010). Assess Students: Item analysis. Retrieved October 7, 2010, from report/itemanalysis.php Lowry, R. (2010). Point Biserial Correlation Coefficient. Retrieved October 7, 2010, from MATH 021. (2010, spring). Raw data. Unpublished raw data. Math Department. (2008). MATH 021 Syllabus. Unpublished manuscript. Math Department, (2010). MATH 021 Syllabus. Unpublished manuscript. Penn State Department of Mathematics. (2008, fall). MATH 021 FINAL EXAM VERSION A, B, C, D. Unpublished manuscript. Penn State Department of Mathematics. (2010, spring). MATH 021 FINAL EXAM VERSION A, B, C, D. Unpublished manuscript. Research at Penn State. (2010). Conducting a Human Participant Research Study. Retrieved from Schreyer Institute for Teaching Excellence. (2008, Dec. 18). MATH Unpublished item analysis data. Thompson, D. R., Beckmann, C. E., & Senk, S. L. (1997, January). Improving Classroom Tests as a Means of Improving Assessment. Mathematics Teacher, 90(1),
33 University of Wisconsin Oshkosh. (2005, August 30). Testing Services: Item Discrimination I. Retrieved October 7, 2010, from itemdiscrimone.php 29
34 Vita Elizabeth Somers Elizabeth Somers 149 Fawn Lane Haverford, PA Education: Pennsylvania State University, Spring 2011 Bachelor of Science Mathematics Teacher Certification Option Honors in Mathematics Thesis: Assessing the Effectiveness of Multiple Choice Math Tests Thesis Advisor: Dr. Stanley Smith Honors: Dean s List Fall Spring 2010 Related Experience: Preservice student teaching in mathematics Fall 2010 Private geometry tutoring for the SAT Summer 2009 Math tutor with Volunteers in Public Schools at State College Area High School Fall 2007, 2008, Spring 2008 Experience and Activities: ESF Summer Camps counselor Summer 2010 ESF Summer Camps swim instructor Summer 2009 Pre team swim coach Summer 2006 & Summer 2007 Penn State Natatorium lifeguard October December 2010
Are You Ready? Simplify Fractions
SKILL 10 Simplify Fractions Teaching Skill 10 Objective Write a fraction in simplest form. Review the definition of simplest form with students. Ask: Is 3 written in simplest form? Why 7 or why not? (Yes,
More informationAGS THE GREAT REVIEW GAME FOR PREALGEBRA (CD) CORRELATED TO CALIFORNIA CONTENT STANDARDS
AGS THE GREAT REVIEW GAME FOR PREALGEBRA (CD) CORRELATED TO CALIFORNIA CONTENT STANDARDS 1 CALIFORNIA CONTENT STANDARDS: Chapter 1 ALGEBRA AND WHOLE NUMBERS Algebra and Functions 1.4 Students use algebraic
More informationBENCHMARK TREND COMPARISON REPORT:
National Survey of Student Engagement (NSSE) BENCHMARK TREND COMPARISON REPORT: CARNEGIE PEER INSTITUTIONS, 20032011 PREPARED BY: ANGEL A. SANCHEZ, DIRECTOR KELLI PAYNE, ADMINISTRATIVE ANALYST/ SPECIALIST
More informationHow to Judge the Quality of an Objective Classroom Test
How to Judge the Quality of an Objective Classroom Test Technical Bulletin #6 Evaluation and Examination Service The University of Iowa (319) 3350356 HOW TO JUDGE THE QUALITY OF AN OBJECTIVE CLASSROOM
More informationCalculators in a Middle School Mathematics Classroom: Helpful or Harmful?
University of Nebraska  Lincoln DigitalCommons@University of Nebraska  Lincoln Action Research Projects Math in the Middle Institute Partnership 72008 Calculators in a Middle School Mathematics Classroom:
More informationAlgebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview
Algebra 1, Quarter 3, Unit 3.1 Line of Best Fit Overview Number of instructional days 6 (1 day assessment) (1 day = 45 minutes) Content to be learned Analyze scatter plots and construct the line of best
More informationLinking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report
Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report Contact Information All correspondence and mailings should be addressed to: CaMLA
More informationExtending Place Value with Whole Numbers to 1,000,000
Grade 4 Mathematics, Quarter 1, Unit 1.1 Extending Place Value with Whole Numbers to 1,000,000 Overview Number of Instructional Days: 10 (1 day = 45 minutes) Content to Be Learned Recognize that a digit
More informationUsing Proportions to Solve Percentage Problems I
RP71 Using Proportions to Solve Percentage Problems I Pages 46 48 Standards: 7.RP.A. Goals: Students will write equivalent statements for proportions by keeping track of the part and the whole, and by
More informationMath 96: Intermediate Algebra in Context
: Intermediate Algebra in Context Syllabus Spring Quarter 2016 Daily, 9:20 10:30am Instructor: Lauri Lindberg Office Hours@ tutoring: Tutoring Center (CAS504) 8 9am & 1 2pm daily STEM (Math) Center (RAI338)
More informationGrade 6: Correlated to AGS Basic Math Skills
Grade 6: Correlated to AGS Basic Math Skills Grade 6: Standard 1 Number Sense Students compare and order positive and negative integers, decimals, fractions, and mixed numbers. They find multiples and
More informationExemplar 6 th Grade Math Unit: Prime Factorization, Greatest Common Factor, and Least Common Multiple
Exemplar 6 th Grade Math Unit: Prime Factorization, Greatest Common Factor, and Least Common Multiple Unit Plan Components Big Goal Standards Big Ideas Unpacked Standards Scaffolded Learning Resources
More informationProficiency Illusion
KINGSBURY RESEARCH CENTER Proficiency Illusion Deborah Adkins, MS 1 Partnering to Help All Kids Learn NWEA.org 503.624.1951 121 NW Everett St., Portland, OR 97209 Executive Summary At the heart of the
More information2 nd grade Task 5 Half and Half
2 nd grade Task 5 Half and Half Student Task Core Idea Number Properties Core Idea 4 Geometry and Measurement Draw and represent halves of geometric shapes. Describe how to know when a shape will show
More informationTable of Contents. Internship Requirements 3 4. Internship Checklist 5. Description of Proposed Internship Request Form 6. Student Agreement Form 7
Table of Contents Section Page Internship Requirements 3 4 Internship Checklist 5 Description of Proposed Internship Request Form 6 Student Agreement Form 7 Consent to Release Records Form 8 Internship
More informationGrade 2: Using a Number Line to Order and Compare Numbers Place Value Horizontal Content Strand
Grade 2: Using a Number Line to Order and Compare Numbers Place Value Horizontal Content Strand Texas Essential Knowledge and Skills (TEKS): (2.1) Number, operation, and quantitative reasoning. The student
More information2005 National Survey of Student Engagement: Freshman and Senior Students at. St. Cloud State University. Preliminary Report.
National Survey of Student Engagement: Freshman and Senior Students at St. Cloud State University Preliminary Report (December, ) Institutional Studies and Planning National Survey of Student Engagement
More informationSchool Competition and Efficiency with Publicly Funded Catholic Schools David Card, Martin D. Dooley, and A. Abigail Payne
School Competition and Efficiency with Publicly Funded Catholic Schools David Card, Martin D. Dooley, and A. Abigail Payne Web Appendix See paper for references to Appendix Appendix 1: Multiple Schools
More informationRubric Assessment of Mathematical Processes in Homework
University of Nebraska  Lincoln DigitalCommons@University of Nebraska  Lincoln Action Research Projects Math in the Middle Institute Partnership 72008 Rubric Assessment of Mathematical Processes in
More informationThe Good Judgment Project: A large scale test of different methods of combining expert predictions
The Good Judgment Project: A large scale test of different methods of combining expert predictions Lyle Ungar, Barb Mellors, Jon Baron, Phil Tetlock, Jaime Ramos, Sam Swift The University of Pennsylvania
More informationInterpreting ACER Test Results
Interpreting ACER Test Results This document briefly explains the different reports provided by the online ACER Progressive Achievement Tests (PAT). More detailed information can be found in the relevant
More informationTechnical Manual Supplement
VERSION 1.0 Technical Manual Supplement The ACT Contents Preface....................................................................... iii Introduction....................................................................
More informationTRAVEL TIME REPORT. Casualty Actuarial Society Education Policy Committee October 2001
TRAVEL TIME REPORT Casualty Actuarial Society Education Policy Committee October 2001 The Education Policy Committee has completed its annual review of travel time. As was the case last year, we do expect
More informationSTUDENT LEARNING ASSESSMENT REPORT
STUDENT LEARNING ASSESSMENT REPORT PROGRAM: Sociology SUBMITTED BY: Janine DeWitt DATE: August 2016 BRIEFLY DESCRIBE WHERE AND HOW ARE DATA AND DOCUMENTS USED TO GENERATE THIS REPORT BEING STORED: The
More informationOntheFly Customization of Automated Essay Scoring
Research Report OntheFly Customization of Automated Essay Scoring Yigal Attali Research & Development December 2007 RR0742 OntheFly Customization of Automated Essay Scoring Yigal Attali ETS, Princeton,
More informationSouth Carolina College and CareerReady Standards for Mathematics. Standards Unpacking Documents Grade 5
South Carolina College and CareerReady Standards for Mathematics Standards Unpacking Documents Grade 5 South Carolina College and CareerReady Standards for Mathematics Standards Unpacking Documents
More informationFoothill College Summer 2016
Foothill College Summer 2016 Intermediate Algebra Math 105.04W CRN# 10135 5.0 units Instructor: Yvette Butterworth Text: None; Beoga.net material used Hours: Online Except Final Thurs, 8/4 3:30pm Phone:
More informationHonors Mathematics. Introduction and Definition of Honors Mathematics
Honors Mathematics Introduction and Definition of Honors Mathematics Honors Mathematics courses are intended to be more challenging than standard courses and provide multiple opportunities for students
More informationThis scope and sequence assumes 160 days for instruction, divided among 15 units.
In previous grades, students learned strategies for multiplication and division, developed understanding of structure of the place value system, and applied understanding of fractions to addition and subtraction
More informationBUILDING CAPACITY FOR COLLEGE AND CAREER READINESS: LESSONS LEARNED FROM NAEP ITEM ANALYSES. Council of the Great City Schools
1 BUILDING CAPACITY FOR COLLEGE AND CAREER READINESS: LESSONS LEARNED FROM NAEP ITEM ANALYSES Council of the Great City Schools 2 Overview This analysis explores national, state and district performance
More informationAC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS TRUCTURE COURSE
AC 2011746: DEVELOPMENT OF AN INTRODUCTION TO INFRAS TRUCTURE COURSE Matthew W Roberts, University of Wisconsin, Platteville MATTHEW ROBERTS is an Associate Professor in the Department of Civil and Environmental
More informationTask Types. Duration, Work and Units Prepared by
Task Types Duration, Work and Units Prepared by 1 Introduction Microsoft Project allows tasks with fixed work, fixed duration, or fixed units. Many people ask questions about changes in these values when
More informationNCEO Technical Report 27
Home About Publications Special Topics Presentations State Policies Accommodations Bibliography Teleconferences Tools Related Sites Interpreting Trends in the Performance of Special Education Students
More informationNATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)
NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) 2008 H. Craig Petersen Director, Analysis, Assessment, and Accreditation Utah State University Logan, Utah AUGUST, 2008 TABLE OF CONTENTS Executive Summary...1
More informationTCC Jim Bolen Math Competition Rules and Facts. Rules:
TCC Jim Bolen Math Competition Rules and Facts Rules: The Jim Bolen Math Competition is composed of two one hour multiple choice precalculus tests. The first test is scheduled on Friday, November 8, 2013
More informationSample Problems for MATH 5001, University of Georgia
Sample Problems for MATH 5001, University of Georgia 1 Give three different decimals that the bundled toothpicks in Figure 1 could represent In each case, explain why the bundled toothpicks can represent
More informationEarly Warning System Implementation Guide
Linking Research and Resources for Better High Schools betterhighschools.org September 2010 Early Warning System Implementation Guide For use with the National High School Center s Early Warning System
More informationMASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option
MASTER OF ARTS IN APPLIED SOCIOLOGY Thesis Option As part of your degree requirements, you will need to complete either an internship or a thesis. In selecting an option, you should evaluate your career
More informationMath 121 Fundamentals of Mathematics I
I. Course Description: Math 121 Fundamentals of Mathematics I Math 121 is a general course in the fundamentals of mathematics. It includes a study of concepts of numbers and fundamental operations with
More informationPage 1 of 11. Curriculum Map: Grade 4 Math Course: Math 4 Subtopic: General. Grade(s): None specified
Curriculum Map: Grade 4 Math Course: Math 4 Subtopic: General Grade(s): None specified Unit: Creating a Community of Mathematical Thinkers Timeline: Week 1 The purpose of the Establishing a Community
More information(I couldn t find a Smartie Book) NEW Grade 5/6 Mathematics: (Number, Statistics and Probability) Title Smartie Mathematics
(I couldn t find a Smartie Book) NEW Grade 5/6 Mathematics: (Number, Statistics and Probability) Title Smartie Mathematics Lesson/ Unit Description Questions: How many Smarties are in a box? Is it the
More informationNational Survey of Student Engagement Spring University of Kansas. Executive Summary
National Survey of Student Engagement Spring 2010 University of Kansas Executive Summary Overview One thousand six hundred and twentyone (1,621) students from the University of Kansas completed the webbased
More informationGuidelines for Writing an Internship Report
Guidelines for Writing an Internship Report Master of Commerce (MCOM) Program Bahauddin Zakariya University, Multan Table of Contents Table of Contents... 2 1. Introduction.... 3 2. The Required Components
More informationShelters Elementary School
Shelters Elementary School August 2, 24 Dear Parents and Community Members: We are pleased to present you with the (AER) which provides key information on the 2324 educational progress for the Shelters
More informationCAAP. Content Analysis Report. Sample College. Institution Code: 9011 Institution Type: 4Year Subgroup: none Test Date: Spring 2011
CAAP Content Analysis Report Institution Code: 911 Institution Type: 4Year Normative Group: 4year Colleges Introduction This report provides information intended to help postsecondary institutions better
More informationNational Survey of Student Engagement
National Survey of Student Engagement Report to the Champlain Community Authors: Michelle Miller and Ellen Zeman, Provost s Office 12/1/2007 This report supplements the formal reports provided to Champlain
More informationMathematics. Mathematics
Mathematics Program Description Successful completion of this major will assure competence in mathematics through differential and integral calculus, providing an adequate background for employment in
More informationGRADUATE APPLICATION GRADUATE SCHOOL. Empowering Leaders for the Fivefold Ministry. Fall Trimester September 2, 2014November 14, 2014
Fall Trimester September 2, 2014November 14, 2014 Application Deadline: August 8, 2014 Classes Begin: September 2, 2014 Add/Drop Deadline: September 12, 2014 GRADUATE SCHOOL Empowering Leaders for the
More informationEndofModule Assessment Task K 2
Student Name Topic A: TwoDimensional Flat Shapes Date 1 Date 2 Date 3 Rubric Score: Time Elapsed: Topic A Topic B Materials: (S) Paper cutouts of typical triangles, squares, Topic C rectangles, hexagons,
More informationGrade Dropping, Strategic Behavior, and Student Satisficing
Grade Dropping, Strategic Behavior, and Student Satisficing Lester Hadsell Department of Economics State University of New York, College at Oneonta Oneonta, NY 13820 hadsell@oneonta.edu Raymond MacDermott
More informationExams: Accommodations Guidelines. English Language Learners
PSSA Accommodations Guidelines for English Language Learners (ELLs) [Arlen: Please format this page like the cover page for the PSSA Accommodations Guidelines for Students PSSA with IEPs and Students with
More informationPsychometric Research Brief Office of Shared Accountability
August 2012 Psychometric Research Brief Office of Shared Accountability Linking Measures of Academic Progress in Mathematics and Maryland School Assessment in Mathematics Huafang Zhao, Ph.D. This brief
More informationThe Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation
University of Nebraska  Lincoln DigitalCommons@University of Nebraska  Lincoln Journal of the National Collegiate Honors Council  Online Archive National Collegiate Honors Council Fall 2004 The Impact
More informationMathematics process categories
Mathematics process categories All of the UK curricula define multiple categories of mathematical proficiency that require students to be able to use and apply mathematics, beyond simple recall of facts
More informationActivity 2 Multiplying Fractions Math 33. Is it important to have common denominators when we multiply fraction? Why or why not?
Activity Multiplying Fractions Math Your Name: Partners Names:.. (.) Essential Question: Think about the question, but don t answer it. You will have an opportunity to answer this question at the end of
More informationThe New York City Department of Education. Grade 5 Mathematics Benchmark Assessment. Teacher Guide Spring 2013
The New York City Department of Education Grade 5 Mathematics Benchmark Assessment Teacher Guide Spring 2013 February 11 March 19, 2013 2704324 Table of Contents Test Design and Instructional Purpose...
More informationPreAlgebra A. Syllabus. Course Overview. Course Goals. General Skills. Credit Value
Syllabus PreAlgebra A Course Overview PreAlgebra is a course designed to prepare you for future work in algebra. In PreAlgebra, you will strengthen your knowledge of numbers as you look to transition
More informationBest Practices in Internet Ministry Released November 7, 2008
Best Practices in Internet Ministry Released November 7, 2008 David T. Bourgeois, Ph.D. Associate Professor of Information Systems Crowell School of Business Biola University Best Practices in Internet
More informationABET Criteria for Accrediting Computer Science Programs
ABET Criteria for Accrediting Computer Science Programs Mapped to 2008 NSSE Survey Questions First Edition, June 2008 Introduction and Rationale for Using NSSE in ABET Accreditation One of the most common
More informationSyllabus ENGR 190 Introductory Calculus (QR)
Syllabus ENGR 190 Introductory Calculus (QR) Catalog Data: ENGR 190 Introductory Calculus (4 credit hours). Note: This course may not be used for credit toward the J.B. Speed School of Engineering B. S.
More informationMultiplication of 2 and 3 digit numbers Multiply and SHOW WORK. EXAMPLE. Now try these on your own! Remember to show all work neatly!
Multiplication of 2 and digit numbers Multiply and SHOW WORK. EXAMPLE 205 12 10 2050 2,60 Now try these on your own! Remember to show all work neatly! 1. 6 2 2. 28 8. 95 7. 82 26 5. 905 15 6. 260 59 7.
More informationSchool Size and the Quality of Teaching and Learning
School Size and the Quality of Teaching and Learning An Analysis of Relationships between School Size and Assessments of Factors Related to the Quality of Teaching and Learning in Primary Schools Undertaken
More informationGUIDE TO THE CUNY ASSESSMENT TESTS
GUIDE TO THE CUNY ASSESSMENT TESTS IN MATHEMATICS Rev. 117.016110 Contents Welcome... 1 Contact Information...1 Programs Administered by the Office of Testing and Evaluation... 1 CUNY Skills Assessment:...1
More informationNORTH CAROLINA VIRTUAL PUBLIC SCHOOL IN WCPSS UPDATE FOR FALL 2007, SPRING 2008, AND SUMMER 2008
E&R Report No. 08.29 February 2009 NORTH CAROLINA VIRTUAL PUBLIC SCHOOL IN WCPSS UPDATE FOR FALL 2007, SPRING 2008, AND SUMMER 2008 Authors: Dina BulgakovCooke, Ph.D., and Nancy Baenen ABSTRACT North
More informationNational Longitudinal Study of Adolescent Health. Wave III Education Data
National Longitudinal Study of Adolescent Health Wave III Education Data Primary Codebook Chandra Muller, Jennifer Pearson, Catherine RiegleCrumb, Jennifer Harris Requejo, Kenneth A. Frank, Kathryn S.
More informationEvaluation of a College Freshman Diversity Research Program
Evaluation of a College Freshman Diversity Research Program Sarah Garner University of Washington, Seattle, Washington 98195 Michael J. Tremmel University of Washington, Seattle, Washington 98195 Sarah
More informationEvidence for Reliability, Validity and Learning Effectiveness
PEARSON EDUCATION Evidence for Reliability, Validity and Learning Effectiveness Introduction Pearson Knowledge Technologies has conducted a large number and wide variety of reliability and validity studies
More informationEducational Attainment
A Demographic and SocioEconomic Profile of Allen County, Indiana based on the 2010 Census and the American Community Survey Educational Attainment A Review of Census Data Related to the Educational Attainment
More informationSouth Carolina English Language Arts
South Carolina English Language Arts A S O F J U N E 2 0, 2 0 1 0, T H I S S TAT E H A D A D O P T E D T H E CO M M O N CO R E S TAT E S TA N DA R D S. DOCUMENTS REVIEWED South Carolina Academic Content
More informationWHY SOLVE PROBLEMS? INTERVIEWING COLLEGE FACULTY ABOUT THE LEARNING AND TEACHING OF PROBLEM SOLVING
From Proceedings of Physics Teacher Education Beyond 2000 International Conference, Barcelona, Spain, August 27 to September 1, 2000 WHY SOLVE PROBLEMS? INTERVIEWING COLLEGE FACULTY ABOUT THE LEARNING
More informationTesting for the Homeschooled High Schooler: SAT, ACT, AP, CLEP, PSAT, SAT II
Testing for the Homeschooled High Schooler: SAT, ACT, AP, CLEP, PSAT, SAT II Does my student *have* to take tests? What exams do students need to take to prepare for college admissions? What are the differences
More informationCHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA
CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA By Koma Timothy Mutua Reg. No. GMB/M/0870/08/11 A Research Project Submitted In Partial Fulfilment
More informationOnLine Data Analytics
International Journal of Computer Applications in Engineering Sciences [VOL I, ISSUE III, SEPTEMBER 2011] [ISSN: 22314946] OnLine Data Analytics Yugandhar Vemulapalli #, Devarapalli Raghu *, Raja Jacob
More informationCreating a Test in Eduphoria! Aware
in Eduphoria! Aware Login to Eduphoria using CHROME!!! 1. LCS Intranet > Portals > Eduphoria From home: LakeCounty.SchoolObjects.com 2. Login with your full email address. First time login password default
More informationCOURSE WEBSITE:
Intro to Financial Accounting Spring 2012 Instructor 2: Jacqueline R. Conrecode, MBA, MS, CPA Office Hours: Mondays & Wednesdays: 11:00 12:15 PM, 3:30 4:45PM Office: Lutgert Hall 3333 Office Phone: 239
More information(Includes a Detailed Analysis of Responses to Overall Satisfaction and Quality of Academic Advising Items) By Steve Chatman
Report #2021/01 Using Item Correlation With Global Satisfaction Within Academic Division to Reduce Questionnaire Length and to Raise the Value of Results An Analysis of Results from the 1996 UC Survey
More informationStandardized Assessment & Data Overview December 21, 2015
Standardized Assessment & Data Overview December 21, 2015 Peters Township School District, as a public school entity, will enable students to realize their potential to learn, live, lead and succeed. 2
More informationEdexcel GCSE. Statistics 1389 Paper 1H. June Mark Scheme. Statistics Edexcel GCSE
Edexcel GCSE Statistics 1389 Paper 1H June 2007 Mark Scheme Edexcel GCSE Statistics 1389 NOTES ON MARKING PRINCIPLES 1 Types of mark M marks: method marks A marks: accuracy marks B marks: unconditional
More informationDO YOU HAVE THESE CONCERNS?
DO YOU HAVE THESE CONCERNS? FACULTY CONCERNS, ADDRESSED MANY FACULTY MEMBERS EXPRESS RESERVATIONS ABOUT ONLINE COURSE EVALUATIONS. IN ORDER TO INCREASE FACULTY BUY IN, IT IS ESSENTIAL TO UNDERSTAND THE
More informationRunning head: LISTENING COMPREHENSION OF UNIVERSITY REGISTERS 1
Running head: LISTENING COMPREHENSION OF UNIVERSITY REGISTERS 1 Assessing Students Listening Comprehension of Different University Spoken Registers Tingting Kang Applied Linguistics Program Northern Arizona
More informationProcess Evaluations for a Multisite Nutrition Education Program
Process Evaluations for a Multisite Nutrition Education Program Paul Branscum 1 and Gail Kaye 2 1 The University of Oklahoma 2 The Ohio State University Abstract Process evaluations are an oftenoverlooked
More informationThe Efficacy of PCI s Reading Program  Level One: A Report of a Randomized Experiment in Brevard Public Schools and MiamiDade County Public Schools
The Efficacy of PCI s Reading Program  Level One: A Report of a Randomized Experiment in Brevard Public Schools and MiamiDade County Public Schools Megan Toby Boya Ma Andrew Jaciw Jessica Cabalo Empirical
More informationMassachusetts Department of Elementary and Secondary Education. Title I Comparability
Massachusetts Department of Elementary and Secondary Education Title I Comparability 20092010 Title I provides federal financial assistance to school districts to provide supplemental educational services
More informationThe Singapore Copyright Act applies to the use of this document.
Title Mathematical problem solving in Singapore schools Author(s) Berinderjeet Kaur Source Teaching and Learning, 19(1), 6778 Published by Institute of Education (Singapore) This document may be used
More informationRCPCH MMC Cohort Study (Part 4) March 2016
RCPCH MMC Cohort Study (Part 4) March 2016 Acknowledgements Dr Simon Clark, Officer for Workforce Planning, RCPCH Dr Carol Ewing, Vice President Health Services, RCPCH Dr Daniel Lumsden, Former Chair,
More informationHow Residency Affects The Grades of Undergraduate Students
The College at Brockport: State University of New York Digital Commons @Brockport Senior Honors Theses Master's Theses and Honors Projects 5102014 How Residency Affects The Grades of Undergraduate Students
More informationThe Talent Development High School Model Context, Components, and Initial Impacts on NinthGrade Students Engagement and Performance
The Talent Development High School Model Context, Components, and Initial Impacts on NinthGrade Students Engagement and Performance James J. Kemple, Corinne M. Herlihy Executive Summary June 2004 In many
More informationAn Analysis of the Early Assessment Program (EAP) Assessment for English
An Analysis of the Early Assessment Program (EAP) Assessment for English Conducted by Achieve on behalf of the California Diploma Project (ADP) and Policy Analysis for California Education (PACE) October
More informationLearning Disability Functional Capacity Evaluation. Dear Doctor,
Dear Doctor, I have been asked to formulate a vocational opinion regarding NAME s employability in light of his/her learning disability. To assist me with this evaluation I would appreciate if you can
More informationField Experience Management 2011 Training Guides
Field Experience Management 2011 Training Guides Page 1 of 40 Contents Introduction... 3 Helpful Resources Available on the LiveText Conference Visitors Pass... 3 Overview... 5 Development Model for FEM...
More informationManipulative Mathematics Using Manipulatives to Promote Understanding of Math Concepts
Using Manipulatives to Promote Understanding of Math Concepts Multiples and Primes Multiples Prime Numbers Manipulatives used: Hundreds Charts Manipulative Mathematics 1 www.foundationsofalgebra.com Multiples
More informationSegmentation Study of Tulsa Area Higher Education Needs Ages 36+ March Prepared for: Conducted by:
Segmentation Study of Tulsa Area Higher Education Needs Ages 36+ March 2004 * * * Prepared for: Tulsa Community College Tulsa, OK * * * Conducted by: Render, vanderslice & Associates Tulsa, Oklahoma Project
More informationClassifying combinations: Do students distinguish between different types of combination problems?
Classifying combinations: Do students distinguish between different types of combination problems? Elise Lockwood Oregon State University Nicholas H. Wasserman Teachers College, Columbia University William
More informationCONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE
CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE CONTENTS 3 Introduction 5 The Learner Experience 7 Perceptions of Training Consistency 11 Impact of Consistency on Learners 15 Conclusions 16 Study Demographics
More informationSusan K. Woodruff. instructional coaching scale: measuring the impact of coaching interactions
Susan K. Woodruff instructional coaching scale: measuring the impact of coaching interactions Susan K. Woodruff Instructional Coaching Group swoodruf@comcast.net Instructional Coaching Group 301 Homestead
More informationOFFICE SUPPORT SPECIALIST Technical Diploma
OFFICE SUPPORT SPECIALIST Technical Diploma Program Code: 311068 our graduates INDEMAND 2017/2018 mstc.edu administrative professional career pathway OFFICE SUPPORT SPECIALIST CUSTOMER RELATIONSHIP PROFESSIONAL
More informationSOUTHERN MAINE COMMUNITY COLLEGE South Portland, Maine 04106
SOUTHERN MAINE COMMUNITY COLLEGE South Portland, Maine 04106 Title: Precalculus Catalog Number: MATH 190 Credit Hours: 3 Total Contact Hours: 45 Instructor: Gwendolyn Blake Email: gblake@smccme.edu Website:
More informationNorms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?
Frequently Asked Questions Today s education environment demands proven tools that promote quality decision making and boost your ability to positively impact student achievement. TerraNova, Third Edition
More informationALEKS. ALEKS Pie Report (Class Level)
ALEKS ALEKS Pie Report (Class Level) The ALEKS Pie Report at the class level shows average learning rates and a detailed view of what students have mastered, not mastered, and are ready to learn. The pie
More information