Program Assessment: Annual Report Program(s): B.A., B.S., M.A., and Ph.D. Department: Mathematics and Statistics College/School: Arts and Sciences Date: June 29, 2018 Primary Assessment Contact: Brody Johnson (Associate Chair) 1. Which program student learning outcomes were assessed in this annual assessment cycle? Program assessment for the academic year 2017-2018 focused primarily on the following program student learning outcomes for the B.A. and B.S. programs: PLO #1: Demonstrate the ability to solve a variety of mathematical problems; PLO #4: Demonstrate an ability to apply the methods of direct and indirect proof; PLO #5: Demonstrate an ability to communicate mathematical ideas and concepts clearly in written problem solutions. These PLOs are common to the B.A. and B.S. assessment plans. 2. What data/artifacts of student learning were collected for each assessed outcome? Were Madrid student artifacts included? The primary source of data for this report consists of student performance on selected problems from the final exam in a range of courses that are part of the B.A. and B.S. programs. Each semester, the instructors for selected courses choose a topic that will be assessed by all instructors of the course on the final exam. The topic is chosen based on the program learning outcome being assessed and often aligns with one of the course learning outcomes. The range of courses included in this process was expanded from the previous academic year and now consists of the following courses: MATH 1510 Calculus 1 MATH 1520 Calculus 2 MATH 2530 Calculus 3 MATH 2660 Principles of Mathematics MATH 3120 Introduction to Linear Algebra MATH 3550 Differential Equations. MATH 2660 and MATH 3120 are included for the first time this year. Madrid faculty members have been fully engaged in this process since Spring 2017. 3. How did you analyze the assessment data? What was the process? Who was involved? NOTE: If you used rubrics as part of your analysis, please include them in an appendix. 1
The faculty member responsible for the section evaluates the final exam problems for their students and each student is given a score on a 0-3 scale. The typical rubric for this evaluation is given below, although instructors have some flexibility to alter the rubric as necessary. Rubric for Final Exam Problem Assessment 3 Student shows a mastery of the relevant material. 2 Student shows competence, but not complete mastery of the material. 1 Student shows a limited understanding of the material. 0 Student shows no understanding of the material. Students who achieve a 2 or 3 have shown competence for the program learning outcome being assessed with respect to the chosen problem. Instructors tabulate the scores for their section(s) and complete a form summarizing their findings and providing some background information about the assessment measure used. In most cases, faculty members submit the problem used for the assessment. The completed forms are submitted to the associate chair. A natural goal for this type of assessment is that scores should fall primarily into the 2 and 3 categories of the rubric. However, the difficulty level of problems in mathematics and statistics can vary substantially even when the core content is identical, so it can also be expected that scores may, at times, fall short of the 2-3 range simply because the chosen problem is somewhat more difficult than many standard problems testing the same concept. This provides some motivation to consider the data in aggregate at the course level with the goal that a high percentage of students who take a given course will receive scores of 2 or 3. 4. What did you learn from the data? Summarize the major findings of your analysis for each assessed outcome. NOTE: If necessary, include any tables, charts, or graphs in an appendix. The department has been collecting data on student learning for approximately four semesters and is still in the process of establishing a baseline for expectations. Instructor participation has been reasonably good and has increased gradually each semester. Term Fall 2016 Spring 2017 Fall 2017 Spring 2018 Sections Included 22 25 31 29 Sections Participating 13 20 27 25 The aggregate data for the Fall 2017 and Spring 2018 semesters are presented below. The data for MATH 1510, 1520, 2530, 3120, and 3550 apply to PLOs #1 and #5. Recall that PLO #1 focuses on the development of a body of knowledge in mathematics and material, while PLO #5 deals with the effective communication of mathematical ideas in clearly written problem solutions. The data for MATH 2660 is related to PLO #4, which involves the ability to create and write proofs using a variety of techniques. 2
Fall 2017 Course 0 1 2 3 Total 2 or 3 1510 53 55 43 127 278 61.15% 1520 20 19 29 50 118 66.95% 2530 12 38 44 47 141 64.54% 2660 0 1 3 12 16 93.75% 3120 0 0 5 7 12 100.00% 3550 3 5 19 31 58 86.20% Spring 2018 Course 0 1 2 3 Total 2 or 3 1510 29 21 20 101 171 70.76% 1520 19 19 18 44 100 62.00% 2530 7 15 28 28 78 71.79% 2660 4 5 10 3 22 59.09% 3120 1 3 7 13 24 83.33% 3550 2 14 21 89 126 87.30% Program Learning Outcomes #1, #5: The percentage of students achieving a 2 or 3 score in each of the courses 1510, 1520, 2530, 3120, and 3550 was above 60% both semesters, with 66% of the students in these five courses receiving a 2 or 3. Moreover, 43% of the students in these courses received a rating of 3. In comparison, 71% of the students in 1510, 1520, 2530, and 3550 received a score of 2 or 3 in the previous annual assessment cycle, with 47% receiving a 3. Math 3120 was not included in the previous annual assessment cycle. Program Learning Outcome #4: In each of the semesters Fall 2017 and Spring 2018 data was collected from one section of Math 2660 Principles of Mathematics. Combining these two data sets, 73% of the students reached the level of 2 or 3, with 39% receiving a 3. This learning outcome was not assessed during the previous annual assessment cycle. 3
5. How did your analysis inform meaningful change? How did you use the analyzed data to make or implement recommendations for change in pedagogy, curriculum design, or your assessment plan? Assessment of student learning through final exam problems has been ongoing for two annual assessment cycles (four semesters). The data collected is helping to establish reasonable expectations for student learning in key courses that support our B.A. and B.S. programs, but no significant conclusions have been drawn from the data thus far. The department engages in a wide variety of formal and informal assessment practices that do not naturally fit into the questions asked on this report. Some are short- term projects, while others are continuing assessment activities. The department conducts exit surveys with graduating seniors who have a major or minor in the department. The survey asks students about the curriculum, their involvement in departmental activities, their advising experience, as well as their future plans. The department frequently runs informal teaching seminars to provide instructors with an opportunity to exchange ideas and ask questions. In Fall 2017, the Calculunch seminar met three times over the course of the semester in order to support new calculus instructors and help with the transition to a new edition of the textbook and a new online homework platform. In Spring 2018, Dr. Druschel organized a teaching innovation seminar in which instructors shared their experiences with new pedagogical approaches. These seminars included graduate teaching assistants, adjuncts, as well as visiting and regular faculty. The department has a separate assessment plan in place for MATH 1300 Elementary Statistics with Computers (cross- listed as STAT 1300). This assessment plan measures student learning in accordance with the ten course learning outcomes and involves a random sample of 10 students. Instructors evaluate each of the selected students on the ten outcomes using a 0-10 scale. Faculty from Mathematics and Statistics have met with faculty in Physical Therapy and Nursing to discuss STAT 1100 Introduction to Statistics to discuss a potential change in the curriculum that would make the class more project- based and give students experience interpreting statistics in practical contexts. The Graduate Committee is in the midst of a deep review of the structure of qualifying exams for the Ph.D. program in mathematics. This review was motivated by our recent self- study and program review. The self- study found that the time- to- degree for doctoral students typically exceeds the length of funding and the external reviewers pointed out that the requirements at SLU were more stringent than those at many peer and aspirant institutions. Graduate students meet annually with the Graduate Coordinator to discuss progress in their program as well as future goals. Graduate students are frequently consulted about their interests for future course offerings at the graduate level. The departmental assessment committee engaged in discussions during the current annual assessment cycle over a variety of topics. The program learning outcomes at both the undergraduate and graduate level are currently under reviewed. Assessment plans and program learning outcomes from mathematics and/or statistics departments at other institutions are being examined during this process. The idea of one or more assessment tests has been discussed. These tests could be 4
administered at various points in the program to assess students' progress with program learning outcomes as they move through the program. There are also plans to offer an experimental capstone course for mathematics majors in Spring 2019. One of the goals in the design of the capstone course will be the incorporation of various components that help provide direct assessment of program learning outcomes for the B.A. and B.S. programs. The capstone course would also seek to contribute to the achievement of multiple program learning outcomes and could be tailored to the specific interests of the students. Discussions will continue during the next annual assessment cycle. 6. Did you follow up ( close the loop ) on past assessment work? If so, what did you learn? (For example, has that curriculum change you made two years ago manifested in improved student learning today, as evidenced in your recent assessment data and analysis?) 1. The Upper- Division Committee within Mathematics and Statistics conducted a review of the requirements for the B.A. in Mathematics over the last two academic years. The most significant changes involve two new requirements. Students must complete MATH 3850 Foundations of Statistical Analysis. Students must complete a course in computer programming. Several driving forces led to these changes, including Curricular recommendations of professional organizations; Feedback from graduating seniors through past exit surveys; Employment information from recent graduates. 2. Feedback from graduate students indicated an interest in a forum in which they could present mathematics to their peers without any faculty involvement. Subsequently, two mathematics doctoral students took responsibility for organizing a graduate student seminar for the Academic Year 2017-2018. Several students presented in the seminar. IMPORTANT: Please submit any revised/updated assessment plans to the University Assessment Coordinator along with this report. 5