BA in Criminal Justice, the MA in Criminal Justice, and the Crime Analysis option major, which

Similar documents
Linguistics Program Outcomes Assessment 2012

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

The Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation

2005 National Survey of Student Engagement: Freshman and Senior Students at. St. Cloud State University. Preliminary Report.

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

School Size and the Quality of Teaching and Learning

Biological Sciences, BS and BA

STUDENT LEARNING ASSESSMENT REPORT

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

Graduate Division Annual Report Key Findings

Colorado State University Department of Construction Management. Assessment Results and Action Plans

ABET Criteria for Accrediting Computer Science Programs

BENCHMARK TREND COMPARISON REPORT:

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

learning collegiate assessment]

State University of New York at Buffalo INTRODUCTION TO STATISTICS PSC 408 Fall 2015 M,W,F 1-1:50 NSC 210

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

U VA THE CHANGING FACE OF UVA STUDENTS: SSESSMENT. About The Study

Office Hours: Mon & Fri 10:00-12:00. Course Description

National Survey of Student Engagement at UND Highlights for Students. Sue Erickson Carmen Williams Office of Institutional Research April 19, 2012

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

The Political Engagement Activity Student Guide

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

On Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC

National Survey of Student Engagement Executive Snapshot 2010

Mathematics Program Assessment Plan

Faculty of Social Sciences

How to Judge the Quality of an Objective Classroom Test

Curricular Reviews: Harvard, Yale & Princeton. DUE Meeting

NCEO Technical Report 27

What Is The National Survey Of Student Engagement (NSSE)?

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Field Experience Management 2011 Training Guides

2 nd grade Task 5 Half and Half

Educational Attainment

College of Education & Social Services (CESS) Advising Plan April 10, 2015

National Survey of Student Engagement

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Handbook for Graduate Students in TESL and Applied Linguistics Programs

Annual Report Accredited Member

Quantitative Research Questionnaire

Higher Education / Student Affairs Internship Manual

TULSA COMMUNITY COLLEGE

Evaluation of Hybrid Online Instruction in Sport Management

GRADUATE PROGRAM Department of Materials Science and Engineering, Drexel University Graduate Advisor: Prof. Caroline Schauer, Ph.D.

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Strategic Practice: Career Practitioner Case Study

May 2011 (Revised March 2016)

A Study of Successful Practices in the IB Program Continuum

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Access Center Assessment Report

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

Creating a Test in Eduphoria! Aware

success. It will place emphasis on:

NATIONAL SURVEY OF STUDENT ENGAGEMENT

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

Workload Policy Department of Art and Art History Revised 5/2/2007

Qualitative Site Review Protocol for DC Charter Schools

CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE

Loyola University Chicago Chicago, Illinois

Entrepreneurial Discovery and the Demmert/Klein Experiment: Additional Evidence from Germany

USC VITERBI SCHOOL OF ENGINEERING

Teacher intelligence: What is it and why do we care?

Longitudinal Analysis of the Effectiveness of DCPS Teachers

Navigating the PhD Options in CMS

Calculators in a Middle School Mathematics Classroom: Helpful or Harmful?

Examiners Report January GCSE Citizenship 5CS01 01

MGT/MGP/MGB 261: Investment Analysis

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

Interpreting ACER Test Results

A&S/Business Dual Major

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

TASK 2: INSTRUCTION COMMENTARY

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

PHL Grad Handbook Department of Philosophy Michigan State University Graduate Student Handbook

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Ryerson University Sociology SOC 483: Advanced Research and Statistics

A Decision Tree Analysis of the Transfer Student Emma Gunu, MS Research Analyst Robert M Roe, PhD Executive Director of Institutional Research and

A Note on Structuring Employability Skills for Accounting Students

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Textbook Evalyation:

By Laurence Capron and Will Mitchell, Boston, MA: Harvard Business Review Press, 2012.

Table of Contents Welcome to the Federal Work Study (FWS)/Community Service/America Reads program.

Graduate Program in Education

Effective practices of peer mentors in an undergraduate writing intensive course

Master s Programme in European Studies

Strategy for teaching communication skills in dentistry

National Survey of Student Engagement Spring University of Kansas. Executive Summary

Firms and Markets Saturdays Summer I 2014

WHAT ARE VIRTUAL MANIPULATIVES?

School Leadership Rubrics

Transcription:

OUTCOMES ASSESSMENT REPORT for BA and MA in Criminal Justice and MA in Crime Analysis This section will discuss the activities regarding the Department of Criminal Justice in assessing the outcome assessments of their students, for all three programs, which include: the BA in Criminal Justice, the MA in Criminal Justice, and the Crime Analysis option major, which we just recently began performing outcomes assessment for, given its relatively new addition to our curriculum. The Department of Criminal Justice s student learning outcomes are contained in Appendix A. First, it should be noted that two separate Outcomes Assessment Plans (hereafter, OAP; see attached) for both the regular CJ Major and MA program were approved by the University Outcomes Assessment Committee in the mid-2000s. Specifically, the OAP for the BA in Criminal Justice was approved in 2005, and the OAP for the MA in Criminal Justice was approved in the later portion of 2006. We were notified that the Crime Analysis Option for our undergrad program did not require a separate plan because it is not its own degree, but rather as an option, and students take most of the required courses similar to the regular majors to fulfill their basic requirements. Given the Crime Analysis option is still in its early stages, very little information has been gathered regarding this Program (outside of an initial post-test, to a limited number of students). Therefore, while much data has been collected regarding the former programs regular CJ program and MA program far less data has been collected regarding the latter Crime Analysis option. On the other hand, a large amount of data has been collected for the former two programs, our regular BA and MA programs (which comprise the vast majority [over 85%] of our students over the last 6 years), and the findings gained from this data will be discussed

below, as well as how the department has addressed areas that have been shown to be effective (as well as weaknesses) in these programs. BA in Criminal Justice Regarding the OAP for the BA in Criminal Justice, we have implemented the assessment plan (see attachment), which involves 1) examination of the capstone paper (assigned in CJUS- 598), as well as 2) the quantitative pre-post exam (see attached) to test whether our students have shown significant progress in 5 key areas, identified as the required core classes: Introduction to Criminal Justice (101), Intro to Criminal Law (102), Research Methods in Criminal Justice (311), Statistics in Criminal Justice (312), and Theories of Criminal Behavior (320). It should be noted that these 5 components are revisited and synthesized in our required capstone course, CJUS-598, and during this course we take many of our assessments for determining the outcomes, according to the OAP. (This is the course in which the post-test is given in the last weeks of the course, as well as when the 598 capstone paper comes into play and we will discuss the analyses of this paper later). As directed by the BA OAP, the quantitative exam was administered to two different sections of the Intro to CJ classes of both spring of 2004 and fall of 2004 (both given the first week of class), as well as to two different sections of the capstone course, CJUS 598, toward the end of terms of spring of 2004 and fall of 2004 (in the last week of classes). In both of the Intro to CJ sections, students were asked to report whether they had NEVER taken a criminal justice class before, and only these students were included as students in the pre-program sample. Regarding the latter two sections (598), or students surveyed in the capstone class, we only included the students who reported being finished their curriculum and were graduating. Specifically, only the students reporting that it was there last term in the CJ program were

included in the post-program sample. Thus, for purposes of assessment, we only included the students who had never had any experience with criminal justice courses, as compared to those who had completed all of their coursework in our curriculum. These post-assessment exams were given in the final weeks of the course, so these are as true graduating seniors that we can test. The findings of the two samples from each group (entering true freshman students vs. true senior graduating students) showed remarkable consistency in their scores on the quantitative exam. Specifically, the spring and fall 2004 sample (N = 47) of students in the entering class, who had never taken a criminal justice course, scored a mean average of 21.2 (out of 50) on the exam (this score was virtually identical (to the tenth of the decimal across courses). This score was highly consistent across two sections of this course, so our test is quite reliable and valid, in the sense that both samples scored virtually identically on the quantitative test. The same test was given to two sections of the capstone course (CJUS 598), and these students scores (N = 29) averaged to a score was 32.6 (the two scores for both samples were statistically identical, to the tenth of a decimal). A t-test showed that the scores of these two groups (pre [21.2] vs. post [32.6]) was a highly significant difference, at p <.001, which suggests a significant improvement (which was not surprising, given that all five sections of test showed significant improvement [see below]). Thus, it appears that the students in our program showed an overall significant increase in learning, as measured by the quantitative exam. Although it may not seem like a large increase in scoring, the difference between the entering students (who had never taken a CJ course) and those who were graduating from our program did show a significant increase in scoring on the test. Specifically, the overall t-test showed a significant improvement on the scores on all aspects of the test. Therefore, we met this

goal stated in the OAP that students would show significant learning, as measured by overall scores on the quantitative exam. Still, to examine which of the 5 required areas were deficient, we estimated t-tests on the scores for each portion of the test (see below). All scores, both overall, and for each 5 subportions of the test, all showed significant improvement. Thus, these finding met the goal of the OAP of significant improvement between the pre-test and the post-test scores. Topic Portions of the Pre-Post Exam Due to the fact that it appears on face-value that the gain in overall total scores on the pre-post test were significant, we conducted subsequent analyses to examine the increase (or decrease) of scores regarding the five required specific topic areas of the test in order to determine which areas were most lacking. These five areas included the five main courses that were required by majors, which included: Intro to Criminal Justice, Intro to Criminal Law, Methods, Statistics, and Criminological Theory. The t-tests that were estimated showed the following differences for improvement for each of these 5 areas respectively: t-test M (pre; N = 47) M (post; N = 29) % Increase Intro to Criminal Justice: t = -8.61 5.30 8.31 57% Intro to Criminal Law: t = -7.10 6.02 8.24 37% Methods: t = -7.89 4.15 7.41 79% Statistics: t = -4.33 3.19 4.83 51% Theory: t = -3.83 2.51 3.76 50% All of these t-tests were highly significant (p <.001), so it is clear that notable improvement on these primary topic areas has taken place, which suggests that a significant amount of learning has taken place during the students time in our curriculum. Perhaps the

lowest area of improvement was in the area of Theory and Statistics due to the low scores in the post-test (although some would interpret the findings as Criminal Law being the lowest, because this area showed the lowest percentage of improvement). However, it should be noted that entering freshmen students answered correctly on the majority of questions in the Intro to Criminal Justice and Intro to Criminal Law questions, as well as almost half of the Methods questions. This indicates a problem with the actual exam we have been using, and we have attempted to address this problem (see below). Also, if one bases the judgment of outcomes on percentage of improvement, it would appear that the Criminal Law curriculum is weakest, because it only increased scores by 36%, which was still significant, but by far the lowest increase (in terms of percentage) of the postscores. Based on percentage of improvement in the pre-post test, the area of Methods was, by far, the highest (at 78%). Still, special attention was given to improving the areas of Theory and Statistics, since they were the areas of lowest raw scores of the post-test. Outcomes Assessment of the Quantitative Exam Scores by Specific Goals/Objectives as Outlined in the OAP Ultimately, in terms of the OAP, there were significant increases in all areas of the quantitative exam, and therefore, ALL of the goals and objectives outlined in the OAP were met. In fact, given the OAP, every single goal/objective was met by the exam scores according to the OAP, so there is no point going through every one of these objectives/goals. Details/actual results are available by request; however, all were positive results. Overall Rating of the Perceived Learning Experience in the Program Furthermore, a separate question asked what graduating students (N = 21) thought about their learning experience overall in the program. For this question, responding students noted that they

rated the overall learning experience as very good (n =5), good (n=13), or acceptable (n = 3). The fact that no students chose the lower category (of poor ) for this question, and most responded in the good category, suggests a positive experience that all students reported regarding their experience in our program. Furthermore, a large majority of students reported a very good or good analysis of the learning experience they had in their time with the program. Overall Rating of the Advising Experience in the Program Also, and beyond the curriculum measures, it is important to note that a separate question asked graduating students (N = 21) to evaluate the advising they received in their time with the program. For this question, responding students said that they rated the advising as very good (n = 7), good (n = 8), acceptable (n = 4), or poor (n = 2). The results of this question showed that the vast majority of the students in our program found the advising to be very good or good, and that only a couple of students (2) found our advising to be poor. This shows that the department has made strong efforts to advise students at a high-quality level. Obviously, we can do a better job, but these findings show that we are on the right track in advising students regarding their academic career in our department. How the Outcomes Assessment Info from the Quantitative Test Scores was Utilized Although the results of the quantitative exam showed success in all five areas and overall, the faculty realized that some entering students were scoring very highly on some of the items on the test that was given to them, especially in the area of Intro to CJ and Methods (with very high scores on the pre-test). Therefore, we have taken suggestions from the faculty who teach given courses (such as those who teach Intro to Criminal Justice and Methods, and other areas), and have revised the exam to make it a substantially more difficult for entering students, in order to

examine whether graduating students have learned more difficult concepts/ideas in those areas. We have not yet given this new version of the test, but we plan to in the coming years. Also, given the relatively low scores of students in the post-test in the area of theory, our faculty have discussed this issue, and acknowledge this weakness. Thus, a special attention has been given to improving attention to this area, such as incorporating theories into more classes and making the theories that are taught to be more hands-on, especially regarding more pragmatic theories, such as Broken Windows. In addition, a new theory course has since been created and was first offered in Fall Term of 2008, which is an advanced theory course entitled Crime Across the Life-Course, (585) that offers far more insight into current, contemporary theoretical development. Also, the instructors of other courses in our curriculum (such as the Gangs course, Women and Crime course, and Victimology course) have acknowledged this deficiency and revised courses in order to incorporate more theory into these courses. Update as of Sept. 2015 To update our previous findings from 2004-2006, we administered the Pre- and Post-Test to a group of new ( true freshman ) students in very beginning of Fall 2014 and graduating seniors at end of Spring 2015. The results were mostly similar from the earlier test, but with a couple of notable differences, especially regarding the component of Theory, as well as the overall score, which showed that the changes that we made in our curriculum seem to have paid off well, in terms of student learning outcomes. The findings of the sample from each group (entering freshman students vs. graduating seniors) showed much consistency in their scores on the quantitative exam as compared to the previous analysis from 2004-2005. Specifically, the Fall 2014 sample (N = 27) of students in the entering class, who were true freshman and had never taken a criminal justice course, scored a

mean average of 20.07 (out of 50) on the exam (and this score was very close to the previous freshman cohort [to approximately 1 point, with this cohort being about 1.1 point lower than the previous freshman cohort]). This score was highly consistent even over a 10-year period, so our test seems quite reliable and valid, in the sense that both samples scored almost the same on the quantitative test. The same test was given to a section of the capstone course (CJUS 598) in Spring 2015, and these students scores (N = 21) averaged to a score was 36.0, which was actually a significant increase over the graduating senior cohorts in 2005-2006 (which was 32.6), which supports the fact that our changes made to the curriculum based on the earlier results seemed to pay off in terms of student learning outcomes. Regarding this most recent Pre-Post comparison in 2014-2015, a t-test showed that the scores of two groups (pre [20.07] vs. post [36.0]) was a highly significant difference, at p <.001, t = -13.05, which was a significant improvement (this was not surprising, given that all five sections of test showed significant improvement [see below]). Thus, it appears that the students in our program showed an overall significant increase in learning, as measured by the quantitative exam. Furthermore, this difference was far greater than that of the previous analysis (i.e., 2004-2005), largely due to the high increase in scores by the graduating seniors cohorts (specifically, from 32.6 to 36.0, respectively). The most recent graduating seniors simply scored far higher on the test than did the two senior cohorts in 2005-2006. Although it appears on face-value that the gain in overall total scores on the pre-post test were significant, we conducted subsequent analyses to examine the increase (or decrease) of scores regarding the five required specific topic areas of the test in order to determine which areas were most lacking. These five areas included the five main courses that were required by

majors, which included: Intro to Criminal Justice, Intro to Criminal Law, Methods, Statistics, and Criminological Theory. The t-tests that were estimated showed the following differences for improvement for each of these 5 areas respectively for the 2014 freshman and 2015 graduating seniors: INCREASE IN SCORES t-test M (pre; n=27) M (post; n=21) 14/ 15 04-06 Overall: t = -13.05 20.07 36.00 79% 54% Intro to Criminal Justice: t = -4.71 5.26 7.67 46% 57% Intro to Criminal Law: t = -6.52 5.37 8.00 49% 37% Methods: t = -7.54 4.04 7.43 84% 79% Statistics: t = -5.67 3.37 6.38 89% 51% Theory: t = -13.26 2.00 6.52 226% 50% All of these t-tests were highly significant (p <.001), so it is clear that notable improvement on these primary topic areas has taken place, which suggests that a significant amount of learning has taken place during the students time in our curriculum. Perhaps the most notable area of improvement was in the area of Theory, as compared to the previous analysis ten years ago. This makes much sense due to the fact that the previous post-test scores on Theory were very low, so our department added an entirely new course, as well as advised all of our faculty to revise their courses to add discussion of theoretical frameworks into their discussion/lectures. Ultimately, in terms of the OAP, there were significant increases in all areas of the quantitative exam, and therefore, ALL of the goals and objectives outlined in the OAP were met. In fact, given the OAP, every single goal/objective was met by the exam scores

according to the OAP, so there is no point going through every one of these objectives/goals. Details/actual results are available by request; however, all were positive results. A separate question asked what graduating students (n = 21) thought about their learning experience overall in the program. For this question, responding students noted that they rated the overall learning experience as very good (n = 15), good (n = 6), and no students chose the lower categories of acceptable or poor. This result is very impressive in itself, but is even more so considering that last time this question was asked to graduating Seniors, the result was positive but not nearly as dominated by responses of very good [i.e., (n =5), good (n=13), and acceptable (n = 3)]. The fact that no students chose the lower category (of poor ) for this question either time the survey was given, and the modal response was the very good option, suggests an overall positive experience that most students perceive in our program. Capstone Papers for CJUS 598 Another portion of the assessment plan involves examining the term papers from the capstone (598) course in our department (the final course) in Spring of 2005, which is meant to synthesize the concepts, issues, and policy implications learned in our curriculum. The papers of an entire course (of students that were actually graduating that term) were analyzed and evaluated according to the criteria in the plan, which were meant to capture all 5 major elements (see above) of the required curriculum. These five elements included: Criminal justice, legal issues, theory, methodology and statistics. The Departmental Outcomes Assessment Committee reviewed a random sample (as described in the Outcomes Assessment plan) of 16 papers from this class (of graduating students) and scored them on the five areas described above.

Before reviewing the findings, it should be noted that in the assignment, an emphasis was placed on the first four elements (criminal justice system, legal issues, theory and methodology), and less emphasis on the last element (statistics). Specifically, the assignment was to identify a problem in the criminal justice area, to propose a legal solution to the problem (using law and theory), and to offer a way to evaluate the success of the program (methods). Thus, the statistics area of evaluation should probably not be considered strongly here because students were not expected to foresee what analytical strategies they would incorporate in studying the topic. Thus, these papers were scored on the degree to which the student used adequate, documented statistics (virtually all were descriptive) to support their argument for a particular strategy or why the topic was a problem for criminal justice agencies. The findings of the Committee s scores regarding these papers showed the following mean scores on the five primary areas (based on a score of 0-5): -Criminal Justice 4.53 -Law/Legal issues 3.87 -Theory 3.06 -Methodology 2.47 -Statistics 2.40 In the follow-up to this previous evaluation (in October 2015), an evaluation of 10 papers found that the scores were similar, but did show some differences. Here are the scores regarding the papers regarding mean scores on the five primary areas (based on a score of 0-5): -Criminal Justice 4.62 -Law/Legal issues 3.71

-Theory 3.41 -Methodology 3.12 -Statistics 3.23 These scores suggest several strengths and weaknesses. First, one strength is that our students appear to have gained a rather strong understanding of how the criminal justice system works, especially its deficiencies (given that the assignment was to find a problem in the system and articulate how it could be improved). Furthermore, most of the student papers showed a fairly good knowledge of the legal issues involved in the topic, or at least where to find such legal codes. Also, most papers showed a valid attempt to apply their solutions to a single theory (a few actually applied two theories), as a framework in applying their solution toward dealing with the problem they identified, even though approximately 80% of the papers chose Deterrence theory, which has not been the dominant theory in criminological thought for many years, but is still the dominant framework used by practitioners (which likely relates to their strong scores regarding the CJ System). In the newest group of papers many students accurately described how they would actually evaluate the program that they proposed, and most weren t as quite inaccurate, in the sense that fewer claimed that they would perform a classical experiment by examining rates of jurisdictions/individuals that had not been experimentally assigned, which is not realistic in many cases presented by the papers in the earlier batch from 2005. And the students in the last cohort of papers actually improved from a 2.47 to a 3.12, which shows that students are actually obtaining more knowledge of methods of actually doing social science. There was also a rise in the scores for statistics, from 2.4 to 3.2. We have been working very hard to improve that

element of understanding about methodology and statistics, and it appears we are moving in the right direction. Perhaps the exam results show a more abstract understanding of methodology, whereas the increase in scores on the 598 papers reveal a more applied level of use of knowledge regarding methodology and statistics used in social science. In light of these findings, our faculty has met and decided to continue to incorporate theory, methods and statistics in all of our undergraduate courses, especially regarding the way that studies that we typically cover in our courses are actually done. We believe this will go a long way toward dealing with any remaining deficits in our students knowledge of methodology and theoretical frameworks that go in to how to perform social scientific studies, especially in the area of criminal justice/criminology. The faculty are aware that analytical statistics should be discussed more often throughout our curriculum, especially in the upper-division courses. As mentioned before regarding methodology, our faculty will make a strong effort to discuss the statistical analyses used studies that are examined/covered in each course; hopefully, such exposure will help to expose our students to the various types of statistical methods that are used in the study of criminological research. Although the score on 598 papers were not alarming regarding theory, the scores were not high enough to reveal satisfaction. Consistent with the quantitative exam results, the score was low enough to support that more attention should be given to theory and statistics in our curriculum. Many of these deficiencies and plans to address more theory in our curriculum have already been addressed, as discussed above. Specifically, a new theory course has been created and was first offered in Fall Term of 2008, Crime over the Life-Course, that offers far more

insight into current theoretical development, and other courses (e.g., Gangs, Victimology, etc.) in our curriculum have incorporated more theoretical applications. MA in Criminal Justice In the OAP for the MA program, there were two primary sources of data that were collected from students to be evaluated for determining the outcomes assessment of learning by our graduate students. The first is a Portfolio (see Appendix), which we have required from our students in the last two years (see below), as well as a required, one-unit course CJUS 686 which required advanced graduate students (after all of their in-class courses had been taken), to take assessment tests to see what they had learned in their coursework (see below; see Appendix). It should be noted that the 686 course unit was added specifically in recent years for the specific purpose of assessing outcomes or learning among our graduate students (see below). Both the portfolio and the 686 assessment assessments were collected from students after they had completed their coursework, and had advanced to candidacy to complete their MA program. Both of these measures, the portfolio and 686 assessments, were assessed by the Departmental Committee on Outcomes Assessment made up of departmental faculty. These assessments were based on scores given by the committee based on the criteria from the plan. The evaluation of the scores for the various categories will be discussed below. It should be noted that the data from the portfolios, as well as from the 686 exams, are quite bias because both include students who are at the latter end of completing their graduate course work/curriculum. Therefore, the scores do not represent the students who did not make it this far in our program. However, these are the assessments that were approved by the University Outcomes Assessment Committee. Furthermore, we feel that such measures taken at the latter

portion of our curriculum are likely to best represent the strengths and weaknesses of our graduate program. Portfolio Assessments Regarding Portfolio assessment, the Departmental Outcomes Assessment Committee has evaluated these portfolios in Spring 2005 regarding the four goals outlined in the MA outcomes assessment plan. The assessments of these portfolios were completed and scored as follows (based on scores on the four major areas of assessment). Average scores on these four goals (each out of a possible 40) were as follows: CJ System: 36.90 Methods: 33.20 Statistics: 28.50 Theory: 35.60 From the Portfolio results, it is quite obvious that the weakest area of our graduate curriculum is regarding the Statistics category. Although this score is suitable regarding the OAP plan, it is still a cause for concern. This weakness in the area of statistics will be discussed below. The other areas of the portfolio assessments appear to hold up well, with all three areas CJ System, Methods and Theory far exceeding the goals outlined in the OAP. However, these were the more global overall scores in these areas. Future examinations of outcomes assessment will involve a more detailed evaluation, particularly regarding the specific objectives within each goal as outlined in the Outcomes Assessment Plan. To clarify, there was not enough time during this assessment period to analyze all specific objectives of each goal outlined in the OAP, but in future evaluations, more time can be devoted to looking at each specific objective

and analyzing ways that weaknesses can be met. For the current review, it is more important to focus on the larger picture, namely the low scores on the goal regarding the area of statistics. Update on Portfolio Assessments In Spring Term of 2015, the Departmental Outcomes Assessment Committee evaluated 10 recent portfolios regarding the four goals outlined in the MA outcomes assessment plan. The assessments of these portfolios were completed and scored as follows (based on scores on the four major areas of assessment; see rubric in Appendix). Average scores on these four goals (each out of a possible 40) were as follows: CJ System: 36.40 Methods: 35.40 Statistics: 33.50 Theory: 36.20 The newest batch of portfolios showed very similar results to the earlier ones from 2005-06. There was a notable increase in the area of methods and statistics, and a smaller increase in the area of theory. The overall scores on the newest group of portfolios increased by approximately 5% over the past batch, so that is a good sign that we are improving our course content and improving student learning. 686 Paper Assessments This 686 assessment (the course is one-unit, which we created specifically for outcomes assessment) involved having graduate students, who had completed there coursework coming in to answer questions in long-answer form (similar to qualifiers), although they did not have any graded score that would impact their record/gpa. This initial analysis in Spring 2005 involved examining the students answers regarding a number of questions that were pertaining to the four

main goals (see below) that were meant to capture most of the primary topics/areas that were considered important in the graduate curriculum. The first group of surveys were given and analyzed in 2005. Specifically, these assessments were scored on the primary four goals: CJ System, Methods, Statistics, and Theory. Average scores on these four goals (each out of a possible 20) were as follows: CJ System: 12.57 Methods: 12.12 Statistics: 5.63 Theory: 13.88 It is quite obvious from this measure that our graduate students are weakest on the Statistics category, as measured by the 686 assessments, which is consistent with the findings from the initial Portfolio assessments from Spring 2005. These findings revealed that our statistics curriculum was relatively deficient. This weakness will be discussed and addressed below. On the other hand, the scores of the other categories met the goals outlined in the OAP, albeit marginally. Specifically, all of the other categories were scored higher than 12, with theory having been given the highest scores (at close to 14). However, even the scores that met the goals of the OAP are rather low, and are cause for concern because none of these scores were high. These deficiencies will be examined and discussed by our faculty in the near future, but for now the most pressing issue is the very low scores regarding the area of statistics by our graduate students on both the portfolio and 686 assessments. Update on 686 Assessments

A recent analysis of a group of 10 686 papers was conducted. As before, the papers were analyzed based on 4 areas. The findings were as follows: CJ System: 16.20 Methods: 13.90 Statistics: 16.20 Theory: 16.70 It is notable that all areas of the scores on the 686 exams improved. This is likely due to a more pragmatic approach in teaching the core curriculum of the Master s program required courses. Specifically, the statistics course has been altered to have students working with data sets from the first week of the course, and applying the concepts to actual data, which had not been done prior to 2005. Also, the methodological courses are also more applied than prior to 2005. Students are essentially required to design an actual study, and build it from a solid theoretical framework. These are some of the likely reasons for the increase in scores on the 686 exams. How We Have Addressed Deficiencies Regarding the MA Data In order to address deficiencies in the theory area, we have added an additional theory course to the graduate curriculum, offered in Fall Term of 2008, which is an advanced theory elective course entitled Crime over the Life-Course, (CJUS 590) that offers far more insight into current theoretical development. This course will be proposed as a permanent course for the curriculum in the coming year. In this course, both modern methodology and statistical analyses will be discussed (e.g., interactions). Also, the level of statistics that are being taught has been elevated by completely overhauling the required graduate stats course, which involves introducing new statistical

modeling to courses. Specifically, instead of focusing on more relatively elementary statistics (e.g., descriptive statistics, chi-square), the new version of grad statistics courses have students going beyond that to more sophisticated, applied analyses used by more researchers in the current literature (e.g., partial correlations, multivariate/logistic regression models, validity/reliability estimates, factor analysis, etc.), while still covering the basics (e.g., measures of central tendency, reliability/validity of scales) in a more applied and contemporary manner, which reflects the types of statistics that are more common in current criminological literature. The goal of the new version of this course is how to actually Do it in terms of contemporary criminological research. Students start applying the concepts presented in the course to actual data-bases starting in the first week of the course. Each assignment for every week requires students to analyze data and provide a paper that explains statistical findings and discuss what the statistics actually mean in common language. Furthermore, our department offers a sequence of courses regarding crime mapping/environmental analyses using GIS software/analysis, which is a fast growing area in the field, and is cutting edge in a sense. (It should be noted that we have an expert in this environmental crime analysis area, Dr. Bichler, who has helped provide this addition to our curriculum.) The goal of this transition has been to actually train graduate students in how to do modern criminological research. By exposing our students to the statistical or analytical techniques that are most commonly used in the extant criminological literature, they will be fully prepared to go on to a PhD program, or to apply such sophisticated analytical techniques to apply to crime data if they are hired to use these applications for criminal justice agencies. Crime Analysis Outcomes Assessment

The Outcomes Assessment plan for the Crime Analysis Option was not decided on until recently, due to its recent addition to our curriculum. Therefore, we have not engaged in much activities in evaluation Student Learning Outcomes assessment. However, we have already administered the first testing of a post-test that compares Seniors in that option with Seniors in the regular CJ major. Unfortunately, a very small number of students (??) were graduating Seniors, so we will wait until we have a higher amount of graduating seniors in the Crime Analysis Option to compare/contrast their Outcomes until that time. We plan to administer the examination to more courses to gain a higher sample of entrylevel and graduating-level students regarding this Crime Analysis Option. At this time, we know relatively very little about the Crime Analysis program regarding outcomes assessment. But in the next few years, we should be able to make the types of assessments that were discussed above regarding the BA and MA programs in Criminal Justice, and be able to revise our curriculum appropriately improve our Crime Analysis Option.