Collegiate Learning Assessment Data CLA Regression 2007-2008 data Karen H Nelson 1 and Judy Wheaton 2 November 2009 Austin College administered the CLA (Collegiate Learning Assessment) in both 2006-2007 and 2007-2008 to first year (FY) and senior year (SY) students. We had committed to do so as part of our work in a Teagle Consortium on Value Added by Liberal Education, in which we were partnered with Furman University, Washington & Lee University, and Juniata College. In each test administration, our goal was to get 100 respondents. Our data from Fall 2006 come from 64 FY and from Spring 2007 from 92 SY students. We had 84 FY students take the CLA in Fall 2007 with 80 SY students in Spring 2008. Brief history: The CLA is used in conjunction with the Teagle Foundation Value Added Grant (2005-2008). The instrument was developed in by The Council on Aid to Education and launched in 2004. On their website (http://www.cae.org/content/pro_collegiate.htm ), they say: Our measures are designed to simulate complex, ambiguous situations that every successful college graduate may one day face. Life is not like a multiple choice test, with four or five simple choices for every problem. So we ask students to analyze complex material and provide written responses. The CLA measures are uniquely designed to test for reasoning and communications skills that most agree should be one outcome of a college education. These data were reported to the Committee on Institutional Effectiveness and shared with Teagle partners (Furman University, Juniata College, and Washington & Lee University). Table 1: CLA summary data Overall # four-year AC better than assessment institutions 2006-2007 At Expected 115 40% 1 Dr. Nelson is Associate Vice President for Institutional Effectiveness and Professor of Psychology at Austin College 2 Error! Reference source not found. is Director of Institutional Research and Assessment at Austin College 1
2007-2008 Above Expected 176 71% The first comparison we can make is between the overall assessment of value added. The good news is we moved from an overall assessment of At Expected in 2007 to Above Expected in 2008. The 2006-2007 CLA Institutional Report states: A value of 5 means that you (Austin College) performed better than at least 40 percent of (115) four-year institutions. The 2007-2008 CLA Institutional Report States: Austin College contributes more to the learning gains made by students than 71 percent of the 176 four-year undergraduate institutions participating in the 2007-2008 CLA. only does our overall assessment move up, our relative ranking does as well (from doing better than 40% to doing better than 71% of four-year institutions). e that the number of comparison institutions also jumps up, which may well mean more weak 4-year schools are checking out the CLA, making us look better not because our students did better, but because more students at other institutions did worse. Typical criticisms of the CLA are: 1. If samples are not truly representative of first and senior year students at the college, the data are questionable; 2. There is little quality control over incentives, selection of samples, motivation, etc. the data are not reliable; 3. Many critics question whether the CLA is a valid measure of college learning. In order to better understand our students performance, we completed a regression analysis of the 2007-2008 data. This enables us to look at complex correlations between components of the SAT (Scholastic Aptitude Test) Total score, SAT Math, SAT Verbal, SAT Writing, ACT and cumulative GPA (for seniors only) and performance on the CLA. In the first table below (Table 1), the statistical significance is reported for Total SAT score and performance on four different aspects of the CLA. When students enter the computer lab to complete the CLA, they are randomly assigned to one of two conditions. Half complete a 90-minute performance task. The other half complete the analytical writing task, comprised of a 45-minute make-an-argument task and a 30-minute critique-an-argument task. The first three columns in each of the tables pertains to those students who did the performance task -- did the SAT Total (or whatever) correlate significantly with their basic score, their scaled score relative to other Austin College students, or their scaled score relative to all students completing the task at all schools? The next three columns ask the same question about the analytic writing score and the final two columns use the basic score on the two components of the analytical writing scale. Therefore, if we look at Table 1, notice that for the first year (FY) students, in all but the last column (the critique-anargument scale ), Total SAT score correlates significantly with all CLA measures. The senior year participants similarly 2
have no significant correlation between their Total SAT and this last critique-an-argument task. However, among seniors (SY) students, the performance task is utterly unrelated to Total SAT score. In other words, if I know a student s SAT score, I can predict her performance on the CLA if she is a first year student for all but one score, but I cannot predict the performance of a senior on the performance task. Table 1. SAT TOTAL CORRELATIONS WITH CLA DATA Rank All Make- An- Freshman Senior significant When we look at Table 2 examining only the Math SAT score, we find an even more interesting pattern. Math SAT correlates significantly with the analytical writing scores on the CLA (calculated all three ways) but neither the performance task score nor the analytical writing score for either first year or senior students. Why would Math predict analytical writing in all three ways it is analyzed? Table 2. SAT MATH CORRELATIONS WITH CLA DATA Freshman Senior Rank All Make- An- As if you weren t confused already, you might expect Verbal SAT (Table 3) to be an even better predictor of a reading and writing task than Math SAT (keeping in mind these sections of the SAT are bubble sheet, multiple choice tasks). You 3
would be right for freshmen and dead wrong for seniors. Verbal SAT is completely unrelated to senior CLA performance, but Verbal SAT in freshmen nicely predicts all 8 scores. Table 3 SAT VERBAL CORRELATIONS WITH CLA DATA Rank All Make- An- Freshman Senior The next analysis (Table 4) looks at the SAT writing score, which we had for first years but not seniors. Once again, the SAT score fails to correlate with any measure of the performance task or the critique-an-argument component of the analytical writing task. Table 4. SAT WRITING CORRELATIONS WITH CLA DATA Rank All Make- An- Freshman Senior N/A N/A N/A N/A N/A N/A N/A N/A Some students in the first year class submitted ACTs instead of or as well as SATs (se Table 5). Interestingly, the pattern here is almost the opposite of the SAT math data. A student s ACT score correlates only with the performance task scores and nothing else. 4
Table 5. ACT TOTAL CORRELATIONS WITH CLA DATA Rank All Make- An- Freshman Senior N/A N/A N/A N/A N/A N/A N/A N/A The last analysis looks at correlations between seniors cumulative GPAs and the CLA scores. The performance task once again is utterly unpredicted by knowing cumulative GPA and neither of the specific analytical writing tasks is predicted, though the pooled analytic writing score is significantly correlated. Table 6. CUM GPA CORRELATIONS WITH CLA DATA Rank All Make- An- Freshman N/A N/A N/A N/A N/A N/A N/A N/A Senior In the end, these data support our decision to not administer the CLA until more national data are available. The performance task has consistently been shown in our samples to not relate to general measures of ability or specific measures of academic success at Austin College. The performance task itself, while theoretically interesting, has little face validity for most college students (a 90 minute writing task, in the middle of which half of your class mates will begin to walk out because they have completed their 75 minute tasks, paper available for organizing ideas only if you ask for it and no context for framing the importance of the task). 5