Student Risk Screening Scale

Similar documents
Sight Word Assessment

Susan Castillo Oral History Interview, June 17, 2014

Testing for the Homeschooled High Schooler: SAT, ACT, AP, CLEP, PSAT, SAT II

Introduction to the Practice of Statistics

5 Guidelines for Learning to Spell

2013 DISCOVER BCS NATIONAL CHAMPIONSHIP GAME NICK SABAN PRESS CONFERENCE

The Evolution of Random Phenomena

Our installer John Stoddard was polite, courteous, and efficient. The order was exactly as we had placed it and we are very satisfied.

Webinar How to Aid Transition by Digitizing Note-Taking Support

Appendix L: Online Testing Highlights and Script

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

Building a Sovereignty Curriculum

Case study Norway case 1

a) analyse sentences, so you know what s going on and how to use that information to help you find the answer.

Data-Based Decision Making: Academic and Behavioral Applications

No Child Left Behind Bill Signing Address. delivered 8 January 2002, Hamilton, Ohio

Recent advances in research and. Formulating Secondary-Level Reading Interventions

ALEKS. ALEKS Pie Report (Class Level)

Houghton Mifflin Online Assessment System Walkthrough Guide

EVERYTHING DiSC WORKPLACE LEADER S GUIDE

Creating an Online Test. **This document was revised for the use of Plano ISD teachers and staff.

How to make an A in Physics 101/102. Submitted by students who earned an A in PHYS 101 and PHYS 102.

Custom Program Title. Leader s Guide. Understanding Other Styles. Discovering Your DiSC Style. Building More Effective Relationships

Best Practices in Internet Ministry Released November 7, 2008

Me on the Map. Standards: Objectives: Learning Activities:

Online Family Chat Main Lobby Thursday, March 10, 2016

Save Children. Can Math Recovery. before They Fail?

WiggleWorks Software Manual PDF0049 (PDF) Houghton Mifflin Harcourt Publishing Company

Contents. Foreword... 5

LEARN TO PROGRAM, SECOND EDITION (THE FACETS OF RUBY SERIES) BY CHRIS PINE

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Field Experience Management 2011 Training Guides

SMARTboard: The SMART Way To Engage Students

Your School and You. Guide for Administrators

We'll be looking at some of the work of Isabel Beck, Mckeown, and Kucan as we look at developing

TIPS PORTAL TRAINING DOCUMENTATION

Course Content Concepts

Basic lesson time includes activity only. Introductory and Wrap-Up suggestions can be used

The Round Earth Project. Collaborative VR for Elementary School Kids

Pyramid. of Interventions

PSYC 620, Section 001: Traineeship in School Psychology Fall 2016

Learning Lesson Study Course

AST Introduction to Solar Systems Astronomy

Implementing an Early Warning Intervention and Monitoring System to Keep Students On Track in the Middle Grades and High School

Village Extended School Program Monrovia Unified School District. Cohort 1 ASES Program since 1999 Awarded the Golden Bell for program excellence

Husky Voice enews. NJHS Awards Presentation. Northwood Students Fight Hunger - Twice

Occupational Therapy and Increasing independence

Curriculum Design Project with Virtual Manipulatives. Gwenanne Salkind. George Mason University EDCI 856. Dr. Patricia Moyer-Packenham

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Too busy doing the mission to take care of your Airmen? Think again...

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

1110 Main Street, East Hartford, CT Tel: (860) Fax: (860)

Undocumented Students. from high school also want to attend a university. Unfortunately, the majority can t due to their

Beginning Blackboard. Getting Started. The Control Panel. 1. Accessing Blackboard:

Utilizing FREE Internet Resources to Flip Your Classroom. Presenter: Shannon J. Holden

ADVANCES IN ASSESSMENT: THE USE OF CHANGE SENSITIVE MEASURES IN COMPREHENSIVE SCHOOL-BASED MODELS OF SUPPORT

Chapter 4 - Fractions

GRANT WOOD ELEMENTARY School Improvement Plan

On May 3, 2013 at 9:30 a.m., Miss Dixon and I co-taught a ballet lesson to twenty

Executive Guide to Simulation for Health

Attendance/ Data Clerk Manual.

SECTION 12 E-Learning (CBT) Delivery Module

babysign 7 Answers to 7 frequently asked questions about how babysign can help you.

What is this species called? Generation Bar Graph

WHO ARE SCHOOL PSYCHOLOGISTS? HOW CAN THEY HELP THOSE OUTSIDE THE CLASSROOM? Christine Mitchell-Endsley, Ph.D. School Psychology

The lasting impact of the Great Depression

TU-E2090 Research Assignment in Operations Management and Services

LMS - LEARNING MANAGEMENT SYSTEM END USER GUIDE

WARREN COUNTY PUBLIC SCHOOLS CUMULATIVE RECORD CHANGE CHANGE DATE: JULY 8, 2014 REVISED 11/10/2014

NCAA Eligibility Center High School Portal Instructions. Course Module

Faculty Schedule Preference Survey Results

Helping Graduate Students Join an Online Learning Community

Getting Results Continuous Improvement Plan

Emergency Safety Interventions: Requirements

School Year 2017/18. DDS MySped Application SPECIAL EDUCATION. Training Guide

Using SAM Central With iread

No Parent Left Behind

Quiz for Teachers. by Paul D. Slocumb, Ed.D. Hear Our Cry: Boys in Crisis

Schoology Getting Started Guide for Teachers

WEEK FORTY-SEVEN. Now stay with me here--this is so important. Our topic this week in my opinion, is the ultimate success formula.

Gifted & Talented. Dyslexia. Special Education. Updates. March 2015!

EMPOWER Self-Service Portal Student User Manual

Port Jefferson Union Free School District. Response to Intervention (RtI) and Academic Intervention Services (AIS) PLAN

Listening to your members: The member satisfaction survey. Presenter: Mary Beth Watt. Outline

2 Any information on the upcoming science test?

Northland Pioneer College Cosmetology Advisory Board Minutes Monday, October 7, :30 6:00 p.m.

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Hentai High School A Game Guide

MINUTE TO WIN IT: NAMING THE PRESIDENTS OF THE UNITED STATES

Introduction to Personality Daily 11:00 11:50am

Eduroam Support Clinics What are they?

A BOOK IN A SLIDESHOW. The Dragonfly Effect JENNIFER AAKER & ANDY SMITH

DegreeWorks Advisor Reference Guide

Personal essay samples for college admission. 8221; (Act 5, Scene, personal essay. Bill Johanson is the college of all the Daily For samples..

Your Guide to. Whole-School REFORM PIVOT PLAN. Strengthening Schools, Families & Communities

A CONVERSATION WITH GERALD HINES

Young Enterprise Tenner Challenge

Early Warning System Implementation Guide

My Little Black Book of Trainer Secrets

Positive Learning Environment

Transcription:

Student Risk Screening Scale This next screening tool is the Student Risk Screening Scale, and this is a free access screening tool, also developed out of University of Oregon, developed by a man named Tom Drummond. It has been available since 1994. This particular screening tool is a free access screening tool, meaning it costs nothing to purchase because it's actually something you can build within your school's information system. This is a very very simple screening tool. This particular one has seven items and it is these items listed here. Each student in a school is rated on just those seven items at those three time points using that Likert-type scale. I want to be really clear, when you first read these, you're thinking "oh my gosh, those items take my breath away." You are not saying that a student is a liar or is a behavior problem. What you are saying is, you're looking to see if these behaviors occur together and to a high enough degree, because if they do, this predicts really important academic and behavioral outcomes for kids. In this particular screening tool, every single student gets evaluated on those items, so teachers, based on what they know to be true about these kids, fill this out, and they get a total score. So if a student scores 0 to 3 they're considered low risk. If a score is 4 to 8 it's considered moderate risk, and 9 to 21 is considered high risk. In many of the districts that we support, these are already set up. They build these in a secure trusted drive, so I would click on like a particular school in a district and underneath that I would have a folder for each teacher that is permissioned for only that teacher to go in there to complete these ratings, because it's not a discussion, it's one teacher's independent rating. Some districts will do it where the ID number is placed on the folder, so that as people get married or unmarried, that ID number doesn't change. I click into that folder and there's a spreadsheet, and that spreadsheet is where my students names and ID numbers are prepopulated. And you rate across, not down. We'll have another opportunity for you to learn more about the logistics, but for today, just keep in mind this big idea. It's just those items, and they predict very important outcomes for kids. This is an example of what it would look like if you were to click in to one of the folders, and ideally these would be prepopulated with students' names and ID numbers. And then you would rate across. So in this illustration in front of you, you see that that one student had a 0, meaning that their teacher had no knowledge that child had stolen anything in the past. Each of these items will surprise you in the sense that there is not an operational definition for each, which is initially unsettling, but it has really strong predictive validity. As I mentioned, each student gets a total score that places them into one of those categories. Here are data from a middle school where they've been screening for some time. They've been screening since 2004. The first time they screened, they had 77 percent of their kids in their school placed in that low risk category. 17 percent in the moderate, and 6 percent in high. As you look at these data in front of you, you see that the green bar tends to get bigger over time, suggesting that as the school is implementing their Tier 1 effort with fidelity, you're starting to see an increase in the percentage of kids in low risk. I do feel like I should tell you, when you re looking at 2008, that those results are not typical. You would expect that those would be about 3 to 5 percent of kids in the high risk category and about 10 to 15 percent scoring in the yellow, if you would. Now that particular school, when that next year, when they started to see that slight increase in risk, they're like "oh my gosh, what's going on?" The first thing they did is they went

back to their treatment integrity data and they realized that, as they had been hiring new teachers through the years, they had forgotten to train the teachers in the full aspects of their Tier 1 efforts. So they did a booster session, retrained their school, and then next year they're back on point again. I want to mention, when you look at these data, one of the reasons I think the school was entirely successful, was because of strong leadership on their team and also because they took in that social validity information twice a year, in fall and spring, and used it to make revisions at the end of every year. So they were constantly using that continuous improvement feedback loop. Now if you look at the 2011 data, you see there are 12 kids in the high risk category, 20 in the moderate risk category, so these pieces of information would be used in conjunction with other information, like how are they doing in terms of grade-point average, how are they doing in terms of attendance, to see what additional needs they might have. I have so much respect for this team. The person that is currently principal at this school, he used to be the vice principal when he first started these efforts. He had told me at one point that he feels like his job had shifted dramatically. That he used to be a disciplinarian, and now he saw himself as an interventionist, meaning that he was using these data to be proactive, actually searching for kids that might need extra support. When they first started screening, this was one of the first middle schools in the country that started using this measure, which was initially designed for use just at the elementary level. And now there's been multiple validation studies showing how well this works in middle and high schools. If you look at this particular slide, it will show you information on how well it predicts how kids do. So teachers completed that first rating and it was just first period teachers. So they as a team decided, and the team included an administrator with decision-making authority, two gen-ed teachers, a special ed teacher, and some outside support people, and they met regularly to install these practices in their school. One of the things that they did is after they did their fall rating, and it was first period as I mentioned. They picked first period because it was also an extra 10 minutes of homeroom time attached to it, so they felt like that was the period that had the most contact time with students. We waited until the end of that academic year, and waited until the year had progressed, and at the end of the year we went and collected information on every single student in the school, all using ID numbers, not student names, and then we ran these analyses. And what we can tell you is that those scores in the fall time point, that the kids who started off at low risk, they ended the year with, on average for that group, less than two office discipline referrals. The kids at the beginning of the year who started off in the moderate risk category actually earned five office discipline referrals. And the kids that started off the year in the high risk category actually earned on average eight office discipline referrals. So that information on screening actually predicted how well those kids did that year in terms of earning office discipline referrals. It also predicted the average number of days that they were spending in on-campus suspension. Then, for every teacher who feels like "hey, that's nice but I'm really, I've got to focus on my academic course so that I can meet my obligations for my state testing," I understand that, but if you look at these data, those same scores actually predicted how kids ended the year academically as well. So for example, looking at this, the kids in the low risk group ended the year with a 3.35 grade-point average, and the students in the moderate and high risk categories

had GPAs of 2.63 and 2.32. There's no statistically-significant differences between those means, which means that any sign of risk at the beginning of the year is jeopardizing kids academically, so we need to intervene. And in middle school, I'm a former middle school teacher, the way kids fail middle school is by not doing their stuff. So when you look at these data on the average number of classes failed, and those groups, if you look at it, the kids that began the year in the low risk category, they failed on average less than one class over that course of the year. The kids in the moderate risk, they failed almost three, and the kids at the high risk failed almost four. So the question becomes, is it worth your time to do these screenings to detect who s going to have challenges. And I think almost unequivocally people say yes, it's worth the time, particularly when you think about the fact that the first screener that I showed you takes about 45 minutes to do and this one takes about 15 minutes. So it's less than one faculty meeting to be able to get that information. Since this time we've also done studies at the high school level as well, and in several other middle schools, and I want you to know that those predictive validity analyses yield the same information. So the short story is that that information predicts how kids do academically and behaviorally across the grade span. A common question that comes up for me is, Are those capturing the same kids? Does that longer one work the same as the shorter one? I again want to say, I think that first one is absolutely the gold standard for systematic screening, but I also want you to know that the analyses that we've recently done suggest that if you're looking for kids with acting out behavior, those two tests are working pretty well to capture the kids with externalizing behavior patterns. The one I just showed you, with those seven items, was not intended to capture kids who have internalizing issues. It was more focused on externalizing behavior. When I compare those two screening tools, if I want to see how well they work, I could do these things called ROC Curves, it's Receiver Operating Curves. Basically the way it works, is if I wanted to determine if like Nathan or Katie or Abbey is at risk, I can flip a coin and be right half the time, heads or tails. My goal with these screenings is to improve chance estimates. So I want to improve above this 50 percent line. If you look at these dots on this graph, these represent scores on the SRSS as compared to the SSBD's acting out side, and these actually suggest that these improve chance estimates by.45. So it means that these screening data are working equally well to detect kids with acting out behavior. Surprisingly, although it wasn't intended to do so, those original seven items were also actually doing a fairly decent job of capturing kids who have internalizing issues, and it improved chance estimates by 30 percent. Perfect prediction would have been falling in this entire box. So this got us thinking, what could we do to add to the Student Risk Screening Scale to better, to adjust the use of that tool to capture kids with internalizing issues. So recently, we have been working on determining which items we can add to these scales to look for kids with internalizing issues. Currently we are piloting these seven items at the elementary, middle, and high school level, and I can tell you right now that our current work at the elementary level suggests that five of those items are sufficient to improve the predictive validity of those data by 5 percent. This means that I'm close to improving chance estimates by 35 percent. This is good news because it takes very little time, very little money. I'm not suggesting that you pick one or the other. I think the question isn't should we screen? It's where do we begin? Knowing now that we can use these screening tools and look for kids who are on track to have some struggles, we want to know which screening tool to put in place.

In terms of the work at the elementary level that's been done to date, if you were looking elementary, middle, or high school, those cut scores that I already showed you can be used to put kids into groups. So if you look, 0 to 3 low risk, 4 to 8 moderate risk, 9 to 21 high risk. We've also learned right now that of those five items that work for internalizing issues at the elementary level, these are the cut scores. These are preliminary in nature. We have one study published on this and we encourage additional inquiry in this area. But those cut scores are, if you score 0 to 1 you re at low risk, 2 to 3 you are at moderate, and 4 to 15 you're at high risk. The reason why this is important is because now we can look to see how kids are performing. I really do believe that we can really support our teachers in looking for kids with these challenging behaviors. I would absolutely agree that we have other ways of doing this, when we're looking to find kids with externalizing behavior patterns. Many PBIS schools, Positive Behavior Intervention Support schools, they use office discipline referral data to look for kids that have challenges. So in those, if a kid has 0 to 1 office discipline referrals, everything's considered to be in the green. If they have scores of 2 to 5, they're considered to need, potentially, Tier 2 support, and if they need more, like if they have 6 or more they're considered high risk. But the problem with that is you're waiting for kids to earn those office discipline referrals, meaning that they're still struggling until they get that many acquired that would connect them with the support. My most major concern is that kids with internalizing issues are not doing things that yield office discipline referrals. So if we're not screening, we're missing an opportunity to support those kids who have these behaviors that are far less easy to be detected. If you were using the SRSS, which we call now IE for internalizing and externalizing behaviors, it's done the same way that you do the Student Risk Screening Scale. Again, this is an illustration at the elementary level. This is fictitious data, so you would have names and ID numbers and the teachers rate across. This is a slide that illustrates conditional formatting. We've worked with some very very talented tech people to build these systems in a number of different districts. This first total score over here represents the sum of the first seven items, and you can use that kindergarten all the way through high school, and the intent there is 0 to 3, again, low risk, 4 to 8 moderate, 9 to 21 high risk. And then this second column of totals, right now we're just looking at this at the elementary level, but again these are 0 to 1 is low, 2 to 3 is moderate, and 4 to 15 is high. We're not yet doing decision-making with combined scores. So here is the original seven and here are the five. Keeping this in mind, these are an example of how, actual data from a school, they're doing this for the first time. This is their winter data. This is the original seven items. So here, when they screened for the very first time, and this school is in their first year of implementation. Again, you see your 77 percent of kids in the low, 18 percent in moderate, and 4 percent in high for the original seven items. You can also look at this by grade level. You can see where the variability is in a school as to where kids might need extra support. And here we see a little less risk. We have about 78 percent of the kids in the low, and then 13 percent at moderate, and just under 8 at high risk. And again, I can look at that by grade level to see how to focus my intervention efforts. As we go into the upper years in elementary school, we see these breakdowns again by 3rd, 4th, and 5th grade. I also want to let you know that we are working on another screening tool called the Student Risk Screening Scale for Early Childhood, and these are different items, because of course developmentally it would look a little different. And you're more than welcome to read about that in this article published in Topics in Early Childhood Special Education.

If you are with somebody right now, we're going to encourage you to take the next five minutes to just take a pause and think through the idea of how to integrate screening into your tiered system of support. If you're by yourself, we encourage you just to jot down a few notes, the remaining questions that you might have regarding systematic screening. We'll catch you back in just a few minutes. I hope you had a good conversation, or at least had a chance to jot down some notes. And I want to again reiterate that we just went over two screening tools. One was the Systematic Screening for Behavior Disorders and that is produced by Hill Walker, Herb Severson, and Ed Feil, and they are from the University of Oregon. The second one was from the Student Risk Screening Scale by Drummond, and that's the free access one I was talking about. Both of those I think are strong screening tools, but they're definitely not the only ones. For information on the SSBD, the first one I mentioned, I encourage you to log online and get that. I highly recommend that you explore these different options for systematic screening, and at the end we'll walk through a little decision-making guide to help focus your searching as you're picking a screening tool.