Science Instructional Practices Survey (SIPS): A New Tool for Identifying Progress in Teaching the NGSS 1

Similar documents
California Professional Standards for Education Leaders (CPSELs)

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

STA 225: Introductory Statistics (CT)

Disciplinary Literacy in Science

A Pilot Study on Pearson s Interactive Science 2011 Program

FY11 Professional Development Expenditures And Learner Pre-post Test Score Gains

Unit: Human Impact Differentiated (Tiered) Task How Does Human Activity Impact Soil Erosion?

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

This Performance Standards include four major components. They are

HEALTH SERVICES ADMINISTRATION

Selling Skills. Tailored to Your Needs. Consultants & trainers in sales, presentations, negotiations and influence

Degree Qualification Profiles Intellectual Skills

What is PDE? Research Report. Paul Nichols

Common Core Exemplar for English Language Arts and Social Studies: GRADE 1

PEDAGOGICAL LEARNING WALKS: MAKING THE THEORY; PRACTICE

The My Class Activities Instrument as Used in Saturday Enrichment Program Evaluation

INSTRUCTIONAL FOCUS DOCUMENT Grade 5/Science

Update on Standards and Educator Evaluation

Van Andel Education Institute Science Academy Professional Development Allegan June 2015

MGT/MGP/MGB 261: Investment Analysis

Course specification

Enhancing Students Understanding Statistics with TinkerPlots: Problem-Based Learning Approach

and secondary sources, attending to such features as the date and origin of the information.

DOES OUR EDUCATIONAL SYSTEM ENHANCE CREATIVITY AND INNOVATION AMONG GIFTED STUDENTS?

Language and Literacy: Exploring Examples of the Language and Literacy Foundations

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

Strategic Planning for Retaining Women in Undergraduate Computing

ADDENDUM 2016 Template - Turnaround Option Plan (TOP) - Phases 1 and 2 St. Lucie Public Schools

Advancing the Discipline of Leadership Studies. What is an Academic Discipline?

Plattsburgh City School District SIP Building Goals

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology

Project title: Ecological, what else? Sustainable schools on the fast lane in Europe! Final evaluation report. 2nd Dicember 2014.

Developing an Assessment Plan to Learn About Student Learning

EQuIP Review Feedback

Sociology 521: Social Statistics and Quantitative Methods I Spring 2013 Mondays 2 5pm Kap 305 Computer Lab. Course Website

The Approaches to Teaching Inventory: A Preliminary Validation of the Malaysian Translation

eportfolio Guide Missouri State University

Fruitvale Station Shopping Center > Retail

Common Core State Standards

Developing a College-level Speed and Accuracy Test

Elementary and Secondary Education Act ADEQUATE YEARLY PROGRESS (AYP) 1O1

CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

Understanding Language

1GOOD LEADERSHIP IS IMPORTANT. Principal Effectiveness and Leadership in an Era of Accountability: What Research Says

Collaborative Classroom Co-Teaching in Inclusive Settings Course Outline

A STUDY ON THE EFFECTS OF IMPLEMENTING A 1:1 INITIATIVE ON STUDENT ACHEIVMENT BASED ON ACT SCORES JEFF ARMSTRONG. Submitted to

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Unit 7 Data analysis and design

ABET Criteria for Accrediting Computer Science Programs

Florida Reading for College Success

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

Procedia - Social and Behavioral Sciences 64 ( 2012 ) INTERNATIONAL EDUCATIONAL TECHNOLOGY CONFERENCE IETC2012

NCSC Alternate Assessments and Instructional Materials Based on Common Core State Standards

Unequal Opportunity in Environmental Education: Environmental Education Programs and Funding at Contra Costa Secondary Schools.

NC Global-Ready Schools

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Engineers and Engineering Brand Monitor 2015

Science Fair Project Handbook

Department of Research & Program Evaluation (DRPE) Office of Accountability. Requests for Flexibility Evaluation Approach APPENDIX A

Albemarle County Public Schools School Improvement Plan KEY CHANGES THIS YEAR

Utfordringer for naturfagene, spesielt knyttet til progresjon. Doris Jorde Naturfagsenteret

POL EVALUATION PLAN. Created for Lucy Learned, Training Specialist Jet Blue Airways

MIDDLE SCHOOL. Academic Success through Prevention, Intervention, Remediation, and Enrichment Plan (ASPIRE)

Procedia - Social and Behavioral Sciences 98 ( 2014 ) International Conference on Current Trends in ELT

Kristin Moser. Sherry Woosley, Ph.D. University of Northern Iowa EBI

Bayley scales of Infant and Toddler Development Third edition

Common Core Standards Alignment Chart Grade 5

Probability and Statistics Curriculum Pacing Guide

teacher, peer, or school) on each page, and a package of stickers on which

Research Design & Analysis Made Easy! Brainstorming Worksheet

Doctoral Initiative on Minority Attrition and Completion

Organising ROSE (The Relevance of Science Education) survey in Finland

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

EDUC-E328 Science in the Elementary Schools

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

WASC Special Visit Research Proposal: Phase IA. WASC views the Administration at California State University, Stanislaus (CSUS) as primarily

understandings, and as transfer tasks that allow students to apply their knowledge to new situations.

Class Numbers: & Personal Financial Management. Sections: RVCC & RVDC. Summer 2008 FIN Fully Online

GUIDE FOR THE WRITING OF THE DISSERTATION

Inside the mind of a learner

PETER BLATCHFORD, PAUL BASSETT, HARVEY GOLDSTEIN & CLARE MARTIN,

PCG Special Education Brief

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

Sociology 521: Social Statistics and Quantitative Methods I Spring Wed. 2 5, Kap 305 Computer Lab. Course Website

NAME OF ASSESSMENT: Reading Informational Texts and Argument Writing Performance Assessment

Full text of O L O W Science As Inquiry conference. Science as Inquiry

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

Special Educational Needs & Disabilities (SEND) Policy

Executive Summary: Tutor-facilitated Digital Literacy Acquisition

A cognitive perspective on pair programming

RED 3313 Language and Literacy Development course syllabus Dr. Nancy Marshall Associate Professor Reading and Elementary Education

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

Running head: DEVELOPING MULTIPLICATION AUTOMATICTY 1. Examining the Impact of Frustration Levels on Multiplication Automaticity.

Integrating Common Core Standards and CASAS Content Standards: Improving Instruction and Adult Learner Outcomes

Master s Programme in European Studies

Transcription:

Science Instructional Practices Survey (SIPS): A New Tool for Identifying Progress in Teaching the NGSS 1 Introducing the Science Instructional Practices Survey (SIPS) The Science Instructional Practices Survey (SIPS) is a new, short survey tool designed to document shifts in 3 rd -10 th grade teacher instructional practice towards practices aligned with the Next Generation Science Standards (NGSS). Such assessment is important to document progress and guide future decisions regarding the efforts involved in shifting to new standards. The SIPS improves on existing survey instruments by identifying a broad range of instructional practices that support student inquiry in science, as well as communication and critical thinking skills. This survey instrument highlights key shifts in teaching and learning that are central to the design of the NGSS and similar new science standards being adopted by states and throughout the country. The Need to Measure Shifts in Instructional Practice Ambitious efforts are taking place to implement a new vision for science education in the United States, in both NGSS adopted states and those states creating other standards. Teacher educators across the United States are involved in supporting teacher shifts in practice toward the new standards. With these efforts, it will be important to document shifts in science instruction towards the goals of NGSS and broader science education reform. Survey instruments are often used to capture instructional practices, but a survey that captured NGSS science instructional practices did not previously exist. To address this need, we developed and validated the SIPS survey instrument. Collection of survey data will help policy makers and teacher educators understand teachers progress in implementing NGSS Science and Engineering Practices (NGSS SE Practices), including the following exemplar questions. How are teachers providing opportunities for students to engage in science and engineering practices in their classrooms? What are the shifts teachers are making to provide science instruction that makes science accessible for all students? To what extent does participation in professional development activities support teachers in shifting their practices? 1 This report is based on Hayes, K.N. Lee, C. S., DiStefano, R., O'Connor, D., & Seitz, J. (2016). Measuring science instructional practice: A survey tool for the age of NGSS. Journal of Science Teacher Education. 27, 137-164 1

A Tool for Tracking Trends, Not Evaluating Teachers The survey questions ask teachers to rate their efforts to engage their students in using the NGSS SE Practices as part of their comprehensive classroom instruction. The SIPS is NOT intended to evaluate individual teacher practice, but rather to report average results across a group of teachers. When administered before and after professional development activities, district administrators, professional development providers and policymakers can use the SIPS as a way to shed light on the impact of professional learning experiences on teacher practice. The SIPS can also be used more generally to highlight how instruction is changing over time. NGSS Science and Engineering Practices (SE Practices) The NGSS are designed to shift science learning away from memorizing facts and following procedures towards more studentcentered instruction, where teachers serve as facilitators, encouraging students to ask their own questions, conduct investigations, and analyze information and solve problems. Student knowledge and prior conceptualizations are treated as assets in the classroom. The standards identify core scientific concepts to be learned at each grade level, as well as the eight NGSS SE Practices that students should apply throughout their learning: 1. Asking questions and defining problems 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations and designing solutions 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information 2 The Science Partnership This educator brief is one of a series published by the Science Partnership, an 8-year project to develop, implement, and study a comprehensive K-12 professional development model for science education. The Science Partnership is a collaborative led by the California State University East Bay and the Alameda County Office of Education, with partners including the California Science Project, school districts and teacher leaders. Its work supports science teachers in shifting their instructional practices by developing teacher knowledge, teacher leadership and organizational capacity. The Science Partnership focuses on schools that serve predominately lowincome, underrepresented students in the East Bay region of the San Francisco Bay Area. The SIPS Survey was developed by Kathryn Hayes, Christine Lee Bae, Rachelle DiStefano, Dawn O Connor, and Jeff Seitz. The work was supported by the National Science Foundation Grant No. 0962804. For more information, visit www.sciencepartnership.org. 2 SIPS Items relating to this NGSS SE Practice did not factor out properly and were not included in the final survey. 2

Documenting NGSS SE Practices in Support of Implementation The SIPS fills a gap in survey research tools to investigate this type of more student-centered instruction. In addition to asking direct questions about engaging students in the NGSS Practices, the SIPS also includes items to measure traditional instruction (such as lecture), and engaging student prior knowledge, areas not typically included in other science education survey instruments. The survey tool consists of 24 questions covering the following six areas of instructional practice, with four of these areas linking to the NGSS Practices as indicated below: 1. Instigating an Investigation (NGSS Practices 1 and 3) 2. Data Collection and Analysis (NGSS Practices 3-5) 3. Critique, Explanation and Argumentation (NGSS Practices 6-7) 4. Modeling (NGSS Practice 2) 5. Traditional Instruction 6. Prior Knowledge It will take several years for teachers and school leaders to learn and become fully adept at teaching based on the Next Generation Science Standards. As states move forward to adopt and implement these new and challenging standards, patience and persistence will be required to help teachers, school leaders, students and parents progress through the transition. Data from SIPS can help education leaders understand the long-term movement of teaching practices towards the NGSS vision, and assist them in targeting and allocating resources to support teachers in their professional learning 3. How to Administer the SIPS The SIPS can be completed in approximately 10-20 minutes. Teachers who have little exposure to NGSS, as well as those experienced with the standards may take the survey. We recommend collecting data for at least 10 teachers to obtain a valid average, and administering it both before and after professional development. That said, we recommend administering the survey no more than two times per year. Interpreting Results Scoring the SIPS is done by calculating the average Likert rating (1 = Never, 2 = Rarely/a few times a year, 3 = Sometimes/once or twice a month, 4 = Often, 5 = Daily or almost daily) for each respondent for each of the six aspects of instructional practice (see the SIPS Scoring Guide), then averaging across all respondents. Reminder: the SIPS is NOT intended for evaluation or comparison of individual teacher practice. All analysis should be done using average scores drawn from at least 10 surveys. With that data in hand, school and district leaders can examine trends and compare progress in each area. 3 The SIPS should be used cautiously with elementary teachers as the survey questions were designed for use primarily with teachers who specialize in science instruction. Thus, for example, elementary teacher responses on the frequency of their science instruction may not be comparable for data aggregation. 3

SIPS Survey Never Rarely (a few times a year) Sometim es (once or twice a month) Often (once or twice a week) Daily or almost daily How often do your students do each of the following in your science classes: 1. Generate questions or predictions to explore 2. Identify questions from observations of phenomena 3. Choose variables to investigate (such as in a lab setting) 4. Design or implement their OWN investigations 5. Make and record observations 6. Gather quantitative or qualitative data 7. Organize data into charts or graphs 8. Analyze relationships using charts or graphs 9. Analyze results using basic calculations 10. Explain the reasoning behind an idea 11. Respectfully critique each others reasoning 12. Supply evidence to support a claim or explanation 13. Consider alternative explanations 14. Make an argument that supports or refutes a claim 15. Create a physical model of a scientific phenomenon (like creating a representation of the solar system) 16. Develop a conceptual model based on data or observations (model is not provided by textbook or teacher) 17. Use models to predict outcomes How often do you do each of the following in your science instruction: 18. Provide direct instruction to explain science concepts 19. Demonstrate an experiment and have students watch 20. Use activity sheets to reinforce skills or content 21. Go over science vocabulary 22. Apply science concepts to explain natural events or realworld situations. 23. Talk with your students about things they do at home that are similar to what is done in science class (e.g., measuring, boiling water). 24. Discuss students prior knowledge or experience related to the science topic or concept. 4

SIPS Survey Scoring Guide To score the SIPs survey, a unique score should be calculated by averaging the ratings of items within that factor. For example, for the factor Instigating an Investigation, the score will be the average ratings from items 1 to 4. Factor NGSS SE Practice Survey Item Score 1. Instigating an Investigation 2. Data Collection and Analyses 3. Critique, Argumentation, and Explanation 1) Questioning 3) Planning and Carrying Out an Investigation 3) Planning and Carrying Out an Investigation 4) Analyzing and Interpreting Data 5) Using Mathematical and Computational Thinking 6) Constructing Explanations 7) Engaging in Argument from Evidence 1. Generate questions or predictions to explore Average of items 2. Identify questions from observations of phenomena 1 to 4: 3. Choose variables to investigate (such as in a lab setting) 4. Design or implement their OWN investigations 5. Make and record observations Average of items 6. Gather quantitative or qualitative data 5 to 9: 7. Organize data into charts or graphs 8. Analyze relationships using charts or graphs 9. Analyze results using basic calculations 10. Explain the reasoning behind an idea Average of items 10 to 15: 11. Respectfully critique each others reasoning 12. Supply evidence to support a claim or explanation 13. Consider alternative explanations 14. Make an argument that supports or refutes a claim 4. Modeling 2) Developing and Using Models 15. Create a physical model of a scientific phenomenon (like creating a 5. Traditional Instruction Average of items representation of the solar system) 16 to 18: 16. Develop a conceptual model based on data or observations 17. Use models to predict outcomes 18. Provide direct instruction to explain science concepts Average of items 19. Demonstrate an experiment and have students watch 20. Use activity sheets to reinforce skills or content 21. Go over science vocabulary 6. Prior Knowledge 22. Apply science concepts to explain natural events or real-world situations. 23. Talk with your students about things they do at home that are similar to what is done in science class (e.g., measuring, boiling water). 24. Discuss students prior knowledge or experience related to the science topic or concept. 19 to 22: Average of items 22 to 24: 5

Measuring Science Instructional Practice SIPS Instrument Design and Validation Instrument development and testing proceeded in seven phases: In Phase 1, project researchers conducted an extensive review of the literature (Fig. 2.1) from which they documented areas of instructional practice. Phase 2 focused on creating survey items, including review and modification of items from existing instruments as well as development of new items. Particular attention was given to writing items that differentiate levels of student involvement and cognitive demand. For example, Identify questions from observations of phenomena supports activity at a higher student cognitive level than simply generate questions or predictions to explore. Sources of science instructional practices response items. Main source instruments Citation Horizon Survey of Science and Mathematics Education Banilower, et., 2013 Science teaching practices Lee, et al., 2009 Scientific inquiry scale Llewellyn, 2013 PSOP (observation tool) Forbes, et al., 2013 EQUIP (observation tool) Marshall, et al., 2009 NGSS practices NRC, 2012 Phases 3 and 4 involved validation through expert review and teacher cognitive interviews (i.e., listening to teachers verbalize their thought process as they answer the questionnaire). The 31- item survey was then tested with 397 science teachers in third through tenth grades. In Phase 5, construct validity was assessed through Exploratory Factor Analysis (EFA). The emergent factors were then tested for validity through confirmatory factor analysis (CFA) with an independent sample. The six factor model met all Goodness of Fit indices. Phase 6 consisted of establishing reliability through internal consistency (Cronbach s alpha =.80-.88). In Phase 7, external validity was evaluated through examining the correlations between the survey scores and hours of professional development, as well as school demographics. Cautionary Recommendations As a self-report survey, the SIPS is inherently vulnerable to participant bias. In particular, teachers who are newer to the NGSS and lack a deep understanding of the instructional shifts may give less accurate ratings. That said, self-report surveys are commonly used in educational research, to understand the types of practices in active use in the classroom, and to provide a relatively simple and inexpensive mechanism to determine general trends across a large sample of teachers. Potential bias can be ameliorated by conducting retrospective surveys (i.e., asking teachers to rate themselves looking back to a previous moment in time, perhaps prior to participating in selected professional development activities) or by triangulating survey results with classroom observations. 6

Measuring Science Instructional Practice Example of the SIPS instrument in Use Example 1: Comparing Average Teacher Practices In this example, 155 teachers indicated the amount of time they engaged in each area of instructional practice. First, teachers reported engaging in Modeling (Factor 4) the least, closely followed by Instigating an Investigation (Factor 1) and Critique, Explanation and Argumentation (Factor 3). The relatively low average ratings for Factors 3 and 4 correspond to scholarship that suggests modeling, explanation, and Average Teacher Rating Prior Knowledge Traditional Instruction Data Collection and Analysis Critique, Explanation and Argumentation Instigating an Investigation Modeling argumentation are the least familiar to teachers and the least often implemented (Capps & Crawford, 2013; Forbes, et al. 2014). As expected, Traditional Instruction averaged relatively high, although not the highest, which was Prior Knowledge. 2.5 2.7 3.2 3.1 3.5 0 1 2 3 4 3.7 Example 2: Changes in Instructional Practice Following PD Figure 2 demonstrates teacher practices before and after a 4-month Professional Development. These 28 teachers rated low on several constructs having to do with modeling, investigations, and critique. However, they demonstrated substantive and significant (in all but Traditional) pre- to postincreases after the PD. Whether these increases remain in place could be tested by asking the teachers to take the survey again the following year. Relationship with PD Example 3: Statistical Analysis of the We used the SIPS instrument to analyze whether teachers number of science professional development hours had a significant relationship with their rating on each subscale. Regression analysis demonstrated that the number of hours significantly but weakly predicted the amount of time they spent on 1) Instigating an Investigation, 3) Critique, Explanation and Argumentation, and 4) Modeling, in each case explaining 3-4% of the variance with p <.05. In addition, PD hours had a negative relationship with 5) Traditional Instruction, explaining 4% of the variance (p <.5). 7