Thank you for joining us virtually to watch this pre recorded webinar overview of the 2016 National Survey of Student Engagement, also known as the

Similar documents
National Survey of Student Engagement (NSSE) Temple University 2016 Results

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

National Survey of Student Engagement (NSSE)

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

ABET Criteria for Accrediting Computer Science Programs

2005 National Survey of Student Engagement: Freshman and Senior Students at. St. Cloud State University. Preliminary Report.

What Is The National Survey Of Student Engagement (NSSE)?

National Survey of Student Engagement Executive Snapshot 2010

National Survey of Student Engagement

BENCHMARK TREND COMPARISON REPORT:

NATIONAL SURVEY OF STUDENT ENGAGEMENT

National Survey of Student Engagement Spring University of Kansas. Executive Summary

National Survey of Student Engagement The College Student Report

2010 National Survey of Student Engagement University Report

Revision and Assessment Plan for the Neumann University Core Experience

NATIONAL SURVEY OF STUDENT ENGAGEMENT

NATIONAL SURVEY OF STUDENT ENGAGEMENT

Office of Institutional Effectiveness 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) DIVERSITY ANALYSIS BY CLASS LEVEL AND GENDER VISION

African American Male Achievement Update

College of Education & Social Services (CESS) Advising Plan April 10, 2015

National Survey of Student Engagement at UND Highlights for Students. Sue Erickson Carmen Williams Office of Institutional Research April 19, 2012

OFFICE OF ENROLLMENT MANAGEMENT. Annual Report

Colorado State University Department of Construction Management. Assessment Results and Action Plans

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

EDUCATIONAL ATTAINMENT

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

An Introduction to LEAP

Executive Summary. Palencia Elementary

Evaluation of a College Freshman Diversity Research Program

Segmentation Study of Tulsa Area Higher Education Needs Ages 36+ March Prepared for: Conducted by:

PREVIEW LEADER S GUIDE IT S ABOUT RESPECT CONTENTS. Recognizing Harassment in a Diverse Workplace

National Survey of Student Engagement (NSSE)

EDUCATIONAL ATTAINMENT

2009 National Survey of Student Engagement. Oklahoma State University

AGENDA Symposium on the Recruitment and Retention of Diverse Populations


STUDENT LEARNING ASSESSMENT REPORT

Educational Attainment

Program Assessment and Alignment

VIEW: An Assessment of Problem Solving Style

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

2007 Advanced Advising Webinar Series. Academic and Career Advising for Sophomores

A Pilot Study on Pearson s Interactive Science 2011 Program

Financial aid: Degree-seeking undergraduates, FY15-16 CU-Boulder Office of Data Analytics, Institutional Research March 2017

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Student Support Services Evaluation Readiness Report. By Mandalyn R. Swanson, Ph.D., Program Evaluation Specialist. and Evaluation

ACCREDITATION STANDARDS

Effective practices of peer mentors in an undergraduate writing intensive course

Date: 9:00 am April 13, 2016, Attendance: Mignone, Pothering, Keller, LaVasseur, Hettinger, Hansen, Finnan, Cabot, Jones Guest: Roof

RETURNING TEACHER REQUIRED TRAINING MODULE YE TRANSCRIPT

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Shelters Elementary School

Association Between Categorical Variables

Using SAM Central With iread

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Cooper Upper Elementary School

EQuIP Review Feedback

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

Iowa School District Profiles. Le Mars

NC Global-Ready Schools

It s not me, it s you : An Analysis of Factors that Influence the Departure of First-Year Students of Color

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

Connect Microbiology. Training Guide

NDPC-SD Data Probes Worksheet

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

Cooking Matters at the Store Evaluation: Executive Summary

The University of North Carolina Strategic Plan Online Survey and Public Forums Executive Summary

Executive Summary. Lincoln Middle Academy of Excellence

Math Pathways Task Force Recommendations February Background

Online Master of Business Administration (MBA)

What is beautiful is useful visual appeal and expected information quality

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

Report on organizing the ROSE survey in France

Assuring Graduate Capabilities

Update Peer and Aspirant Institutions

Developing an Assessment Plan to Learn About Student Learning

Portfolio-Based Language Assessment (PBLA) Presented by Rebecca Hiebert

Graduate Division Annual Report Key Findings

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

(Includes a Detailed Analysis of Responses to Overall Satisfaction and Quality of Academic Advising Items) By Steve Chatman

Ministry of Education, Republic of Palau Executive Summary

Student Engagement and Cultures of Self-Discovery

Data Glossary. Summa Cum Laude: the top 2% of each college's distribution of cumulative GPAs for the graduating cohort. Academic Honors (Latin Honors)

RCPCH MMC Cohort Study (Part 4) March 2016

WP 2: Project Quality Assurance. Quality Manual

Suggested Citation: Institute for Research on Higher Education. (2016). College Affordability Diagnosis: Maine. Philadelphia, PA: Institute for

University of Arizona

Engineers and Engineering Brand Monitor 2015

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Films for ESOL training. Section 2 - Language Experience

Leader s Guide: Dream Big and Plan for Success

The following resolution is presented for approval to the Board of Trustees. RESOLUTION 16-

Strategic Planning for Retaining Women in Undergraduate Computing

National Collegiate Retention and. Persistence-to-Degree Rates

JOB OUTLOOK 2018 NOVEMBER 2017 FREE TO NACE MEMBERS $52.00 NONMEMBER PRICE NATIONAL ASSOCIATION OF COLLEGES AND EMPLOYERS

Transcription:

Thank you for joining us virtually to watch this pre recorded webinar overview of the 2016 National Survey of Student Engagement, also known as the NSSE. The Office of Assessment fielded the NSSE in the Spring of 2016 to first year and senior students on Youngstown State University s campus. 1

Just to give you a brief overview of the agenda. We ll spend the first couple minutes discussing the survey tool and how it links to Youngstown State s mission statement and strategic plan. We ll also briefly explore the connection NSSE data has to our accreditation body, the Higher Learning Commission. The bulk of our time however will be spent presenting key NSSE results. This is meant to only be a snapshot of data. Detailed data reports can be requested from the Office of Assessment. Also, any campus stakeholders are welcome to request an individualized presentation. 2

The primary intent of NSSE is to measure the concept of student engagement. 3

The NSSE survey tool looks at student engagement through the lens of two critical features of collegiate quality what students do and what institutions do. Self reported questions examine what students do in terms of time and energy put towards educationally enriching activities. Questions also look at the perception students have of the institution creating opportunities for students to participate in activities linked to learning. 4

NSSE has been around since 1999. The survey is housed in the Center for Postsecondary Research at the Indiana University School of Education. The survey instrument was updated in 2013. This past year, 557 institutions in the United States and Canada participated. 5

YSU fields the NSSE as a check up of student engagement on our campus. The survey is not meant to be a stand alone measure of student engagement. Rather, NSSE serves as one of the first steps in facilitating appropriate reflection and assessment of research backed national engagement indicators for the purpose of continuous improvement. NSSE creates conversations around the opportunities that best engage YSU students. 6

The content of the survey is broken into 4 parts. First, the survey tool asks students to report the frequency with which they engage in activities that represent effective educational practice. Second, students record their perceptions of the college environment associated with achievement, satisfaction, and persistence. Third, students estimate their educational and personal growth since starting college. And finally, students provide information about their background, including age, gender, race or ethnicity, living situation, educational status, and major field. 7

To represent the multiple dimensions of student engagement, 10 Engagement Indicators are calculated from 47 NSSE survey questions or items. The 10 engagement indicators are highlighted in the red boxes on your screen. The indicators are grouped within four themes. The themes are highlighted in the pink area of your screen. The Engagement Indicators and component items were rigorously tested both qualitatively and quantitatively in a multiyear effort that included student focus groups, cognitive interviews, and two years of pilot testing and analysis. As a result, each indicator provides valuable, concise, actionable information about a distinct aspect of student engagement. In the Engagement Indicators report, each indicator is expressed on a 60 point scale. Component items are converted to a 60 point scale with never equaling 0, sometimes equaling 20, often equaling 40, and very often equaling 60, then averaged together to compute student level scores. Institutional engagement indictor scores are the weighted averages of student level scores for each class level. 8

For a detailed look at the survey, you can download the survey instrument on the YSU Office of Assessment website. Also, a link is included on the current slide that provides information on the rigor of the survey instrument and methodology. The survey was administered directly through a credible third-party survey organization to YSU first year and senior students. The YSU Office of Assessment provided email addresses of all first-year and senior YSU students in the Spring of 2016 to the survey administrator. Gift cards were used as incentives and reminder emails were sent to increase response rate. Additionally in 2016, YSU chose to add two modules, Global Learning and First Year Experiences/Senior Transitions that were administered as add-ons to the survey instrument. These tools are also available for download on the YSU Office of Assessment website. 9

As a student centered institution, we should be using a sound instrument to listen to students experiences. Surveys like the NSSE (and the Noel Levitz which we will administer in the Spring of 2017) help our office to identify areas of concerns and opportunities to improve our practice in providing a well rounded educational experience for our students. As we analyze the NSSE data, we are intentional about centering data and action steps around the mission statement of the university. Reading through the mission statement on the screen, you can see some of the specific ways in which the YSU Mission Statement aligns with NSSE results. 10

It is common for institutions to use NSSE data as evidence for accreditation. This slide provides a summary of how data points map to Higher Learning Commission criterion. Related data will be shared with the corresponding criterion committees. 11

One of the strengths of using the NSSE, is that we are able to benchmark our performance against other universities that also participate in the survey administration. A great deal of planning went into selecting the best possible comparison institutions. 12

We chose three comparison groups: one group being all institutions that participated in the survey. The second being a historical peer group. This group was originally chosen by a former YSU president and has been used in the past years when NSSE was administered. This group is strictly for benchmarking across multiple years. The Peer Institution group is the best match for a true comparison. On the next slide will be a detailed description of how we chose that group. We also have the ability to break down NSSE data by colleges for comparison. When comparing within or across groups, it is important to note that NSSE measures process not achievement. 13

A distance analysis was used to identify 23 institutions that are most similar to Youngstown State University in relation to several variables. Schools were filtered by Carnegie classification. Then the variables listed in the gray table were used to establish our peer group through a distance analysis. Listed in the red table are the universities a part of our peer comparison group. These institutions have similar university and student characteristics. 14

As a quick review, the NSSE survey was administered in the Spring of 2016 to first year and senior YSU students through email and reminder emails. YSU s overall response rate is consistent to slightly higher than past years and national averages. 15

Our first year response rate was 25% or 475 first year students. Our senior response rate was 28% or 642 senior students. You ll note the table on the screen also includes an overall sampling error. This is also referred to as the margin of error. Sampling error is an estimate of the amount the true score on a given item could differ from the estimate based on a sample. For example, if the sampling error is plus or minus 5% and 40% of your students reply "Very often" to a particular item, then the true population value is most likely between 35% and 45%. 16

This table breaks down the characteristics of students that responded to the survey versus the overall YSU population. Just as a note this table uses institutional IPEDS categories submitted in the population file, not self reported demographic data. By looking at the table, you ll note some places where the difference between respondents and population is noticeable specifically the female and male breakdown at 6 to 7% difference between survey respondents and student population. 17

Up until this point we ve been looking at the purpose and quality of the NSSE survey methods. At this point, we are moving forward towards looking at highlighted pieces of data. 18

NSSE identifies top and bottom performing survey items for our first year and senior students based on statistical significance and effect size, providing a snapshot of the greatest differences in YSU respondents compared to our peer institutions. 19

Take a moment to read through YSU s 5 highest performing items for first year students and 5 highest performing items for senior students. {pause} Reading through the table, you ll note that courses including a community based project or service learning appears for both first year and senior students. Also notable is the academic emphasis of these items. YSU highest performing items are all related to academic engagement with faculty, coursework, peers, or support services. 20

Take a moment to read through the lowest performing items on the screen. {pause} Our lowest performing items compared between first year and senior students share little similarity. Four of the five lowest performing items for first year students are related to diverse perspectives and experiences. For senior students, four of the five lowest performing items are related to reading and/or summary and comprehension of information. 21

As a reminder, NSSE has established 10 engagement indicators that survey items and data are organized around. 22

These 10 items are organized within four themes and represent the multiple dimensions of student engagement. 23

YSU participated in the 2013 NSSE survey. Because of this we have the ability to look at trends between the 2013 and 2016 scores. Differences in scores do not indicate statistical significance and it is critical to note these are not matched samples. Rather, we use mean scores to look at trends in data and to identify areas for further investigation. As a reminder on the scale each indicator is expressed on a 60 point scale. Component items are converted to a 60 point scale with never equaling 0, sometimes equaling 20, often equaling 40, and very often equaling 60, then averaged together to compute student level scores. Institutional engagement indictor scores are the weighted averages of student level scores for each class level. Red arrows indicate a decrease in mean, green arrows indicate an increase in mean, and a black line indicates no change. We ll take a slight pause to allow you to read the engagement indicator scores on this slide and the next. {pause} 24

{pause} 25

This table presents how our engagement indicator scores fair in comparison to our peer group. Five areas of statistically significant difference in scores are found For first year students reflective & integrative learning and discussions with diverse others averages are significantly lower than our peer institutions. For senior students, learning strategies are significantly lower than our peer institutions. While, collaborative learning and studentfaculty interactions are statistically significantly higher than our peer schools. You may have observed that all of our areas that are considered significantly lower than our peers, were also areas where our mean score decreased from 2013 to 2016. 26

Institutions are able to chose from a variety of Topical Modules to add to the standard NSSE instrument. These modules are short sets of questions on a designated topic. Rooted in strategic thought from the Provost, YSU added two modules to this year s NSSE. The first was First Year Experiences & Senior Transitions and the second Global Learning. 27

The First Year Experiences and Senior Transitions module included a set of items only for first year students and a set only for seniors, with questions adapted from the Beginning College Survey of Student Engagement and the Strategic National Arts Alumni Project, respectively. The first year items focused on academic perseverance, help seeking behaviors, and institutional commitment, while the senior items explored post graduation plans, links between the academic major and future plans, and confidence with skills developed during college. Because of the nature of modules, it is important to note that benchmarking is to the 237 other schools that participated in this module, not to our peer group. 28

There were 6 survey items in the module where first year students scored significantly different than the rest of the module. 29

Three significant items could be identified here as areas of challenge in comparison to the overall module mean: students studying when there were other interesting things to do, students finding additional information for course assignments when they don t understand the material, and students participating in courses discussions, even when they don t feel like it. 30

Additionally, three significant items could be identified here as positive differences: students have less difficulty in getting help with school work, students seek help from family members on school work more often, and most notably only 23% of first year respondents seriously considered leaving YSU in the past year, while the module mean was 33%. 31

For the questions that were directed at Senior students, the slide shows the five survey items that revealed areas of statistically significant difference from other schools participating in the module. 32

One difference to mention is the first item Do you already have a job after graduation? As a reminder, this survey was fielded to YSU students in the spring semester of their senior year. 37% of respondents answered yes. This is lower than the module mean by 7%. And may be a place for further investigation. 33

The second added module, Global Learning, assesses student experiences and coursework that emphasize global affairs, world cultures, nationalities, religions, and other international topics. The module complements items on the core NSSE questionnaire about student experiences with people from different backgrounds, course emphasis on integrative and reflective learning, and participation in study abroad. Because of the nature of modules, it is important to note that benchmarking is to the 64 other schools that participated in this module, not to our peer group. 34

YSU s mean scores are statistically significantly below the group mean on 19 out of 20 items for First Year Students and all 20 items for Senior students. This includes students perceiving YSU to prepare them for life and work in an increasingly globalized era. The full data report on the Global Learning module can be requested from the YSU Office of Assessment. 35

Paired with some of the lower performing items on the main NSSE questionnaire, this is a topic area that warrants further investigation. To align with the mission of YSU, it becomes crucial to identify areas where YSU can be providing opportunities for diverse educational experiences and fostering global perspectives. 36

Because of the volume of NSSE data available, it would be impossible to cover all of the data in this webinar. The Office of Assessment has combed through the data to pick a few additional data areas that we find relevant as an overview. As a reminder, for access to full reports and the report builder, visit the YSU Office of Assessment website. 37

Before looking at other data we can get a snapshot of a typical week for a YSU first year and senior student in comparison to our peers. Take a moment to read through the table and reflect on what this tells us about our students. 38

A set of questions on the survey tool asks seniors how and to what level the university has prepared them in 10 different areas. If you reference back to the mission of YSU, these 10 items very much align with the university s mission statement. In the table you can see that only a little over half of students feel the university has prepared them to be an informed and active citizen. In fact, the bottom three items are consistent with some of our lowest performing items and the results from the global learning module citizenship, understanding people from other backgrounds, and solving complex real world problems. At the top of the list however, 86% of seniors feel that the university has prepared them (very much or quite a bit) to critically and analytically think. 39

It is notable that perceived gains at a very much or quite a bit level among seniors, increased from 2013 to 2016 on all ten items. Increases ranged between 2 and 10 percent, with the biggest increase evident in the item Developing or clarifying a personal code of values/ethics. 40

Another selection of data are the self reported quality of interactions respondents feel they have with other students, faculty, and advisors. The highest percentage of negative interactions and lowest percentage of positive interactions students have is with advisors with 20% of students noting negative interactions. 41

In comparison to our peers, one of the lowest performing areas of the survey is in terms of first year students and experiences that promote diverse perspectives and learning. As you can see on the slide, YSU scores lower than our peer institutions on 8 survey items. These items range from diverse perspectives being included in courses, to understanding people of other backgrounds. This area of survey data is one that warrants further investigation and intentional action. 42

The NSSE survey designates certain activities as High Impact Practices. These are enriching educational experiences that go beyond typical classroom learning and provide high levels of engagement opportunity for students. Six High Impact Practices are designated and include being a part of a formal learning community, participating in service learning, conducting research with a faculty member, participating in an internship or field experience, studying aboard, and a culminating senior experience or capstone. 43

Student engagement research recommends that all students participate in at least two high impact practices over the course of their undergraduate experience one during the first year and one in the context of their major. YSU students indicate higher levels of participation in high impact practices than our peer institutions. 91% of senior students have participated in at least one high impact practice, with 68% participating in at least two. 44

Overall, the majority of both first year and senior students (over 80%) rate their experience at YSU as excellent or good. One of the primary reasons the Office of Assessment fields the NSSE survey is to identify areas for continuous improvement. The data we reviewed today looks at some areas of strength and other areas of opportunity for student engagement at YSU. The NSSE survey data provides us with a starting point to understand YSU students and create more opportunities for engagement. 45

These questions for reflection are the end of the webinar. However, the opportunity to get involved in investigating NSSE or other continuous improvement data through the Office of Assessment is ongoing. Visit the Office of Assessment website or contact our office for more information. 46