CHAPTER 4 RESULTS. This study investigated the computer literacy and computer application skills of

Similar documents
Instructor: Mario D. Garrett, Ph.D. Phone: Office: Hepner Hall (HH) 100

School Size and the Quality of Teaching and Learning

MISSISSIPPI STATE UNIVERSITY SUG FACULTY SALARY DATA BY COLLEGE BY DISCIPLINE 12 month salaries converted to 9 month

STA 225: Introductory Statistics (CT)

ScienceDirect. Noorminshah A Iahad a *, Marva Mirabolghasemi a, Noorfa Haszlinna Mustaffa a, Muhammad Shafie Abd. Latif a, Yahya Buntat b

Teachers Attitudes Toward Mobile Learning in Korea

VOL. 3, NO. 5, May 2012 ISSN Journal of Emerging Trends in Computing and Information Sciences CIS Journal. All rights reserved.

12- A whirlwind tour of statistics

Generic Skills and the Employability of Electrical Installation Students in Technical Colleges of Akwa Ibom State, Nigeria.

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

State University of New York at Buffalo INTRODUCTION TO STATISTICS PSC 408 Fall 2015 M,W,F 1-1:50 NSC 210

PREDISPOSING FACTORS TOWARDS EXAMINATION MALPRACTICE AMONG STUDENTS IN LAGOS UNIVERSITIES: IMPLICATIONS FOR COUNSELLING

Office Hours: Mon & Fri 10:00-12:00. Course Description

A study of the capabilities of graduate students in writing thesis and the advising quality of faculty members to pursue the thesis

Effects of Self-Regulated Strategy Development on EFL Learners Reading Comprehension and Metacognition

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007

An Empirical Analysis of the Effects of Mexican American Studies Participation on Student Achievement within Tucson Unified School District

Saeed Rajaeepour Associate Professor, Department of Educational Sciences. Seyed Ali Siadat Professor, Department of Educational Sciences

PROFESSIONAL TREATMENT OF TEACHERS AND STUDENT ACADEMIC ACHIEVEMENT. James B. Chapman. Dissertation submitted to the Faculty of the Virginia

MISSISSIPPI STATE UNIVERSITY SUG FACULTY SALARY DATA BY COLLEGE BY DISCIPLINE

Analyzing the Usage of IT in SMEs

AGRICULTURAL AND EXTENSION EDUCATION

The Impact of Formative Assessment and Remedial Teaching on EFL Learners Listening Comprehension N A H I D Z A R E I N A S TA R A N YA S A M I

MAJORS, OPTIONS, AND DEGREES

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

MIDDLE AND HIGH SCHOOL MATHEMATICS TEACHER DIFFERENCES IN MATHEMATICS ALTERNATIVE CERTIFICATION

CONSTRUCTION OF AN ACHIEVEMENT TEST Introduction One of the important duties of a teacher is to observe the student in the classroom, laboratory and

A Pilot Study on Pearson s Interactive Science 2011 Program

Montana's Distance Learning Policy for Adult Basic and Literacy Education

Field Experience Management 2011 Training Guides

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

National Survey of Student Engagement Spring University of Kansas. Executive Summary

The Effect of Written Corrective Feedback on the Accuracy of English Article Usage in L2 Writing

TULSA COMMUNITY COLLEGE

A Program Evaluation of Connecticut Project Learning Tree Educator Workshops

Research Design & Analysis Made Easy! Brainstorming Worksheet

A COMPARATIVE STUDY OF MALE AND FEMALE STUDENTS IN AGRICULTURE AND BIOLOGY IN KWARA STATE COLLEGE OF

Evaluation of Teach For America:

The Impact of Learning Styles on the Iranian EFL Learners' Input Processing

Integration of ICT in Teaching and Learning

Simple Random Sample (SRS) & Voluntary Response Sample: Examples: A Voluntary Response Sample: Examples: Systematic Sample Best Used When

Probability and Statistics Curriculum Pacing Guide

Chapters 1-5 Cumulative Assessment AP Statistics November 2008 Gillespie, Block 4

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

Towards Developing a Quantitative Literacy/ Reasoning Assessment Instrument

Quantitative analysis with statistics (and ponies) (Some slides, pony-based examples from Blase Ur)

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

Sheila M. Smith is Assistant Professor, Department of Business Information Technology, College of Business, Ball State University, Muncie, Indiana.

Understanding and Interpreting the NRC s Data-Based Assessment of Research-Doctorate Programs in the United States (2010)

STUDENT SATISFACTION IN PROFESSIONAL EDUCATION IN GWALIOR

Quantitative Study with Prospective Students: Final Report. for. Illinois Wesleyan University Bloomington, Illinois

TEXT FAMILIARITY, READING TASKS, AND ESP TEST PERFORMANCE: A STUDY ON IRANIAN LEP AND NON-LEP UNIVERSITY STUDENTS

A STUDY ON THE EFFECTS OF IMPLEMENTING A 1:1 INITIATIVE ON STUDENT ACHEIVMENT BASED ON ACT SCORES JEFF ARMSTRONG. Submitted to

Assessing Student Learning in the Major

Kansas Adequate Yearly Progress (AYP) Revised Guidance

(Includes a Detailed Analysis of Responses to Overall Satisfaction and Quality of Academic Advising Items) By Steve Chatman

WINNIPEG, MANITOBA, CANADA

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Third Misconceptions Seminar Proceedings (1993)

Developing a College-level Speed and Accuracy Test

In the rapidly moving world of the. Information-Seeking Behavior and Reference Medium Preferences Differences between Faculty, Staff, and Students

Major Degree Campus Accounting B.B.A. Athens Accounting M.Acc. Athens Adult Education Ed.D. Athens Adult Education Ed.S. Athens Adult Education M.Ed.

THE INFORMATION SYSTEMS ANALYST EXAM AS A PROGRAM ASSESSMENT TOOL: PRE-POST TESTS AND COMPARISON TO THE MAJOR FIELD TEST

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

STEM Academy Workshops Evaluation

The influence of parental background on students academic performance in physics in WASSCE

Evidence for Reliability, Validity and Learning Effectiveness

K 1 2 K 1 2. Iron Mountain Public Schools Standards (modified METS) Checklist by Grade Level Page 1 of 11

Major Classic FIG Fusion FIG Residential FIG Learning Community Business: The CEOs The World of. Designing Your Future in. Future in Engineering

VIEW: An Assessment of Problem Solving Style

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

Enhancing Students Understanding Statistics with TinkerPlots: Problem-Based Learning Approach

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

UNIVERSITY OF ALABAMA AT BIRMINGHAM. IPEDS Completions Reports, July 1, June 30, 2016 SUMMARY

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Learning Microsoft Office Excel

Use of the Kalamazoo Essential Elements Communication Checklist (Adapted) in an Institutional Interpersonal and Communication Skills Curriculum

Procedia - Social and Behavioral Sciences 64 ( 2012 ) INTERNATIONAL EDUCATIONAL TECHNOLOGY CONFERENCE IETC2012

Running head: LISTENING COMPREHENSION OF UNIVERSITY REGISTERS 1

DOES OUR EDUCATIONAL SYSTEM ENHANCE CREATIVITY AND INNOVATION AMONG GIFTED STUDENTS?

Session 2B From understanding perspectives to informing public policy the potential and challenges for Q findings to inform survey design

Faculty Voice Task Force 5: Fixed Term Faculty. November 1, 2006

On-the-Fly Customization of Automated Essay Scoring

Empowering Students Learning Achievement Through Project-Based Learning As Perceived By Electrical Instructors And Students

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

TULSA COMMUNITY COLLEGE

System Quality and Its Influence on Students Learning Satisfaction in UiTM Shah Alam

Segmentation Study of Tulsa Area Higher Education Needs Ages 36+ March Prepared for: Conducted by:

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

Do multi-year scholarships increase retention? Results

Undergraduates Views of K-12 Teaching as a Career Choice

CHAPTER III RESEARCH METHOD

Advertisement No. 2/2013

Critical Issues and Problems in Technology Education

EDPS 859: Statistical Methods A Peer Review of Teaching Project Benchmark Portfolio

Psychometric Research Brief Office of Shared Accountability

Wright State University

Supply and Demand of Instructional School Personnel

Transcription:

54 CHAPTER 4 RESULTS Introduction This study investigated the computer literacy and computer application skills of undergraduate college students. The study consisted of two phases: a faculty survey and a student assessment. One phase of the study consisted of a Web-based faculty survey administered to a stratified random sample of faculty members from four-year public institutions in the state of Missouri. The purpose of this faculty survey was to identify basic computer skills needed by undergraduate students to be academically successful in post-secondary education. The study also examined the data collected for trends and differences between independent variables of subject/content area, institution, gender, and years of faculty experience. The other phase of the study consisted of a series of student assessments administered to a group of undergraduate students enrolled in a computer literacy course. The assessments were administered prior to any instruction of the specific topic being tested. The purpose of these assessments was to evaluate the computer competencies of students entering post-secondary education. The study also examined the data collected for trends and differences between independent variables such as home state, number of high school computer courses taken, gender, and major field of study.

55 Population and Sample Faculty Survey The population eligible for inclusion in this portion of the study consisted of 4,223 faculty members from 13 four-year public institutions in the state of Missouri as listed in the Official Manual State of Missouri 2003-2004 (Missouri Secretary of State, 2004). A table of recommended sample sizes (n) for population (N) with finite sizes, developed by Krejcie and Morgan and adapted by Patten (2004), was used to determine sample size. According to the table, and for purposes of this study, a finite population size N = 4223 revealed a sample size n = 357 as the goal for this study. The population was divided into strata (subgroups) according to institution. This stratified random sample ensured that subgroups were represented in the correct proportions. According to Patten (2004), the same percentage of participants, not the same number of participants, were drawn from each stratum. Faculty members from each stratum (institution) were randomly selected through the use of the randomize function in Microsoft Excel. A total of 1,416 emails, with the survey link and password, were sent to the stratified, randomly selected faculty members and 426 survey responses were received. This represented a 30% response rate. One response was rejected because the participant selected a no response to the question agreeing to participate in the study and 20 were rejected because of incomplete answers to survey items. This provided 405 usable surveys for the study resulting in a 29% usable survey return rate. Table 7 illustrates the

proportionate stratified random sampling results. Strata have been numerically coded to protect the identity of the institutions. Table 7 Proportionate Stratified Random Sample Results from a Population Divided into 13 Institution Strata Stratum # of faculty (population) Proportion of population Stratum sample size # of faculty contacted Survey responses Usable survey responses 1 380.09 32 127 47 44 2 42.01 4 14 0 0 3 122.03 10 41 14 14 4 198.05 17 66 20 19 5 153.04 13 51 17 15 6 200.05 17 67 36 36 7 335.08 28 112 41 40 8 564.13 47 189 50 47 9 314.07 26 105 29 27 10 1055.25 88 354 84 80 11 378.09 32 127 40 38 12 237.06 20 79 23 21 13 245.06 21 82 25 24 Total 4223 1.00 354 1416 426 405 56 Student Assessment The population eligible for inclusion in the student assessment portion of the study consisted of college freshmen from a small mid-western university enrolled in a computer literacy course. Permission was requested, through student consent forms, to use their assessment scores as part of the study. Signed consent forms were collected from 259 students. Ten participants were rejected because of missing demographic data, and 85 were rejected because of missing assessment scores. This resulted in a total of 164 students with demographic data and all assessment components complete. According to a table of recommended sample sizes (n) for population (N), developed by Krejcie and

Morgan and adapted by Patten (2004), a population size N = 259 and thus a sample size n = 155 was the goal for this study. Thus, 164 participants met the sample size goal. Statistical Analysis Faculty Survey Demographic Descriptive Statistics The results of the faculty survey s initial question on the importance of computer literacy/skills in relation to student academic success at the post-secondary level indicated a strong importance of computer literacy/skills. As indicated in Table 8, two hundred and fifty-nine (64%) of the respondents indicated computer literacy/skills were very important and 128 (31.6%) respondents indicated computer literacy/skills were important. Seventeen respondents (4.2%) indicated somewhat important and one respondent (0.2%) indicated not important. Table 8 Importance of Computer Literacy/Skills in Relation to Student Academic Success Scale f % Not important 1.2 Somewhat important 17 4.2 Important 128 31.6 Very important 259 64.0 Total 405 100.0 Of the 405 faculty survey respondents, 26 (6.4%) were Instructors, 111 (27.4%) were Assistant Professors, 124 (30.6%) were Associate Professors, 138 (34.1%) were Full Professors, and six (1.5%) were identified as other including Department Chairs, Emeritus faculty, Co-directors of departments, etc. Table 9 indicates the frequency and percent of faculty survey respondents in various departments or content areas. The Other category included areas such as 57

58 Library Science, Law, various medical areas, Engineering, Plant Microbiology and Pathology, Industrial Technology, Criminology, Kinesiology, Ecology, Safety Sciences, Social Work, Veterinary Medicine, Communication Science and Disorders, Forestry/Atmospheric Science, Biomedical Sciences, Nursing, Electrical and Computer Engineering, Anthropology, Optometry, Environmental Engineering, etc. Table 9 Demographic Breakdown of Department or Content Area f % Accounting/Econ/Finance 25 6.2 Agriculture 15 3.7 Art 11 2.7 Biological Sciences 14 3.5 Business Administration 14 3.5 Chemistry/Physics/Science Education 26 6.4 Communication/Theatre/Languages 11 2.7 Computer Science/Information Systems 11 2.7 Education 37 9.1 English 18 4.4 Family & Consumer Sciences 7 1.7 Geology/Geography 7 1.7 Health/PE/Recreation/Dance 14 3.5 History/Humanities/Philosophy/Political Science 26 6.4 Marketing/Management 10 2.5 Mass Communications/Broadcasting/Digital Media/Journalism 4 1.0 Math/Statistics 17 4.2 Modern Languages 2.5 Music 15 3.7 Psychology/Sociology/Counseling 25 6.2 Other 96 23.7 Total 405 100.0 Of the 405 faculty respondents, 158 (39%) were female and 247 (61%) were male. Seventy-nine (19.5%) have been employed for five years or less at their current institution, 129 (31.9%) have been employed from 5-10 years, 109 (26.9%) have been

59 employed from 11-20 years, 67 (16.5%) have been employed 21 30 years, and 21 (5.2%) have been employed for over 30 years at their current institution. In regards to the number of years in education, of the 405 respondents, 25 (6.2%) have been in education less than five years, 90 (22.2%) have been in education from 5-10 years, 115 (28.4%) have been in education from 11-20 years, 96 (23.7%) have been in education from 21-30 years, and 79 (19.5%) have been in education for over 30 years. Reliability The survey consisted of five major sections; computer concepts, word processing, spreadsheet, presentation, and database items. Each of these sections (subscales) consisted of eight individual survey items. Cronk (2004) indicates one way to test reliability is to assess internal consistency of the data by conducting an item-total analysis. The Spearman rho correlation was used to conduct this analysis because the data were nominal. According to Cronk (2004), itemtotal correlations should be positive and greater than 0.3 to indicate internal consistency. Tables 10-14 illustrate that computer concept items, word processing items, spreadsheet items, presentation items, and database items were found to be internally consistent within each subscale as none of the Spearman rho correlation values fell below 0.5. Table 15 indicates that all subscales within the survey were also found to be internally consistent with rho values in the 0.7 to 0.8 range. Table 10 Spearman rho Correlations for Computer Concepts Items Computer Concept Variables n rho p Computer and information literacy, introduction to application 405 0.555 0.000 software, word processing concepts, and inside the computer Internet, email, system software, and exploring the Web 405 0.632 0.000

Computer Concept Variables n rho p Current issues, emerging technologies, spreadsheet concepts, and 405 0.754 0.000 data storage Presentation packages, special purpose programs, 405 0.755 0.000 multimedia/virtual reality, and input/output Database concepts, telecommunications, and networks 405 0.726 0.000 Ethics and security 405 0.626 0.000 Web page creation 405 0.727 0.000 Locating and evaluating information on the Internet, effectively using search engines, determining the credibility of information 405 0.561 0.000 Table 11 Spearman rho Correlations for Word Processing Items Word Processing Variables n rho p-value Getting Started 405 0.586 0.000 Insert and Modify Text 405 0.716 0.000 Create and Modify Paragraphs 405 0.761 0.000 Format Documents 405 0.747 0.000 Manage Documents 405 0.723 0.000 Working with Graphics 405 0.804 0.000 Workgroup Collaboration 405 0.686 0.000 Creating and Modifying Graphics 405 0.764 0.000 Table 12 Spearman rho Correlations for Spreadsheet Items Spreadsheet Variables n rho p-value Working with Cells and Cell Data 405 0.885 0.000 Managing Workbooks 405 0.889 0.000 Formatting and Printing Workbooks 405 0.886 0.000 Modifying Workgroups 405 0.873 0.000 Creating and Revising Formulas 405 0.859 0.000 Creating and Modifying Graphics 405 0.821 0.000 Workgroup Collaboration 405 0.781 0.000 Integrating a spreadsheet with other software 405 0.819 0.000 Table 13 Spearman rho Correlations for Presentation Items Presentation Variables n rho p-value Creating Presentations 405 0.860 0.000 Inserting and Modifying Text 405 0.852 0.000 Inserting and Modifying Visual Elements 405 0.889 0.000 Modifying Presentation Formats 405 0.870 0.000 Printing Presentations 405 0.785 0.000 Working with Data from Other Sources 405 0.836 0.000 Managing and Delivering Presentations 405 0.828 0.000 Workgroup Collaboration 405 0.794 0.000 60

61 Table 14 Spearman rho Correlations for Database Items Database Variables n rho p-value Getting Started 405 0.895 0.000 Creating and Using Databases 405 0.931 0.000 Creating and Modifying Tables 405 0.918 0.000 Creating and Modifying Queries 405 0.877 0.000 Creating and Modifying Forms 405 0.869 0.000 Viewing and Organizing Information 405 0.926 0.000 Producing Reports 405 0.871 0.000 Integrating with Other Applications 405 0.909 0.000 Table 15 Spearman rho Correlations between Subscales Subscale Variables n rho p-value Computer Concepts 405 0.808 0.000 Word Processing 405 0.732 0.000 Spreadsheet 405 0.819 0.000 Presentation 405 0.835 0.000 Database 405 0.803 0.000 Cronbach s alpha was also performed as another reliability measure to test for internal consistency. Cronk (2004) notes that Cronbach s alpha uses a scale to measure a single construct and then determines the extent to which all items are measuring the same construct. As noted by Cronk (2004), a reliability coefficient close to 1.00 indicates good internal consistency and a coefficient close to 0.00 indicates poor internal consistency. Cronbach s alpha was used to determine the internal reliability of the survey instrument. The instrument was tested in its entirety, and the five individual sections of the survey were tested independently. The Cronbach s alpha reliability coefficients for the individual sections (subscales) of the survey ranged from a low of 0.8338 to a high of 0.9675 with all having p = 0.000. These results demonstrated a high level of internal reliability. The survey as a whole had a reliability coefficient of 0.8620 which also

demonstrated a high degree of internal consistency. Table 16 summarizes the reliability coefficients of the survey instrument. Table 16 Cronbach s Alpha Reliability Coefficients Subscale Variables n alpha p Computer Concepts 405 0.8338.000 Word Processing 405 0.8864.000 Spreadsheet 405 0.9484.000 Presentation 405 0.9453.000 Database 405 0.9675.000 All subscales combined 405 0.8620.000 Descriptive Statistics Each of the sections (subscales) of the survey consisted of eight individual survey items. In the computer concepts section, topics were grouped according to the computer literacy course at the researcher s institution. Item 1 included the topics of computer and information literacy, introduction to application software, word processing, and inside the computer. Item 2 included the topics of understanding the Internet, email, system software, and exploring the Web. Item 3 included the topics of spreadsheets, current issues, emerging technologies, and data storage. Item 4 included the topics of presentation packages, special purpose programs, multimedia/virtual reality, and input/output. Item 5 included the topics of databases, telecommunications, and networks. Item 6 included the topics of ethics and security, Item 7 included Web page creation, and Item 8 included locating and evaluating information on the Internet. Table 17 displays the frequency and percentage results of the 405 faculty respondents in regards to the importance of various computer topics. Of the eight survey items in the computer concepts section, four were considered very important: Item 1 topics (60.2%) including computer and information literacy, introduction to application 62

63 software, word processing, and inside the computer; Item 2 topics (67.4%) including understanding the Internet, email, system software, and exploring the Web; Item 6 topics (42.2%) including ethics and security; and Item 8 topics (68.4%) including locating and evaluating information on the Internet. Computer concept topics considered important were: Item 3 topics (36.8%) including spreadsheets, current issues, emerging technologies, and data storage; and Item 4 topics (39%) including presentation packages, special purpose programs, multimedia/virtual reality, and input/output. Item 5 topics (38.5%) including databases, telecommunications, and networks and Item 7 topics (39.8%) including Web page creation were considered somewhat important. Table 17 Frequencies and Percentages for Computer Concepts (n = 405) (NI Not important, SI = Somewhat important, I = Important, VI = Very important) Item NI SI I VI Computer/information literacy, introduction to application software, word processing, and inside the computer 5 1.2% Understanding the Internet, email, system software, and exploring the Web 3 0.7% Spreadsheets, current issues, emerging 41 technologies, and data storage 10.1% Presentations, special purpose programs, 39 multimedia/virtual reality, input/output 9.6% Databases, telecommunications, 55 and networks 13.6% Ethics and security 15 3.7% Web page creation 114 28.1% Locating and evaluating information 5 on the Internet 1.2% 24 5.9% 35 8.6% 108 26.7% 97 24.0% 156 38.5% 67 16.5% 161 39.8% 26 6.4% 132 32.6% 94 23.2% 149 36.8% 158 39.0% 120 29.6% 152 37.5% 84 20.7% 97 24.0% 244 60.2% 273 67.4% 107 26.4% 111 27.4% 74 18.3% 171 42.2% 46 11.4% 277 68.4% In addition to the computer concepts component of the faculty survey, it also included items related to four computer application areas; word processing, spreadsheet,

64 presentation, and database skills. Each application consisted of skills grouped together into eight survey items. The faculty survey items along with the skills associated with each item can be found in Appendix I. Results of the computer application skills component are reported in Tables 18-21. In regards to the word processing section, five of the eight items resulting in very important were opening/closing a document and using help (80.5%), inserting and modifying text (72.6), creating and modifying paragraphs (64.2%), formatting documents (58.0%), and managing documents (64.2%). Two of the eight items were considered important including working with graphics (36.3%) and creating and modifying graphics (35.1%). Workgroup collaboration resulted in somewhat important (37.8%). Table 18 Frequencies and Percentages for Word Processing Skills (n = 405) (NI Not important, SI = Somewhat important, I = Important, VI = Very important) Item NI SI I VI Getting started 3 0.7% 15 3.7% 61 15.1% 326 80.5% Insert and modify text 6 1.5% 22 5.4% 83 20.5% 294 72.6% Create and modify paragraphs 8 2.0% 37 9.1% 100 24.7% 260 64.2% Format documents 9 2.2% 43 10.6% 118 29.1% 235 58.0% Manage documents 5 1.2% 33 8.1% 107 26.4% 260 64.2% Working with graphics 31 7.7% 112 27.7% 147 36.3% 115 28.4% Workgroup collaboration 40 153 146 66 Creating and modifying graphics 9.9% 39 9.6% 37.8% 138 34.1% 36.0% 142 35.1% Although many of the items in the spreadsheet section were very close, the 16.3% 86 21.2% highest percentages indicated three of the eight items were important, while the other five

items were considered somewhat important. The three areas resulting in important were working with cells and cell data (29.1%) managing workbooks (29.4%), and creating and modifying graphics (33.3%). The other five areas of formatting and printing workbooks (37.0%), modifying workgroups (42.0%), creating and revising formulas (30.1%), workgroup collaboration (43.2%), and integrating with other software (40.7%) were considered somewhat important. Table 19 Frequencies and Percentages for Spreadsheet Skills (n = 405) (NI Not important, SI = Somewhat important, I = Important, VI = Very important) Item NI SI I VI Working with cells and cell data 60 14.8% 116 28.6% 118 29.1% 111 27.4% Managing workbooks 69 17.0% 108 26.7% 119 29.4% 109 26.9% Formatting and printing workbooks 79 19.5% 150 37.0% 98 24.2% 78 19.3% Modifying workgroups 88 21.7% 170 42.0% 102 25.2% 45 11.1% Creating and revising formulas 96 23.7% 122 30.1% 95 23.5% 92 22.7% Creating and modifying graphics 65 16.0% 125 30.9% 135 33.3% 80 19.8% Workgroup collaboration 106 175 95 29 26.2% Integrating with other software 98 24.2% 43.2% 165 40.7% 23.5% 104 25.7% 7.2% 38 9.4% Presentation skills results show two areas at very important including creating presentations (44.0%) and inserting and modifying text (41.5%). Four areas were considered important including inserting and modifying visual elements (37.0%), modifying presentation formats (32.1%), printing presentations (33.8%), managing and delivering presentations (34.8%). Spreadsheet skills involving working with data from 65

66 other sources resulted in a tie between somewhat important and important (35.1%). Workgroup collaboration skills resulted in somewhat important (38.0%). Table 20 Frequencies and Percentages for Presentation Skills (n = 405) (NI Not important, SI = Somewhat important, I = Important, VI = Very important) Item NI SI I VI Creating presentations 26 6.4% 67 16.5% 134 33.1% 178 44.0% Inserting and modifying text 22 5.4% 68 16.8% 147 36.3% 168 41.5% Inserting and modifying visual elements 30 7.4% 92 22.7% 150 37.0% 133 32.8% Modifying formats 49 12.1% 122 30.1% 130 32.1% 104 25.7% Printing presentations 33 8.1% 104 25.7% 137 33.8% 131 32.3% Working with data from other sources 48 11.9% 142 35.1% 142 35.1% 73 18.0% Managing and delivering 32 94 141 138 presentations 7.9% Workgroup collaboration 77 19.0% 23.2% 154 38.0% 34.8% 111 27.4% 34.1% 63 15.6% In regards to the database results, all items resulted in somewhat important. Table 21 Frequencies and Percentages for Database Skills (n = 405) (NI Not important, SI = Somewhat important, I = important, VI = Very important) Item NI SI I VI Getting Started 76 18.8% 122 30.1% 121 29.9% 86 21.2% Creating and Using Databases 84 20.7% 132 32.6% 110 27.2% 79 19.5% Creating and Modifying Tables 85 21.0% 150 37.0% 112 27.7% 58 14.3% Creating and Modifying Queries 113 27.9% 154 38.0% 95 23.5% 43 10.6% Creating and Modifying Forms 122 154 95 34 Viewing and Organizing Information 30.1% 91 22.5% 38.0% 153 37.8% 23.5% 96 23.7% 8.4% 65 16.0%

Item NI SI I VI Producing Reports 80 19.8% 123 30.4% 114 28.1% 88 21.7% Integrating with other Applications 103 25.4% 157 38.8% 96 23.7% 49 12.1% Of the faculty members surveyed, 185 (45.7%) respondents indicated a computer literacy/skills course was required of all majors within their department, while 220 (54.3%) indicated a computer literacy/skills course was not required of majors within their department. As summarized in Table 22, the survey results indicated that 345 (85.1%) respondents strongly agree (48.1%) or agree (37.0%) that a computer literacy/skills course or equivalent test out should be required of all undergraduate students. Table 22 Frequencies and Percentages for Computer Course or Equivalent Testout Required for All Undergraduate Students Strongly Disagree Agree Strongly Disagree Require course or equivalent test out 8 2.0% 52 12.8% 150 37.0% Agree 195 48.1% It is customary to report measures of central tendency and measures of dispersion when describing data. The mean is the most powerful measure of central tendency, while the standard deviation is the most powerful measure of dispersion (Cronk, 2004). Since the individual items, within each section of the survey instrument, have been proven to be internally consistent, a composite score for each section (subscale) will be used throughout the rest of this report rather than individual items. Each individual item within a section had a minimum score of one (not important) to a maximum score of four (very important). The composite score was calculated by adding the scores for all eight 67

individual items within each subscale. The composite score had a minimum score of eight and a maximum score of 32. Table 23 summarizes the descriptive data for each subscale (dependent variable). Table 23 Descriptive Statistics for Dependent Variables (n = 405) Dependent Variable Mean sd Computer concepts 24.1852 4.56610 Word processing skills 25.9926 4.63387 Spreadsheet skills 19.3802 6.76542 Presentation skills 22.7926 6.36446 Database skills 18.6889 7.14018 Trends and Relationship Comparisons Analysis of variance (ANOVA) is a procedure for evaluating the mean differences between two or more groups of subjects that vary on a single independent variable (Gravetter & Wallnau, 2004; Cronk, 2004). A one-way ANOVA was used to compare the mean of computer concepts (dependent variable) with the department or content area of the faculty member (independent variable). As illustrated in Table 24, a significant difference was found among departments (F (20, 384) = 3.39, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 25. For the sake of brevity, only significant differences are shown in the table. This analysis revealed that a department of History/Humanities/ Philosophy/Political Science had a significantly lower rating (m = 21.12, sd = 3.76) on computer concepts than the Computer Science/Information Systems Department (m = 27.45, sd = 3.59). The Education Department (m = 27.59, sd = 3.24) had significantly higher ratings on computer concepts than several other departments including 68

69 Chemistry/Physics/Science Education (m = 21.96, sd = 5.11), English (m = 22.72, sd = 4.73), History/Humanities/Philosophy/ Political Science (m = 21.11, sd = 3.76), Mathematics/Statistics (m = 22.41, sd = 4.84), Psychology/Sociology/Counseling (m = 22.76, sd = 4.18), and the Other category (m = 24.34, sd = 4.50). All other departments showed no significant difference in the importance of computer concepts. Table 24 One-way ANOVA comparing Computer Concepts by Department SS df MS F p Between Groups 1265.170 20 63.259 3.394.000 Within Groups 7157.941 384 18.640 Total 8423.111 404 Table 25 Tukey s HSD comparing Computer Concepts by Department Mean Department Department Difference Std. Error Sig. CS/IS History/Humanities/ Philosophy/ Political Science 6.3392(*) 1.55291.009 Education Chemistry/Physics/ Science Education 5.6331(*) 1.10487.000 English 4.8724(*) 1.24072.016 History/Humanities/ Philosophy/Political Science 6.4792(*) 1.10487.000 Math/Statistics 5.1828(*) 1.26503.009 Psychology/Sociology/ Counseling 4.8346(*) 1.11777.003 Other 3.2508(*) 0.83544.018 * The mean difference is significant at the.05 level. A one-way ANOVA was used to compare the mean of word processing skills (dependent variable) with the department or content area (independent variable). As illustrated in Table 26, a significant difference was found among departments (F (20, 384) = 2.87, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 27. This analysis revealed the Education

70 Department (m = 28.97, sd = 3.01) had higher ratings for word processing skills than other departments including History/Humanities/Philosophy/Political Science (m = 23.38, sd = 3.67), Mathematics/Statistics (m = 22.47, sd = 6.28), Psychology/Sociology/ Counseling (m = 24.28, sd = 4.40), and the Other category (m = 25.64, sd = 5.11). The other departments showed no significant difference in the importance of word processing skills. Table 26 One-way ANOVA comparing Word Processing Skills by Department SS df MS F p Between Groups 1128.478 20 56.424 2.871.000 Within Groups 7546.500 384 19.652 Total 8674.978 404 Table 27 Tukey s HSD comparing Word Processing Skills by Department Mean Department Department Difference Std. Error Sig. Education History/Humanities/ Philosophy/Political Sci 5.5884(*) 1.13446.000 Math/Statistics 6.5024(*) 1.29891.000 Psychology/Sociology/ Counseling 4.6930(*) 1.14771.009 other 3.3376(*) 0.85782.018 * The mean difference is significant at the.05 level. A one-way ANOVA was used to compare the mean of spreadsheet skills (dependent variable) with the department or content area (independent variable). As illustrated in Table 28, a significant difference was found among departments (F (20, 384) = 6.24, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 29. This analysis revealed that Accounting/ Economics/Finance (m = 25.04, sd = 6.10) had higher ratings for spreadsheet skills than

71 other departments including Art (m = 13.64, sd = 5.68), Communications/Theatre/ Languages (m = 15.73, sd = 3.20), English (m = 14.89, sd = 4.63), History/Humanities/ Philosophy/ Political Science (m = 13.62, sd = 5.19), Music (m = 15.33, sd = 5.16), Psychology/ Sociology/Counseling (m = 17.00, sd = 4.88), and the Other category (m = 19.06, sd = 7.30). Agriculture (m = 24.13, sd = 3.85) had higher spreadsheet ratings than Art, English, History/Humanities/Philosophy/ Political Science, Music, and Psychology/Sociology/ Counseling. Business Administration (m = 24.07, sd = 4.50) had higher spreadsheet ratings than Art, English, History/Humanities/Philosophy/Political Science, and Music. Chemistry/ Physics/Science Education (m = 22.54, sd = 6.22) had higher spreadsheet ratings than Art, English, History/ Humanities/Philosophy/Political Science, and Music. Computer Science/Information Systems Department (m = 25.64, sd = 5.85) had higher spreadsheet ratings than Art, Communication/Theatre/Languages, English, History/Humanities/ Philosophy/Political Science, Music, and Psychology/Sociology/Counseling. The Education Department (m = 20.95, sd = 5.86) and the Other category had a higher spreadsheet rating than History/Humanities/ Philosophy/Political Science departments. Table 28 One-way ANOVA comparing Spreadsheet Skills by Department SS df MS F p Between Groups 4536.719 20 226.836 6.242.000 Within Groups 13954.723 384 36.340 Total 18491.442 404

Table 29 Tukey s HSD comparing Spreadsheet Skills by Department Department Department Mean Difference Std. Error Sig. Accounting Art 11.4036(*) 2.18112.000 Comm/Theatre/Languages 9.3127(*) 2.18112.004 English 10.1511(*) 1.86347.000 History/Humanities/ Philosophy/Political Sci 11.4246(*) 1.68859.000 Music 9.7067(*) 1.96884.000 Psyc/Soc/Counseling 8.0400(*) 1.70506.001 Other 5.9775(*) 1.35357.002 Agriculture Art 10.4970(*) 2.39298.003 English 9.2444(*) 2.10751.003 History/Humanities/ Philosophy/Political Sci 10.519(*) 1.95459.000 Music 8.8000(*) 2.20122.012 Psyc/Soc/Counseling 7.1333(*) 1.96884.046 Business Admin Art 10.4351(*) 2.42887.004 English 9.1825(*) 2.14818.004 History/Humanities/ Philosophy/Political Sci 10.4560(*) 1.99836.000 Music 8.7381(*) 2.24019.018 Chem/Physics/Sci Ed Art 8.9021(*) 2.16827.008 English 7.6496(*) 1.84841.007 History/Humanities/ Philosophy/Political Sci 8.9231(*) 1.67195.000 Music 7.2051(*) 1.95459.037 CS/IS Art 12.0000(*) 2.57048.001 Comm/Theatre/Lang 9.9091(*) 2.57048.021 English 10.7475(*) 2.30707.001 History/Humanities/ Philosophy/Political Sci 12.0210(*) 2.16827.000 Music 10.3030(*) 2.39298.004 Psyc/Soc/Counseling 8.6364(*) 2.18112.014 Education History/Humanities/ Philosophy/Political Sci 7.3306(*) 1.54269.001 Other History/Humanities/ Philosophy/Political Sci 5.4471(*) 1.33276.009 * The mean difference is significant at the.05 level. 72

73 A one-way ANOVA was used to compare the mean of presentation software skills (dependent variable) with the department or content area (independent variable). As illustrated in Table 30, a significant difference was found among departments (F (20, 384) = 5.16, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 31. This analysis resulted in the Agriculture Department (m = 25.67, sd = 3.50) having higher presentation skill ratings than History/ Humanities/Philosophy/Political Science (m = 18.27, sd = 5.77) and Music (m = 16.93, sd = 6.11). Business Administration (m = 28.00, sd = 2.80) had higher ratings than English (m = 18.67, sd = 5.36), History/Humanities/Philosophy/Political Science, Math (m = 19.24, sd = 7.81), Music, and Psychology/Sociology/Counseling (m = 20.80, sd = 6.07). The Computer Science/Information Systems Department (m = 25.55, sd = 5.65) had higher presentation ratings than Music. The Education Department (m = 27.54, sd = 3.59) had higher presentation ratings than Chemistry/Physics/Science Education (m = 21.69, sd = 6.10), History/Humanities/Philosophy/Political Science, English, Math/Statistics, Music, Psychology/Sociology/Counseling, and the Other category (m = 23.35, sd = 6.17). The Other category had higher presentation skills ratings than History/Humanities/ Philosophy/Political Science and Music. Table 30 One-way ANOVA comparing Presentation Skills by Department SS df MS F p Between Groups 3463.948 20 173.197 5.155.000 Within Groups 12900.630 384 33.595 Total 16364.578 404

74 Table 31 Tukey s HSD comparing Presentation Skills by Department Mean Department Department Difference Std. Error Sig. Agriculture History/Humanities/ Philosophy/Political Sci 7.3974(*) 1.87931.016 Music 8.7333(*) 2.11646.008 Business Admin English 9.33333(*) 2.06545.002 History/Humanities/ Philosophy/Political Sci 9.7308(*) 1.92141.000 Math/Statistics 8.7647(*) 2.09186.006 Music 11.0667(*) 2.15392.000 Psyc/Soc/Counseling 7.2000(*) 1.93481.033 CS/IS Music 8.6121(*) 2.30083.031 Education Chemistry/Physics/Sci Ed 5.8482(*) 1.48328.015 English 8.8739(*) 1.66565.000 History/Humanities/ Philosophy/Political Sci 9.2713(*) 1.48328.000 Math/Statistics 8.3052(*) 1.69829.000 Music 10.6072(*) 1.77417.000 Psyc/Soc/Counseling 6.7405(*) 1.50060.002 Other 4.1864(*) 1.12158.032 Other History/Humanities/ Philosophy/Political Sci 530849(*) 1.28144.014 Music 6.4208(*) 1.60924.013 * The mean difference is significant at the.05 level. A one-way ANOVA was used to compare the mean of database software skills (dependent variable) with the department or content area (independent variable). As illustrated in Table 32, a significant difference was found among departments (F (20, 384) = 3.68, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 33. This analysis revealed that Accounting/Economics/Finance (m = 23.24, sd = 7.15) had higher database skills ratings than Art (m = 13.10, sd = 5.15), Chemistry/Physics/Science Education (m = 15.31, sd = 6.43), English (m = 14.17, sd = 6.32), and History/Humanities/Philosophy/Political

Science (m = 15.65, sd = 6.29). The Computer Science/Information Systems Department had higher database ratings than Art, Chemistry/Physics/Science Education, English, History, and Music (m = 15.40, sd = 6.77). The department of Education (m = 22.19, sd = 6.28) had higher database ratings than Art, Chemistry/Physics/Science Education, English, and History/Humanities/Philosophy/Political Science. Table 32 One-way ANOVA comparing Database Skills by Department SS df MS F p Between Groups 3309.254 20 165.463 3.675.000 Within Groups 17287.546 384 45.020 Total 20596.800 404 Table 33 Tukey s HSD comparing Database Skills by Department Department Accounting/ Econ/Finance Department Art Mean Difference Std. Error Sig. 10.1491(*) 2.42765.006 Chemistry/Physics/ Science Education 7.9323(*) 1.87944.005 English 9.0733(*) 2.07410.003 History/Humanities/ Philosophy/Political Sci 7.5862(*) 1.87944.011 CS/IS Art 12.1818(*) 2.86101.005 Chemistry/Physics/Sci Ed 9.9650(*) 2.41334.008 English 11.1061(*) 2.56784.003 History/Humanities/ 9.6189(*) 2.41334.013 Philosophy/Political Sci Music 9.8727(*) 2.66346.035 Education Art 9.0983(*) 2.30422.015 Chemistry/Physics/Sci Ed 6.8815(*) 1.71705.012 English 8.0225(*) 1.92817.007 History/Humanities/ Philosophy/Political Sci * The mean difference is significant at the.05 level. 6.5353(*) 1.71705.025 75

A one-way ANOVA was used to compare the overall composite scores with the department or content area. As illustrated in Table 34, a significant difference was found among departments (F (20, 384) = 4.76, p =.000). Tukey s HSD was used to determine the nature of the differences between the departments as shown in Table 35. This analysis revealed Accounting/Economics/Finance (m = 122.80, sd = 23.68) rated higher on all computer concepts and application skills than English (m = 96.67, sd = 20.04) and History/Humanities/Philosophy/Political Science (m = 92.04, sd = 20.28). Agriculture (m = 122.87, sd = 18.03) had higher overall ratings than History/Humanities/Philosophy/ Political Science. Business Administration (m = 125.14, sd = 16.08) had higher overall ratings than English and History/Humanities/Philosophy/Political Science. Computer Science/Information Systems had higher overall ratings than English, History/Humanities/Philosophy/Political Science, Math/Statistics (m = 99.06, sd = 28.87), and Music (m = 97.53, sd = 20.45). Education (m = 127.24, sd = 17.46) had higher overall ratings than English, History/Humanities/Philosophy/Political Science, Math/Statistics, Music, Psychology/Sociology/Counseling (m = 103.76, sd = 20.50), and Other category (m = 111.17, sd = 24.27). The Other category had higher overall ratings than History/Humanities/Philosophy/Political Science. Table 34 One-way ANOVA comparing Total Composite Score by Department SS df MS F p Between Groups 46393.933 20 2319.697 4.755.000 Within Groups 187345.435 384 487.879 Total 233739.368 404 76

77 Table 35 Tukey s HSD comparing Total Composite Score by Department Mean Difference Std. Error Sig. Department Department Accounting/ English Econ/Finance 26.1333(*) 6.82785.023 History/Humanities/ Philosophy/Political Sci 30.7615(*) 6.18706.000 Agriculture History/Humanities/ Philosophy/Political Sci 30.8282(*) 7.16169.004 Business Admin English 28.4762(*) 7.87101.046 History/Humanities/ Philosophy/Political Sci 33.1044(*) 7.32209.002 CS/IS English 33.6061(*) 8.45322.013 History/Humanities/ Philosophy/Political Sci 38.2343(*) 7.94463.000 Math/Statistics 31.2139(*) 8.54701.042 Music 32.7394(*) 8.76800.032 Education English 30.5766(*) 6.34746.000 History/Humanities/ Other 35.2048(*) 5.65248.000 Philosophy/Political Sci Math/Statistics 28.1844(*) 6.47184.003 Music 29.7099(*) 6.76100.003 Psyc/Soc/Counseling 23.4832(*) 5.71848.008 Other 16.0766(*) 4.27411.029 History/Humanities/ Philosophy/Political Sci * The mean difference is significant at the.05 level. 19.1282(*) 4.88330.017 A one-way ANOVA was used to compare the overall composite scores with the institution. As illustrated in Table 36, no significant difference was found (F (11,393) = 0.89, p =.546). The importance of computer concepts and computer application skills did not differ significantly between institutions. Table 36 One-way ANOVA comparing Total Composite Score by Institution SS df MS F p Between Groups 5706.446 11 518.768.894.546 Within Groups 228032.922 393 580.236 Total 233739.368 404

Tables 37 and 38 illustrate the results of an independent-samples t test comparing the overall mean score of male subjects to the overall mean score of female subjects. No significant difference was found (t (403) = 1.33, p =.183). The mean of the female subjects (m = 113.03, sd = 24.70) was not significantly different from the mean of the male subjects (m = 109.77, sd = 25.59). Table 37 Independent Samples t-test comparing Total Composite Score by Gender t df p Equal variances assumed 1.334 403.183 Equal variances not assumed 1.321 323.378.187 Table 38 Total Composite Score Group Statistics by Gender Std. Error Mean Gender n Mean sd female 158 113.0316 24.70267 1.96524 male 247 109.7652 23.59089 1.50105 However, when using a one-way ANOVA to compare each dependent variable (computer concepts, word processing, spreadsheet, presentation, and database skills) with gender there were significant differences as illustrated in Tables 39. Post-hoc tests were not performed because there were fewer than three groups. A significant difference was found between gender and computer concepts (F (1, 403) = 14.47, p =.000). Analysis revealed that female faculty (m = 25.25, sd = 4.37) rated the importance of computer concepts higher than male (m = 23.51, sd = 4.57). A significant difference was found between gender and word processing (F (1, 403) = 8.08, p =.005). Analysis revealed that female faculty (m = 26.80, sd = 4.25) rated the importance of word processing higher than male (m = 25.47, sd = 4.80). A significant difference was found between gender and spreadsheet skills (F (1, 403) = 4.68, p =.031). Analysis revealed that male faculty (m = 78

79 19.96, sd = 6.48) rated the importance of spreadsheet skills higher than female (m = 18.47, sd = 7.11). A significant difference was found between gender and presentation skills (F (1, 403) = 4.56, p =.033). Analysis revealed that female (m = 23.63, sd = 6.55) rated the importance of presentation skills higher than male (m = 22.26, sd = 6.19). No significant difference was found between database and gender (F (1, 403) =.173, p =.678). Table 39 One-way ANOVA comparing Dependent Variables by Gender SS df MS F p Computer Between Groups Concepts 291.997 1 291.997 14.472.000 Within Groups 8131.114 403 20.176 Total 8423.111 404 Word Processing Between Groups 170.481 1 170.481 8.079.005 Within Groups 8504.497 403 21.103 Total 8674.978 404 Spreadsheet Between Groups 212.448 1 212.448 4.684.031 Within Groups 18278.994 403 45.357 Total 18491.442 404 Presentation Between Groups 182.938 1 182.938 4.556.033 Within Groups 16181.640 403 40.153 Total 16364.578 404 Database Between Groups 8.822 1 8.822.173.678 Within Groups 20587.978 403 51.087 Total 20596.800 404 A one-way ANOVA was used to compare the overall composite scores with the years of faculty experience in education. As illustrated in Table 40, no significant difference was found (F (4, 400) = 1.98, p =.097). The importance of computer concepts and computer application skills did not differ significantly when compared to the respondent s years of educational experience.

80 Table 40 One-way ANOVA comparing Overall Rating by Years of Experience SS df MS F p Between Groups 4528.703 4 1132.176 1.976.097 Within Groups 229210.665 400 573.027 Total 233739.368 404 Cronk (2004) describes Multivariate Analysis of Variance (MANOVA) as a test that involves more than one dependent variable and is used to reduce Type I error inflation. A one-way MANOVA was calculated examining the effect of content area on all five dependent variables; computer concepts, word processing, spreadsheet, presentation, and database scores. A significant effect was found (Lambda(100, 1858) =.425, p =.000). Table 41 One-way MANOVA comparing Five Dependent Variables by Department/Content Area Hypothesis Effect Value F df Error df p Wilks.425 3.562 100 1858.503.000 Department Lambda Follow-up univariate ANOVAs (Table 42) indicated that the importance of computer concepts (F(20,384) = 3.39, p =.000), word processing (F(20,384) = 2.87, p =.000), spreadsheet (F(20,384) = 6.24, p =.000), presentation (F(20,384) = 5.16, p =.000), and database skills (F(20,384) = 3.68, p =.000) were all significantly different by department/content area.

81 Table 42 Follow-up Univariate ANOVAs comparing Five Dependent Variables by Department/Content Area Type III Sum of Squares df MS F p Source Dependent Variable Depart Computer Concepts 1265.170 20 63.259 3.394.000 Word Processing 1128.478 20 56.424 2.871.000 Spreadsheet 4536.719 20 226.836 6.242.000 Presentation 3463.948 20 173.197 5.155.000 Database 3309.254 20 165.463 3.675.000 Error Computer Concepts 7157.941 384 18.640 Word Processing 7546.500 384 19.652 Spreadsheet 13954.723 384 36.340 Presentation 12900.630 384 33.595 Database 17287.546 384 45.020 Total Computer Concepts 245317.000 405 Word Processing 282299.000 405 Spreadsheet 170607.000 405 Presentation 226763.000 405 Database 162053.000 405 A one-way MANOVA was calculated examining the effect of institution on all five dependent variables; computer concepts, word processing, spreadsheet, presentation, and database scores. Table 43 indicates no significant effect was found (Lambda(55, 1804) =.880, p =.637). None of the five dependent variables were significantly influenced by institution. Table 43 One-way MANOVA comparing Five Dependent Variables by Institution Hypothesis Effect Value F df Error df p Institution Wilks Lambda.880.922 55.000 1804.179.637 A one-way MANOVA was calculated examining the effect of years of experience in education on all five dependent variables; computer concepts, word processing, spreadsheet, presentation, and database scores. Table 44 indicates no significant effect

82 was found (Lambda(20, 1314) =.926, p =.061). None of the five dependent variables were significantly influenced by years of experience in education. Table 44 One-way MANOVA comparing Five Dependent Variables by Yrs of Experience Hypothesis Effect Value F df Error df p Institution Wilks Lambda.926 1.535 20.000 1314.333.061 A one-way MANOVA was calculated examining the effect of gender on all five dependent variables; computer concepts, word processing, spreadsheet, presentation, and database scores. Table 45 indicates a significant effect was found (Lambda(5, 399) =.884, p =.000). Table 45 One-way MANOVA comparing Five Dependent Variables by Gender Hypothesis Effect Value F df Error df p Gender Wilks Lambda.884 10.448 5.000 399.000.000 Follow-up univariate ANOVAs, as illustrated in Table 46, indicated that the importance of database skills was not significantly influenced by gender (F(1,403) =.173, p =.678). Computer concepts (F(1,403) = 14.47, p =.000), word processing (F(1, 403) = 8.08, p =.005), spreadsheet (F(1,403) = 4.68, p =.031), and presentation (F(1,403) = 4.56, p =.033) were all significantly different by gender. This confirmed earlier findings.

Table 46 Follow-up Univariate ANOVAs comparing Five Dependent Variables by Gender Source Dependent Variable Type III Sum of Squares df MS F p Gender Computer Concepts 291.997 1 291.997 14.472.000 Word Processing 170.481 1 170.481 8.079.005 Spreadsheet 212.448 1 212.448 4.684.031 Presentation 182.938 1 182.938 4.556.033 Database 8.822 1 8.822.173.678 Error Computer Concepts 8131.114 403 20.176 Word Processing 8504.497 403 21.103 Spreadsheet 18278.994 403 45.357 Presentation 16181.640 403 40.153 Database 20587.978 403 51.087 Total Computer Concepts 245317.000 405 Word Processing 282299.000 405 Spreadsheet 170607.000 405 Presentation 226763.000 405 Database 162053.000 405 A 2 (gender) x 22 (department) between-subjects factorial ANOVA was calculated comparing the overall score for subjects who were male or female and who were one of 22 department categories. Table 47 shows a significant main effect for department (F(20, 363 ) = 3.68, p =.000). These findings were reported earlier. The main effect for gender was not significant (F(1, 363) =.52, p =.471 ). The interaction was also not significant (F(20, 363) = 1.18, p =.265). The effect of the department was not influenced by whether or not the faculty member was male or female. 83

Table 47 Summary of 2(Gender) by 22 (Department) Analysis of Variance for Overall Score Type III Source Sum of Squares df MS F p Department 35685.634 20 1784.282 3.684.000 Gender 252.798 1 252.798.522.471 Department * Gender 11472.736 20 573.637 1.184.265 Error 175835.090 363 484.394 Total 5227297.000 405 84 Statistical Analysis Student Assessment Demographic Descriptive Statistics Demographic data were collected on student participants including gender, age, home state, size of high school graduating class, number of computer courses taken in high school, and college major. Of the 164 student participants, 106 (64.6%) were female and 58 (35.4%) were male. The majority of student participants (75.6%) were ages 18 and 19 with other ages ranging from 20 49. For purposes of further analysis, ages were divided into three categories, those less than 20 years of age, those between 20 and 25 years of age, and those over 25 years of age as illustrated in Table 48. Table 48 Frequencies and Percentages for Age (n = 164) Age f % Valid Percent Cumulative Percent 18 19 yrs 124 75.6 75.6 75.6 20 25 yrs 38 23.2 23.2 98.8 over 25 yrs 2 1.2 1.2 100.0 Total 164 100.0 100.0 One hundred eight (65.9%) of the student participants indicated Missouri as their home state, 29 (17.7%) students were from Iowa, 19 (11.6%) students were from

Nebraska, one student (0.6%) was from Kansas, and 7 (4.3%) students indicated other as home state. Students were asked the size of their high school graduating class. Five categories were developed according to the results: 53 (32.3%) students had a high school graduating class less than 100 students, 48 (29.3%) had a high school graduating class of 100 199 students, 24 (14.6%) had a graduating class of 200 399 students, 33 (20.1%) had a graduating class of 400 599 students, and six students (3.7%) had 600 or more in their high school graduating class. Results are summarized in Table 49. Table 49 Frequencies and Percentages for Size of High School Graduating Class (n = 164) Size f % Valid Percent Cumulative Percent less than 100 53 32.3 32.3 32.3 100 199 students 48 29.3 29.3 61.6 200 399 students 24 14.6 14.6 76.2 400 599 students 33 20.1 20.1 96.3 over 600 students 6 3.7 3.7 100.0 Total 164 100.0 100.0 Student participants were also asked how many computer courses they had taken in high school. As Table 50 summarizes, 44 (26.8%) students had no computer courses in high school, 57 (34.8%) students had one computer course, 48 (29.3%) students had two courses, 11 (6.7%) students had three courses, and four students (2.4%) had four or more courses. Table 50 Frequencies and Percentages for # of Computer Courses Taken in High School (n = 164) Computer Courses f % Valid Percent Cumulative Percent 0 44 26.8 26.8 26.8 1 57 34.8 34.8 61.6 85

86 Computer Courses f % Valid Percent Cumulative Percent 2 48 29.3 29.3 90.9 3 11 6.7 6.7 97.6 4 2 1.2 1.2 98.8 6 2 1.2 1.2 100.0 Total 164 100.0 100.0 The top three areas in regards to major field of study resulted in 42 students (25.6%) majoring in Education, 27 students (16.5%) majoring in Marketing/ Management, and 20 students (12.2%) were undecided. Table 51 summarizes other major fields of study. Table 51 Frequencies and Percentages for Major Field of Study (n = 164) f % Accounting / Economics / Finance 10 6.1 Agriculture 6 3.7 Art 12 7.3 Biological Sciences 1.6 Business Administration 4 2.4 Computer Science / Information Systems 3 1.8 Education 42 25.6 English 2 1.2 Family & Consumer Science 9 5.5 Geology / Geography 1.6 Health / Physical Education / Recreation / Dance 3 1.8 History / Humanities / Philosophy / Political Science 5 3.0 Marketing / Management 27 16.5 Mass Communications / Broadcasting / Digital Media / Journalism 3 1.8 Mathematics / Statistics 1.6 Music 3 1.8 Psychology / Sociology / Counseling 3 1.8 Other 9 5.5 Undecided 20 12.2 Total 164 100.0

87 Reliability The student assessment consisted of five major sections; computer concepts, word processing, spreadsheet, presentation, and database skills. The computer concepts section consisted of 150 multiple choice questions (25 questions from six different module areas) covering various computer topics. Module one questions covered computer and information literacy, introduction to application software, word processing concepts, and inside the system. Module two questions covered understanding the Internet, email, system software, and exploring the Web. Module three questions covered spreadsheets concepts, current issues, emerging technologies, and data storage. Module four covered presentations packages, special purpose programs, multimedia/virtual reality, and input/output. Module five questions covered database concepts, telecommunications, and networks. Module six questions covered creating a Web page, ethics, and security. As an incentive to encourage student participation, any student who passed the assessments at 80% mastery could test out of the course. This 80% mastery had been used in the past for course test out purposes. Thus, the researcher agreed to use the same procedures and tests as would be given for a course test out. Item-total analysis was used to assess the internal consistency of the data. The Pearson correlation coefficient was used to conduct this analysis because the data were interval in nature. Correlations for the data revealed that Module one items (r = +.64, n = 164, p =.000, two tails), Module two items r = +.58, n = 164, p =.000, two tails), Module three items (r = +.64, n = 164, p =.000, two tails), Module four items (r = +.73, n = 164, p =.000, two tails), Module five items (r = +.67, n = 164, p =.000, two tails),