Commission on Accreditation, Office of Program Consultation & Accreditation The APA 2012 Annual Report Online Commission on Accreditation Portal Survey Report American Psychological Association, Washington D.C.
Introduction The Annual Report Online (ARO) was first implemented electronically in 2002 for collection of data utilized exclusively by the American Psychological Association (APA) for accreditation of doctoral, internship, and postdoctoral programs that educate and train future psychologists. Annually, all APA accredited programs are required to submit data on their program, students and faculty to an online data collection system. A little over a year ago, the Office of Program Consultation and Accreditation (OPCA), on behalf of the Commission on Accreditation (CoA), contracted a Boston-based IT company, Liaison International, to build a new and improved eaccreditation platform online. This new platform is now known as the CoA Portal and will move many accreditation processes now completed on paper to an electronic process. The CoA Portal debuted in 2012 launching the first ARO in the new CoA Portal. The 2012 ARO opened on June 4, 2012 and closed on September 15, 2012. The ARO closed with about 99% of doctoral and internship programs and 100% of postdoctoral programs submitting their data by the September 15 th deadline, one of the most successful years for ontime ARO submission. Following the closing of the ARO the CoA planned to evaluate the new eaccreditation platform by surveying registered Portal users. The survey serves as a reference for the CoA in its implementation of changes in the Portal in terms of content, functionality and utility of the site. About 25% of registered Portal users submitted completed surveys. The aim of this report is to inform stakeholders and the general public about the results of the survey and the action taken by the Commission in response to the feedback provided. This report presents raw data, some in aggregate form, and shows the distribution of survey responses, as well as some follow-up analyses that summarize the results. However, due to the nature of the survey, mainly because survey items were not confirmed through any means of factor analysis for validation purposes, summarized results wield no interpretive strength. Methodology The Research department implemented the survey campaign using Surveygizmo. Liaison supplied a spreadsheet containing email contact information for all registered CoA Portal users, which totaled 1196. The CoA-approved survey was opened to respondents on December 5, 2012 and closed on December 31, 2012. Eleven hundred eighty-nine emails successfully reached valid inboxes. One reminder email was sent out on December 18, 2012, of those, 2 did not reach a valid inbox and were returned. Registered CoA Portal users received an email that contained a direct link to the survey located on the Surveygizmo website and were guaranteed that their responses would remain anonymous. The survey was not password protected and the site settings permitted multiple uses of the distributed survey link. The survey consisted of 30 multiple-choice questions, of which 22 were Likert-scale items ranging from 1 = Strongly disagree to 5 = Strongly agree (Appendix A). Testing by research staff found completion of the survey to take approximately 5 minutes. The Likert-scale items were categorized based on content among three domain variables: Overall portal use, Usability, or Performance/Navigation (Appendix A). For example overall Portal use included items related to overall confidence in quality of data entered and website design. These domains were not derived scientifically. 1
Statistical Analysis Descriptive statistics and some exploratory analysis were performed. Frequency distributions were reported (N and %). Crosstabulation tables were used to explore relationships between variables. Following exploration of the data, Pearson s chi square statistic was reported for statistically significant correlations. Statistical significance was set at p 0.150 due to the explorative nature of the analyses. Results A total of 335 surveys were returned, of those 292 were completed and 43 were partially completed. Four surveys marked as partially complete were removed from the analysis due to the fact that no answers were logged for any of the question items. I. Respondent characteristic Table 1. Program information by respondent role in program Program Program Type director/faculty Administration Doctoral only 84 (40.6) 97 (78.9) 181 (54.7) Internship only 95 (45.9) 12 (9.8) 108 (32.6) Postdoc only 12 (5.8) 3 (2.4) 15 (4.5) Doctoral+Internship 4 (1.9) 7 (5.7) 11 (3.3) Doctoral+Postdoc 1 (0.5) 0 (0) 1 (0.3) Internship+Postdoc 11 (5.3) 2 (1.6) 13 (3.9) All 0 (0) 2 (1.6) 2 (0.6) Number Accred Programs Data Entered 1 194 (93.7) 104 (83.9) 298 (90) 2 or 3 13 (6.3) 20 (16.1) 33 (10) 2
Table 2. System use by respondent role in program Program Participation in webinar training director/faculty Administration Yes 86 (41.5) 72 (58.1) 158 (47.7) No 121 (58.5) 52 (41.9) 173 (52.3) Role in data entry All 144 (70) 79 (63.7) 223 (67.6) Most 18 (8.7) 25 (20.2) 43 (13) Half 4 (1.9) 13 (10.5) 17 (5.2) Some 16 (7.8) 3 (2.4) 19 (5.8) A little 12 (5.8) 2 (1.6) 14 (4.2) None 12 (5.8) 2 (1.6) 14 (4.2) Table 3. Type of web browser used to enter data Browser type Internet Explorer 167 (50.7) Firefox 73 (22.2) Chrome 41 (12.5) Safari 28 (8.5) Unknown 56 (17) 3
II. Survey Results by Domain Table 4. Overall use of Portal site Strongly Disagree Disagree Neutral Agree Strongly Agree N Overall satisfaction with CoA Portal 306 10 (3.3) 17 (5.6) 40 (13.1) 177 (57.8) 62 (20.3) Website for all user levels 298 5 (1.7) 14 (4.7) 42 (14.1) 182 (61.1) 55 (18.5) Accurate pre-populated data 294 8 (2.7) 30 (10.2) 33 (11.2) 136 (46.3) 87 (29.6) Confident all records updated 304 4 (1.3) 3 (1) 13 (4.3) 110 (36.2) 174 (57.2) ARO data consistent with self-study 304 4 (1.3) 4 (1.3) 28 (9.2) 135 (44.4) 133 (43.8) Archives section is accurate 296 6 (2) 16 (5.4) 65 (22) 147 (49.7) 62 (20.9) Strongest agreement was found for respondent confidence that all ARO records were added or updated before submission and the belief that their ARO data are consistent with their self-study data. Table 5. Usability of Portal site Strongly Disagree Disagree Neutral Agree Strongly Agree N Website is easy to use 298 0 (0) 34 (11.4) 42 (14.1) 222 (74.5) 0 (0) Webinars/online tutorials useful 180 2 (1.1) 10 (5.6) 52 (28.9) 89 (49.4) 26 (14.4) ARO instructions were adequate 296 4 (1.4) 34 (11.5) 48 (16.2) 165 (55.7) 45 (15.2) Progress bars on ARO dashboard helped 296 2 (0.7) 16 (5.4) 42 (14.2) 146 (49.3) 90 (30.4) Understood how and when to use "mark as complete" buttons 294 3 (1.0) 14 (4.8) 37 (12.6) 163 (55.4) 77 (26.2) ARO tip email' was helpful 279 3 (1.1) 18 (6.5) 77 (27.6) 125 (44.8) 56 (20.1) Site has expected functionality 294 13 (4.4) 56 (19.0) 70 (23.8) 125 (42.5) 30 (10.2) Time to complete ARO was reasonable 298 17 (5.7) 31 (10.4) 43 (14.4) 164 (55) 43 (14.4) About eighty percent of respondents agreed or strongly agreed that the progress bars were useful and understood how and when to use the mark as complete button. However, there was notable disagreement that the site had the expected functionality, that the time to complete the ARO was reasonable and that the instructions were adequate. 4
Table 6. Performance/Navigation of Portal site Strongly Disagree Disagree Neutral Agree Strongly Agree N Page load rate was good 296 27 (9.1) 37 (12.5) 35 (11.8) 152 (51.4) 45 (15.2) Organized/clear info on website 293 7 (2.4) 23 (7.8) 42 (14.3) 172 (58.7) 49 (16.7) Should auto-sort by last name 287 9 (3.1) 16 (5.6) 60 (20.9) 105 (36.6) 97 (33.8) *Unsorted columns frustrating 287 0 (0) 16 (3.3) 60 (34.1) 105 (33) 97 (29.6) *Redirected to first page after update is frustrating 275 0 (0) 7 (2.5) 72 (26.2) 98 (35.6) 98 (35.6) *Reliability error check did not give solutions 270 5 (1.9) 23 (8.5) 76 (28.1) 84 (31.1) 82 (30.4) Aware of ARO data Excel download link 289 11 (3.8) 55 (19) 43 (14.9) 141 (48.8) 39 (13.5) Understand how to submit ARO data 292 3 (1.0) 6 (2.1) 19 (6.5) 181 (62) 83 (28.4) *Item framed negatively toward expected frustration with system features/functionality Ninety percent of respondents agreed or strongly agreed that they understood the steps required to formally submit ARO data but just over sixty percent agreed that they were aware they could download their ARO data. Twenty percent disagreed that the page load rate was good and ten percent disagreed that information on the website was organized and clear. III. Crosstabulation (Crosstab) Results Due to the low rate of response from faculty respondents, crosstab results by program role were collapsed across faculty (N = 16) and program director (N = 191) categories and resultant frequencies were compared to administrative respondents (N = 124). Differences in responses trended toward statistical significance based on which role or position respondents held in the program. Administrative respondents tended to better understand how to download a spreadsheet of ARO data and agree that the CoA Portal instructions were adequate. Interestingly, there was more of a tendency for more administrative respondents to participate in data entry training than program directors, and to prefer records to be automatically sorted by last name. Project directors were more likely to report they were unable to complete data entry in a reasonable amount of time, and they are less confident their historical ARO data are accurate. 5
Table 7. Program role and overall satisfaction with CoA Portal What is your role in the program? Program director/faculty Administration I understand I can download my program's 2012 ARO data I was able to complete data entry in a reasonable amount of time Did you participate in any data entry training? I believe my program's historical ARO data in the archives section is accurate Student/Graduate/Fac ulty profile tables should automatically sort records by last name When working through the ARO, the instructions were adequate Disagree 50 (28.2) 16 (14.3) 66 (22.8) Neutral 30 (16.9) 13 (11.6) 43 (14.9) Agree 97 (54.8) 83 (74.1) 180 (62.3) 177 (61.2) 112 (38.8) 289 (100) Disagree 39 (21.3) 9 (7.8) 48 (16.1) Neutral 22 (12) 21 (18.3) 43 (14.4) Agree 122 (66.7) 85 (73.9) 207 (69.5) 183 (61.4) 115 (38.6) 298 (100) Yes 87 (41.8) 71 (57.7) 158 (47.7) No 121 (58.2) 52 (42.3) 173 (52.3) 208 (62.8) 123 (37.2) 331 (100) Disagree 20 (10.9) 2 (1.8) 22 (7.4) Neutral 35 (19.1) 30 (26.5) 65 (22) Agree 128 (69.9) 81 (71.7) 209 (70.6) 183 (61.8) 113 (38.2) 296 (100) Disagree 17 (9.8) 8 (7) 25 (8.7) Neutral 44 (25.4) 16 (14) 60 (20.9) Agree 112 (64.7) 90 (78.9) 202 (70.4) 173 (60.3) 114 (39.7) 287 (100) Disagree 30 (16.4) 8 (7.1) 38 (12.8) Neutral 31 (16.9) 17 (15) 48 (16.2) Agree 122 (66.7) 88 (77.9) 210 (70.9) 183 (61.8) 113 (38.2) 296 (100) Disagree 10 (10.2) 3 (3.7) 13 (7.2) I found the training Neutral 30 (30.6) 22 (26.8) 52 (28.9) webinars and online tutorials useful Agree 58 (59.2) 57 (69.5) 115 (63.9) 98 (54.4) 82 (45.6) 180 (100) *** p<0.00, ** p<0.05, * p<0.150 X² 11.276 *** (p=.004) 10.412 *** (p=0.005) 7.829 *** (p=.005) 9.668 *** (p=.008) 6.864 ** (p=.032) 6.113 ** (p=.047) 3.615 (p=.164) 6
Table 8. Program role and overall satisfaction with CoA Portal (Continue) Program director/faculty What is your role in the program? Administration X² Overall, I am satisfied with the CoA Portal ARO website...it was frustrating that the system returned to first page of student/ graduate/faculty profile tables... it was frustrating that column sort preferences were not saved when returning to profile tables The data that were pre-populated from previous AROs were accurate Disagree 21 (11.1) 6 (5.2) 27 (8.8) Neutral 24 (12.6) 16 (13.8) 40 (13.1) Agree 145 (76.3) 94 (81) 239 (78.1) 190 (62.1) 116 (37.9) 306 (100) Disagree 3 (1.8) 4 (3.6) 7 2.5) Neutral 48 (29.4) 24 (21.4) 72 26.2) Agree 112 (68.7) 84 (75) 196 71.3) 163 (59.3) 112 (40.7) 275 (100) Disagree 6 3.8 3 (2.7) 9 (3.3) Neutral 59 36.9 33 (30) 92 (34.1) Agree 95 59.4 74 (67.3) 169 (62.6) 160 59.3 110 (40.7) 270 (100) Disagree 26 (14.3) 12 (10.7) 38 (12.9) Neutral 21 (11.5) 12 (10.7) 33 (11.2) Agree 135 (74.2) 88 (78.6) 223 (75.9) 3.102 (p=.212) 2.780 (p=.249) 1.758 (p=.415).903 (p=.637) 182 (61.9) 112 (38.1) 294 (100) *** p<0.00, ** p<0.05, * p<0.150 7
Table 9. Data entry training and overall satisfaction with CoA Portal Did you participate in any data entry training? Yes No Overall, the website is easy to use Overall, I am satisfied with the CoA Portal ARO website The rate at which pages loaded was fast enough Disagree 9 (6) 25 (16.8) 34 (11.4) Neutral 24 (16.1) 18 (12.1) 42 (14.1) Agree 116 (77.9) 106 (71.1) 222 (74.5) 149 (50) 149 (50) 298 (100) Disagree 7 (4.6) 20 (12.9) 27 (8.8) Neutral 21 (13.9 19 (12.3) 40 (13.1) Agree 123 (81.5) 116 (74.8) 239 (78.1) 151 (49.3) 155 (50.7) 306 (100) Disagree 31 (20.9) 33 (22.3) 64 (21.6) Neutral 11 (7.4) 24 (16.2) 35 (11.8) Agree 106 (71.6) 91 (61.5) 197 (66.6) 148 (50) 148 (50) 296 (100) Disagree 5 (3.5) 14 (9.1) 19 (6.4) The website was Neutral 22 (15.3) 20 (13) 42 (14.1) designed for all levels of users Agree 117 (81.3) 120 (77.9) 237 (79.5) 144 (48.3) 154 (51.7) 298 (100) *** p<0.00, ** p<0.05, * p<0.150 X² 8.837 (p = 0.012) ** 6.513 (p = 0.039) ** 6.033 (p = 0.04) ** 4.065 (p = 0.131) * Table 10. Data entry role and overall satisfaction with CoA Portal Role in data entry I believe my program's historical ARO data in the Archives section is accurate The organization of information on the website is clear The rate at which pages loaded was fast enough All/Most Half/Some/ A little/none Disagree 15 (6.1) 7 (13.5) 22 (7.4) Neutral 50 (20.5) 15 (28.8) 65 (22) Agree 179 (73.4) 30 (57.7) 22 (70.6) 244 (82.4) 52 (17.6) 296 (100) Disagree 25 (10.2) 5 (10.4) 30 (10.2) Neutral 40 (13.7) 2 (0.7) 42 (14.3) Agree 180 (61.4) 41 (14) 221 (75.4) 245 (83.6) 48 (16.4) 293 (100) Disagree 52 (20.9) 12 (25.5) 64 (21.6) Neutral 26 (10.4) 9 (19.1) 35 (11.8) Agree 171 (68.7) 26 (55.3) 197 (66.6) 249 (84.1) 47 (15.9) 296 (100) X² 5.938 (p = 0.051) * 4.901 (p = 0.086) * 3.90 (p = 0.136) * 8
*** p<0.00, ** p<0.05, * p<0.150 Table 11. Overall satisfaction with CoA Portal and website speed Overall, I am satisfied with the CoA Portal ARO website Disagree Neutral Agree Disagree 16 (61.5) 12 (32.4) 36 (15.5) 64 (21.7) The rate at which pages loaded was Neutral 3 (11.5) 5 (13.5) 26 (11.2) 34 (11.5) fast enough Agree 7 (26.9) 20 (54.1) 170 (73.3) 197 (66.8) 26 (8.8) 37 (12.5) 232 (78.6) 295 (100) *** p<0.00, ** p<0.05, * p<0.150 X² 33.768 (p 0.000) *** Table 12. Website designed for all user levels and website is easy to use The website was designed for all levels of users Disagree Neutral Agree Disagree 14 (73.7) 8 (20) 12 (5.2) 34 (11.7) Neutral 3 (15.8) 14 (35) 24 (10.4) 41 (14.1) Overall, the website is easy to Agree 2 (10.5) 18 (45) 195 (84.4) 231 (74.1) use 19 (6.6) 40 (13.8) 231 (79.7) 290 (100) *** p<0.00, ** p<0.05, * p<0.150 X² 105.840 (p 0.000) *** 9
Discussion The 2012 ARO CoA Portal survey identified problem areas as well as satisfactory areas in the CoA Portal. In general, survey respondents were satisfied with the new CoA Portal platform and agreed that the website was designed for all levels of users. Respondents expressed dissatisfaction in areas of adequacy of ARO instructions, expected functionality of the site, and the page load rate. Also, there was overall agreement that it was frustrating that the site redirected users to the first page of the student, graduate or faculty list after updates to an individual profile, column sort preferences weren t saved for the profile tables, and solutions to reliability check errors weren t provided. The CoA Portal is a data collection system used for accreditation purposes and it is important that the data collected is accurate. Ultimately, ARO data in the Portal will be used to populate program self-study tables. It is important to note that most respondents could agree that ARO data for previous years was imported into the new database accurately [76% agreed; 13% disagreed]. Also, ninety percent of respondents agreed that their program updated and/or added records during the 2012 ARO and that their data was consistent with self-study data. However, just seventy percent agreed that they thought their archived ARO data was accurate. Considering that some ARO data points will be used to populate self-study tables in the future, it would be very useful if programs could make corrections/updates to historical ARO data. For follow-up analysis, practical logic helped guide explorative analyses. Statistically significant correlations could be elucidated for some items. Some of the interesting trends found were participation in data entry training seemed to facilitate easier website use, the feeling that the website was designed for all levels of users, positive perception of page loading speed on the website, and an overall satisfaction with the CoA Portal. Also, respondent role in program moderated outcomes on several items. Administrative respondents tended to understand how to download a spreadsheet of ARO data than faculty/program director respondents and administrative respondents tended to agree that the CoA Portal instructions were adequate more than faculty/program director respondents. Interestingly, in connection with the previously mentioned outcomes, administrative respondents tended to participate in data entry training at a higher rate than faculty/program director respondents. There were limiting factors to consider in implementing this survey. The poor response rate to the survey and lack of a validating process for survey items impaired the ability to interpret the summarized results with any depth. Also, the methodology in implementing the survey did not take measures to ensure a representative sample of registered Portal users participated in the survey. Consequently, broad statements about the utility and functionality of the CoA Portal as it relates to APA accredited programs as a whole can t be made using these survey results alone. The implications of these results are that the CoA Portal is a tool that has improved and will further improve the method of data collection for APA accreditation purposes, particularly for the ARO. The fact that administrative staff for programs are typically in contact with OPCA staff and more readily available than program directors may mean that, in general, program directors knowledge about the ARO and CoA Portal site were significantly different than administrative staff. Hence, program role in some cases moderate how respondents agreed or disagreed with survey items (Table 7). An important follow-up question is how OPCA staff might tailor ARO outreach and resources to program directors versus administrative staff. Finally, these results indicate that there may be some benefit for users to participate in data entry training prior to entering data for the ARO. 10
There are limitations in the current system that weren t outlined in this survey format that have caused problems for some programs, such as the inability to enter student/trainee (or faculty/supervisor) start and left dates that are outside the reporting year, and the confusion regarding faculty/supervisor classification definitions. Related, programs are also seeking guidance regarding ARO data requirements for faculty/supervisors classified as Other Contributors. Also, because ARO data is collected and organized based on a calendar academic/training year, the ARO system must use rigid business rules to deal with data, therefore flexibility is compromised when handling start and left dates. Consequently, situations arise where certain ARO data may appear incompatible with self-study data. These challenges will be prioritized and solved as viable, long-term solutions become available. At last, this survey was done as an explorative effort and used as a medium for programs to express their opinion about the CoA Portal and to support future decision-making processes regarding changes to the CoA Portal. The goal of the CoA moving forward is to provide programs with the support that is needed to successfully submit required annual program ARO data, to work with the current vendor to reduce the number of obstacles to entering and submitting data efficiently and timely, and to reduce confusion about which data are required. This year the CoA and OPCA is offering 4 webinars for data entry training and the ARO will not close on a weekend, as it did last year, when office staff are not available. Several new features have been added to the CoA Portal so that it s more convenient and easier to navigate, such as not losing column sort preferences after navigating back to the profile table and saving the page location when there are multiple pages of profile tables after navigating back from updating an individual profile. Throughout the Portal more detailed instructions are available to help users understand the process involved in data entry including more detailed instruction for the reliability check feature used in the ARO to check for consistency in data entry. Also, to facilitate accurate ARO data population of self-study tables, the Commission has approved opening the Archives section of the ARO so that programs can update student and faculty records. 11
Appendix A. Survey Items Respondent information 1 Respondent role in program Program Director; Faculty; Administrative 2 Number of accredited programs for which you complete the ARO 1; 2; 3 3 Type of program(s) Doctoral; Internship; Postdoctoral; select all 4 Participation in pre- or post-launch training Yes/No 5 Respondent role in data entry Entered all data; most; half; some; a little; none 6 Typical browser used IE6; IE7; IE8; IE9: IE10; Firefox; Chrome; Safari; Other; Unknown Likert Scale Items: 1 = Strongly disagree 2 = Disagree 3 = Neutral 4 = Agree 5 = Strongly agree Overall Domain: 7 Overall I am satisfied with the CoA Portal ARO website 8 The website was designed for all levels of users 9 The data that were pre-populated from previous AROs were accurate 10 I am confident that my program added or updated all records before submitting 2012 ARO 11 I believe my program's ARO data is consistent with data in our self-study tables 12 I believe my program's historical ARO data in the Archives section is accurate Usability Domain: 13 Overall, the website is easy to use 14 I found the training webinars and online tutorials to be useful 15 When working through the ARO, the instructions were adequate 16 The progress bars on the ARO dashboard were helpful for completing the ARO 17 I understood how and when to use the Mark as complete buttons 18 The ARO email reminders were helpful 12
19 The website has all the functions and capabilities I expect it to have 20 I was able to complete data entry in a reasonable amount of time Performance/Navigation Domain: 21 The rate at which pages loaded was fast enough 22 The organization of information on the website is clear 23 Student/Graduate/Faculty profile tables should automatically sort records by last name 24 After updating a record, it was frustrating that column sort preferences were not saved when returning to Student/Graduate/Faculty profile tables 25 After updating a record not located on the first page, it was frustrating that the system returned to the first page of Student/Graduate/Faculty profile tables 26 The reliability check table was frustrating because error messages did not tell me how to fix the problems 27 I understand I can download my program's 2012 ARO data by clicking on the Excel link on the Submit page (both during the 2012 open window and now) 28 I understood the steps required to formally submit my program's ARO Multiple option: 29 Which month should the ARO window open? (ARO Window) April, May, June 30 Which month should the ARO window close? (ARO Window) August, September 13