SCOTT'S THOUGHTS
I’m glad you are joining me again as I take us through the process of creating a Self-Study Report that will pass muster with the Commission. In the last blog, I began our examination of Standard C1.03 – the most often cited standard. Small wonder, too, considering the vast amount of information it demands. Today let’s look in more detail at that standard’s many appendices and what they mean for your SSR report.
As we examine the appendices in depth, remember that each should be answered with a 1) tabular or graphic display and 2) a narrative, as requested by the ARC-PA’s instructions. Therefore as we move through the requirements, align your planning with these things in mind: displaying your data in an understandable way, and describing its status and implications in cohesion with your graphics.
Appendix 14B: Administrative Aspects of the Program and Institutional Resources. This appendix requires the program to submit data and analysis to assess the sufficiency and effectiveness of administrative aspects of the program and institutional resources.
Appendix 14B has an extensive list of elements that all must be addressed in relation to sufficiency and effectiveness; don’t leave any of these out of your report. If you need to recalibrate your surveys or data collection to do so, this is something to think about sooner rather than later! Later in this blog series, we will address how to present “new” data (that is, new variables that have only one or two years’ of data rather than the Commission’s preferred three years).
Appendix 14C: Effectiveness of the Didactic Curriculum. This appendix requires the program to submit data and analysis related to student 1) evaluation of didactic courses and instructors (across multiple years), 2) the number of final grades of C or below for didactic courses, and 3) student attrition and remediation in didactic courses.
Appendix 14D: Effectiveness of the Clinical Curriculum. This appendix requires the program to submit data and analysis related to 1) student evaluation of rotations, clinical sites and preceptors, 2) the number of final grades of C or below for rotations, and 3) student attrition and remediation in rotations.
One problem I often see in responding to Appendix 14D is that the ARC-PA’s templates do not specifically state what the expectations are. Therefore, I’ll just remind you that it is insufficient to present just a high-level aggregate score (i.e., our preceptor average score was ___). Programs must present an aggregate of every site, every preceptor, and the trend analysis.
Appendix 14E: Preparation of Graduates to Achieve Program Defined Competencies. This appendix requires the program to submit data and analysis related to student attrition, summative evaluation performance, student exit/graduate feedback and faculty evaluation of the curriculum to assess its ability to prepare students to achieve program defined competencies. (emphasis added)
This appendix makes you focus on the issue: How do you measure your competencies? You have student exit feedback, and the summative evaluations, but you may need to think of other creative ways to evaluate your curriculum.
Appendix 14F. Program Evaluation of PANCE Outcomes. This appendix requires the program to submit data and analysis related to program evaluation of PANCE outcomes and many variables:
Admissions criteria as a predictor of success
Course outcomes
Course and instructor evaluations by students
Program instructional objectives, learning outcomes, and breadth and depth of the curriculum
Student summative evaluation results
Remediation practices and results
Student progress criteria and attrition data
Feedback from students who were unsuccessful on PANCE, if available
Preceptor and graduate feedback (employer feedback is not required but may be helpful if available to the program)
I give you a cautionary note for this appendix! Be certain to look at PANCE outcomes against all these variables.
Appendix 14G. Sufficiency and Effectiveness of Principal and Instructional Faculty and Staff. This appendix requires the program to submit data and analysis related to sufficiency and effectiveness of program faculty, sufficiency and effectiveness of administrative support staff, and program faculty and staff changes
There is a matrix of answers to provide here, and thus, it is important to ensure that your answer checks all the boxes. In your response, remember 1) to address sufficiency and effectiveness separately, and 2) to address all three groups: principal faculty, instructional faculty, and staff.
Appendix 14H: Success in Meeting the Program’s Goals.
This appendix requires the program to submit data and analysis related to effectiveness in meeting its goals. You should trend data across multiple years and demonstrate that your program is meeting its goals.
In conjunction with this, ensure data that validates that your program is meeting its goals is posted on your program’s website. Programs are constantly being audited in terms of their websites, so if you have not reviewed your goals or the display of data in some time, you are likely not displaying sufficient data.
Now that we have looked at the appendices and noted what each requires, we will move forward with how to respond in our next blog post. The Commission wants specific information presented in a specific way, and unlocking that mystery is key to reducing the number of citations and observations our SSRs receive. Join me then.
I’m glad you are joining me again as I take us through the process of creating a Self-Study Report that will pass muster with the Commission. In the last blog, I began our examination of Standard C1.03 – the most often cited standard. Small wonder, too, considering the vast amount of information it demands. Today let’s look in more detail at that standard’s many appendices and what they mean for your SSR report.
As we examine the appendices in depth, remember that each should be answered with a 1) tabular or graphic display and 2) a narrative, as requested by the ARC-PA’s instructions. Therefore as we move through the requirements, align your planning with these things in mind: displaying your data in an understandable way, and describing its status and implications in cohesion with your graphics.
Appendix 14B: Administrative Aspects of the Program and Institutional Resources. This appendix requires the program to submit data and analysis to assess the sufficiency and effectiveness of administrative aspects of the program and institutional resources.
Appendix 14B has an extensive list of elements that all must be addressed in relation to sufficiency and effectiveness; don’t leave any of these out of your report. If you need to recalibrate your surveys or data collection to do so, this is something to think about sooner rather than later! Later in this blog series, we will address how to present “new” data (that is, new variables that have only one or two years’ of data rather than the Commission’s preferred three years).
Appendix 14C: Effectiveness of the Didactic Curriculum. This appendix requires the program to submit data and analysis related to student 1) evaluation of didactic courses and instructors (across multiple years), 2) the number of final grades of C or below for didactic courses, and 3) student attrition and remediation in didactic courses.
Appendix 14D: Effectiveness of the Clinical Curriculum. This appendix requires the program to submit data and analysis related to 1) student evaluation of rotations, clinical sites and preceptors, 2) the number of final grades of C or below for rotations, and 3) student attrition and remediation in rotations.
One problem I often see in responding to Appendix 14D is that the ARC-PA’s templates do not specifically state what the expectations are. Therefore, I’ll just remind you that it is insufficient to present just a high-level aggregate score (i.e., our preceptor average score was ___). Programs must present an aggregate of every site, every preceptor, and the trend analysis.
Appendix 14E: Preparation of Graduates to Achieve Program Defined Competencies. This appendix requires the program to submit data and analysis related to student attrition, summative evaluation performance, student exit/graduate feedback and faculty evaluation of the curriculum to assess its ability to prepare students to achieve program defined competencies. (emphasis added)
This appendix makes you focus on the issue: How do you measure your competencies? You have student exit feedback, and the summative evaluations, but you may need to think of other creative ways to evaluate your curriculum.
Appendix 14F. Program Evaluation of PANCE Outcomes. This appendix requires the program to submit data and analysis related to program evaluation of PANCE outcomes and many variables:
Admissions criteria as a predictor of success
Course outcomes
Course and instructor evaluations by students
Program instructional objectives, learning outcomes, and breadth and depth of the curriculum
Student summative evaluation results
Remediation practices and results
Student progress criteria and attrition data
Feedback from students who were unsuccessful on PANCE, if available
Preceptor and graduate feedback (employer feedback is not required but may be helpful if available to the program)
I give you a cautionary note for this appendix! Be certain to look at PANCE outcomes against all these variables.
Appendix 14G. Sufficiency and Effectiveness of Principal and Instructional Faculty and Staff. This appendix requires the program to submit data and analysis related to sufficiency and effectiveness of program faculty, sufficiency and effectiveness of administrative support staff, and program faculty and staff changes
There is a matrix of answers to provide here, and thus, it is important to ensure that your answer checks all the boxes. In your response, remember 1) to address sufficiency and effectiveness separately, and 2) to address all three groups: principal faculty, instructional faculty, and staff.
Appendix 14H: Success in Meeting the Program’s Goals.
This appendix requires the program to submit data and analysis related to effectiveness in meeting its goals. You should trend data across multiple years and demonstrate that your program is meeting its goals.
In conjunction with this, ensure data that validates that your program is meeting its goals is posted on your program’s website. Programs are constantly being audited in terms of their websites, so if you have not reviewed your goals or the display of data in some time, you are likely not displaying sufficient data.
Now that we have looked at the appendices and noted what each requires, we will move forward with how to respond in our next blog post. The Commission wants specific information presented in a specific way, and unlocking that mystery is key to reducing the number of citations and observations our SSRs receive. Join me then.
Subscribe to our newsletter
© 2024 Scott Massey Ph.D. LLC