BLOG

SCOTT'S THOUGHTS

Volume 5: Standard C1.02 (a-c)

Volume 5: Standard C1.02 (a-c)

November 14, 20235 min read

Welcome back to my blog series on common citations that PA Programs receive from the ARC-PA. We’ve been focusing on Standard C1.03 - the most problematic of all standards. Before we move on to solutions and recommendations, let’s do a quick review of Standard C1.02(a-c), because problem-solving for these various standards and sub-standards follows extremely similar pathways. 

Since March 2022, 46 PA programs received at least one citation for standard C1.02 (a-c).

The components of Standard C1.02 are collection and critical analysis, which seem to be somewhat elusively defined concepts. What does it mean, when the Commission says you “didn’t conduct critical analysis?” What does the critical analysis of data actually mean? What qualifies as “collection?”

Let’s first break down the standard into its parts. The definitions are provided by the Accreditation Manual for Entry-Level Physician Assistant Programs (July 2023).

C1.02. The program must implement its ongoing self-assessment process by:

  1.   Conducting data collection,

  2.   Performing critical analysis of data, and

  3.   Applying the results leading to conclusions that identify:

  4. Program strengths,

  5. Program areas in need of improvement, and

iii. Action plans.          

Standard C1.02a

Revisions to the curriculum and other dimensions of the program are clearly supported by data collected in a process of ongoing program self-assessment.

This support must be confirmed by the SSR and on-site by interviews with faculty and staff, data collection tools, data summaries, and committee meeting minutes. A program should collect both quantitative and qualitative data within its self-assessment process that addresses the data requirements for each self-study appendix.

Standard C1.02b

Here is this standard’s definition from the Accreditation Manual for entry-level Physician Assistant Programs (July 2023):

Evaluating the validity of data (e.g., low response rates), identification of areas above or below benchmark, evaluating trends over time, triangulation of data to identify relationships/contributing factors, and correlation of data to the expectations of the program. 

Revisions in curricular and administrative aspects of the program (requirements, content, instructional methods, evaluation, policies, etc.) are shown to be data-driven or data-informed. 

Standard C1.02c

Standard C1.02c.i. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify program strengths. The program must clearly articulate and document the process by which it determined what it considered a strength.

For example, if you say that a strength is test scores falling above benchmark, but fail to state the benchmark, this is problematic for the reviewer. A review of meeting minutes provided at the site visit must include documentation of analysis or identification of program strengths within the meeting minutes of the program’s data analysis. 

Standard C1.02c.ii. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify program areas in need of improvement.

The program must consistently articulate correlation of the data to other quantitative or qualitative data to determine the reason(s) for being below a benchmark and identify the same as an area in need of improvement.

Standard C1.02c.iii. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify action plans. Action plans should be shown as the result of the implementation of an ongoing self-assessment process, and must be data-driven. 

Modifications are completed at the time you submit your report. Areas needing improvement are not. Those are ongoing action items that you are monitoring. If they are driven by your benchmarks, they are not deemed to be data-driven.

Common Language of Citations 

When the review committee cites a failure in compliance, we can often look to their comments to see what they were missing from the SSR. Let’s look at this finding for Standard C1.02b, and the comments. I have emphasized important wording.

Findings: The program did not provide evidence of the implementation of its ongoing self-assessment process by performing a critical analysis of data.

Comments from the review committee: The program did not provide evidence that its conclusions, strengths, and areas in need of improvement provided as examples of ongoing program self-assessment were the result of performing critical analysis of the data. Missing data did not allow the program to fully assess any aspect of the program. 

At the time of the site visit, the program faculty stated that at their annual retreats they “review” the data. Program faculty did not articulate a well-defined process, to include benchmarks with rationale for the analysis of quantitative and qualitative data. The program faculty reiterated what was documented within the submitted modified self-study report (mSSR) and did not articulate critical analysis of the data to support any conclusions. 

Onsite review of retreat minutes provided a list of the topics and some individual data provided within the minutes (e.g., PANCE failures with their admissions data and PACKRAT performance). However, there was no onsite documentation to verify the implementation of data analysis as part of the program’s self-assessment process. In the response, the program acknowledged the observation.

Next time…

Let’s move on from defining the problems to solving them! I invite you to join me in my next blog, in which I will begin sharing many of the common mistakes and “trouble areas” that I see arising in PA programs SSRs - and how to prepare for them. Depending upon the site visitors and the commissioner, they may cite you for C1.03 plus C1.02. It really depends on what the issues are, and they cross over in many regards. Knowing ahead of time will help you avoid these errors and direct your SSR to align with what the Commission’s reviewers expect from you. 

RC-PA Standard C1.02 compliance guidelinesSelf-assessment process in PA program accreditationCritical analysis of data in SSRIdentifying program strengths and areas for improvementAction plans in response to ARC-PA citations
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog
Volume 5: Standard C1.02 (a-c)

Volume 5: Standard C1.02 (a-c)

November 14, 20235 min read

Welcome back to my blog series on common citations that PA Programs receive from the ARC-PA. We’ve been focusing on Standard C1.03 - the most problematic of all standards. Before we move on to solutions and recommendations, let’s do a quick review of Standard C1.02(a-c), because problem-solving for these various standards and sub-standards follows extremely similar pathways. 

Since March 2022, 46 PA programs received at least one citation for standard C1.02 (a-c).

The components of Standard C1.02 are collection and critical analysis, which seem to be somewhat elusively defined concepts. What does it mean, when the Commission says you “didn’t conduct critical analysis?” What does the critical analysis of data actually mean? What qualifies as “collection?”

Let’s first break down the standard into its parts. The definitions are provided by the Accreditation Manual for Entry-Level Physician Assistant Programs (July 2023).

C1.02. The program must implement its ongoing self-assessment process by:

  1.   Conducting data collection,

  2.   Performing critical analysis of data, and

  3.   Applying the results leading to conclusions that identify:

  4. Program strengths,

  5. Program areas in need of improvement, and

iii. Action plans.          

Standard C1.02a

Revisions to the curriculum and other dimensions of the program are clearly supported by data collected in a process of ongoing program self-assessment.

This support must be confirmed by the SSR and on-site by interviews with faculty and staff, data collection tools, data summaries, and committee meeting minutes. A program should collect both quantitative and qualitative data within its self-assessment process that addresses the data requirements for each self-study appendix.

Standard C1.02b

Here is this standard’s definition from the Accreditation Manual for entry-level Physician Assistant Programs (July 2023):

Evaluating the validity of data (e.g., low response rates), identification of areas above or below benchmark, evaluating trends over time, triangulation of data to identify relationships/contributing factors, and correlation of data to the expectations of the program. 

Revisions in curricular and administrative aspects of the program (requirements, content, instructional methods, evaluation, policies, etc.) are shown to be data-driven or data-informed. 

Standard C1.02c

Standard C1.02c.i. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify program strengths. The program must clearly articulate and document the process by which it determined what it considered a strength.

For example, if you say that a strength is test scores falling above benchmark, but fail to state the benchmark, this is problematic for the reviewer. A review of meeting minutes provided at the site visit must include documentation of analysis or identification of program strengths within the meeting minutes of the program’s data analysis. 

Standard C1.02c.ii. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify program areas in need of improvement.

The program must consistently articulate correlation of the data to other quantitative or qualitative data to determine the reason(s) for being below a benchmark and identify the same as an area in need of improvement.

Standard C1.02c.iii. The program must implement its ongoing self-assessment process by applying the results leading to conclusions that identify action plans. Action plans should be shown as the result of the implementation of an ongoing self-assessment process, and must be data-driven. 

Modifications are completed at the time you submit your report. Areas needing improvement are not. Those are ongoing action items that you are monitoring. If they are driven by your benchmarks, they are not deemed to be data-driven.

Common Language of Citations 

When the review committee cites a failure in compliance, we can often look to their comments to see what they were missing from the SSR. Let’s look at this finding for Standard C1.02b, and the comments. I have emphasized important wording.

Findings: The program did not provide evidence of the implementation of its ongoing self-assessment process by performing a critical analysis of data.

Comments from the review committee: The program did not provide evidence that its conclusions, strengths, and areas in need of improvement provided as examples of ongoing program self-assessment were the result of performing critical analysis of the data. Missing data did not allow the program to fully assess any aspect of the program. 

At the time of the site visit, the program faculty stated that at their annual retreats they “review” the data. Program faculty did not articulate a well-defined process, to include benchmarks with rationale for the analysis of quantitative and qualitative data. The program faculty reiterated what was documented within the submitted modified self-study report (mSSR) and did not articulate critical analysis of the data to support any conclusions. 

Onsite review of retreat minutes provided a list of the topics and some individual data provided within the minutes (e.g., PANCE failures with their admissions data and PACKRAT performance). However, there was no onsite documentation to verify the implementation of data analysis as part of the program’s self-assessment process. In the response, the program acknowledged the observation.

Next time…

Let’s move on from defining the problems to solving them! I invite you to join me in my next blog, in which I will begin sharing many of the common mistakes and “trouble areas” that I see arising in PA programs SSRs - and how to prepare for them. Depending upon the site visitors and the commissioner, they may cite you for C1.03 plus C1.02. It really depends on what the issues are, and they cross over in many regards. Knowing ahead of time will help you avoid these errors and direct your SSR to align with what the Commission’s reviewers expect from you. 

RC-PA Standard C1.02 compliance guidelinesSelf-assessment process in PA program accreditationCritical analysis of data in SSRIdentifying program strengths and areas for improvementAction plans in response to ARC-PA citations
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog

Don't miss out on future events!

Subscribe to our newsletter

© 2024 Scott Massey Ph.D. LLC

Privacy Policy | Terms of Use