BLOG

SCOTT'S THOUGHTS

Beyond Pass Rates: Making PANCE Data Work for Program Improvement

Beyond Pass Rates: Making PANCE Data Work for Program Improvement

May 06, 20253 min read

The ARC-PA continues to emphasize an integrated approach to validating program compliance with the C standards. In this blog, I aim to clarify the expectations outlined in the fifth edition of Standard C1.01, which states:

“The program must define its ongoing self-assessment process that is designed to document program effectiveness and foster program improvement. At a minimum, the process must address: e) PANCE performance.”

For many years, a high first-time PANCE pass rate seemed like enough. I assumed strong results confirmed that the program’s methods were effective and required little additional validation. But that assumption misses the depth of what is now required.

Appendix 13F of the Self-Study Report (SSR), part of the fifth edition standards, outlines a number of areas that must be analyzed in connection with PANCE outcomes, including:

  • Admissions criteria as predictors of success

  • Course outcomes

  • Course and instructor evaluations by students

  • Program instructional objectives, learning outcomes, and curriculum depth/breadth

  • Student summative evaluation results

  • Student progress criteria and attrition data

  • Feedback from students who did not pass the PANCE

  • Preceptor and graduate feedback

In this and future blog entries, I will break down each of these elements and offer practical suggestions on how to align with both the spirit and the specifics of the requirement. These perspectives are my own and not official guidance from the ARC-PA.


Admissions Criteria as Predictors of Success

How do admissions factors relate to PANCE success? Start with your program’s minimum prerequisite standards. These typically include metrics like cumulative GPA, science GPA, prerequisite GPA, GRE scores, and healthcare experience hours.

A useful strategy is to analyze your most recent graduating class. Segment PANCE scores and look for trends against these admissions variables. If any students failed on their first attempt, evaluate their academic profiles in detail. How did they differ from students who passed?

If no one failed, focus on those with scores below 400—a range that often represents borderline performance. You may uncover early indicators of risk.

Statistical analysis can help here too. A Pearson correlation, for instance, shows the strength of relationship between each admissions factor and PANCE scores. Extending this across several cohorts can help refine your admissions scoring—perhaps revealing that certain factors are given too much or too little weight.


Course Outcomes

Course performance is another critical area. Begin by reviewing the academic records of any students who failed the PANCE. Were there patterns in lower grades? Did they struggle in certain systems-based or clinically oriented courses?

Zooming out, look at class-wide performance in relation to national PANCE benchmarks. Are there specific organ systems or task areas where your students consistently underperform?

It’s also essential to revisit curriculum mapping. Do your courses align with the NCCPA blueprint? If specific topics have fallen off the radar—or if new instructors have introduced content changes—this could affect outcomes. A sudden drop in student scores may be traced back to such changes.

Again, basic statistics can help. Correlating aggregate course grades with PANCE scores can highlight which classes are strong predictors of exam performance and which may need adjustment.


These kinds of data investigations require faculty to be more than just educators—they must also be analysts and collaborators. Effective self-assessment digs beneath surface-level metrics and identifies opportunities for continuous improvement.

In the next blog, I’ll explore how student evaluations and instructional objectives contribute to the self-assessment process. Stay tuned.

Self-assessmentPANCE performanceAdmissions criteriaCourse outcomesProgram evaluation
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog
Beyond Pass Rates: Making PANCE Data Work for Program Improvement

Beyond Pass Rates: Making PANCE Data Work for Program Improvement

May 06, 20253 min read

The ARC-PA continues to emphasize an integrated approach to validating program compliance with the C standards. In this blog, I aim to clarify the expectations outlined in the fifth edition of Standard C1.01, which states:

“The program must define its ongoing self-assessment process that is designed to document program effectiveness and foster program improvement. At a minimum, the process must address: e) PANCE performance.”

For many years, a high first-time PANCE pass rate seemed like enough. I assumed strong results confirmed that the program’s methods were effective and required little additional validation. But that assumption misses the depth of what is now required.

Appendix 13F of the Self-Study Report (SSR), part of the fifth edition standards, outlines a number of areas that must be analyzed in connection with PANCE outcomes, including:

  • Admissions criteria as predictors of success

  • Course outcomes

  • Course and instructor evaluations by students

  • Program instructional objectives, learning outcomes, and curriculum depth/breadth

  • Student summative evaluation results

  • Student progress criteria and attrition data

  • Feedback from students who did not pass the PANCE

  • Preceptor and graduate feedback

In this and future blog entries, I will break down each of these elements and offer practical suggestions on how to align with both the spirit and the specifics of the requirement. These perspectives are my own and not official guidance from the ARC-PA.


Admissions Criteria as Predictors of Success

How do admissions factors relate to PANCE success? Start with your program’s minimum prerequisite standards. These typically include metrics like cumulative GPA, science GPA, prerequisite GPA, GRE scores, and healthcare experience hours.

A useful strategy is to analyze your most recent graduating class. Segment PANCE scores and look for trends against these admissions variables. If any students failed on their first attempt, evaluate their academic profiles in detail. How did they differ from students who passed?

If no one failed, focus on those with scores below 400—a range that often represents borderline performance. You may uncover early indicators of risk.

Statistical analysis can help here too. A Pearson correlation, for instance, shows the strength of relationship between each admissions factor and PANCE scores. Extending this across several cohorts can help refine your admissions scoring—perhaps revealing that certain factors are given too much or too little weight.


Course Outcomes

Course performance is another critical area. Begin by reviewing the academic records of any students who failed the PANCE. Were there patterns in lower grades? Did they struggle in certain systems-based or clinically oriented courses?

Zooming out, look at class-wide performance in relation to national PANCE benchmarks. Are there specific organ systems or task areas where your students consistently underperform?

It’s also essential to revisit curriculum mapping. Do your courses align with the NCCPA blueprint? If specific topics have fallen off the radar—or if new instructors have introduced content changes—this could affect outcomes. A sudden drop in student scores may be traced back to such changes.

Again, basic statistics can help. Correlating aggregate course grades with PANCE scores can highlight which classes are strong predictors of exam performance and which may need adjustment.


These kinds of data investigations require faculty to be more than just educators—they must also be analysts and collaborators. Effective self-assessment digs beneath surface-level metrics and identifies opportunities for continuous improvement.

In the next blog, I’ll explore how student evaluations and instructional objectives contribute to the self-assessment process. Stay tuned.

Self-assessmentPANCE performanceAdmissions criteriaCourse outcomesProgram evaluation
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog

Don't miss out on future events!

Subscribe to our newsletter

© 2025 Scott Massey Ph.D. LLC

Privacy Policy | Terms of Use