BLOG

SCOTT'S THOUGHTS

Volume 10: Part One Conclusion

Volume 10: Part One Conclusion

December 26, 20235 min read

Thank you for joining me over the past weeks as we have examined the common citations (and how to avoid them). We have looked at ARC-PA Standards 1.03 and 1.02, which comprise a large majority of program citations concerning the submission of the SSR. I hope that I have provided some valuable insight on tackling issues with these standards from a “ground up” perspective, beginning with how your program curates its data, leading to overall performance improvement across the board, rather than bandaid measures that only cover underlying issues.

Today let’s do a quick review of what we have discussed as well as the main goals of this blog series.

Summary of Standard C1.03 and C1.02 common mistakes

Avoid committing these errors!

  • Presenting benchmarks that include one data source.

  • Presenting benchmarks that only require one year of data.

  • Defining areas in need of improvement and strength without explicit language or definition is what drives this conclusion.

  • Not providing conclusions/summary statements in the analysis narratives that mirror the language used in the strengths, modifications, and areas in need of improvement.

  • Documenting data-driven modifications in the analysis narrative that do not directly connect to the modification section at the bottom of the template.

  • Inadequate benchmarks that define program expectations for each specific data element. Good benchmarks are the first step to getting this whole process done correctly!

  • Inadequate benchmarks for defining strengths and areas in need of improvement that are measurable.

  • Not triangulating data from other Appendix 14 components whenever possible.

Remember ARC-PA’s Expectations

By keeping the ARC-PA’s expectations in mind from the beginning, you’ll build a better foundation to respond. 

  • Establish program benchmarks for quantitative and qualitative data and explain the rationale for the choice of a threshold/benchmark in terms of program or institutional expectations.  

  • Describe how the quantitative and qualitative data will be interpreted and critically analyzed by the program to identify potential for improvement or change.  

  • Appropriately reference any data discussed in the analysis. 

  • Document data analysis that explores potential causes/reasons for scores below the program’s benchmark, trends over time and relates the data to the expectations of the program.  

  • Clearly state conclusions within the narratives.  

  • Link conclusions and action plans to data analysis. 

Presentation Strategies

When writing the SSR, make sure that you have summary statements that provide analysis and conclusions that are directly tied to each strength, area in need of improvement, or modification.

Generate very clearly written summary statements within the SSR that provide the rationale for strengths, areas in need of improvement, and modifications that directly line with the language at the bottom of the template.

I know that compliance with the data-heavy requirements of ARC-PA seems like a Herculean task. But you can do this! It’s just the new normal, and requires different preparation than we have employed in years past. Remember the following:

  1.   Using quality assurance to ensure that the program follows the ARC-PA's directions can make the difference between a successful visit and multiple citations within the C standards.

  2.   Expectations in regard to benchmarks, triangulation of data, and generation of summary statements, have become exceedingly prescriptive.

  3.   Program resources dedicated to assessment must be evaluated for sufficiency to ensure that all requirements are met.

  4.   Administration at your institution needs to be aware of the heightened expectations that result in significant time commitment and resources to achieve compliance.

Quality Assurance

One of the best ways to improve compliance is also one of the oldest – something we learn in elementary school! That is, of course, to have someone else check your work. I know, as a professional, it’s tempting to do everything yourself and trust your own judgment. However, having a second or third pair of eyes review your reports is an excellent way to sidestep errors.

Outside review conducts quality assurance for common mistakes and pitfalls. Those of us who do a lot of writing in our jobs know that frequently we stop “seeing” our documents and ignore mistakes or other oversights.

External review is essential to make sure that others can understand your charts and graphs, follow the narratives, and draw the same conclusions you reached. Do not assume that what is crystal clear to you must be so for everyone. If your readers cannot see how you came to a conclusion, this is a red flag that you’re missing steps.

Does my PA program need a designated data analyst?

I am often asked this question in webinars and consultations. I believe the answer is yes. The world is changing, and curating data is becoming an art. Raw data can’t be reviewed or analyzed. PA programs need someone who can take large amounts of incoming data, tabulate it, curate it, and put it in place so that when it comes time to put all this data to use, it is actually ready to be useful.

Some programs already have a full-time director of assessment. I’ve done this myself as a full-time job. I recommend going to administration and saying, “In order to function efficiently, we need an advanced staff member to take care of this kind of analysis for us.”

Coming Soon

I am currently working on a follow-up webinar and blog series that continues my review of the top ten most common SSR citations. Remember, my webinars are always free! 

Please join me in the coming weeks for more insight into avoiding the pitfalls of meeting ARC-PA Standards, managing your program’s Self-Study Reports, and remaining in compliance. 

C-PA compliancePA program assessmentSSR citationsProgram accreditationData-driven decision-making
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog
Volume 10: Part One Conclusion

Volume 10: Part One Conclusion

December 26, 20235 min read

Thank you for joining me over the past weeks as we have examined the common citations (and how to avoid them). We have looked at ARC-PA Standards 1.03 and 1.02, which comprise a large majority of program citations concerning the submission of the SSR. I hope that I have provided some valuable insight on tackling issues with these standards from a “ground up” perspective, beginning with how your program curates its data, leading to overall performance improvement across the board, rather than bandaid measures that only cover underlying issues.

Today let’s do a quick review of what we have discussed as well as the main goals of this blog series.

Summary of Standard C1.03 and C1.02 common mistakes

Avoid committing these errors!

  • Presenting benchmarks that include one data source.

  • Presenting benchmarks that only require one year of data.

  • Defining areas in need of improvement and strength without explicit language or definition is what drives this conclusion.

  • Not providing conclusions/summary statements in the analysis narratives that mirror the language used in the strengths, modifications, and areas in need of improvement.

  • Documenting data-driven modifications in the analysis narrative that do not directly connect to the modification section at the bottom of the template.

  • Inadequate benchmarks that define program expectations for each specific data element. Good benchmarks are the first step to getting this whole process done correctly!

  • Inadequate benchmarks for defining strengths and areas in need of improvement that are measurable.

  • Not triangulating data from other Appendix 14 components whenever possible.

Remember ARC-PA’s Expectations

By keeping the ARC-PA’s expectations in mind from the beginning, you’ll build a better foundation to respond. 

  • Establish program benchmarks for quantitative and qualitative data and explain the rationale for the choice of a threshold/benchmark in terms of program or institutional expectations.  

  • Describe how the quantitative and qualitative data will be interpreted and critically analyzed by the program to identify potential for improvement or change.  

  • Appropriately reference any data discussed in the analysis. 

  • Document data analysis that explores potential causes/reasons for scores below the program’s benchmark, trends over time and relates the data to the expectations of the program.  

  • Clearly state conclusions within the narratives.  

  • Link conclusions and action plans to data analysis. 

Presentation Strategies

When writing the SSR, make sure that you have summary statements that provide analysis and conclusions that are directly tied to each strength, area in need of improvement, or modification.

Generate very clearly written summary statements within the SSR that provide the rationale for strengths, areas in need of improvement, and modifications that directly line with the language at the bottom of the template.

I know that compliance with the data-heavy requirements of ARC-PA seems like a Herculean task. But you can do this! It’s just the new normal, and requires different preparation than we have employed in years past. Remember the following:

  1.   Using quality assurance to ensure that the program follows the ARC-PA's directions can make the difference between a successful visit and multiple citations within the C standards.

  2.   Expectations in regard to benchmarks, triangulation of data, and generation of summary statements, have become exceedingly prescriptive.

  3.   Program resources dedicated to assessment must be evaluated for sufficiency to ensure that all requirements are met.

  4.   Administration at your institution needs to be aware of the heightened expectations that result in significant time commitment and resources to achieve compliance.

Quality Assurance

One of the best ways to improve compliance is also one of the oldest – something we learn in elementary school! That is, of course, to have someone else check your work. I know, as a professional, it’s tempting to do everything yourself and trust your own judgment. However, having a second or third pair of eyes review your reports is an excellent way to sidestep errors.

Outside review conducts quality assurance for common mistakes and pitfalls. Those of us who do a lot of writing in our jobs know that frequently we stop “seeing” our documents and ignore mistakes or other oversights.

External review is essential to make sure that others can understand your charts and graphs, follow the narratives, and draw the same conclusions you reached. Do not assume that what is crystal clear to you must be so for everyone. If your readers cannot see how you came to a conclusion, this is a red flag that you’re missing steps.

Does my PA program need a designated data analyst?

I am often asked this question in webinars and consultations. I believe the answer is yes. The world is changing, and curating data is becoming an art. Raw data can’t be reviewed or analyzed. PA programs need someone who can take large amounts of incoming data, tabulate it, curate it, and put it in place so that when it comes time to put all this data to use, it is actually ready to be useful.

Some programs already have a full-time director of assessment. I’ve done this myself as a full-time job. I recommend going to administration and saying, “In order to function efficiently, we need an advanced staff member to take care of this kind of analysis for us.”

Coming Soon

I am currently working on a follow-up webinar and blog series that continues my review of the top ten most common SSR citations. Remember, my webinars are always free! 

Please join me in the coming weeks for more insight into avoiding the pitfalls of meeting ARC-PA Standards, managing your program’s Self-Study Reports, and remaining in compliance. 

C-PA compliancePA program assessmentSSR citationsProgram accreditationData-driven decision-making
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog

Don't miss out on future events!

Subscribe to our newsletter

© 2024 Scott Massey Ph.D. LLC

Privacy Policy | Terms of Use