BLOG

SCOTT'S THOUGHTS

Volume 9: Faculty Involvement The importance of faculty knowledge

Volume 9: Faculty Involvement The importance of faculty knowledge

December 19, 20234 min read

I’ve seen situations where the SSRs were almost flawlessly written, but unfortunately, the faculty was unable to articulate the same ideas well during site visit Q&A sessions, and there were no records in the minutes that supported the reports, either. Obviously, a well-written report is desirable. You must, however, ensure that your faculty is on board, with the language, the action plans, the benchmarks, and so on. If your faculty doesn’t know what’s in the SSR or understand what any of it means, a site visit will not go well. 

Ensure that your benchmarks are precise and allow your faculty to analyze the data more precisely. They must guide you, and define strengths and areas needing improvement. Data that falls below the benchmark will result in action plans if it meets the criteria.

Keep the following goals in mind:

  • Ensure that all team members can articulate the benchmarks during the site visit. If your benchmarks are kind of “all over the place” and not clearly defined, that’s the first place things will start to fall apart.

  • Create minute templates that allow for improved recording of action plans to archive and memorialize your action plans that align with your SSR.

  • Prepare your faculty to “speak the language “ of critical analysis.

  • Prepare all team members to own at least one SSR element, such as Didactic faculty for Appendix 14C, Clinical faculty for Appendix 14D, and the Assessment Team for Appendices 14E and F.

  • Hold multiple review sessions with the assessment team.

The key assessment team members need to be the “experts” in describing and outlining the program’s assessment process. This includes articulating how critical analysis is integral to the program's decision-making in assessment.  

Practice rehearsing how critical analysis led to the SSR action plans several times, including how triangulation was used. I strongly recommend conducting a mock visit with an external expert. Trace the committee process from review to ratification.

A tag-team approach is very helpful. Within your faculty, you have very experienced people and less experienced people. Everybody needs to own this process and speak to it. If a question is posed to a team member who is less experienced, and their answer is not as deep as it could have been, one of the more experienced team members can supplement data by simply speaking up: “Can I add something to that? Can I include some additional information?” There is a synergistic effort of the entire team.

The site visit

When a site visit team comes to your program, there will be a deep, granular discussion of the Submitted Appendix 14 (Self-Study Report). including process, outcomes, analysis, modifications, and plans. This is a significant part of every visit and can go on for more than two hours. The discussion will take place with the Assessment Team (at more extensive programs) or everybody (at smaller programs).

Members of the site visit team usually meet with the Program Director, Medical Director, Principal Faculty and members of the assessment team/committee.

This session provides the team an opportunity to discuss and clarify the program's SSR and supporting materials with the program faculty in order to obtain a more complete understanding of the program’s self-study process, outcomes, analysis, modifications, and plans. The focus of this session is to address the progress of the program and changes that have been made. 

During the discussion of the Submitted Appendix 14 (Self-Study Report) not all faculty may be included in the discussion. However, everyone MUST be prepped for what to expect and answer questions involving process, outcomes, analysis, modifications, and plans. Good preparation can mean the difference between a clean visit involving the C1.02 standards vs multiple C1.02 citations.  

Prepare your faculty to speak the language of critical analysis. They should be able to articulate “how” the conclusions were made that resulted in action plans. Use precise language. “We analyzed the trends and this met our threshold for an Area In Need of Improvement. Therefore, we went forward with an action plan.”  “In the assessment committee, we analyzed the data and then drew conclusions that resulted in the plan of action.” Reference relationships such as cause and effect/correlation between variables.

Finally, you want to portray an overall culture of data-driven decision-making. “We don’t make decisions based on opinions but on evidence.”

Next week…

We’ll conclude our blog series on dealing with citations in Standards C1.03 and C1.02 with a review and some recommendations for moving forward. I hope to see you then.

SSR site visit preparationFaculty engagement in SSR processCritical analysis language trainingBenchmark articulation in program evaluationData-driven decision-making culture
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog
Volume 9: Faculty Involvement The importance of faculty knowledge

Volume 9: Faculty Involvement The importance of faculty knowledge

December 19, 20234 min read

I’ve seen situations where the SSRs were almost flawlessly written, but unfortunately, the faculty was unable to articulate the same ideas well during site visit Q&A sessions, and there were no records in the minutes that supported the reports, either. Obviously, a well-written report is desirable. You must, however, ensure that your faculty is on board, with the language, the action plans, the benchmarks, and so on. If your faculty doesn’t know what’s in the SSR or understand what any of it means, a site visit will not go well. 

Ensure that your benchmarks are precise and allow your faculty to analyze the data more precisely. They must guide you, and define strengths and areas needing improvement. Data that falls below the benchmark will result in action plans if it meets the criteria.

Keep the following goals in mind:

  • Ensure that all team members can articulate the benchmarks during the site visit. If your benchmarks are kind of “all over the place” and not clearly defined, that’s the first place things will start to fall apart.

  • Create minute templates that allow for improved recording of action plans to archive and memorialize your action plans that align with your SSR.

  • Prepare your faculty to “speak the language “ of critical analysis.

  • Prepare all team members to own at least one SSR element, such as Didactic faculty for Appendix 14C, Clinical faculty for Appendix 14D, and the Assessment Team for Appendices 14E and F.

  • Hold multiple review sessions with the assessment team.

The key assessment team members need to be the “experts” in describing and outlining the program’s assessment process. This includes articulating how critical analysis is integral to the program's decision-making in assessment.  

Practice rehearsing how critical analysis led to the SSR action plans several times, including how triangulation was used. I strongly recommend conducting a mock visit with an external expert. Trace the committee process from review to ratification.

A tag-team approach is very helpful. Within your faculty, you have very experienced people and less experienced people. Everybody needs to own this process and speak to it. If a question is posed to a team member who is less experienced, and their answer is not as deep as it could have been, one of the more experienced team members can supplement data by simply speaking up: “Can I add something to that? Can I include some additional information?” There is a synergistic effort of the entire team.

The site visit

When a site visit team comes to your program, there will be a deep, granular discussion of the Submitted Appendix 14 (Self-Study Report). including process, outcomes, analysis, modifications, and plans. This is a significant part of every visit and can go on for more than two hours. The discussion will take place with the Assessment Team (at more extensive programs) or everybody (at smaller programs).

Members of the site visit team usually meet with the Program Director, Medical Director, Principal Faculty and members of the assessment team/committee.

This session provides the team an opportunity to discuss and clarify the program's SSR and supporting materials with the program faculty in order to obtain a more complete understanding of the program’s self-study process, outcomes, analysis, modifications, and plans. The focus of this session is to address the progress of the program and changes that have been made. 

During the discussion of the Submitted Appendix 14 (Self-Study Report) not all faculty may be included in the discussion. However, everyone MUST be prepped for what to expect and answer questions involving process, outcomes, analysis, modifications, and plans. Good preparation can mean the difference between a clean visit involving the C1.02 standards vs multiple C1.02 citations.  

Prepare your faculty to speak the language of critical analysis. They should be able to articulate “how” the conclusions were made that resulted in action plans. Use precise language. “We analyzed the trends and this met our threshold for an Area In Need of Improvement. Therefore, we went forward with an action plan.”  “In the assessment committee, we analyzed the data and then drew conclusions that resulted in the plan of action.” Reference relationships such as cause and effect/correlation between variables.

Finally, you want to portray an overall culture of data-driven decision-making. “We don’t make decisions based on opinions but on evidence.”

Next week…

We’ll conclude our blog series on dealing with citations in Standards C1.03 and C1.02 with a review and some recommendations for moving forward. I hope to see you then.

SSR site visit preparationFaculty engagement in SSR processCritical analysis language trainingBenchmark articulation in program evaluationData-driven decision-making culture
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog

Don't miss out on future events!

Subscribe to our newsletter

© 2024 Scott Massey Ph.D. LLC

Privacy Policy | Terms of Use