SCOTT'S THOUGHTS
Welcome back! Our last blog introduced another frequently cited Standard, A2.09d. This time, the issue is the program director’s knowledge and ability to articulate that knowledge. Specifically, can the Program Director articulate the program’s assessment process?
Looking at the language used by the committee will help us understand where deficits arise.
I’d like to share a few examples of citation language to demonstrate how the committee perceived the program director’s language and responses as lacking. I have seen this language play out in a multitude of reports.
“The program director did not demonstrate knowledge of program continuous review and analysis . . . At the site visit, the program director did not consistently articulate critical analysis of the data leading to its conclusions and actions. The program director also did not articulate or document thresholds for identifying strengths, areas in need of improvement, or need for action plans.”
“Data and analysis discussed in meetings but did not address the observation to provide evidence that the program director was knowledgeable about and responsible for continuous programmatic review and analysis.”
“The program director did not demonstrate knowledge of program continuous review and analysis . . . At the time of the site visit, the program director and faculty did not consistently articulate critical analysis of the data leading to its conclusions and actions.”
“The program director was not knowledgeable about and responsible for continuous programmatic review and analysis. ”
“The implementation of the program’s self-assessment process reviewed in the self-study report (SSR) and discussed onsite included substantive deficits in the program’s analysis of quantitative and qualitative data, identification of conclusions and linkage.”
These observations and conclusions get straight to the heart of the matter: Standard A2.09 b requires a program director to be “knowledgeable about” and “responsible for” continuous program review and analysis and, ultimately, be able to speak to the big picture in the language that the committee is looking for. This really does have to do with the program director’s ability to articulate and describe.
So, how can a program director ensure that they are prepared to do so?
To make compliance easier from the start, look at your program’s assessment with these points in mind:
The program director needs to be the architect and driver of the programmatic self-assessment process. This includes direct oversight of the drafting of the SSR. This doesn’t mean the PD must be in every meeting or act as chair of the assessment committee, but they must know and understand the process. Perhaps they are even the ones who developed the process originally.
Senior institutional officials should mentor the program director and provide institutional assessment/research and support for PAEA/ARC-PA workshops. The entire responsibility cannot always fall on the PA program. This is a complex and difficult process with moving parts, and assessment requirements now are both a deep dive and a big lift. Without institutional support, it’s very difficult for anybody to be really skilled in articulating these things.
Ensure that benchmarks have been carefully defined and can be articulated clearly. Benchmarks, as we have discussed, are the basis for everything and remain a clear way to determine whether things are going according to plan or if changes need to be made.
The program director needs to be the “go-to resource” regarding the construction and implementation of the assessment process. Oftentimes, the committee will look to the PD to see if they can provide a good description of the process at the very beginning of the interview. Their ability to respond (or lack thereof) can color the remainder of the assessment.
Assessment Leadership must be experts in assessment, not just the ones who organize the meetings. This job should not fall to just one person. This is why I strongly recommend creating a team where each member is aware of the big picture but specializes in a particular area. They are involved with the critical analysis component and can speak to it.
If you’re from a smaller program, with only six or seven faculty, the assessment process can be developed so that two or three individuals comprise the assessment leadership.
The PD needs to be a member of the assessment leadership and expert enough to describe the process. Tracing your assessment process as a conceptual framework will be helpful because it allows you to visualize what’s happening at the program level.
Everyone on the team assessment leadership team must be able to interject, as in a tag-team approach. Having individuals assigned to each component to discuss it in depth can work very well. This includes articulating how critical analysis is integral to the program's decision-making in assessment.
Before a site visit, I strongly recommend practice and rehearsal to be certain that assessment leadership can articulate how critical analysis led to the SSR action plans, including how triangulation was used. Doing a mock visit with an external expert is an excellent way to prepare.
We’ll conclude our examination of A2.09d with my recommendations on ensuring your leadership assessment team articulates data-driven decision-making. Then, we'll take a closer look at the dynamics of an SSR or site visit compared to our real-life experience.
Welcome back! Our last blog introduced another frequently cited Standard, A2.09d. This time, the issue is the program director’s knowledge and ability to articulate that knowledge. Specifically, can the Program Director articulate the program’s assessment process?
Looking at the language used by the committee will help us understand where deficits arise.
I’d like to share a few examples of citation language to demonstrate how the committee perceived the program director’s language and responses as lacking. I have seen this language play out in a multitude of reports.
“The program director did not demonstrate knowledge of program continuous review and analysis . . . At the site visit, the program director did not consistently articulate critical analysis of the data leading to its conclusions and actions. The program director also did not articulate or document thresholds for identifying strengths, areas in need of improvement, or need for action plans.”
“Data and analysis discussed in meetings but did not address the observation to provide evidence that the program director was knowledgeable about and responsible for continuous programmatic review and analysis.”
“The program director did not demonstrate knowledge of program continuous review and analysis . . . At the time of the site visit, the program director and faculty did not consistently articulate critical analysis of the data leading to its conclusions and actions.”
“The program director was not knowledgeable about and responsible for continuous programmatic review and analysis. ”
“The implementation of the program’s self-assessment process reviewed in the self-study report (SSR) and discussed onsite included substantive deficits in the program’s analysis of quantitative and qualitative data, identification of conclusions and linkage.”
These observations and conclusions get straight to the heart of the matter: Standard A2.09 b requires a program director to be “knowledgeable about” and “responsible for” continuous program review and analysis and, ultimately, be able to speak to the big picture in the language that the committee is looking for. This really does have to do with the program director’s ability to articulate and describe.
So, how can a program director ensure that they are prepared to do so?
To make compliance easier from the start, look at your program’s assessment with these points in mind:
The program director needs to be the architect and driver of the programmatic self-assessment process. This includes direct oversight of the drafting of the SSR. This doesn’t mean the PD must be in every meeting or act as chair of the assessment committee, but they must know and understand the process. Perhaps they are even the ones who developed the process originally.
Senior institutional officials should mentor the program director and provide institutional assessment/research and support for PAEA/ARC-PA workshops. The entire responsibility cannot always fall on the PA program. This is a complex and difficult process with moving parts, and assessment requirements now are both a deep dive and a big lift. Without institutional support, it’s very difficult for anybody to be really skilled in articulating these things.
Ensure that benchmarks have been carefully defined and can be articulated clearly. Benchmarks, as we have discussed, are the basis for everything and remain a clear way to determine whether things are going according to plan or if changes need to be made.
The program director needs to be the “go-to resource” regarding the construction and implementation of the assessment process. Oftentimes, the committee will look to the PD to see if they can provide a good description of the process at the very beginning of the interview. Their ability to respond (or lack thereof) can color the remainder of the assessment.
Assessment Leadership must be experts in assessment, not just the ones who organize the meetings. This job should not fall to just one person. This is why I strongly recommend creating a team where each member is aware of the big picture but specializes in a particular area. They are involved with the critical analysis component and can speak to it.
If you’re from a smaller program, with only six or seven faculty, the assessment process can be developed so that two or three individuals comprise the assessment leadership.
The PD needs to be a member of the assessment leadership and expert enough to describe the process. Tracing your assessment process as a conceptual framework will be helpful because it allows you to visualize what’s happening at the program level.
Everyone on the team assessment leadership team must be able to interject, as in a tag-team approach. Having individuals assigned to each component to discuss it in depth can work very well. This includes articulating how critical analysis is integral to the program's decision-making in assessment.
Before a site visit, I strongly recommend practice and rehearsal to be certain that assessment leadership can articulate how critical analysis led to the SSR action plans, including how triangulation was used. Doing a mock visit with an external expert is an excellent way to prepare.
We’ll conclude our examination of A2.09d with my recommendations on ensuring your leadership assessment team articulates data-driven decision-making. Then, we'll take a closer look at the dynamics of an SSR or site visit compared to our real-life experience.
Subscribe to our newsletter
© 2024 Scott Massey Ph.D. LLC