BLOG

SCOTT'S THOUGHTS

Finding Solutions to Common ARC-PA Citations – Volume 12

Finding Solutions to Common ARC-PA Citations – Volume 12

March 27, 20244 min read

Standard B4.01a-b

Welcome back to my weekly blog. Today, I continue my series on the most frequently cited ARC-PA 5th Edition Standards in Self-Study Reports by looking at Standard B4.01a-b. Let’s get started!

According to this Standard, the program must conduct frequent, objective, and documented evaluations of student performance in meeting the program’s learning outcomes and instructional objectives for both didactic and supervised clinical practice experience components.

The evaluations must:

a) show alignment between what is expected and what is taught. You can see oftentimes that the observation or citation says the program “was not able to describe how the methods of assessment aligned with learning outcomes.” In other words, your alignment of these two factors has to be so clear that missing it will be difficult. 

b) allow the program to identify and address any student deficiencies in a timely manner.

What are the nuances of compliance? B4.01a requires us to align what is expected with what is taught. This requires that learning outcomes and assessment modalities must be perfectly crosswalked together. I will use this terminology frequently because I believe focusing on consistency in the language and terminology is one of the ways you can mitigate this issue. 

Then, B4.01b would ask that the program be able to identify and address any student deficiencies in a timely manner. As I have stated, compliance with this standard is irrevocably connected to B4.01a. Learning outcomes and assessments must be crosswalked to measure any student deficiencies. This requires each aspect of the learning outcomes to be measurable. This is why learning outcomes with complex content that do not allow for full measurement are deemed uncompliant.

Why are citations so common here?

1. There must be perfect alignment between the learning outcomes for SCPE and the assessment modalities. And, this does indeed mean “perfect” - if your learning outcomes are skills a, b, c, d, and e; your assessment must specifically assess skills a, b, c, d, and e. 

2. Preceptor evaluations and the learning outcomes must be crosswalked to demonstrate assessment of all learning outcomes. Think of it this way: any learning outcome must be demonstrated and measurable. 

3. The learning outcomes must be measurable without complex content. This aspect ties in with the previous one. (Note: in next week’s blog, I’ll show you an example of altering your assessments to eliminate complex content as well as create alignment.)

4. Programs must triangulate assessment between learning outcomes, instructional objectives, and assessment modalities. 

5. All learning outcomes must be met. If a student does not achieve a learning outcome (for whatever reason, such as they didn’t get the opportunity or the evaluation was below the program’s successful benchmark), a gap analysis must be included to ensure the student meets competency prior to graduation. I will explain how to conduct a gap analysis in upcoming blogs.

An example of citations will help us see where problems arise. The relevant language is in red.

Example.

Standard B4.01a: The program must conduct frequent, objective and documented evaluations of student performance in meeting the program’s learning outcomes and instructional objectives for both didactic and supervised clinical practice experience components. The evaluations must align with what is expected and taught. 

Findings: The program did not provide evidence it conducted evaluations of students in the supervised clinical practice experience (SCPE) curriculum with clear alignment between what is expected, taught, and assessed for program-required clinical/technical skills and procedures. (This is probably more common than anything else because, intuitively, we do not like to be very specific about the procedures we expect of our students. But at the end of the day, you must specify. One way to approach this is by using a smaller list of required procedures.)

Comments: The program described its process of frequent, objective, and documented evaluations of students in the curriculum. Within the supervised clinical practice experience (SCPE) syllabi and preceptor evaluation forms provided in the application (appendices 13 and 17) and reviewed at the time of the site visit, the program did not describe how the clinical and technical skills component of the program’s learning outcomes are evaluated.

•While the preceptor evaluation forms were aligned with the learning outcomes, there were no defined learning outcomes related to the performance of technical skills or assessment of expected technical skills for each rotation. Therefore, without program-defined learning outcomes, the program did not provide evidence that it assessed student performance of technical skills.

•For example, within the Emergency Medicine SCPE syllabus, the program identified learning outcomes that required the students to identify, clearly communicate, and discuss procedures common to the management of Emergency Medicine patients: describe the principles of, role, and clinical value of common procedures for a list of SCPE-specific procedures (such as throat culture/swab, nasal culture/swab, visual/auditory testing, central venous access and removal, arterial line access and removal, paracentesis, lumbar puncture, joint aspiration and injection, and thoracentesis).

•However, there is no indication of expectations or assessment of performance of technical skills and procedures in the SCPE components of the program. (In this case, the technical skills were listed in the learning outcomes but not in the preceptor evaluation form.)

In my next blog…

We’ll dig into the concept of “complex content” and discuss ways to simplify your SSRs so that there is no question of measurability. See you then!

ARC-PA 5th Edition StandardsStandard B4.01a-bSelf-Study Reports (SSRs)Student performance evaluationsLearning outcomesInstructional objectivesAlignmentAssessment modalitiesDeficienciesPreceptor evaluationsMeasurabilityGap analysisTechnical skillsComplex contentSupervised clinical practice experience (SCPE)Site visitComplianceCitation analysisProgram evaluationEducational assessment
Back to Blog
Finding Solutions to Common ARC-PA Citations – Volume 12

Finding Solutions to Common ARC-PA Citations – Volume 12

March 27, 20244 min read

Standard B4.01a-b

Welcome back to my weekly blog. Today, I continue my series on the most frequently cited ARC-PA 5th Edition Standards in Self-Study Reports by looking at Standard B4.01a-b. Let’s get started!

According to this Standard, the program must conduct frequent, objective, and documented evaluations of student performance in meeting the program’s learning outcomes and instructional objectives for both didactic and supervised clinical practice experience components.

The evaluations must:

a) show alignment between what is expected and what is taught. You can see oftentimes that the observation or citation says the program “was not able to describe how the methods of assessment aligned with learning outcomes.” In other words, your alignment of these two factors has to be so clear that missing it will be difficult. 

b) allow the program to identify and address any student deficiencies in a timely manner.

What are the nuances of compliance? B4.01a requires us to align what is expected with what is taught. This requires that learning outcomes and assessment modalities must be perfectly crosswalked together. I will use this terminology frequently because I believe focusing on consistency in the language and terminology is one of the ways you can mitigate this issue. 

Then, B4.01b would ask that the program be able to identify and address any student deficiencies in a timely manner. As I have stated, compliance with this standard is irrevocably connected to B4.01a. Learning outcomes and assessments must be crosswalked to measure any student deficiencies. This requires each aspect of the learning outcomes to be measurable. This is why learning outcomes with complex content that do not allow for full measurement are deemed uncompliant.

Why are citations so common here?

1. There must be perfect alignment between the learning outcomes for SCPE and the assessment modalities. And, this does indeed mean “perfect” - if your learning outcomes are skills a, b, c, d, and e; your assessment must specifically assess skills a, b, c, d, and e. 

2. Preceptor evaluations and the learning outcomes must be crosswalked to demonstrate assessment of all learning outcomes. Think of it this way: any learning outcome must be demonstrated and measurable. 

3. The learning outcomes must be measurable without complex content. This aspect ties in with the previous one. (Note: in next week’s blog, I’ll show you an example of altering your assessments to eliminate complex content as well as create alignment.)

4. Programs must triangulate assessment between learning outcomes, instructional objectives, and assessment modalities. 

5. All learning outcomes must be met. If a student does not achieve a learning outcome (for whatever reason, such as they didn’t get the opportunity or the evaluation was below the program’s successful benchmark), a gap analysis must be included to ensure the student meets competency prior to graduation. I will explain how to conduct a gap analysis in upcoming blogs.

An example of citations will help us see where problems arise. The relevant language is in red.

Example.

Standard B4.01a: The program must conduct frequent, objective and documented evaluations of student performance in meeting the program’s learning outcomes and instructional objectives for both didactic and supervised clinical practice experience components. The evaluations must align with what is expected and taught. 

Findings: The program did not provide evidence it conducted evaluations of students in the supervised clinical practice experience (SCPE) curriculum with clear alignment between what is expected, taught, and assessed for program-required clinical/technical skills and procedures. (This is probably more common than anything else because, intuitively, we do not like to be very specific about the procedures we expect of our students. But at the end of the day, you must specify. One way to approach this is by using a smaller list of required procedures.)

Comments: The program described its process of frequent, objective, and documented evaluations of students in the curriculum. Within the supervised clinical practice experience (SCPE) syllabi and preceptor evaluation forms provided in the application (appendices 13 and 17) and reviewed at the time of the site visit, the program did not describe how the clinical and technical skills component of the program’s learning outcomes are evaluated.

•While the preceptor evaluation forms were aligned with the learning outcomes, there were no defined learning outcomes related to the performance of technical skills or assessment of expected technical skills for each rotation. Therefore, without program-defined learning outcomes, the program did not provide evidence that it assessed student performance of technical skills.

•For example, within the Emergency Medicine SCPE syllabus, the program identified learning outcomes that required the students to identify, clearly communicate, and discuss procedures common to the management of Emergency Medicine patients: describe the principles of, role, and clinical value of common procedures for a list of SCPE-specific procedures (such as throat culture/swab, nasal culture/swab, visual/auditory testing, central venous access and removal, arterial line access and removal, paracentesis, lumbar puncture, joint aspiration and injection, and thoracentesis).

•However, there is no indication of expectations or assessment of performance of technical skills and procedures in the SCPE components of the program. (In this case, the technical skills were listed in the learning outcomes but not in the preceptor evaluation form.)

In my next blog…

We’ll dig into the concept of “complex content” and discuss ways to simplify your SSRs so that there is no question of measurability. See you then!

ARC-PA 5th Edition StandardsStandard B4.01a-bSelf-Study Reports (SSRs)Student performance evaluationsLearning outcomesInstructional objectivesAlignmentAssessment modalitiesDeficienciesPreceptor evaluationsMeasurabilityGap analysisTechnical skillsComplex contentSupervised clinical practice experience (SCPE)Site visitComplianceCitation analysisProgram evaluationEducational assessment
Back to Blog

Don't miss out on future events!

Subscribe to our newsletter

© 2024 Scott Massey Ph.D. LLC

Privacy Policy | Terms of Use