TY - JOUR
T1 - Development and Pilot Testing of a Programmatic System for Competency Assessment in US Anesthesiology Residency Training
AU - Woodworth, Glenn E.
AU - Goldstein, Zachary T.
AU - Ambardekar, Aditee P.
AU - Arthur, Mary E.
AU - Bailey, Caryl F.
AU - Booth, Gregory J.
AU - Carney, Patricia A.
AU - Chen, Fei
AU - Duncan, Michael J.
AU - Fromer, Ilana R.
AU - Hallman, Matthew R.
AU - Hoang, Thomas
AU - Isaak, Robert
AU - Klesius, Lisa L.
AU - Ladlie, Beth L.
AU - Mitchell, Sally Ann
AU - Miller Juve, Amy K.
AU - Mitchell, John D.
AU - McGrath, Brian J.
AU - Shepler, John A.
AU - Sims, Charles R.
AU - Spofford, Christina M.
AU - Tanaka, Pedro P.
AU - Maniker, Robert B.
N1 - Publisher Copyright:
© 2024 Lippincott Williams and Wilkins. All rights reserved.
PY - 2024/5/1
Y1 - 2024/5/1
N2 - BACKGROUND: In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments. METHODS: Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC). RESULTS: New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation. CONCLUSIONS: A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.
AB - BACKGROUND: In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments. METHODS: Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC). RESULTS: New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation. CONCLUSIONS: A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.
UR - http://www.scopus.com/inward/record.url?scp=85190737727&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85190737727&partnerID=8YFLogxK
U2 - 10.1213/ANE.0000000000006667
DO - 10.1213/ANE.0000000000006667
M3 - Article
C2 - 37801598
AN - SCOPUS:85190737727
SN - 0003-2999
VL - 138
SP - 1081
EP - 1093
JO - Anesthesia and analgesia
JF - Anesthesia and analgesia
IS - 5
ER -