TY - JOUR
T1 - Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool
AU - LaMantia, Joseph
AU - Kane, Bryan
AU - Yarris, Lalena
AU - Tadros, Anthony
AU - Ward, Mary Frances
AU - Lesser, Martin
AU - Shayne, Philip
AU - Brunett, Patrick
AU - Kyriakedes, Chris
AU - Rinnert, Stephen
AU - Schimdt, Joseph
AU - Wald, David
AU - Akerman, Meredith
AU - Livote, Elayne
AU - Soohoo, David
AU - Gong, Jonathan
PY - 2009/12
Y1 - 2009/12
N2 - Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.
AB - Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.
KW - Evaluation
KW - Inter-rater variation
KW - Reliability
KW - Training
UR - http://www.scopus.com/inward/record.url?scp=73349137942&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=73349137942&partnerID=8YFLogxK
U2 - 10.1111/j.1553-2712.2009.00593.x
DO - 10.1111/j.1553-2712.2009.00593.x
M3 - Article
C2 - 20053212
AN - SCOPUS:73349137942
SN - 1069-6563
VL - 16
SP - S51-S57
JO - Academic Emergency Medicine
JF - Academic Emergency Medicine
IS - SUPPL. 2
ER -