Towards a Shared Typology of Ethics Cases: Inter-Rater Reliability and Consensus Testing of the Armstrong Clinical Ethics Coding System (ACECS)
Friday, October 13, 2023
5:00 PM – 6:15 PM ET
Location: Essex AB (Fourth Floor)
As clinical ethics consultation becomes more common across hospital systems, ethics programs, seeking to standardize data-gathering to allow for comparison within and across institutions, increasingly look to objective and semi-objective tools such as the Armstrong Clinical Ethics Coding System (ACECS). While there is growing awareness of the need for descriptive and analytic assessment, the field has not yet endorsed a uniform typology to categorize ethics consultations. Currently, there are at least 27 different typologies; of these, ACECS is being increasingly utilized, including at our own institution. Accordingly, we limited our analysis in the present study exclusively to the ACECS typology.
Despite this proliferation, there has been no formal effort to establish reliability for the ACECS codes. Our research uses accepted statistical measures, including Fleiss’s kappa, to determine the inter-rater reliability of several highly used ACECS content codes by analyzing how multiple ethicists classify hypothetical ethics consultations. Moreover, due to the large number of ACECS codes, this work also applies consensus analysis to measure reliability for all 138 content codes by examining the agreement between participants regarding code definitions.
As the field of clinical ethics continues to standardize, this research will have significant implications for improving coding systems like ACECS and others yet to be developed.
Dave Reis – Ethics Research Team Lead, Ethics Program, Wellstar Health System; Lexi White – Ethics Program – Wellstar Health System