TY - GEN
T1 - Automated Anatomical Feature Detection for Completeness of Abdominal FAST Exam
AU - Lee, Hyeon Woo
AU - Zahiri, Mohsen
AU - Ghoshal, Goutam
AU - Schmidt, Stephen
AU - Schnittke, Nikolai
AU - Hicks, Bryson
AU - Kaili, Matt
AU - Gregory, Cynthia
AU - Feuerherdt, Magdelyn
AU - Thomas, Caelan
AU - Zhang, Yuan
AU - Hibbs, Katlyn
AU - Sreenivasan, Aishwarya
AU - Shupp, Jeffrey W.
AU - Rizzo, Julie
AU - Gregory, Kenton
AU - Raju, Balasundar
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The Focused Assessment with Sonography in Trauma (FAST) exam is a crucial tool for swiftly identifying intraperitoneal hemorrhage in trauma patients. Accurate interpretation of FAST results relies on clinicians' capacity to thoroughly visualize regions corresponding to potential fluid accumulation across three abdominal zones: the right upper quadrant, left upper quadrant, and suprapubic zones. To ensure comprehensive zones, it is imperative to visualize the essential organs within these zones. Automating the identification of key organs can guide all users in capturing complete zone and enhance diagnostic precision, particularly for less-experienced practitioners. In this study, we propose a deep learning-based approach for both classifying zones and localizing key organs during abdominal FAST examinations. We introduce two distinct methods for zone classification and organ detection. Initially, we build a mobile classification network for processing multi-frame inputs. For organ detection, we employ a single-stage detector to identify key anatomical features in 2D frames. Finally, we report that a combining the outputs from these two approaches results in a model with improved diagnostic accuracy.
AB - The Focused Assessment with Sonography in Trauma (FAST) exam is a crucial tool for swiftly identifying intraperitoneal hemorrhage in trauma patients. Accurate interpretation of FAST results relies on clinicians' capacity to thoroughly visualize regions corresponding to potential fluid accumulation across three abdominal zones: the right upper quadrant, left upper quadrant, and suprapubic zones. To ensure comprehensive zones, it is imperative to visualize the essential organs within these zones. Automating the identification of key organs can guide all users in capturing complete zone and enhance diagnostic precision, particularly for less-experienced practitioners. In this study, we propose a deep learning-based approach for both classifying zones and localizing key organs during abdominal FAST examinations. We introduce two distinct methods for zone classification and organ detection. Initially, we build a mobile classification network for processing multi-frame inputs. For organ detection, we employ a single-stage detector to identify key anatomical features in 2D frames. Finally, we report that a combining the outputs from these two approaches results in a model with improved diagnostic accuracy.
KW - Computer Vision
KW - Deep Learning
KW - FAST Exam
KW - Trauma
KW - Ultrasound Imaging Analysis
UR - http://www.scopus.com/inward/record.url?scp=85178619384&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85178619384&partnerID=8YFLogxK
U2 - 10.1109/IUS51837.2023.10306598
DO - 10.1109/IUS51837.2023.10306598
M3 - Conference contribution
AN - SCOPUS:85178619384
T3 - IEEE International Ultrasonics Symposium, IUS
BT - IUS 2023 - IEEE International Ultrasonics Symposium, Proceedings
PB - IEEE Computer Society
T2 - 2023 IEEE International Ultrasonics Symposium, IUS 2023
Y2 - 3 September 2023 through 8 September 2023
ER -