TY - JOUR
T1 - State-of-the-art in biomedical literature retrieval for clinical cases
T2 - a survey of the TREC 2014 CDS track
AU - Roberts, Kirk
AU - Simpson, Matthew
AU - Demner-Fushman, Dina
AU - Voorhees, Ellen
AU - Hersh, William
N1 - Funding Information:
Kirk Roberts, Matthew Simpson, and Dina Demner-Fushman were supported by the intramural research program at the U.S. National Library of Medicine, National Institutes of Health. The authors would also like to thank the following participants for providing feedback and clarifications: Raymond Wan, Paul McNamee, Jean Garcia-Gathright, Joao Palotti, Eva D?hondt, Dawit Girmay, Afshin Deroie, Sungbin Choi, Luca Soldaini, Joe McCarthy, and Yi-Shu Wei.
Publisher Copyright:
© 2015, Springer Science+Business Media New York (outside the USA).
PY - 2016/4/1
Y1 - 2016/4/1
N2 - Providing access to relevant biomedical literature in a clinical setting has the potential to bridge a critical gap in evidence-based medicine. Here, our goal is specifically to provide relevant articles to clinicians to improve their decision-making in diagnosing, treating, and testing patients. To this end, the TREC 2014 Clinical Decision Support Track evaluated a system’s ability to retrieve relevant articles in one of three categories (Diagnosis, Treatment, Test) using an idealized form of a patient medical record. Over 100 submissions from over 25 participants were evaluated on 30 topics, resulting in over 37k relevance judgments. In this article, we provide an overview of the task, a survey of the information retrieval methods employed by the participants, an analysis of the results, and a discussion on the future directions for this challenging yet important task.
AB - Providing access to relevant biomedical literature in a clinical setting has the potential to bridge a critical gap in evidence-based medicine. Here, our goal is specifically to provide relevant articles to clinicians to improve their decision-making in diagnosing, treating, and testing patients. To this end, the TREC 2014 Clinical Decision Support Track evaluated a system’s ability to retrieve relevant articles in one of three categories (Diagnosis, Treatment, Test) using an idealized form of a patient medical record. Over 100 submissions from over 25 participants were evaluated on 30 topics, resulting in over 37k relevance judgments. In this article, we provide an overview of the task, a survey of the information retrieval methods employed by the participants, an analysis of the results, and a discussion on the future directions for this challenging yet important task.
KW - Biomedical information retrieval
KW - Clinical decision support
KW - Information retrieval evaluation
UR - http://www.scopus.com/inward/record.url?scp=84956652690&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84956652690&partnerID=8YFLogxK
U2 - 10.1007/s10791-015-9259-x
DO - 10.1007/s10791-015-9259-x
M3 - Article
AN - SCOPUS:84956652690
SN - 1386-4564
VL - 19
SP - 113
EP - 148
JO - Information Retrieval
JF - Information Retrieval
IS - 1-2
ER -