TY - JOUR
T1 - DREAMTools
T2 - A Python package for scoring collaborative challenges
AU - Cokelaer, Thomas
AU - Bansal, Mukesh
AU - Bare, Christopher
AU - Bilal, Erhan
AU - Bot, Brian M.
AU - Chaibub Neto, Elias
AU - Eduati, Federica
AU - de la Fuente, Alberto
AU - Gönen, Mehmet
AU - Hill, Steven M.
AU - Hoff, Bruce
AU - Karr, Jonathan R.
AU - Küffner, Robert
AU - Menden, Michael P.
AU - Meyer, Pablo
AU - Norel, Raquel
AU - Pratap, Abhishek
AU - Prill, Robert J.
AU - Weirauch, Matthew T.
AU - Costello, James C.
AU - Stolovitzky, Gustavo
AU - Saez-Rodriguez, Julio
N1 - Publisher Copyright:
© 2016 Cokelaer T et al.
PY - 2016
Y1 - 2016
N2 - DREAM challenges are community competitions designed to advance computational methods and address fundamental questions in system biology and translational medicine. Each challenge asks participants to develop and apply computational methods to either predict unobserved outcomes or to identify unknown model parameters given a set of training data. Computational methods are evaluated using an automated scoring metric, scores are posted to a public leaderboard, and methods are published to facilitate community discussions on how to build improved methods. By engaging participants from a wide range of science and engineering backgrounds, DREAM challenges can comparatively evaluate a wide range of statistical, machine learning, and biophysical methods. Here, we describe DREAMTools, a Python package for evaluating DREAM challenge scoring metrics. DREAMTools provides a command line interface that enables researchers to test new methods on past challenges, as well as a framework for scoring new challenges. As of March 2016, DREAMTools includes more than 80% of completed DREAM challenges. DREAMTools complements the data, metadata, and software tools available at the DREAM website http://dreamchallenges.org and on the Synapse platform at https://www.synapse.org. Availability: DREAMTools is a Python package. Releases and documentation are available at http://pypi.python.org/pypi/dreamtools. The source code is available at http://github.com/dreamtools/dreamtools.
AB - DREAM challenges are community competitions designed to advance computational methods and address fundamental questions in system biology and translational medicine. Each challenge asks participants to develop and apply computational methods to either predict unobserved outcomes or to identify unknown model parameters given a set of training data. Computational methods are evaluated using an automated scoring metric, scores are posted to a public leaderboard, and methods are published to facilitate community discussions on how to build improved methods. By engaging participants from a wide range of science and engineering backgrounds, DREAM challenges can comparatively evaluate a wide range of statistical, machine learning, and biophysical methods. Here, we describe DREAMTools, a Python package for evaluating DREAM challenge scoring metrics. DREAMTools provides a command line interface that enables researchers to test new methods on past challenges, as well as a framework for scoring new challenges. As of March 2016, DREAMTools includes more than 80% of completed DREAM challenges. DREAMTools complements the data, metadata, and software tools available at the DREAM website http://dreamchallenges.org and on the Synapse platform at https://www.synapse.org. Availability: DREAMTools is a Python package. Releases and documentation are available at http://pypi.python.org/pypi/dreamtools. The source code is available at http://github.com/dreamtools/dreamtools.
UR - http://www.scopus.com/inward/record.url?scp=84969591048&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84969591048&partnerID=8YFLogxK
U2 - 10.12688/f1000research.7118.2
DO - 10.12688/f1000research.7118.2
M3 - Article
AN - SCOPUS:84969591048
SN - 2046-1402
VL - 4
JO - F1000Research
JF - F1000Research
M1 - 1030
ER -