In cooperation with the annual Computing in Cardiology conference, PhysioNet hosts a series of challenges, inviting participants to tackle clinically interesting problems that are either unsolved or not well-solved.
In complementary ways, PhysioNet and Computing in Cardiology (CinC) catalyze and support scientific communication and collaboration between basic and clinical scientists. The annual meetings of CinC are gatherings of researchers from many nations and disciplines, bridging the geographic and specialty chasms that separate understanding from practice, while PhysioNet provides on-line data and software resources that support collaborations of basic and clinical researchers throughout the year. The annual PhysioNet/CinC Challenges seek to provide stimulating yet friendly competitions, while at the same time offering both specialists and non-specialists alike opportunities to make progress on significant open problems whose solutions may be of profound clinical value. The use of shared data provided via PhysioNet makes it possible for participants to work independently toward a common objective. At CinC, participants can make meaningful results-based comparisons of their methods; lively and well-informed discussions are the norm at scientific sessions dedicated to these challenges. Discovery of the complementary strengths of diverse approaches to a problem when coupled with deep understanding of that problem frequently sparks new collaborations and opportunities for further study.
A new challenge topic is announced each year. For each challenge, we assemble the raw materials needed to begin work, and we post them here on PhysioNet. In a typical challenge, these raw materials consist of a collection of signals or other data to be analyzed, and sometimes a sample entry that can be used as a starting point. The required analyses are provided for a subset of the data (the "learning set") in each case, and the challenge is to analyze the remaining data (the "test set").
Each challenge begins when the announcement is posted here, and ends in August or early September, shortly before the Computing in Cardiology conference. An important milestone for participants is the deadline for submitting abstracts for Computing in Cardiology, which is usually 1 May each year. Those wishing to qualify as official entrants, with eligibility for awards, must submit an acceptable abstract describing their work as well as an entry for scoring by this deadline. A limited number of revised entries may be submitted between 1 May and the final challenge deadline, which varies from year to year.
Challenges are open to all. Beginning in 2010, participants enter by joining a PhysioNetWorks project that allows them to submit entries and receive scores via this web site. Instructions and details, which vary from year to year, are on the home page for each challenge.
(You are also encouraged to work on challenges from previous years. In most cases, the solutions have been posted and you can score yourself. If the solutions have been withheld, as in a few cases in which followup studies are planned or ongoing, the challenge home pages have information about how to obtain scores.)
When the structure of the challenge permits, the top scores (including those of both official and unofficial entrants) are posted anonymously during the challenge period. The top-scoring eligible participant in each challenge event receives an award at that year's Computing in Cardiology conference.
After the final challenge deadline, we post the names of the top scorers, their scores, the number of entries they submitted in order to achieve their scores, and (for the official entrants) the papers they submitted to Computing in Cardiology in order to qualify. Many of the Challenges have open-source events, in which participants submit their entries as code to be tested using data they have not seen; the most successful of these entries are also posted after the challenge concludes, as a basis for follow-up studies.
By presenting these challenges, we aim to stimulate work on important clinical problems and to foster rapid progress towards their solution. Collaborations among those who have developed complementary approaches to challenge problems are easily established. We consider it especially significant that many of those who have participated in these challenges would not otherwise have had access to the data needed to study these topics. By bringing with them the insights and methods they have acquired from their own areas of expertise, these researchers enrich our fields of interest. We look forward to future challenges, and invite you to join in!
What will be the topic of the next challenge? It might be image analysis, or simulation, or forecasting.... An ideal challenge problem is interesting, clinically important, and possible to study using available materials that have not been widely circulated previously. Moreover, there must be an objective way to evaluate the quality of a challenge entry (for an analysis problem, this usually means there must be a known set of correct analyses of the data, i.e., a "gold standard" against which entries can be compared).
What should be the topic of next year's challenge? Do you have a data set that can help in creating a challenge? Please send us your ideas!
If you would like help understanding the content of this page, or using and downloading data/software, please see our Frequently Asked Questions.
If you have any comments, feedback, or particular questions regarding this page or our website, please send them to the webmaster.
Updated Tuesday, 30 August 2016 at 22:56 CEST