ISBI 2017: Call for Challenge Proposals

June 22, 2016

Kitware is proud to continue its promotion and support of public medical image analysis challenges that accelerate the pace of research and collaboration.  Following our successful work helping to organize grand challenges at previous conferences, we are now announcing our participation in organizing challenges for ISBI 2017…

 

ISBI 2017: Call for Challenge Proposals

The International Symposium on Biomedical Imaging 2017 (ISBI 2017) conference is soliciting proposals for scientific challenges. The goal of a challenge is to accelerate the pace of research on a demanding academic and clinical problem by fostering collaboration, communication, and understanding via the quantitative comparison of competing methods using standardized data sets.

When submitting a challenge proposal, the organizing team, the challenge and workshop schedules, and the challenge topic and representative data are critical.  Each challenge needs an organizing team responsible for providing training and test data, defining the tasks, specifying the metrics used to measure performance, managing entries, and organizing the on-site workshop presentations. Challenges will have half-day workshops at the conference.  We encourage on-site evaluations and scientific presentations during these workshops, but the operation of a challenge and the agenda for each workshop is determined by the challenge organizers.

We encourage new challenge topics as well as topics that were addressed during previous challenges.  New challenge topics could introduce new imaging devices, new biomedical applications, or existing applications that would benefit from focused attention from the medical imaging research community.  Challenges that address previous challenge topics could feature, for example, repeating a challenge to track how the field has advanced, addressing bottlenecks in existing processing pipelines, processing larger datasets, analyzing specific sub-populations, or exploring the utility of particular technologies such as deep learning.

We also welcome proposals to contribute challenge data.  Such proposals will be vetted with challenge organizers to increase data diversity, provide new insights, or increase statistical significance of challenge results.  As with challenge topic proposals, challenge data proposals can address new or existing challenge topics.

Important Challenge Dates

Challenge Proposal Deadline Rolling acceptance up to September 15, 2016
Notification of Acceptance Rolling acceptance:

    Review within one week of submission

Challenge Announcements on ISBI Website October 1, 2016
Challenge Half-Day Workshops at ISBI April 18, 2017

How to Submit Challenge Proposal

To propose a challenge, please submit a PDF document to the ISBI 2017 Challenge Co-Chairs:

That document should address the following topics that match the review criteria:

  • Relevance: Does the challenge address an appropriate and relevant task? A problem that only very few groups are addressing may not (yet) be a good idea for a competition, as there are simply not enough potential participants (yet). A problem that is more or less solved is also not a good idea. Challenges ideally address an important open problem for which some solutions are available, so that the time is right for a fair and direct comparison of different approaches.
  • Data Quality: It is important to provide enough data, and this data should contain enough variability to be representative for the problem. In general, it is preferred to have data from different scanners or devices, obtained with different protocols or workflows as used in clinical practice worldwide, and from different institutions or populations. A typical limitation of published papers is that a proposed algorithm is evaluated on data from only a single site. A good challenge does not have this limitation. We will organize a Call for Data for accepted Challenges and encourage you to set up ways to include data donated by the community in your challenge.
  • Training Data: Prospective participants to challenges often appreciate the availability of ample training data. Provide this data if possible. Of course, training data should be representative of testing data. Please indicate in the rules of your challenge how participants should use this training data, i.e. are they allowed their own training data as well or not, or are there different tracks depending on what training data was used. Also here you may include data provided by the community through our Call for Data.
  • Test data: Make sure you properly separate training and test data. It is preferred to include some test data from protocols/scanners/institutions that are not represented in the training set.
  • Reference Standard: The methods of defining the reference standard and of evaluating algorithm results must be clearly defined and generally agreeable to the academic community. The challenge is unlikely to attract interest from serious contenders if these methods are poorly considered or open to question. For a paper describing procedures to define reference standards see http://www.hal.inserm.fr/file/index/docid/185431/filename/Jannin_Manuscript-revised3IJCARS2007.pdf
  • Participants: The details of the challenge should be well publicized and advertized in the relevant circles in order to attract a reasonable number of participants, without which the final results will be of less interest. Collecting all prior work relevant to the task at hand and personally inviting the authors of this work has been shown to be a good procedure. In your challenge proposal, please include an overview of key papers relevant to your challenge. A list of prospective participants is also appreciated, or otherwise an estimate on the number of expected participants. Include your plans for how to attract participants in your proposal.
  • Organizing Team: Diversity in the team of challenge organizers is generally recommended. It is good to include a number of people from different backgrounds with experience in the field. These might include researchers from a number of different academic institutes, as well as from industry, who have worked on a variety of projects related to the topic of interest. This will ensure access to a larger pool of data and contacts, as well as a balanced set of opinions on how to define the reference standards and evaluate algorithm performance.
  • Website: A good challenge should not end with a workshop at a conference. Make sure you include a high quality website with your challenge, set up in such a way that for many years to come, new submissions can be processed quickly and efficiently. Using a framework for hosting your challenge and/or your data is recommended. Suggestions are provided here.
  • Open access: A challenge is more attractive to the field if the data and the algorithms are open, or will be open, or publicly available, also after the challenge event. Please include a paragraph on how you deal with availability of data and algorithms in your proposal.
  • Publications: A high profile overview paper on the results of your challenge is a good way to inform the community about the results of your comparative study. Include your plans on this topic.

 

Leave a Reply