MICCAI 2021 FLARE Challenge:

Fast and Low GPU memory Abdominal oRgan sEgmentation


🎉🎉🎉🎉🎉🎉

Welcome to join MICCAI FLARE 2023 !

https://codalab.lisn.upsaclay.fr/competitions/12239


Abdominal organ segmentation plays an important role in clinical practice, and to some extent, it seems to be a solved problem because the state-of-the-art methods have achieved inter-observer performance in several benchmark datasets. However, most of the existing abdominal datasets only contain single-center, single-phase, single-vendor, or single-disease cases, and it is unclear whether the excellent performance can be generalized on more diverse datasets. Moreover, many SOTA methods use model ensembles to boost performance, but these solutions usually have a large model size and cost extensive computational resources, which are impractical to be deployed in clinical practice.

To address these limitations, we organize the Fast and Low GPU Memory Abdominal Organ Segmentation challenge that has two main features:  (1) the dataset is large and diverse, includes 511 cases from 11 medical centers.  (2) we not only focus on segmentation accuracy but also segmentation efficiency, which are in concordance with real clinical practice and requirements.

Participants are required to develop segmentation methods that can segment the liver, kidney, spleen, and pancreas simultaneously, where both accuracy and efficiency will be evaluated for ranking.

More information can be found from the design document.


 Current: Post Challenge

News: We release a new testing set with 90 cases that include more challenging cases. 30 ground-truth masks are also released, which can be used to visually compare segmentation results. The remaining 60 ground-truth masks are hidden but you can obtain the evaluation results here.

Download the dataset at

Baidu Netdisk:https://pan.baidu.com/s/13Q-bnu-kIUf-boJyzoI6Hw?pwd=2022 pw:2022

Google Drive: https://drive.google.com/drive/folders/1Y2xY3Exgxd16Y8sLqy-VvDqxQvJcHFuB?usp=share_link

Docker containers of top 10 team have been released at Docker Hub and it's very easy to obtain their segmentation results. 

Validation submission: please directly upload your segmentation results and short paper here.

Note: Sometimes, the submission may encounter the error: "The container was killed as it exceeded the memory limit of 4g.". This is because the grand-challenge platform has a limitation of running memory and unfortunately, we do not have the right to release the limitation. When the segmentation is not satisfied, computing NSD can lead to this error. We have tried the nnUNet baseline and it can be evaluated successfully. In other words, if the segmentation performance is better than the nnUNet baseline, it can be evaluated successfully. 

The challenge summary paper has been available online:

@article{MedIA-FLARE21,
title = {Fast and Low-GPU-memory abdomen CT organ segmentation: The FLARE challenge},
author = {Jun Ma and Yao Zhang and Song Gu and Xingle An and Zhihe Wang and Cheng Ge and Congcong Wang and Fan Zhang and Yu Wang and Yinan Xu and Shuiping Gou and Franz Thaler and Christian Payer and Darko Štern and Edward G.A. Henderson and Dónal M. McSweeney and Andrew Green and Price Jackson and Lachlan McIntosh and Quoc-Cuong Nguyen and Abdul Qayyum and Pierre-Henri Conze and Ziyan Huang and Ziqi Zhou and Deng-Ping Fan and Huan Xiong and Guoqiang Dong and Qiongjie Zhu and Jian He and Xiaoping Yang},
journal = {Medical Image Analysis},
pages = {102616},
volume = {82},
year =  {2022}
}

Please cite this paper if the FLARE21 dataset is used in your study. 


 Important Dates

  • 20 April, 2021 (12:00 AM GMT): Registration open!
  • 1 May, 2021 (12:00 AM GMT): Release of the training and validation data. Docker and short paper submission of validation set opening.
  • 31 July, 2021 (12:00 AM GMT ): Deadline for validation submission.
  • 1 August, 2021 (12:00 AM GMT): Docker and short paper submission of testing set opening.
  • 20 August, 2021 (12:00 AM GMT ): Deadline for testing submission.
  • 1 October, 2021: The top 10 teams and results have been announced. The FLARE challenge keeps open for future submissions.
  • The papers and codes of the top 10 teams are available on the Awards page. All the testing results are available on the Results page. To view the results, please click 'Join' and we will approve your request after checking that the information is complete on your profile page.

 Awards

  • We will provide cash prize (5000/3000/2000/500/500/200/200/200/200/200 CNY) for the Top-10 teams, respectively. (1st place: 5000; 2nd place: 3000; 3rd place 2000; 4-5 place: 500; 6-10 place:200)  
  • A certificate will be awarded to the Top-10 teams. The top teams will be required to submit their training and testing codes for verification after the challenge submission deadline in order to ensure that the challenge rules have been respected. 
  • The Top-10 teams will be invited to give oral presentations during FLARE (Oct 1 / 9:00-13:00 UTC).

  • Top-5 teams will be invited to contribute to a challenge review paper (a maximum of two authors per team). Note that the code should be publicly available by  Oct. 1, 2021 for reproducible research, which is also required by the top journal.


 How to participate

  1. Read the following challenge rules carefully.
  2. Click 'Join' to participate in the FLARE21 Challenge. Please make sure that your grand-challenge profile is complete (e.g., Name, Institution, Department, and Location).
  3. Send the signed Challenge Rule Agreement Consent form to FLARE21@aliyun.com via your affiliation E-mails (Gmail, 163.com, qq.com, outlook.com et al. will be ignored without notice). Please use your GrandChallenge account as the subject in the email.
  4. Download the training and validation data when the participation request is approved. 
  5. Develop your solution and make a complete submission (including a Docker tar file and a short paper). 

 Rules

1. All participants should register this challenge with their real names, affiliation (including department, full name of university/institute/company, country), and affiliation E-mails. Incomplete or redundant registrations will be removed without notice.

2. All participants must submit a complete solution to this challenge during the validation and testing phase. A complete solution includes a Docker container (tar file) and a 2-8-page qualified short paper. 

To encourage more participation and reduce the barriers to entry,

  • we have created a

    baseline solution list, participants can either provide a baseline solution or develop their own solutions.

  • we have created a short paper template

    (latex, word) for the participants.

3. All participants should agree that the submitted short papers can be publicly available to the community on the challenge website, and organizers can use the information provided by the participants, including scores, predicted labels, and short papers.

4. Participants should have docker expertise and the submitted Docker tar file size is preferred less than 10 GB. A Docker size of over 15 GB will raise an error. The Docker should execute for at most 6 hours and occupy no more than 16GB GPU Memory to generate segmentation results of the testing set (100 cases). Otherwise, an error will be returned.  

5. Participants are not allowed to register multiple teams and accounts ( only listed names in the signed document will be considered ). Participants from the same research group are also not allowed to register multiple teams. One partcipant can only join one team.  Organizers keep the right to disqualify such participants. 

6. Redistribution or transfer of data or data link is not allowed. Participants should use the data only by themselves. The challenge data and results will be free to use after MICCAI (1st Oct 2021).

7. For a fair comparison, participants are not allowed to use any additional data and pre-trained models.


Sponsors

NVIDIA has sponsored a DGX-1 for the final evaluation.