Fast and Low-GPU-memory abdomen CT organ segmentation: The FLARE challenge
- Author(s)
- Ma, J; Zhang, Y; Gu, S; An, X; Wang, Z; Ge, C; Wang, C; Zhang, F; Wang, Y; Xu, Y; Gou, S; Thaler, F; Payer, C; Stern, D; Henderson, EGA; McSweeney, DM; Green, A; Jackson, P; McIntosh, L; Nguyen, QC; Qayyum, A; Conze, PH; Huang, Z; Zhou, Z; Fan, DP; Xiong, H; Dong, G; Zhu, Q; He, J; Yang, X;
- Journal Title
- Medical Image Analysis
- Publication Type
- Research article
- Abstract
- Automatic segmentation of abdominal organs in CT scans plays an important role in clinical practice. However, most existing benchmarks and datasets only focus on segmentation accuracy, while the model efficiency and its accuracy on the testing cases from different medical centers have not been evaluated. To comprehensively benchmark abdominal organ segmentation methods, we organized the first Fast and Low GPU memory Abdominal oRgan sEgmentation (FLARE) challenge, where the segmentation methods were encouraged to achieve high accuracy on the testing cases from different medical centers, fast inference speed, and low GPU memory consumption, simultaneously. The winning method surpassed the existing state-of-the-art method, achieving a 19x faster inference speed and reducing the GPU memory consumption by 60% with comparable accuracy. We provide a summary of the top methods, make their code and Docker containers publicly available, and give practical suggestions on building accurate and efficient abdominal organ segmentation models. The FLARE challenge remains open for future submissions through a live platform for benchmarking further methodology developments at https://flare.grand-challenge.org/.
- Keywords
- Humans; *Algorithms; *Tomography, X-Ray Computed/methods; Abdomen/diagnostic imaging; Benchmarking; Image Processing, Computer-Assisted/methods; Abdominal organ; Efficiency; Multi-center; Segmentation
- Department(s)
- Physical Sciences
- PubMed ID
- 36179380
- Publisher's Version
- https://doi.org/10.1016/j.media.2022.102616
- Terms of Use/Rights Notice
- Refer to copyright notice on published article.
Creation Date: 2025-01-31 03:38:43
Last Modified: 2025-01-31 03:40:50