MICCAI 2020 RibFrac Challenge: 
Rib Fracture Detection and Classification


If you find this work useful in your research, please acknowledge the RibFrac project teams in the paper and cite both our challenge and clinical paper:

Jiancheng Yang, Rui Shi, Liang Jin, Xiaoyang Huang, Kaiming Kuang, Donglai Wei, Shixuan Gu, Jianying Liu, Pengfei Liu, Zhizhong Chai, Yongjie Xiao, Hao Chen, Liming Xu, Bang Du, Xiangyi Yan, Hao Tang, Adam Alessio, Gregory Holste, Jiapeng Zhang, Xiaoming Wang, Jianye He, Lixuan Che, Hanspeter Pfister, Ming Li, Bingbing Ni. "Deep Rib Fracture Instance Segmentation and Classification from CT on the RibFrac Challenge." arXiv Preprint (2024). (DOI)

Liang Jin, Jiancheng Yang, Kaiming Kuang, Bingbing Ni, Yiyi Gao, Yingli Sun, Pan Gao, Weiling Ma, Mingyu Tan, Hui Kang, Jiajun Chen, Ming Li. Deep-Learning-Assisted Detection and Segmentation of Rib Fractures from CT Scans: Development and Validation of FracNet. EBioMedicine (2020). (DOI

or using  bibtex:

title={Deep Rib Fracture Instance Segmentation and Classification from CT on the RibFrac Challenge},
author={Yang, Jiancheng and Shi, Rui and Jin, Liang and Huang, Xiaoyang and Kuang, Kaiming and Wei, Donglai and Gu, Shixuan and Liu, Jianying and Liu, Pengfei and Chai, Zhizhong and Xiao, Yongjie and Chen, Hao and Xu, Liming and Du, Bang and Yan, Xiangyi and Tang, Hao and Alessio, Adam and Holste, Gregory and Zhang, Jiapeng and Wang, Xiaoming and He, Jianye and Che, Lixuan and Pfister, Hanspeter and Li, Ming and Ni, Bingbing},
journal={arXiv Preprint},

title={Deep-Learning-Assisted Detection and Segmentation of Rib Fractures from CT Scans: Development and Validation of FracNet},
author={Jin, Liang and Yang, Jiancheng and Kuang, Kaiming and Ni, Bingbing and Gao, Yiyi and Sun, Yingli and Gao, Pan and Ma, Weiling and Tan, Mingyu and Kang, Hui and Chen, Jiajun and Li, Ming},

Be a part of the RibFrac Challenge workshop 

As challenge organizers, we will invite top 3 teams for detection and classification task (up to 6 teams in total), to present their method and results at the challenge workshop. Teams with interesting solution could also be invited. The workshop will be hold on October 4, 2020 as an official satellite event of MICCAI 2020. These teams will be offered partial expense to attend the satellite events, which is a good opportunity to showcase your work to the public.

Participants in this competition are not required to attend the workshop. However, only teams that are attending the workshop will be considered for presenting their work. 

Contribute to a challenge review paper

Top teams will be invited to contribute to a challenge review paper, which could be potentially accepted by a top medical image journal. Up to 10 teams will be invited  to submit a 4-page brief solution (following Springer LNCS format, i.e.,  the same format as MICCAI submissions), 2 members per team could be qualified as authors for the challenge review paper. If a public dataset is used in any form (including pretraining, domain adaption), it should be clearly stated in the solution. Besides, if the participants use fully automatic algorithms to generative auxillary labels,  it should also be clearly stated in the submitted solution. Please note that any manual labeling on training, validation or test set is strictly forbidden.

Top-3 teams for each task are required to submit the 4-page solution. If a team ranks top 3 in both detection and classification tasks, up to 3 authors could be included in the challenge review paper, as performance of the two tasks are correlated, 

The organizers reserve the right to identify the "top" teams to be invited, whch depends on leaderboard ranks (mainly), technical contributions of the solution, and numbers of participants in each task. We will send invitation to the top teams for team information, solution paper, and potential solution presentation on the challenge.

The RibFrac challenge is a research effort of thousands of hours by experienced radiologists, computer scientists and engineers. We kindly ask you to respect our effort by appropriate citation and keeping data license.

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.