Subscribe us: https://groups.google.com/u/2/g/imseg
We developed a suite of pre-trained 3D models, named SuPreM, that combined the best of large-scale datasets and per-voxel annotations, showing the transferability across a range of 3D medical imaging tasks.
Transitioning to Fully-Supervised Pre-Training with Large-Scale Radiology ImageNet for Improved AI Transferability in Three-Dimensional Medical Segmentation
Wenxuan Li1, Junfei Xiao1, Jie Liu2, Yucheng Tang3, Alan Yuille1, and Zongwei Zhou1,*
1Johns Hopkins University
2City University of Hong Kong
3NVIDIA
Radiological Society of North America (RSNA) 2023
abstract | code | slides | talk
The release of AbdomenAtlas 1.0 can be found at https://github.com/MrGiovanni/AbdomenAtlas
AbdomenAtlas 1.1 is an extensive dataset of 9,262 CT volumes with per-voxel annotation of 25 organs and pseudo annotations for seven types of tumors, enabling us to finally perform supervised pre-training of AI models at scale. Based on AbdomenAtlas 1.1, we also provide a suite of pre-trained models comprising several widely recognized AI models.
Prelimianry benchmark showed that supervised pre-training strikes as a preferred choice in terms of performance and efficiency compared with self-supervised pre-training.
We anticipate that the release of large, annotated datasets (AbdomenAtlas 1.1) and the suite of pre-trained models (SuPreM) will bolster collaborative endeavors in establishing Foundation Datasets and Foundation Models for the broader applications of 3D volumetric medical image analysis.
The following is a list of supported model backbones in our collection. Select the appropriate family of backbones and click to expand the table, download a specific backbone and its pre-trained weights (name
and download
), and save the weights into ./pretrained_weights/
. More backbones will be added along time. Please suggest the backbone in this channel if you want us to pre-train it on AbdomenAtlas 1.1 containing 9,262 annotated CT volumes.
Swin UNETR
name | params | pre-trained data | resources | download |
---|---|---|---|---|
Tang et al. | 62.19M | 5050 CT | weights | |
Jose Valanaras et al. | 62.19M | 50000 CT/MRI | weights | |
Universal Model | 62.19M | 2100 CT | weights | |
SuPreM | 62.19M | 2100 CT | ours 🌟 | weights |
U-Net
SegResNet
name | params | pre-trained data | resources | download |
---|---|---|---|---|
SuPreM | 62.19M | 2100 CT | ours 🌟 | weights |
This work was supported by the Lustgarten Foundation for Pancreatic Cancer Research and the McGovern Foundation. The segmentation backbone is based on Swin UNETR; we appreciate the effort of the MONAI Team to provide and maintain open-source code to the community. Paper content is covered by patents pending.