Skip to content
forked from MrGiovanni/SuPreM

Supervised Pre-Trained 3D Models for Medical Image Analysis

License

Notifications You must be signed in to change notification settings

EdSun3941/SuPreM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

We developed a suite of pre-trained 3D models, named SuPreM, that combined the best of large-scale datasets and per-voxel annotations, showing the transferability across a range of 3D medical imaging tasks.

Paper

Transitioning to Fully-Supervised Pre-Training with Large-Scale Radiology ImageNet for Improved AI Transferability in Three-Dimensional Medical Segmentation
Wenxuan Li1, Junfei Xiao1, Jie Liu2, Yucheng Tang3, Alan Yuille1, and Zongwei Zhou1,*
1Johns Hopkins University
2City University of Hong Kong
3NVIDIA
Radiological Society of North America (RSNA) 2023
abstract | code | slides | talk

An Extensive Dataset: AbdomenAtlas 1.1

The release of AbdomenAtlas 1.0 can be found at https://github.com/MrGiovanni/AbdomenAtlas

AbdomenAtlas 1.1 is an extensive dataset of 9,262 CT volumes with per-voxel annotation of 25 organs and pseudo annotations for seven types of tumors, enabling us to finally perform supervised pre-training of AI models at scale. Based on AbdomenAtlas 1.1, we also provide a suite of pre-trained models comprising several widely recognized AI models.

Prelimianry benchmark showed that supervised pre-training strikes as a preferred choice in terms of performance and efficiency compared with self-supervised pre-training.

We anticipate that the release of large, annotated datasets (AbdomenAtlas 1.1) and the suite of pre-trained models (SuPreM) will bolster collaborative endeavors in establishing Foundation Datasets and Foundation Models for the broader applications of 3D volumetric medical image analysis.

A Suite of Pre-trained Models: SuPreM

The following is a list of supported model backbones in our collection. Select the appropriate family of backbones and click to expand the table, download a specific backbone and its pre-trained weights (name and download), and save the weights into ./pretrained_weights/. More backbones will be added along time. Please suggest the backbone in this channel if you want us to pre-train it on AbdomenAtlas 1.1 containing 9,262 annotated CT volumes.

Swin UNETR
name params pre-trained data resources download
Tang et al. 62.19M 5050 CT GitHub stars weights
Jose Valanaras et al. 62.19M 50000 CT/MRI GitHub stars weights
Universal Model 62.19M 2100 CT GitHub stars weights
SuPreM 62.19M 2100 CT ours 🌟 weights
U-Net
name params pre-trained data resources download
Models Genesis 19.08M 623 CT GitHub stars weights
UniMiSS tiny 5022 CT&MRI GitHub stars weights
small weights
Med3D 85.75M 1638 CT GitHub stars weights
DoDNet 17.29M 920 CT GitHub stars weights
Universal Model 19.08M 2100 CT GitHub stars weights
SegResNet
name params pre-trained data resources download
SuPreM 62.19M 2100 CT ours 🌟 weights

Acknowledgement

This work was supported by the Lustgarten Foundation for Pancreatic Cancer Research and the McGovern Foundation. The segmentation backbone is based on Swin UNETR; we appreciate the effort of the MONAI Team to provide and maintain open-source code to the community. Paper content is covered by patents pending.

About

Supervised Pre-Trained 3D Models for Medical Image Analysis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%