Some useful pytorch snippets
This version has been tested with Pytorch 1.10.0, but using it with Pytorch<2.0 should be safe.
-
Install Pytorch 1.10.0 folowing the instructions provided on the page pytorch.org/get-started/previous-versions/#v1100.
-
Install OpenSlide.
-
Install it by:
Adding it to your requirements file:
# use the latest version gtorch_utils @ https://github.com/giussepi/gtorch_utils/tarball/main # or use a specific release (format 1) gutils @ https://github.com/giussepi/gtorch_utils/archive/refs/tags/v0.1.0.tar.gz # or use a specific release (format 2) gutils @ git+https://github.com/giussepi/[email protected]
Or installing it directly:
pip install git+git://github.com/giussepi/gtorch_utils.git --use-feature=2020-resolver --no-cache-dir # or pip install https://github.com/giussepi/gtorch_utils/tarball/main --use-feature=2020-resolver --no-cache-dir
-
If you want to modify some default configuration values (e.g. for CT-82 and LiTS17 processing). Copy and the content from settings.py.template into your project
settings.py
and update it appropriately (especiallyPROJECT_PATH
,CT82_SAVING_PATH
,LITS17_SAVING_PATH
andLITS17_CONFIG
)
-
Clone this repository
-
If you want to modify some default configuration values (e.g. for CT-82 and LiTS17 processing). Make a copy of the configuration file, review it thoroughly and update it properly (especially
PROJECT_PATH
,CT82_SAVING_PATH
,LITS17_SAVING_PATH
andLITS17_CONFIG
)cp settings.py.template settings.py
-
Modify or add new modules/features with their respective tests
-
Get the test datasets by running
chmod +x get_test_datasets.sh ./get_test_datasets.sh
-
Execute all the tests
chmod +x run_tests.sh ./run_tests.sh
-
If all the tests pass, commit your changes
A few of our tests employs two cases from the NIH-TCIA CT Pancreas benchmark (CT-82) 1 2 3
- DB
- EPSILON
- BaseDataset
- DatasetLabelsMixin
- Detail
- BasicDataset
- DatasetTemplate
- HDF5Dataset
- BrainTumorDataset
- CarvanaDataset
- CT82Dataset [review whole module for more functionalities]
- LiTS17Dataset, LiTS17CropDataset [review whole module for more functionalities]
- OnlineCoNSePDataset, OfflineCoNSePDataset, SeedWorker [review whole module for more functionalities]
- GaussianNoise
- ADSVModelMGR
- ModelMGR
- Checkpoint
- EarlyStopping
- MaskPlotter
- MemoryPrint
- MetricEvaluator
- PlotTensorBoard
- TrainingPlotter
- BasicModelMGR
- ModelMGRImageChannelsError
- CheckPointMixin
- LrShedulerTrack
- SubDatasetsMixin
- DataLoggerMixin
- DADataLoggerMixin
- IniCheckpintError
- CT3DNIfTIMixin
- adsv
- ADSVModelMGRMixin
- standard
- ModelMGRMixin
- SanityChecksMixin
- WeightsChangingSanityChecksMixin
- TorchMetricsMixin
- DATorchMetricsMixin
- ModularTorchMetricsMixin
- Perceptron
- MLP
- ResNet, resnet18, resnet34, resnet50, resnet101, resnet152
- Xception, xception
- InitMixin
- Deeplabv3plus
- UNet
- UNet_3Plus
- UNet_3Plus_DeepSup
- UNet_3Plus_DeepSup_CGM
- DoubleConv
- XConv
- Down
- MicroAE
- TinyAE
- TinyUpAE
- MicroUpAE
- AEDown
- AEDown2
- Up
- OutConv
- UpConcat
- AEUpConcat
- AEUpConcat2
- UnetDsv
- UnetGridGatingSignal
- MetricItem
- Normalizer
- Reproducibility
- sync_batchnorm
- SynchronizedBatchNorm1d, SynchronizedBatchNorm2d, SynchronizedBatchNorm3d
- DataParallelWithCallback, patch_replication_callback
- get_batchnormxd_class
- ConfusionMatrixMGR
- bce_dice_loss_, bce_dice_loss (fastest), BceDiceLoss (support for logits)
- dice_coef_loss
- FocalLoss
- FPPV_Loss
- FPR_Loss
- IOU_Loss<torch.nn.Module>, IOU_loss
- lovasz_hinge, lovasz_softmax
- MCC_Loss, MCCLoss
- MSSSIM_Loss
- NPV_Loss
- Recall_Loss
- SpecificityLoss
- TverskyLoss
- DiceCoeff (individual samples), dice_coeff (batches), dice_coeff_metric (batches, fastest implementation)
- fppv
- fpr
- IOU
- MSSSIM
- npv
- recall
- RNPV
- Specificity
- Accuracy
- BalancedAccuracy
- Recall
- Specificity
- DiceCoefficient, DiceCoefficientPerImage
- plot_img_and_mask
- apply_padding
All the classes and functions are fully document so explore the modules, load snippets and have fun! ππ€ E.g.:
from collections import OrderedDict
import torch.optim as optim
from gtorch_utils.constants import DB
from gtorch_utils.nns.managers.classification import BasicModelMGR
from gtorch_utils.nns.models.classification import Perceptron
# Minimum example, see all the BasicModelMGR options in its class definition at gtorch_utils/models/managers.py.
# GenericDataset is subclass of gtorch_utils.datasets.generic.BaseDataset that you must implement
# to handle your dataset. You can pass argument to your class using dataset_kwargs
BasicModelMGR(
model=Perceptron(3000, 102),
dataset=GenericDataset,
dataset_kwargs=dict(dbhandler=DBhandler, normalizer=Normalizer.MAX_NORM, val_size=.1),
epochs=200
)()
Just pass to BasicModelMGR
the keywrod argument tensorboard=True
and execute:
./run_tensorboard.sh
Note: If you installed this app as a package then you may want to copy the run_tensorboard.sh script to your project root or just run tensorboard --logdir=runs
every time you want to see your training results on the TensorBoard interface. To do so, just open localhost:6006 on your browser.
- Make it compatible with torch 2.0
- Write tests for MemoryPrint
- Implement double cross validation
- Implement cross validation
- Write more tests
- Save & load checkpoint
- Early stopping callback
- Plot train & val loss in tensorboard
Footnotes
-
Holger R. Roth, Amal Farag, Evrim B. Turkbey, Le Lu, Jiamin Liu, and Ronald M. Summers. (2016). Data From Pancreas-CT. The Cancer Imaging Archive. https://doi.org/10.7937/K9/TCIA.2016.tNB1kqBU β©
-
Roth HR, Lu L, Farag A, Shin H-C, Liu J, Turkbey EB, Summers RM. DeepOrgan: Multi-level Deep Convolutional Networks for Automated Pancreas Segmentation. N. Navab et al. (Eds.): MICCAI 2015, Part I, LNCS 9349, pp. 556β564, 2015. (paper) β©
-
Clark K, Vendt B, Smith K, Freymann J, Kirby J, Koppel P, Moore S, Phillips S, Maffitt D, Pringle M, Tarbox L, Prior F. The Cancer Imaging Archive (TCIA): Maintaining and Operating a Public Information Repository, Journal of Digital Imaging, Volume 26, Number 6, December, 2013, pp 1045-1057. DOI: https://doi.org/10.1007/s10278-013-9622-7 β©