-
Install mmsegmentation.
-
Download ADE20K dataset from the official website. The directory structure should look like
ade └── ADEChallengeData2016 ├── annotations │ ├── training │ └── validation └── images ├── training └── validation
Next, create a symbolic link to the dataset.
cd segmentation/ mkdir data ln -s [path/to/ade20k] data/
-
Download LITv2 pretrained weights on ImageNet.
To train a model with pre-trained weights, run:
# single-gpu training
python tools/train.py <CONFIG_FILE> --options model.pretrained=<PRETRAIN_MODEL> [model.backbone.use_checkpoint=True] [other optional arguments]
# multi-gpu training
tools/dist_train.sh <CONFIG_FILE> <GPU_NUM> --options model.pretrained=<PRETRAIN_MODEL> [model.backbone.use_checkpoint=True] [other optional arguments]
For example, to train a Semantic FPN model with a LITv2-S backbone on 8 GPUs, run:
tools/dist_train.sh configs/litv2/litv2_s_fpn_r50_512x512_80k_ade20k.py 8 --options model.pretrained=litv2_s.pth
# single-gpu testing
python tools/test.py <CONFIG_FILE> <SEG_CHECKPOINT_FILE> --eval mIoU
# multi-gpu testing
tools/dist_test.sh <CONFIG_FILE> <SEG_CHECKPOINT_FILE> <GPU_NUM> --eval mIoU
For example, to evaluate a Semantic FPN model with a LITv2-S backbone, run:
tools/dist_test.sh configs/litv2/litv2_s_fpn_r50_512x512_80k_ade20k.py litv2_s_fpn_r50_512x512_80k_ade20k.pth 8 --eval mIoU
To get the FLOPs, run
python tools/get_flops.py configs/litv2/litv2_s_fpn_r50_512x512_80k_ade20k.py
This should give
Input shape: (3, 512, 512)
Flops: 41.29 GFLOPs
Params: 31.45 M
To test the FPS, run
python -m torch.distributed.launch --nproc_per_node=1 --master_port=29500 tools/benchmark.py \
configs/lit/retinanet_litv2_s_fpn_1x_coco.py \
--checkpoint retinanet_litv2_s_fpn_1x_coco.pth \
--launcher pytorch
Backbone | Params (M) | FLOPs (G) | FPS | mIoU | Config | Download |
---|---|---|---|---|---|---|
LITv2-S | 31 | 41 | 42.6 | 44.3 | config | model & log |
LITv2-M | 52 | 63 | 28.5 | 45.7 | config | model & log |
LITv2-B | 90 | 93 | 27.5 | 47.2 | config | model & log |
This repository is released under the Apache 2.0 license as found in the LICENSE file.