Skip to content

Commit

Permalink
update CIE model and model links in doc
Browse files Browse the repository at this point in the history
  • Loading branch information
rogerwwww committed Aug 3, 2021
1 parent 450bc48 commit 41e7f10
Show file tree
Hide file tree
Showing 7 changed files with 122 additions and 41 deletions.
49 changes: 25 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,33 +61,34 @@ _ThinkMatch_ currently contains pytorch source code of the following deep graph

### PascalVOC - 2GM

| model | year | aero | bike | bird | boat | bottle | bus | car | cat | chair | cow | table | dog | horse | mbkie | person | plant | sheep | sofa | train | tv | mean |
| ---------------------- | ---- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
| [GMN](/models/GMN) | 2018 | 0.4163 | 0.5964 | 0.6027 | 0.4795 | 0.7918 | 0.7020 | 0.6735 | 0.6488 | 0.3924 | 0.6128 | 0.6693 | 0.5976 | 0.6106 | 0.5975 | 0.3721 | 0.7818 | 0.6800 | 0.4993 | 0.8421 | 0.9141 | 0.6240 |
| [PCA-GM](/models/PCA) | 2019 | 0.4979 | 0.6193 | 0.6531 | 0.5715 | 0.7882 | 0.7556 | 0.6466 | 0.6969 | 0.4164 | 0.6339 | 0.5073 | 0.6705 | 0.6671 | 0.6164 | 0.4447 | 0.8116 | 0.6782 | 0.5922 | 0.7845 | 0.9042 | 0.6478 |
| [NGM](/models/NGM) | 2019 | 0.5010 | 0.6350 | 0.5790 | 0.5340 | 0.7980 | 0.7710 | 0.7360 | 0.6820 | 0.4110 | 0.6640 | 0.4080 | 0.6030 | 0.6190 | 0.6350 | 0.4560 | 0.7710 | 0.6930 | 0.6550 | 0.7920 | 0.8820 | 0.6413 |
| [NHGM](/models/NGM) | 2019 | 0.5240 | 0.6220 | 0.5830 | 0.5570 | 0.7870 | 0.7770 | 0.7440 | 0.7070 | 0.4200 | 0.6460 | 0.5380 | 0.6100 | 0.6190 | 0.6080 | 0.4680 | 0.7910 | 0.6680 | 0.5510 | 0.8090 | 0.8870 | 0.6458 |
| [IPCA-GM](/models/PCA) | 2020 | 0.5378 | 0.6622 | 0.6714 | 0.6120 | 0.8039 | 0.7527 | 0.7255 | 0.7252 | 0.4455 | 0.6524 | 0.5430 | 0.6724 | 0.6790 | 0.6421 | 0.4793 | 0.8435 | 0.7079 | 0.6398 | 0.8380 | 0.9083 | 0.6770 |
| [CIE-H](/models/CIE) | 2020 | 0.5250 | 0.6858 | 0.7015 | 0.5706 | 0.8207 | 0.7700 | 0.7073 | 0.7313 | 0.4383 | 0.6994 | 0.6237 | 0.7018 | 0.7031 | 0.6641 | 0.4763 | 0.8525 | 0.7172 | 0.6400 | 0.8385 | 0.9168 | 0.6892 |
| [BBGM](/models/BBGM) | 2020 | 0.6187 | 0.7106 | 0.7969 | 0.7896 | 0.8740 | 0.9401 | 0.8947 | 0.8022 | 0.5676 | 0.7914 | 0.6458 | 0.7892 | 0.7615 | 0.7512 | 0.6519 | 0.9818 | 0.7729 | 0.7701 | 0.9494 | 0.9393 | 0.7899 |
| [NGM-v2](/models/NGM) | 2021 | 0.6184 | 0.7118 | 0.7762 | 0.7875 | 0.8733 | 0.9363 | 0.8770 | 0.7977 | 0.5535 | 0.7781 | 0.8952 | 0.7880 | 0.8011 | 0.7923 | 0.6258 | 0.9771 | 0.7769 | 0.7574 | 0.9665 | 0.9323 | 0.8011 |
| [NHGM-v2](/models/NGM) | 2021 | 0.5995 | 0.7154 | 0.7724 | 0.7902 | 0.8773 | 0.9457 | 0.8903 | 0.8181 | 0.5995 | 0.8129 | 0.8695 | 0.7811 | 0.7645 | 0.7750 | 0.6440 | 0.9872 | 0.7778 | 0.7538 | 0.9787 | 0.9280 | 0.8040 |
| model | year | aero | bike | bird | boat | bottle | bus | car | cat | chair | cow | table | dog | horse | mbkie | person | plant | sheep | sofa | train | tv | mean |
| ------------------------------------------------------------ | ---- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
| [GMN](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gmn) | 2018 | 0.4163 | 0.5964 | 0.6027 | 0.4795 | 0.7918 | 0.7020 | 0.6735 | 0.6488 | 0.3924 | 0.6128 | 0.6693 | 0.5976 | 0.6106 | 0.5975 | 0.3721 | 0.7818 | 0.6800 | 0.4993 | 0.8421 | 0.9141 | 0.6240 |
| [PCA-GM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#pca-gm) | 2019 | 0.4979 | 0.6193 | 0.6531 | 0.5715 | 0.7882 | 0.7556 | 0.6466 | 0.6969 | 0.4164 | 0.6339 | 0.5073 | 0.6705 | 0.6671 | 0.6164 | 0.4447 | 0.8116 | 0.6782 | 0.5922 | 0.7845 | 0.9042 | 0.6478 |
| [NGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2019 | 0.5010 | 0.6350 | 0.5790 | 0.5340 | 0.7980 | 0.7710 | 0.7360 | 0.6820 | 0.4110 | 0.6640 | 0.4080 | 0.6030 | 0.6190 | 0.6350 | 0.4560 | 0.7710 | 0.6930 | 0.6550 | 0.7920 | 0.8820 | 0.6413 |
| [NHGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2019 | 0.5240 | 0.6220 | 0.5830 | 0.5570 | 0.7870 | 0.7770 | 0.7440 | 0.7070 | 0.4200 | 0.6460 | 0.5380 | 0.6100 | 0.6190 | 0.6080 | 0.4680 | 0.7910 | 0.6680 | 0.5510 | 0.8090 | 0.8870 | 0.6458 |
| [IPCA-GM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#pca-gm) | 2020 | 0.5378 | 0.6622 | 0.6714 | 0.6120 | 0.8039 | 0.7527 | 0.7255 | 0.7252 | 0.4455 | 0.6524 | 0.5430 | 0.6724 | 0.6790 | 0.6421 | 0.4793 | 0.8435 | 0.7079 | 0.6398 | 0.8380 | 0.9083 | 0.6770 |
| [CIE-H](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#cie-h) | 2020 | 0.5250 | 0.6858 | 0.7015 | 0.5706 | 0.8207 | 0.7700 | 0.7073 | 0.7313 | 0.4383 | 0.6994 | 0.6237 | 0.7018 | 0.7031 | 0.6641 | 0.4763 | 0.8525 | 0.7172 | 0.6400 | 0.8385 | 0.9168 | 0.6892 |
| [BBGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#bbgm) | 2020 | 0.6187 | 0.7106 | 0.7969 | 0.7896 | 0.8740 | 0.9401 | 0.8947 | 0.8022 | 0.5676 | 0.7914 | 0.6458 | 0.7892 | 0.7615 | 0.7512 | 0.6519 | 0.9818 | 0.7729 | 0.7701 | 0.9494 | 0.9393 | 0.7899 |
| [NGM-v2](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2021 | 0.6184 | 0.7118 | 0.7762 | 0.7875 | 0.8733 | 0.9363 | 0.8770 | 0.7977 | 0.5535 | 0.7781 | 0.8952 | 0.7880 | 0.8011 | 0.7923 | 0.6258 | 0.9771 | 0.7769 | 0.7574 | 0.9665 | 0.9323 | 0.8011 |
| [NHGM-v2](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2021 | 0.5995 | 0.7154 | 0.7724 | 0.7902 | 0.8773 | 0.9457 | 0.8903 | 0.8181 | 0.5995 | 0.8129 | 0.8695 | 0.7811 | 0.7645 | 0.7750 | 0.6440 | 0.9872 | 0.7778 | 0.7538 | 0.9787 | 0.9280 | 0.8040 |

### Willow Object Class - 2GM & MGM

| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [GMN](/models/GMN) | 2018 | - | 0.6790 | 0.7670 | 0.9980 | 0.6920 | 0.8310 | 0.7934 |
| [PCA-GM](/models/PCA) | 2019 | - | 0.8760 | 0.8360 | 1.0000 | 0.7760 | 0.8840 | 0.8744 |
| [NGM](/models/NGM) | 2019 | - | 0.8420 | 0.7760 | 0.9940 | 0.7680 | 0.8830 | 0.8530 |
| [NHGM](/models/NGM) | 2019 | - | 0.8650 | 0.7220 | 0.9990 | 0.7930 | 0.8940 | 0.8550 |
| [NMGM](/models/NGM) | 2019 | - | 0.7850 | 0.9210 | 1.0000 | 0.7870 | 0.9480 | 0.8880 |
| [IPCA-GM](/models/PCA) | 2020 | - | 0.9040 | 0.8860 | 1.0000 | 0.8300 | 0.8830 | 0.9006 |
| [BBGM](/models/BBGM) | 2020 | - | 0.9680 | 0.8990 | 1.0000 | 0.9980 | 0.9940 | 0.9718 |
| [GANN-MGM](/models/GANN) | 2020 | self-supervised | 0.9600 | 0.9642 | 1.0000 | 1.0000 | 0.9879 | 0.9906 |
| [NGM-v2](/models/NGM) | 2021 | - | 0.9740 | 0.9340 | 1.0000 | 0.9860 | 0.9830 | 0.9754 |
| [NHGM-v2](/models/NGM) | 2021 | - | 0.9740 | 0.9390 | 1.0000 | 0.9860 | 0.9890 | 0.9780 |
| [NMGM-v2](/models/NGM) | 2021 | - | 0.9760 | 0.9447 | 1.0000 | 1.0000 | 0.9902 | 0.9822 |
| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------------------------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [GMN](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gmn) | 2018 | - | 0.6790 | 0.7670 | 0.9980 | 0.6920 | 0.8310 | 0.7934 |
| [PCA-GM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#pca-gm) | 2019 | - | 0.8760 | 0.8360 | 1.0000 | 0.7760 | 0.8840 | 0.8744 |
| [NGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2019 | - | 0.8420 | 0.7760 | 0.9940 | 0.7680 | 0.8830 | 0.8530 |
| [NHGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2019 | - | 0.8650 | 0.7220 | 0.9990 | 0.7930 | 0.8940 | 0.8550 |
| [NMGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2019 | - | 0.7850 | 0.9210 | 1.0000 | 0.7870 | 0.9480 | 0.8880 |
| [IPCA-GM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#pca) | 2020 | - | 0.9040 | 0.8860 | 1.0000 | 0.8300 | 0.8830 | 0.9006 |
| [CIE-H](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#cie-h) | 2020 | - | 0.8581 | 0.8206 | 0.9994 | 0.8836 | 0.8871 | 0.8898 |
| [BBGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#bbgm) | 2020 | - | 0.9680 | 0.8990 | 1.0000 | 0.9980 | 0.9940 | 0.9718 |
| [GANN-MGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gann) | 2020 | self-supervised | 0.9600 | 0.9642 | 1.0000 | 1.0000 | 0.9879 | 0.9906 |
| [NGM-v2](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2021 | - | 0.9740 | 0.9340 | 1.0000 | 0.9860 | 0.9830 | 0.9754 |
| [NHGM-v2](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2021 | - | 0.9740 | 0.9390 | 1.0000 | 0.9860 | 0.9890 | 0.9780 |
| [NMGM-v2](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#ngm) | 2021 | - | 0.9760 | 0.9447 | 1.0000 | 1.0000 | 0.9902 | 0.9822 |

_ThinkMatch_ includes the flowing datasets with the provided benchmarks:

Expand Down
69 changes: 69 additions & 0 deletions experiments/vgg16_cie_willow.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
MODEL_NAME: vgg16_cie
DATASET_NAME: willow

DATASET_FULL_NAME: WillowObject

MODULE: models.CIE.model

BACKBONE: VGG16_bn

BATCH_SIZE: 8
DATALOADER_NUM: 2

RANDOM_SEED: 123

# available GPU ids
GPUS:
- 0
#- 1

# Problem configuration
PROBLEM:
TYPE: 2GM
RESCALE: # rescaled image size
- 256
- 256

# Graph construction settings
GRAPH:
SRC_GRAPH_CONSTRUCT: tri
TGT_GRAPH_CONSTRUCT: tri
SYM_ADJACENCY: True

# Willow object class dataset configuration
WillowObject:
TRAIN_NUM: 20 # number of images for training set
SPLIT_OFFSET: 0 # the starting index of training set

# Training settings
TRAIN:
# start, end epochs
START_EPOCH: 0
NUM_EPOCHS: 30

LOSS_FUNC: hung

# learning rate
LR: 1.0e-4
MOMENTUM: 0.9
LR_DECAY: 0.2
LR_STEP: # (in epochs)
- 20

EPOCH_ITERS: 200 # iterations per epoch

CLASS: none

# Evaluation settings
EVAL:
EPOCH: 23 # epoch to be tested
SAMPLES: 1000 # number of tested pairs for each class

# CIE model parameters
CIE:
FEATURE_CHANNEL: 512
SK_ITER_NUM: 20
SK_EPSILON: 1.0e-10
SK_TAU: .005
GNN_FEAT: 2048
GNN_LAYER: 2
4 changes: 2 additions & 2 deletions models/BBGM/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ pretrained model: [google drive](https://drive.google.com/file/d/1RxC7daviZf3kz2

| model | year | aero | bike | bird | boat | bottle | bus | car | cat | chair | cow | table | dog | horse | mbkie | person | plant | sheep | sofa | train | tv | mean |
| ---------------------- | ---- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
| [BBGM](/models/BBGM) | 2020 | 0.6187 | 0.7106 | 0.7969 | 0.7896 | 0.8740 | 0.9401 | 0.8947 | 0.8022 | 0.5676 | 0.7914 | 0.6458 | 0.7892 | 0.7615 | 0.7512 | 0.6519 | 0.9818 | 0.7729 | 0.7701 | 0.9494 | 0.9393 | 0.7899 |
| [BBGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#bbgm) | 2020 | 0.6187 | 0.7106 | 0.7969 | 0.7896 | 0.8740 | 0.9401 | 0.8947 | 0.8022 | 0.5676 | 0.7914 | 0.6458 | 0.7892 | 0.7615 | 0.7512 | 0.6519 | 0.9818 | 0.7729 | 0.7701 | 0.9494 | 0.9393 | 0.7899 |

### Willow Object Class - 2GM & MGM

Expand All @@ -28,7 +28,7 @@ pretrained model: [google drive](https://drive.google.com/file/d/1bt8wBeimM0ofm3

| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [BBGM](/models/BBGM) | 2020 | - | 0.9680 | 0.8990 | 1.0000 | 0.9980 | 0.9940 | 0.9718 |
| [BBGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#bbgm) | 2020 | - | 0.9680 | 0.8990 | 1.0000 | 0.9980 | 0.9940 | 0.9718 |

## File Organization
```
Expand Down
13 changes: 12 additions & 1 deletion models/CIE/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,18 @@ pretrained model: [google drive](https://drive.google.com/file/d/1oRwcnw06t1rCbr

| model | year | aero | bike | bird | boat | bottle | bus | car | cat | chair | cow | table | dog | horse | mbkie | person | plant | sheep | sofa | train | tv | mean |
| --------------------- | ---- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
| [CIE-H](/models/CIE) | 2020 | 0.5250 | 0.6858 | 0.7015 | 0.5706 | 0.8207 | 0.7700 | 0.7073 | 0.7313 | 0.4383 | 0.6994 | 0.6237 | 0.7018 | 0.7031 | 0.6641 | 0.4763 | 0.8525 | 0.7172 | 0.6400 | 0.8385 | 0.9168 | 0.6892 |
| [CIE-H](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#cie-h) | 2020 | 0.5250 | 0.6858 | 0.7015 | 0.5706 | 0.8207 | 0.7700 | 0.7073 | 0.7313 | 0.4383 | 0.6994 | 0.6237 | 0.7018 | 0.7031 | 0.6641 | 0.4763 | 0.8525 | 0.7172 | 0.6400 | 0.8385 | 0.9168 | 0.6892 |

### Willow Object Class - 2GM

experiment config: ``experiments/vgg16_cie_willow.yaml``

pretrained model: [google drive](https://drive.google.com/file/d/1aUdNTWlFxk-sf-bj08ADUoo9CSIQjzDb/view?usp=sharing)

| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [CIE-H](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#cie-h) | 2020 | - | 0.8581 | 0.8206 | 0.9994 | 0.8836 | 0.8871 | 0.8898 |


## File Organization
```
Expand Down
2 changes: 1 addition & 1 deletion models/GANN/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ pretrained model: [google drive](https://drive.google.com/file/d/15Sg6mi9nrpsD4M

| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [GANN-MGM](/models/GANN) | 2020 | self-supervised | 0.9600 | 0.9642 | 1.0000 | 1.0000 | 0.9879 | 0.9906 |
| [GANN-MGM](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gann) | 2020 | self-supervised | 0.9600 | 0.9642 | 1.0000 | 1.0000 | 0.9879 | 0.9906 |

## File Organization
```
Expand Down
6 changes: 3 additions & 3 deletions models/GMN/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ pretrained model: [google drive](https://drive.google.com/file/d/1X8p4XjzqGDniYi

| model | year | aero | bike | bird | boat | bottle | bus | car | cat | chair | cow | table | dog | horse | mbkie | person | plant | sheep | sofa | train | tv | mean |
| ---------------------- | ---- | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
| [GMN](/models/GMN) | 2018 | 0.4163 | 0.5964 | 0.6027 | 0.4795 | 0.7918 | 0.7020 | 0.6735 | 0.6488 | 0.3924 | 0.6128 | 0.6693 | 0.5976 | 0.6106 | 0.5975 | 0.3721 | 0.7818 | 0.6800 | 0.4993 | 0.8421 | 0.9141 | 0.6240 |
| [GMN](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gmn) | 2018 | 0.4163 | 0.5964 | 0.6027 | 0.4795 | 0.7918 | 0.7020 | 0.6735 | 0.6488 | 0.3924 | 0.6128 | 0.6693 | 0.5976 | 0.6106 | 0.5975 | 0.3721 | 0.7818 | 0.6800 | 0.4993 | 0.8421 | 0.9141 | 0.6240 |

### Willow Object Class - 2GM & MGM
### Willow Object Class - 2GM
experiment config: ``experiments/vgg16_gmn_willow.yaml``

pretrained model: [google drive](https://drive.google.com/file/d/1PWM1i0oywH3hrwPdYerPazRmhApC0B4U/view?usp=sharing)

| model | year | remark | Car | Duck | Face | Motorbike | Winebottle | mean |
| ------------------------ | ---- | --------------- | ------ | ------ | ------ | --------- | ---------- | ------ |
| [GMN](/models/GMN) | 2018 | - | 0.6790 | 0.7670 | 0.9980 | 0.6920 | 0.8310 | 0.7934 |
| [GMN](https://thinkmatch.readthedocs.io/en/latest/guide/models.html#gmn) | 2018 | - | 0.6790 | 0.7670 | 0.9980 | 0.6920 | 0.8310 | 0.7934 |

## File Organization
```
Expand Down
Loading

0 comments on commit 41e7f10

Please sign in to comment.