Skip to content

Commit

Permalink
major cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
Abhay Gupta committed Oct 1, 2021
1 parent a772212 commit 3fb85bf
Show file tree
Hide file tree
Showing 20 changed files with 176 additions and 2,169 deletions.
60 changes: 5 additions & 55 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,61 +6,11 @@ Implementation of [Vision Transformer](https://openreview.net/forum?id=YicbFdNTT

## Features

- [x] Vanilla ViT
- [x] Hybrid ViT (with support for BiTResNets as backbone)
- [x] Hybrid ViT (with support for AxialResNets as backbone)
- [x] Training Scripts

To Do:

- [ ] Training Script
- [ ] Support for linear decay
- [ ] Correct hyper parameters
- [ ] Full Axial-ViT
- [ ] Results for Imagenet-1K and Imagenet-21K

## Installation

Create the environment:

```bash
conda env create -f environment.yml
```

Preparing the dataset:

```bash
mkdir data
cd data
ln -s path/to/dataset imagenet
```

## Running the Scripts

For *non-distributed training*:

```bash
python train.py --model ViT --name vit_logs
```

For *distributed training*:

```bash
CUDA_VISIBLE_DEVICES=0,1,2,3 python dist_train.py --model ViT --name vit_dist_logs
```

For *testing* add the `--test` parameter:

```bash
python train.py --model ViT --name vit_logs --test
CUDA_VISIBLE_DEVICES=0,1,2,3 python dist_train.py --model ViT --name vit_dist_logs --test
```

## References

1. BiTResNet: https://github.com/google-research/big_transfer/tree/master/bit_pytorch
2. AxialResNet: https://github.com/csrhddlam/axial-deeplab
3. Training Scripts: https://github.com/csrhddlam/axial-deeplab
- [x] ViT
- [x] ViT with convolutional patches
- [x] ViT with convolutional stems
- [x] Early Convolutional Stem
- [x] Scaled ReLU Stem

## Citations

Expand Down
Loading

0 comments on commit 3fb85bf

Please sign in to comment.