Skip to content

Train deep neural networks on CIFAR-10 and CIFAR-100 with PyTorch.

License

Notifications You must be signed in to change notification settings

EvanMu96/pytorch-cifar

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Train CIFAR-10 and CIFAR-100 with PyTorch

Train deep networks with PyTorch on the CIFAR-10 and CIFAR-100 datasets.

Usage

Train ResNet56 on CIFAR-10

  • with batch size 128
  • for182 epochs
  • with initial learning rate 0.1
  • and a piecewise constant learning rate decay function
  • with a decay factor of 0.1 (default) at epochs 91 and 136
  • using the first two GPUs
  • storing 10 state checkpoints
  • and printing a progress bar
# setup options
MODEL=resnet56
BATCH_SIZE=128
NUM_EPOCHS=182
NUM_CKPTS=10
LR=0.1
DECAY_POLICY=pconst
LR_MILESTONES="91 136"
export CUDA_VISIBLE_DEVICES=0,1

# run 
SCRIPT=main.py

python $SCRIPT \
  --model ${MODEL} \
  --batch_size ${BATCH_SIZE} \
  --num_epochs ${NUM_EPOCHS} \
  --num_ckpts ${NUM_CKPTS} \
  --progress_bar \
  --lr ${LR} \
  --lr_decay_policy ${DECAY_POLICY} \
  --lr_milestones ${LR_MILESTONES}
==> Preparing data..
Files already downloaded and verified
Files already downloaded and verified
==> Building resnet56 model..

Epoch: 0
 [==>........................... 19/391 ..............................]  Step: 1s392ms | Tot: 24s295ms | lr: 1.000e-01 | Loss: 2.173 | Acc: 17.393% (423/2432)

Accuracy (as reported by @kuangliu)

Model Acc.
VGG16 92.64%
ResNet18 93.02%
ResNet50 93.62%
ResNet101 93.75%
MobileNetV2 94.43%
ResNeXt29(32x4d) 94.73%
ResNeXt29(2x64d) 94.82%
DenseNet121 95.04%
PreActResNet18 95.11%
DPN92 95.16%

About

Train deep neural networks on CIFAR-10 and CIFAR-100 with PyTorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.6%
  • Shell 1.4%