forked from microsoft/CNTK
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
add more description on original ResNet
- Loading branch information
Showing
1 changed file
with
8 additions
and
4 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,20 +1,24 @@ | ||
# CNTK example: ImageNet ResNet | ||
# CNTK example: Original ImageNet ResNet | ||
|
||
## Overview | ||
This work is an implementation of ResNet in CNTK. The work is strictly based on the original [ResNet paper](http://arxiv.org/abs/1512.03385). If you are interested in the original implementation of ResNet, follow [this link](https://github.com/KaimingHe/deep-residual-networks). | ||
|
||
## Dataset | ||
|Data: |The ILSVRC2012 dataset (http://www.image-net.org/challenges/LSVRC/2012/) of images. | ||
|:---------|:--- | ||
|Purpose |This example demonstrates usage of the NDL (Network Description Language) to define networks similar to ResNet. | ||
|Network |NDLNetworkBuilder, deep convolutional residual networks (ResNet). | ||
|Training |Stochastic gradient descent with momentum. | ||
|
||
## Details | ||
The network configurations and experiment settings in this this folder resemble the ones in the original [ResNet paper](http://arxiv.org/abs/1512.03385) strictly without any extra optimization. | ||
* `Weight Decay in Batch Normalization`: Disable the weight decay in batch normalization. In our experiment, apply weight decay to all nodes will slow down the training curve convergence. | ||
* `Post Batch Normalization`: After training and before evaluating, using post batch normalization command to evaluate the mean and variance of batch normalization nodes instead of running statistics mean and variance. From the experiment, the statistics results of post batch normalization are more robust. | ||
|
||
## Results | ||
The following table contains results. | ||
|
||
| Network | Top-1 error | Top-5 error | Model | ||
| ------------- | ----------- | ----------- | ---------- | ||
| ResNet-50 | 24.58 | 7.43 | | ||
|
||
## Notes | ||
This work is an implementation of ResNets in CNTK. If you are interested in the original implementation of ResNet, follow [this link](https://github.com/KaimingHe/deep-residual-networks). | ||
## Notes |