Skip to content

Commit

Permalink
Fixes for X.cntk -> X_ndl_deprecated.cntk config name changes
Browse files Browse the repository at this point in the history
  • Loading branch information
mahilleb-msft committed Aug 30, 2016
1 parent 9c8a882 commit 533c42b
Show file tree
Hide file tree
Showing 9 changed files with 59 additions and 61 deletions.
64 changes: 32 additions & 32 deletions CNTK.sln

Large diffs are not rendered by default.

10 changes: 4 additions & 6 deletions Examples/Evaluation/CSEvalClient/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -220,9 +220,9 @@ private static void EvaluateNetworkSingleLayer()
using (var model = new IEvaluateModelManagedF())
{
// Create the network
// This network (AddOperatorConstant.cntk) is a simple network consisting of a single binary operator (Plus)
// This network (AddOperatorConstant_ndl_deprecated.cntk) is a simple network consisting of a single binary operator (Plus)
// operating over a single input and a constant
string networkFilePath = Path.Combine(workingDirectory, @"AddOperatorConstant.cntk");
string networkFilePath = Path.Combine(workingDirectory, @"AddOperatorConstant_ndl_deprecated.cntk");
if (!File.Exists(networkFilePath))
{
Console.WriteLine("Error: The network configuration file {0} does not exist.", networkFilePath);
Expand Down Expand Up @@ -270,9 +270,9 @@ private static void EvaluateNetworkSingleLayerNoInput()
using (var model = new IEvaluateModelManagedF())
{
// Create the network
// This network (AddOperatorConstantNoInput.cntk) is a simple network consisting of a single binary operator (Plus)
// This network (AddOperatorConstantNoInput_ndl_deprecated.cntk) is a simple network consisting of a single binary operator (Plus)
// operating over a two constants, therefore no input is necessary.
string networkFilePath = Path.Combine(workingDirectory, @"AddOperatorConstantNoInput.cntk");
string networkFilePath = Path.Combine(workingDirectory, @"AddOperatorConstantNoInput_ndl_deprecated.cntk");
if (!File.Exists(networkFilePath))
{
Console.WriteLine("Error: The network configuration file {0} does not exist.", networkFilePath);
Expand Down Expand Up @@ -318,8 +318,6 @@ private static void EvaluateExtendedNetworkSingleLayerNoInput()
using (var model = new ModelEvaluationExtendedF())
{
// Create the network
// This network (AddOperatorConstantNoInput.cntk) is a simple network consisting of a single binary operator (Plus)
// operating over a two constants, therefore no input is necessary.
model.CreateNetwork(modelDefinition);

VariableSchema outputSchema = model.GetOutputSchema();
Expand Down
12 changes: 6 additions & 6 deletions Examples/Image/MNIST/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,12 +44,12 @@ or prefix the call to the cntk executable with the corresponding folder.

Run the example from the Image/MNIST/Data folder using:

`cntk configFile=../Config/01_OneHidden.cntk`
`cntk configFile=../Config/01_OneHidden_ndl_deprecated.cntk`

or run from any folder and specify the Data folder as the `currentDirectory`,
e.g. running from the Image/MNIST folder using:

`cntk configFile=Config/01_OneHidden.cntk currentDirectory=Data`
`cntk configFile=Config/01_OneHidden_ndl_deprecated.cntk currentDirectory=Data`

The output folder will be created inside Image/MNIST/.

Expand All @@ -61,22 +61,22 @@ There are four config files and the corresponding network description files in t

1. 01_OneHidden.ndl is a simple, one hidden layer network that produces 2.3% of error.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/01_OneHidden.cntk`
`cntk configFile=../Config/01_OneHidden_ndl_deprecated.cntk`

2. 02_Convolution.ndl is more interesting, convolutional network which has 2 convolutional and 2 max pooling layers.
The network produces 0.87% of error after training for about 2 minutes on GPU.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/02_Convolution.cntk`
`cntk configFile=../Config/02_Convolution_ndl_deprecated.cntk`

3. 03_ConvBatchNorm.ndl is almost identical to 02_Convolution.ndl
except that it uses batch normalization for the convolutional and fully connected layers.
As a result, it achieves around 0.8% of error after training for just 2 epochs (and less than 30 seconds).
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/03_ConvBatchNorm.cntk`
`cntk configFile=../Config/03_ConvBatchNorm_ndl_deprecated.cntk`

4. 04_DeConv.ndl illustrates the usage of Deconvolution and Unpooling. It is a network with one Convolution, one Pooling, one Unpooling and one Deconvolution layer. In fact it is an auto-encoder network where Rectified Linear Unit (ReLU) or Sigmoid layer is now replaced with Convolutional ReLU (for encoding) and Deconvolutional ReLU (for decoding) layers. The network goal is to reconstruct the original signal, with Mean Squared Error (MSE) used to minimize the reconstruction error. Generally such networks are used in semantic segmentation.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/04_DeConv.cntk`
`cntk configFile=../Config/04_DeConv_ndl_deprecated.cntk`

For more details, refer to .ndl and the corresponding .cntk files.

Expand Down
8 changes: 4 additions & 4 deletions Examples/Image/Miscellaneous/CIFAR-10/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,18 +38,18 @@ https://code.google.com/p/cuda-convnet/source/browse/trunk/example-layers/layers
The network produces 20.5% of error after training for about 3 minutes on GPU.
To run the sample, navigate to the sample folder and run the following command:
```
cntk configFile=01_Conv.cntk
cntk configFile=01_Conv_ndl_deprecated.cntk
```
2. 02_BatchNormConv.ndl is a convolutional network which uses batch normalization technique (http://arxiv.org/abs/1502.03167).
To run the sample, navigate to the sample folder and run the following command:
```
cntk configFile=02_BatchNormConv.cntk
cntk configFile=02_BatchNormConv_ndl_deprecated.cntk
```

3. 03_ResNet.ndl and 04_ResNet_56.ndl are very deep convolutional networks that use ResNet architecture and have 20 and 56 layers respectively (http://arxiv.org/abs/1512.03385).
With 03_ResNet.cntk you should get around 8.2% of error after training for about 50 minutes. 04_ResNet_56.cntk should produce around 6.4% of error after training for about 3 hours (see log files in the Output directory).
With 03_ResNet_ndl_deprecated.cntk you should get around 8.2% of error after training for about 50 minutes. 04_ResNet_56_ndl_deprecated.cntk should produce around 6.4% of error after training for about 3 hours (see log files in the Output directory).

4. 05_ConvLocal.cntk uses locally-connected convolution layers (see `conv_local3` and `conv_local4` in `05_ConvLocal.cntk`) and resembles a network described here: https://code.google.com/p/cuda-convnet/source/browse/trunk/example-layers/layers-conv-local-11pct.cfg
4. 05_ConvLocal_ndl_deprecated.cntk uses locally-connected convolution layers (see `conv_local3` and `conv_local4` in `05_ConvLocal_ndl_deprecated.cntk`) and resembles a network described here: https://code.google.com/p/cuda-convnet/source/browse/trunk/example-layers/layers-conv-local-11pct.cfg

5. 06_RegressionSimple.cntk shows how to train a regression model on image data. It uses a very simple network and a composite reader using both the ImageReader and CNTKTextFormatReader and defines a the RMSE (root mean square error) as the loss function. The value that the network learns to predict are simply the average rgb values of an image normalized to [0, 1]. To generate the ground truth labels for regression you need to run the CifarConverter.py script (since this example was added later you might need to rerun it to generate the regression files). See also here: https://github.com/Microsoft/CNTK/wiki/Train-a-regression-model-on-images

Expand Down
4 changes: 2 additions & 2 deletions Examples/Speech/AN4/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ See License.md in the root level folder of the CNTK repository for full license
|Purpose: |Showcase how to train feed forward and LSTM networks for speech data
|Network: |SimpleNetworkBuilder for 2-layer FF, NdlNetworkBuilder for 3-layer LSTM network
|Training: |Data-parallel 1-Bit SGD with automatic mini batch rescaling (FF)
|Comments: |There are two config files: FeedForward.cntk and LSTM-NDL.cntk for FF and LSTM training respectively
|Comments: |There are two config files: FeedForward.cntk and LSTM-NDL_ndl_deprecated.cntk for FF and LSTM training respectively

## Running the example

Expand Down Expand Up @@ -61,7 +61,7 @@ To run on CPU set `deviceId = -1`, to run on GPU set deviceId to "auto" or a spe

The FeedForward.cntk file uses the SimpleNetworkBuilder to create a 2-layer
feed forward network with sigmoid nodes and a softmax layer.
The LSTM-NDL.cntk file uses the NdlNetworkBuilder and refers to the lstmp-3layer-opt.ndl file.
The LSTM-NDL_ndl_deprecated.cntk file uses the NdlNetworkBuilder and refers to the lstmp-3layer-opt.ndl file.
In the ndl file an LSTM component is defined and used to create a 3-layer LSTM network with a softmax layer.
Both configuration only define and execute a single training task:

Expand Down
12 changes: 6 additions & 6 deletions Examples/Speech/Miscellaneous/TIMIT/config/README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@ Path Definitions:
Network Training Examples:
==========================
* TIMIT_TrainSimpleNetwork.cntk [train basic feedforward fully connected neural network]
* TIMIT_TrainNDLNetwork.cntk [train a neural network defined using NDL]
* TIMIT_TrainNDLNetwork_ndl_deprecated.cntk [train a neural network defined using NDL]
* TIMIT_AdaptLearnRate.cntk [similar to simple network example, but learning rate adapted based on dev set]
* TIMIT_TrainAutoEncoder.cntk [train autoencoder with bottleneck layer]
* TIMIT_TrainWithPreTrain.cntk [pre-train using layerwise discriminative pre-training, then do full network training]
* TIMIT_TrainMultiTask.cntk [train with multi-task learning with joint prediction of senone labels and dialect region]
* TIMIT_TrainMultiInput.cntk [train with 2 different inputs: fbank and mfcc]
* TIMIT_TrainLSTM.cntk [train single layer LSTM network]
* TIMIT_TrainAutoEncoder_ndl_deprecated.cntk [train autoencoder with bottleneck layer]
* TIMIT_TrainWithPreTrain_ndl_deprecated.cntk [pre-train using layerwise discriminative pre-training, then do full network training]
* TIMIT_TrainMultiTask_ndl_deprecated.cntk [train with multi-task learning with joint prediction of senone labels and dialect region]
* TIMIT_TrainMultiInput_ndl_deprecated.cntk [train with 2 different inputs: fbank and mfcc]
* TIMIT_TrainLSTM_ndl_deprecated.cntk [train single layer LSTM network]

Network Evaluation Examples:
============================
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
. $TEST_DIR/../run-timit-test-common

# Train:
cntkrun TIMIT_TrainAutoEncoder.cntk "$CntkArguments TIMIT_TrainAutoEncoder=[reader=[useMersenneTwisterRand=true]]" || exit $?
cntkrun TIMIT_TrainAutoEncoder_ndl_deprecated.cntk "$CntkArguments TIMIT_TrainAutoEncoder=[reader=[useMersenneTwisterRand=true]]" || exit $?

# Copy the test data to the test run directory, so that we do not damage anything
DataDir=$TEST_RUN_DIR/TestData
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ if [ $? != 0 ]; then
fi

# Train:
cntkrun TIMIT_TrainAutoEncoder.cntk "TIMIT_TrainAutoEncoder=[reader=[readerType=HTKDeserializers]] $CntkArguments" || exit $?
cntkrun TIMIT_TrainAutoEncoder_ndl_deprecated.cntk "TIMIT_TrainAutoEncoder=[reader=[readerType=HTKDeserializers]] $CntkArguments" || exit $?

# Copy the test data to the test run directory, so that we do not damage anything
DataDir=$TEST_RUN_DIR/TestData
Expand Down
6 changes: 3 additions & 3 deletions Tests/EndToEndTests/Speech/README_Windows_Debug_commands.txt
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ COMMAND: --cd $(SolutionDir)Tests\EndToEndTests\Speech\Data -f $(SolutionDi

--- Speech\AN4:

COMMAND: configFile=$(SolutionDir)Examples\Speech\AN4\Config\LSTM-NDL.cntk currentDirectory=$(SolutionDir)Examples\Speech\AN4\Data RunDir=$(SolutionDir)Examples\RunDir\Speech\AN4 DataDir=$(SolutionDir)Examples\Speech\AN4\Data ConfigDir=$(SolutionDir)Examples\Speech\AN4\Config OutputDir=$(SolutionDir)Examples\RunDir\Speech\AN4 stderr=$(SolutionDir)Examples\RunDir\Speech\AN4\cntkSpeech.dnn.log DeviceId=auto speechTrain=[SGD=[maxEpochs=1]] speechTrain=[SGD=[epochSize=64]] parallelTrain=false makeMode=false
COMMAND: configFile=$(SolutionDir)Examples\Speech\AN4\Config\LSTM-NDL_ndl_deprecated.cntk currentDirectory=$(SolutionDir)Examples\Speech\AN4\Data RunDir=$(SolutionDir)Examples\RunDir\Speech\AN4 DataDir=$(SolutionDir)Examples\Speech\AN4\Data ConfigDir=$(SolutionDir)Examples\Speech\AN4\Config OutputDir=$(SolutionDir)Examples\RunDir\Speech\AN4 stderr=$(SolutionDir)Examples\RunDir\Speech\AN4\cntkSpeech.dnn.log DeviceId=auto speechTrain=[SGD=[maxEpochs=1]] speechTrain=[SGD=[epochSize=64]] parallelTrain=false makeMode=false

--- Speech\DiscriminativePreTraining: --currently fails with MEL error 'Parameter name could not be resolved 'HL2.y'

Expand All @@ -52,9 +52,9 @@ COMMAND: currentDirectory=\\storage.ccp.philly.selfhost.corp.microsoft.com\pu

--- MNIST:

COMMAND: configFile=$(SolutionDir)Examples/Image/MNIST/Config/01_OneHidden.cntk currentDirectory=$(SolutionDir)Tests/EndToEndTests/Image/Data RunDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_01_OneHidden DataDir=$(SolutionDir)Tests/EndToEndTests/Image/Data ConfigDir=$(SolutionDir)Examples/Image/MNIST/Config OutputDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_01_OneHidden DeviceId=0 train=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Train.txt]] test=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Test.txt]] train=[SGD=[maxEpochs=1]] train=[SGD=[epochSize=100]] train=[reader=[randomize=none]] imageLayout="cudnn" makeMode=false
COMMAND: configFile=$(SolutionDir)Examples/Image/MNIST/Config/01_OneHidden_ndl_deprecated.cntk currentDirectory=$(SolutionDir)Tests/EndToEndTests/Image/Data RunDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_01_OneHidden DataDir=$(SolutionDir)Tests/EndToEndTests/Image/Data ConfigDir=$(SolutionDir)Examples/Image/MNIST/Config OutputDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_01_OneHidden DeviceId=0 train=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Train.txt]] test=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Test.txt]] train=[SGD=[maxEpochs=1]] train=[SGD=[epochSize=100]] train=[reader=[randomize=none]] imageLayout="cudnn" makeMode=false

COMMAND: configFile=$(SolutionDir)Examples/Image/MNIST/Config/02_Convolution.cntk currentDirectory=$(SolutionDir)Tests/EndToEndTests/Image/Data RunDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_02_Convolution DataDir=$(SolutionDir)Tests/EndToEndTests/Image/Data ConfigDir=$(SolutionDir)Examples/Image/MNIST/Config OutputDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_02_Convolution DeviceId=0 train=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Train.txt]] test=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Test.txt]] train=[SGD=[maxEpochs=1]] train=[SGD=[epochSize=100]] train=[reader=[randomize=none]] imageLayout="cudnn" makeMode=false
COMMAND: configFile=$(SolutionDir)Examples/Image/MNIST/Config/02_Convolution_ndl_deprecated.cntk currentDirectory=$(SolutionDir)Tests/EndToEndTests/Image/Data RunDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_02_Convolution DataDir=$(SolutionDir)Tests/EndToEndTests/Image/Data ConfigDir=$(SolutionDir)Examples/Image/MNIST/Config OutputDir=$(SolutionDir)Tests/EndToEndTests/RunDir/Image/MNIST_02_Convolution DeviceId=0 train=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Train.txt]] test=[reader=[file=$(SolutionDir)Tests/EndToEndTests/Image/Data/Test.txt]] train=[SGD=[maxEpochs=1]] train=[SGD=[epochSize=100]] train=[reader=[randomize=none]] imageLayout="cudnn" makeMode=false

TODO out-of-date:
COMMAND: currentDirectory=$(SolutionDir)ExampleSetups\Image\MNIST configFile=02_Conv.cntk configName=02_Conv
Expand Down

0 comments on commit 533c42b

Please sign in to comment.