Skip to content

Commit

Permalink
Merge branch 'master' of https://git01.codeplex.com/cntk into dongyu/dev
Browse files Browse the repository at this point in the history
Conflicts:
	.gitignore
	Documentation/CNTK-TechReport/lyx/CNTKBook_CNTK_Adv_Chapter.lyx
	Documentation/CNTK-TechReport/lyx/CNTKBook_CNTK_Programmer_Chapter.lyx
  • Loading branch information
Dong Yu committed Jun 19, 2015
2 parents 60cab14 + d1720ef commit 8003d24
Show file tree
Hide file tree
Showing 5 changed files with 144 additions and 13 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -176,3 +176,4 @@ MachineLearning/CNTK/buildinfo.h
MachineLearning/CNTK/buildinfo.h$$


*.lyx#
22 changes: 11 additions & 11 deletions CheckInSuites/MNIST/mnistCheckIn.config
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ mnistTrainSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -81,7 +81,7 @@ mnistTrainSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -101,7 +101,7 @@ mnistTrainSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_test.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand Down Expand Up @@ -147,7 +147,7 @@ mnistTrainNDLNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -167,7 +167,7 @@ mnistTrainNDLNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -187,7 +187,7 @@ mnistTrainNDLNet=[
dim=1
start=0
file=$DataFolder$\mnist_test.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand Down Expand Up @@ -231,7 +231,7 @@ mnistAdaptSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -251,7 +251,7 @@ mnistAdaptSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_train.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand All @@ -271,7 +271,7 @@ mnistAdaptSimpleNet=[
dim=1
start=0
file=$DataFolder$\mnist_test.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand Down Expand Up @@ -302,7 +302,7 @@ mnistSimpleNetTest=[
dim=1
start=0
file=$DataFolder$\mnist_test.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand Down Expand Up @@ -337,7 +337,7 @@ mnistSimpleNetCV=[
dim=1
start=0
file=$DataFolder$\mnist_test.txt
labelMappingFile=$ConfigFolder$\mnistlabels.txt
labelMappingFile=$DataFolder$\mnistlabels.txt
labelDim=10
labelType=Category
]
Expand Down
Binary file not shown.
82 changes: 80 additions & 2 deletions Documentation/CNTK-TechReport/lyx/CNTKBook_CNTK_Adv_Chapter.lyx
Original file line number Diff line number Diff line change
Expand Up @@ -692,14 +692,26 @@ EvalNodes=(ErrPredict)
OutputNodes=(Plus2)
\end_layout

\begin_layout Plain Layout

NodesReqMultiSeqHandling=(CE)
\end_layout

\end_inset

After defining the network, it’s important to let CNTK know what the special
nodes are in the network.
For example, CNTK needs to know which input nodes are features and which
are labels.
For example, CNTK needs to know which input nodes are features and labels.
It also needs to know the default output nodes, evaluation nodes and training
criteria nodes.
Note here the specification of the nodes that require special handling
(NodesReqMultiSeqHandling) when the network is evalauted or trained with
multiple sequences, e.g., when the network itself is an RNN or the model
is trained with the sequence-level criterion.
Since in these cases multiple sequences will be stitched together to improve
the speed and special handling is needed for, e.g., the criterion node, to
reset the RNN states at the right time and to mask out the samples when
the features/labels are missing.
CNTK supports multiple inputs and outputs which can be represented by comma
separated variable names surrounded by parentheses.
\end_layout
Expand Down Expand Up @@ -1254,6 +1266,11 @@ CriteriaNodes=(CE)
EvalNodes=(ErrPredict)
\end_layout

\begin_layout Plain Layout

NodesReqMultiSeqHandling=(CE)
\end_layout

\end_inset

we can tag these nodes as they are defined.
Expand Down Expand Up @@ -1284,6 +1301,11 @@ CE = SMBFF(L3, LDim, HDim, labels, tag=Criteria)
Err=ErrorPrediction(labels, CE.F, tag=Eval)
\end_layout

\begin_layout Plain Layout

CE = SMBFF(L3, LDim, HDim, labels, tag=MultiSeq)
\end_layout

\end_inset


Expand Down Expand Up @@ -2144,6 +2166,47 @@ startRow - the start row to get a slice
numRows - the number of rows to get
\end_layout

\begin_layout Subsubsection
RowStack
\begin_inset Index idx
status open

\begin_layout Plain Layout
RowStack
\end_layout

\end_inset


\end_layout

\begin_layout Standard
Concatnate rows of input matrices to form a bigger matrix.
The resulting matrix is a sumof(rows) by m1.cols matrix.
It supports variable-length input.
The syntax is
\end_layout

\begin_layout Standard
\begin_inset listings
inline false
status open

\begin_layout Plain Layout

RowStack(m1, m2, ...)
\end_layout

\end_inset


\end_layout

\begin_layout Itemize
m1, m2 - input matrices.
can be any number of input matrices.
\end_layout

\begin_layout Subsubsection
Scale
\begin_inset Index idx
Expand Down Expand Up @@ -5521,6 +5584,21 @@ Output
=true|false: Set the node as one of the output nodes.
\end_layout

\begin_layout Itemize
MultiSeq
\begin_inset Index idx
status open

\begin_layout Plain Layout
MultiSeq
\end_layout

\end_inset

=true|false: Set the node as one of the nodes that require special handling
when multiple sequences are used in the training or evaluation.
\end_layout

\begin_layout Subsubsection

\series bold
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3675,6 +3675,58 @@ mType>& inputGradientValues, const Matrix<ElemType>& gradientValues)
which contains the actual gradient computation code.
\end_layout

\begin_layout Subsubsection
Customization of the Multi-Sequence Handling Code
\begin_inset Index idx
status open

\begin_layout Plain Layout
Customization of the Multi-Sequence Handling Code
\end_layout

\end_inset


\end_layout

\begin_layout Standard
If your node will generate an output (function values) that has different
number of columns (recall that each column is a sample) than the input
(e.g., all the ciretrion nodes will generate a scalar value as the output),
you need to add the protected virtual function
\end_layout

\begin_layout Standard
\begin_inset listings
inline false
status open

\begin_layout Plain Layout

protected:
\end_layout

\begin_layout Plain Layout

virtual bool UseCustomizedMultiSeqHandling() { return true; }
\end_layout

\end_inset


\end_layout

\begin_layout Standard
This function indicates that you will use your customized code to handle
the condition in which multi-sequences are used in each minibatch (e.g.,
when training an RNN).
You need to add the MaskToZeroWhenLabelAndFeatureMissing calls at the approriat
e places in your code to mask out both the function values and the gradient
values when a segment of the minibatch does not have features/labels.
An example of the customized handling code can be found inside the CrossEntropy
WithSoftmaxNode.
\end_layout

\begin_layout Subsubsection
The CNTKMath Library
\begin_inset Index idx
Expand Down

0 comments on commit 8003d24

Please sign in to comment.