Skip to content

Commit

Permalink
Remove redundancy codes
Browse files Browse the repository at this point in the history
  • Loading branch information
gaoyuan committed Mar 22, 2017
1 parent 57c355a commit 784e242
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 7 deletions.
6 changes: 6 additions & 0 deletions doc/api/v2/config/layer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,12 @@ sum_to_one_norm
:members: sum_to_one_norm
:noindex:

cross_channel_norm
---------------
.. automodule:: paddle.v2.layer
:members: cross_channel_norm
:noindex:

Recurrent Layers
================

Expand Down
2 changes: 0 additions & 2 deletions paddle/gserver/layers/CrossChannelNormLayer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@ void CrossChannelNormLayer::forward(PassType passType) {
normBuffer_->addScalar(*normBuffer_, 1e-6);
inV->square2(*dataBuffer_);
for (size_t i = 0; i < batchSize; i++) {
spatialBuffer_->zeroMem();
MatrixPtr inTmp = Matrix::create(
inV->getData() + i * dataDim, channels_, spatialDim, false, useGpu_);
MatrixPtr dataTmp = Matrix::create(dataBuffer_->getData() + i * dataDim,
Expand Down Expand Up @@ -80,7 +79,6 @@ void CrossChannelNormLayer::backward(const UpdateCallback& callback) {
scaleDiff_->zeroMem();
for (size_t i = 0; i < batchSize; i++) {
spatialBuffer_->zeroMem();
channelBuffer_->zeroMem();
// propagate to param.
MatrixPtr dataBufferTmp =
Matrix::create(dataBuffer_->getData() + i * dataDim,
Expand Down
9 changes: 4 additions & 5 deletions paddle/gserver/layers/NormLayer.h
Original file line number Diff line number Diff line change
Expand Up @@ -66,11 +66,10 @@ class ResponseNormLayer : public NormLayer {
};

/**
* This layer applys normalize across the channels of each sample to a
* conv layer's output and scale the output by a group of trainable factors
* which dimensions equal to the channel's number.
* - Input: One and only one input layer are accepted. The input layer must be
* be a data output layer.
* This layer applys normalization across the channels of each sample to a
* conv layer's output, and scales the output by a group of trainable factors
* whose equal to the number of channels.
* - Input: One and only one input layer are accepted.
* - Output: The normalized data of the input data.
* Reference:
* Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed,
Expand Down
1 change: 1 addition & 0 deletions python/paddle/trainer_config_helpers/layers.py
Original file line number Diff line number Diff line change
Expand Up @@ -1015,6 +1015,7 @@ def cross_channel_norm_layer(input, name=None, param_attr=None):
This layer applys normalize across the channels of each sample to
a conv layer's output and scale the output by a group of trainable
factors which dimensions equal to the channel's number.
:param name: The Layer Name.
:type name: basestring
:param input: The input layer.
Expand Down

0 comments on commit 784e242

Please sign in to comment.