Skip to content

Commit

Permalink
Added batch norm and activation before avg pooling (raghakot#31)
Browse files Browse the repository at this point in the history
  • Loading branch information
shelpuk authored and raghakot committed Feb 2, 2017
1 parent 5017ca9 commit 6abded5
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion resnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,10 +200,13 @@ def build(input_shape, num_outputs, block_fn, repetitions):
# Last activation
block = _bn_relu(block)

block_norm = BatchNormalization(mode=0, axis=CHANNEL_AXIS)(block)
block_output = Activation("relu")(block_norm)

# Classifier block
pool2 = AveragePooling2D(pool_size=(block._keras_shape[ROW_AXIS],
block._keras_shape[COL_AXIS]),
strides=(1, 1))(block)
strides=(1, 1))(block_output)
flatten1 = Flatten()(pool2)
dense = Dense(output_dim=num_outputs, init="he_normal", activation="softmax")(flatten1)

Expand Down

0 comments on commit 6abded5

Please sign in to comment.