Skip to content
This repository was archived by the owner on Nov 25, 2021. It is now read-only.

Fixed padding in the first convolutional layer#21

Open
alex4men wants to merge 1 commit intoudacity:masterfrom
alex4men:patch-1
Open

Fixed padding in the first convolutional layer#21
alex4men wants to merge 1 commit intoudacity:masterfrom
alex4men:patch-1

Conversation

@alex4men
Copy link

@alex4men alex4men commented Jun 3, 2019

As stated in the comment, conv1 layer should have 'VALID' padding, but in the code (line 48) it had 'SAME' padding. That was confusing, and leaded to (?, 28, 28, 256) output shape of the conv2 layer, instead of (?, 27, 27, 256), as described in the original paper.

As stated in the comment, conv1 layer should have 'VALID' padding, but in the code (line 48) it had 'SAME' padding. That was confusing, and leaded to (?, 28, 28, 256) output shape of the conv2 layer, instead of (?, 27, 27, 256), as described in the original paper.
@alex4men alex4men changed the title Fixed padding in the first convolution layer Fixed padding in the first convolutional layer Jun 3, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments