Skip to content

Remove extra convolutions in TCN #471

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Sep 12, 2021

Conversation

vpozdnyakov
Copy link
Contributor

Fixes #470. Add the 1x1 convolution in a case of different number of channels between input and output of a residual block

@vpozdnyakov vpozdnyakov changed the title Fix extra convolutions Remove extra convolutions in TCN Sep 8, 2021
@@ -84,7 +84,7 @@ def __init__(self,
if weight_norm:
self.conv1, self.conv2 = nn.utils.weight_norm(self.conv1), nn.utils.weight_norm(self.conv2)

if nr_blocks_below == 0 or nr_blocks_below == num_layers - 1:
if nr_blocks_below in {0, num_layers - 1} and input_dim != output_dim:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually do you think we could simplify this to

Suggested change
if nr_blocks_below in {0, num_layers - 1} and input_dim != output_dim:
if input_dim != output_dim:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree! Looks super simple and clear

@@ -103,7 +103,8 @@ def forward(self, x):
x = self.dropout_fn(x)

# add residual
if self.nr_blocks_below in {0, self.num_layers - 1}:
if (self.nr_blocks_below in {0, self.num_layers - 1} and
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And also here just check, self.conv1.in_channels != self.conv2.out_channels?

Copy link
Contributor

@hrzn hrzn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@entropicbloom entropicbloom self-requested a review September 9, 2021 08:56
Copy link
Contributor

@entropicbloom entropicbloom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for polishing our TCN!

@hrzn hrzn merged commit dca5f78 into unit8co:master Sep 12, 2021
@vpozdnyakov vpozdnyakov deleted the fix/Extra-convolutions-in-TCN branch September 12, 2021 15:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Extra convolutions in TCN
3 participants