question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Conv2DTranspose ambigous output shape

See original GitHub issue

Currently, Conv2DTranspose infers the shape of the output using deconv_length but because the output shape of a transposed convolution is ambigous it can infer an undesired shape.

Example:

conv = Conv2D(16, 3, strides=2, padding='same')
transpose_conv = Conv2DTranspose(1, 3, strides=2, padding='same')

input_a = Input(shape=(23, 23, 1))
x_conv_a = conv(input_a)
x_transpose_a = transpose_conv(x_conv_a)

print("(a) Input shape: {}".format(int_shape(input_a)))
print("(a) Shape after convolution: {}".format(int_shape(x_conv_a)))
print("(a) Shape after transposed convolution: {}".format(int_shape(x_transpose_a)))
print()

input_b = Input(shape=(24, 24, 1))
x_conv_b = conv(input_b)
x_transpose_b = transpose_conv(x_conv_b)

print("(b) Input shape: {}".format(int_shape(input_b)))
print("(b) Shape after convolution: {}".format(int_shape(x_conv_b)))
print("(b) Shape after transposed convolution: {}".format(int_shape(x_transpose_b)))

The output:

(a) Input shape: (None, 23, 23, 1)
(a) Shape after convolution: (None, 12, 12, 16)
(a) Shape after transposed convolution: (None, 24, 24, 1)

(b) Input shape: (None, 24, 24, 1)
(b) Shape after convolution: (None, 12, 12, 16)
(b) Shape after transposed convolution: (None, 24, 24, 1)

From an input shape (None, 12, 12, 16) a transposed convolution can output either (None, 24, 24, 1) or (None, 23, 23, 1). Conv2DTranspose always outputs (None, 24, 24, 1).

Shouldn’t the user have to supply the output_shape (like in Tensorflow) or an output padding (like in PyTorch) to resolve the ambiguity?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:7
  • Comments:13 (4 by maintainers)

github_iconTop GitHub Comments

3reactions
davidtvscommented, Jun 21, 2018

PR #10246 has been merged. The shape of the output can now be controlled through a new optional argument (output_padding).

2reactions
emiljohacommented, May 16, 2018

@davidtvs After further testing, not specifying output_shape and letting it default to None causes a TypeError exception when the output_shape si cast to a touple.

self._output_shape = tuple(output_shape)

This is easily fixed by checking for that case and only casting to tuple if not None.

Read more comments on GitHub >

github_iconTop Results From Across the Web

understanding output shape of keras Conv2DTranspose
The parameter output_padding is a way to resolve the ambiguity by choosing explicitly the output dimension. In my example, the minimum output ......
Read more >
How to calculate the output shape of conv2d_transpose?
Here is the correct formula for computing the size of the output with tf.layers.conv2d_transpose() : # Padding==Same: H = H1 * stride ...
Read more >
Understanding output shape of keras Conv2DTranspose
As a consequence, when applying a Conv2DTranspose with kernel size of 3x3, stride of 7x7 and padding "same", there is an ambiguity about...
Read more >
Conv2DTranspose layer - Keras
If set to None (default), the output shape is inferred. data_format: A string, one of channels_last (default) or channels_first . The ordering of...
Read more >
ConvTranspose2d — PyTorch 1.13 documentation
However, when stride > 1 , Conv2d maps multiple input shapes to the same output shape. output_padding is provided to resolve this ambiguity...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found