Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

# AvgPool2d (kernel_size=1)

See original GitHub issue

In `vgg.py` I found this line: `layers += [nn.AvgPool2d(kernel_size=1, stride=1)]` .

Do I understand correctly that AvgPool2d layers with kernel_size=1 just return the input as it is? Why do we need them?

### Issue Analytics

• State:
• Created 3 years ago
• Reactions:2

#### Top GitHub Comments

2reactions
minhlabcommented, Dec 28, 2020

No, that’s the flatten call in the forward() function.

0reactions
Hrushikesh-githubcommented, Feb 15, 2021

As per the original pytorch implementation, we don’t have any average pooling layers present. I think this is a mistake

#### Top Results From Across the Web

AvgPool2d — PyTorch 1.13 documentation
The parameters kernel_size , stride , padding can either be: a single int – in which case the same value is used for...
How to apply a 2D Average Pooling in PyTorch? - Tutorialspoint
AvgPool2d(). pooling = nn.AvgPool2d(kernel_size). Apply the Average Pooling pooling on the input tensor or image tensor.
How to Apply a 2D Average Pooling in PyTorch?
The below syntax is used to apply 2D average pooling. Syntax: torch.nn.AvgPool2d(kernel_size, stride). Parameter: kernel_size: This is size ...
AvgPool2D: How to Incorporate Average pooling into a ...
to perform average pooling on the output of the second convolutional layer. The kernel size argument. kernel_size=4. is required and determines how large...
Python Examples of torch.nn.AvgPool2d - ProgramCreek.com
Conv2d(in_channels, x, kernel_size=3, padding=1), nn.BatchNorm2d(x), nn.ReLU(inplace=True)] in_channels = x layers += [nn.AvgPool2d(kernel_size=1 ...

#### Troubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free