inconsistent behavior of keepdims for CUB-based reductions
See original GitHub issueIt appears that keepdims does not keep all of the reduction dimensions when CUB is enabled and axis is a tuple containing more than one axis.
- Conditions (you can just paste the output of
python -c 'import cupy; cupy.show_config()'
)
CuPy Version : 7.0.0rc1 (build from master branch)
CUDA Root : /usr/local/cuda
CUDA Build Version : 10010
CUDA Driver Version : 10010
CUDA Runtime Version : 10010
- Code to reproduce
import cupy
x = cupy.ones((64, 64))
cupy.cuda.cub_enabled=False
out = x.sum(axis=(0, 1), keepdims=True)
print("CUB DISABLED shape={}".format(out.shape))
cupy.cuda.cub_enabled=True
out = x.sum(axis=(0, 1), keepdims=True)
print("CUB ENABLED shape={}".format(out.shape))
gives
CUB DISABLED shape=(1, 1)
CUB ENABLED shape=(1,)
cc @leofang
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:5 (5 by maintainers)
Top Results From Across the Web
In numpy.sum() there is parameter called "keepdims". What ...
An example showing keepdims in action when working with higher dimensional ... Let's see how the shape of the array changes as we...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I think the origin of the problem was an oversight on my part in #2517. I enabled reduction over all axes there, but did not ensure that
keepdims
preserved all of the axes.I am working on a fix for this currently.