Question: maximum ndim discrepancy between CuPy and NumPy?
See original GitHub issueI noticed in CuPy we set MAX_NDIM = 25
:
https://github.com/cupy/cupy/blob/46e5ff0bdf401ad88f404cd7cb7d26dd62b061c0/cupy/core/_carray.pxd#L10
whereas in NumPy this is set to 32 more than a decade ago:
https://github.com/numpy/numpy/blob/9563a3a63fd3dfb3b687fbabad134b5ded72bd46/numpy/core/include/numpy/ndarraytypes.h#L40
Is there a particular reason to pick 25? I was experimenting some iterator stuff and realized I need to know the largest possible ndim
at build time. I can work around things so this is not critical at all, just simply curious.
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (6 by maintainers)
Top Results From Across the Web
Differences between CuPy and NumPy
CuPy handles out-of-bounds indices differently by default from NumPy when using integer array indexing. NumPy handles them by raising an error, but CuPy...
Read more >What is the difference between dimension and length of a ...
Length commands give me 63 as response but ndim tells it is a 3 dimensional array. I am quite new to the computer...
Read more >NumPy quickstart — NumPy v1.25.dev0 Manual
Understand the difference between one-, two- and n-dimensional arrays in NumPy; ... The length of the shape tuple is therefore the number of...
Read more >Chapter 4. NumPy Basics: Arrays and Vectorized Computation
One of the key features of NumPy is its N-dimensional array object, or ndarray, which is ... Calling astype always creates a new...
Read more >NumPy Interview Questions and Answers 2022 - HackerTrail
Top Basic and Intermediate NumPy Interview Questions and Answers. ... What is the difference between copy and view in NumPy?
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yeah, if I remember correctly. That just meant there was a well-used tensor library that chose the number without any problem for years.
I talked with @okuta regarding this. It seems there’s no strong reason why 25 is picked. I think changing it to 32 for NumPy compatibility is better (but must verify that there’s no huge performance degradation).
If I recall correctly there was a MAX_NDIM validation in the past versions of CuPy but seems it’s lost at some point. It should be added too.
update: @beam2d told me that it was picked after cutorch: https://github.com/torch/cutorch/blob/5e9d86cb982a6048d3077aeb0e0cee19847b4c08/lib/THC/THCTensorInfo.cuh#L10