einsum in numpy 1.15 is almost 30 times slower than numpy 1.14!
See original GitHub issueeinsum is significantly slower in numpy 1.15 (when compared to numpy 1.14). Could be related to #11686. I am not sure… I am using python 3.7.0 in Arch Linux.
I realize that 1.15 does bring some improvements to einsum in general; but, for this specific code, it is very slow!
Reproducing code example:
import numpy as np
import time
print(np.version.version)
b = np.random.random((5, 2))
t = np.random.random((5, 5, 2))
p = np.random.random((2, 5))
t1 = time.time()
for _ in range(int(1e6)):
out = np.einsum('ij,ixy,ji->xy', b, t, p)
t2 = time.time()
print(t2-t1)
Output 1:
1.15.0 129.36364936828613
Output 2:
1.14.5 4.071631908416748
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (13 by maintainers)
Top Results From Across the Web
Why is numpy's einsum slower than numpy's built-in functions?
The numpy source for einsum is very complex and I don't fully understand it. So be advised that the following is speculative at...
Read more >numpy.einsum — NumPy v1.24 Manual
New in version 1.6.0. The Einstein summation convention can be used to compute many multi-dimensional, linear algebraic array operations. einsum provides ...
Read more >Matlab vs. Python -- A rebuttal - John T. Foster
The first claim is that MATLAB code is more readable to scientists ... This is because NumPy was created for element-wise array operations ......
Read more >TensorRT 8.4.1 Release Notes - NVIDIA Documentation Center
TensorRT Release 8.5.1. These are the TensorRT 8.5.1 Release Notes and are applicable to x86 Linux, Windows, JetPack, and PowerPC Linux users.
Read more >Package List — Spack 0.20.0.dev0 documentation
Versions: master, 3.0.0; Build Dependencies: cmake, ninja, boost, lapack, python, py-numpy, py-scipy, py-pybind11, mumps, netlib-scalapack, petsc, mpi, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hi Mark, it’s good to hear from you. I don’t know if you have noticed, but @dgasmith maintains a separate repository at https://github.com/dgasmith/opt_einsum with einsum modded to work with different back ends. Einsum is taking on a life of its own.
It’s nice to see C-Einsum is doing ok for small operations, I put some good effort into keeping its overhead low. For example, that’s why the C-level API accepts strings instead of lists of integers. In my benchmarks I found that constructing a single string and interpreting it from C is much faster than constructing the equivalent lists of integers and interpreting those.