Memory usage of `dss_line()`
See original GitHub issueHi,
I came across this repository when I was looking for an implementation of ZAPline. Really nice that you took the effort to translate the original matlab package. I have played around with it on my data, and for most of it, it works really nicely.
However, one thing that worries me a bit, is the memory usage of the dss_line()
. So far, I tried to run it on MEG data (300 channels x 1.8x10^6 samples), but there was no chance that my laptop (24GB) could offer enough memory for dss_line
to finish. Only if I cropped the data considerably it worked.
Of course, eventually the scripts will run on an HPC, so memory shouldn’t be a big problem there, still it is quite annoying that I can’t run and check those scripts locally before exporting them to the hpc. So I was wondering whether there are some magic tricks that I can do to lower the memory footprint? Settings that I have missed or something? I tried to trace the memory consumption of the function (where I was sufficiently confident to not affect functionality), and could indeed reduce the usage here and there, but the peak is still higher than what my pc can handle.
If helpful, I can provide more information of any kind.
Issue Analytics
- State:
- Created 2 years ago
- Comments:13 (13 by maintainers)
Top GitHub Comments
you can base it on #57
Hi @eort
You are 100% right and this is something that I’ve identified already (see #50 ). One obvious thing to try is to compute the covariance by blocks. I’ll try to get around to it soon (but if you want to open a PR sooner you’re welcome 😉 )