question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

MemoryError in fft/fftpack.py, line 81, in _raw_fft

See original GitHub issue

My use case is doing a lot (~3000 different templates on equally many images) of sequential template matching (via skimage.feature.match_template). After a little time, a MemoryError occurs:

File "./generate_raw_index.py", line 34, in verify_raw_window
    result = feature.match_template(vignette, raw_window)
  File "/home/mschroeder/.local/lib/python2.7/site-packages/skimage/feature/template.py", line 139, in match_template
    mode="valid")[1:-1, 1:-1]
  File "/home/mschroeder/.local/lib/python2.7/site-packages/scipy/signal/signaltools.py", line 365, in fftconvolve
    ret = (np.fft.irfftn(np.fft.rfftn(in1, fshape) *
  File "/home/mschroeder/.local/lib/python2.7/site-packages/numpy/fft/fftpack.py", line 1073, in rfftn
    a = rfft(a, s[-1], axes[-1], norm)
  File "/home/mschroeder/.local/lib/python2.7/site-packages/numpy/fft/fftpack.py", line 361, in rfft
    _real_fft_cache)
  File "/home/mschroeder/.local/lib/python2.7/site-packages/numpy/fft/fftpack.py", line 81, in _raw_fft
    r = work_function(a, wsave)
MemoryError
Segmentation fault

The error occurs in work_function, which is fftpack.rfftf, but I think the memory runs low because of the fft_cache, which is _real_fft_cache. Its elements are size-dependent and my use case incorporates many different sizes. The number of objects in both _real_fft_cache and _fft_cache increases over the runtime and the total memory quickly reaches megabytes. So there is an ever-growing cache whose elements are rarely reused.

The introduction of the cache dates back to https://github.com/numpy/numpy/commit/4bea674fd9a5bfa5f7ee14a5fec164a9d84b657c , but I can’t see, where all that comes from.

For my case I’d like to get rid of the cache (whose existence is not justified in any comment in the code), but I understand, that it might give a significant speed-up in a more friendly use case.

To prevent the cache from growing beyond all bonds, some sophisticated logic (maybe LRU would be good) will be needed. I don’t know, if anywhere in numpy such a cache with limited size is already implemented.

For now, I will just reset both caches to {} after each call to match_template.

The cache was not the problem (although it doesn’t help either). The problem was an overly big image. Nevertheless I expect numpy to tell me “this array is too big” instead of segfaulting.

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
sebergcommented, Feb 1, 2016

Yeah, but there are two bugs here:

  1. Probably the init function (seems at least often fftpack.cffti) does not handle the memoryerror well
  2. There seems no limit on the cache size? One question is, whether it is actually worth it, considering that we have a global array cache now I think. It might be that it serves almost the same purpose.
0reactions
rgommerscommented, Mar 21, 2019

Normally we would just close it, since 1.16 is a LTS if there is some anticipation we should fix it for 1.16, maybe not.

It’s been open for 3 years without any more reports or follow-up. So no, no need to fix in 1.16.x I’d say. Let’s just close; if someone comes along with a 1.16.x patch we may consider it, but not much point following up ourselves now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Handle the MemoryError in Python - Rollbar
A MemoryError is an error encountered in Python when there is no memory available for allocation. Learn two ways to solve this.
Read more >
memory error in python - Stack Overflow
The issue is that 32-bit python only has access to ~4GB of RAM. This can shrink even further if your operating system is...
Read more >
MemoryError / Too Many Files Open - Coding - PsychoPy
Hello! I am trying to present a series of 150 images and videos while collecting a participant response (150 trials, each with an...
Read more >
Memory error during 2Dclassification - 2D Classification
Hi All, I am using cryosparc 3.2.0 and spontaneously see following error during 2D classification. Any help will be appreciated.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found