Very high memory usage on Windows 10
See original GitHub issueReproducing code example:
from memory_profiler import profile
def process(num):
result = 0;
for i in range(num):
result += i**2
@profile
def main():
from scipy import integrate
process(5000000)
print("Done")
if __name__ == '__main__':
main()
output
Done
Filename: multiprocess.py
Line # Mem usage Increment Line Contents
================================================
8 33.0 MiB 33.0 MiB @profile
9 def main():
10 57.6 MiB 24.5 MiB from scipy import integrate
11 57.6 MiB 0.0 MiB process(5000000)
12 57.6 MiB 0.0 MiB print("Done")
In Windows Task Manager > Details tab > right click a column header > Select columns > Commit size Run the above script Watch the commit size of the new python process in Task Manager Note 1.5GB commit size (while working set is not that large, and @profile reports only 57MB.
When from scipy import integrate is commented out, the commit size will be only 33MB.
This high memory commitment is a problem especially when running many processes in parallel. Ex: https://github.com/mwaskom/seaborn/issues/2181
Scipy/Numpy/Python version information:
import sys, scipy, numpy; print(scipy.__version__, numpy.__version__, sys.version_info)
1.2.0 1.18.4 sys.version_info(major=3, minor=6, micro=5, releaselevel='final', serial=0)
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:11 (7 by maintainers)
Top Results From Across the Web
Windows 10 High Memory Usage [Causes and Solutions]
How to Fix Windows 10 High Memory Usage · Close unnecessary programs. · Disable startup programs. · Disable Superfetch service. · Increase virtual ......
Read more >High memory usage. How to fix them - Microsoft Community
1) See fixes for high RAM usage: · 2) Optimize Virtual Memory: · 3) Type Resource in Search box, open Resource Monitor as...
Read more >How To Stop RAM Usage in Windows 10 (Prevent High Usage)
Here are a few effective ways to stop high RAM usage in Windows 10: Close unused background programs. Disable startup programs.
Read more >How to Fix High Memory Usage with Windows 10 - Shred Cube
1. Close Programs You're Not Using · 2. Move Select Files to an External Hard Drive · 3. Uninstall Unnecessary Programs · 4....
Read more >How to Fix High Memory Usage in Windows - Make Tech Easier
How to Fix High Memory Usage in Windows · 1. Close Unnecessary Programs and Applications · 2. Remove Unwanted Programs from Your PC...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

I’m not sure why the first interpreter over-commits so heavily, it does not appear significantly different than the second. I suppose you could try compiling your own version of CPython.
It looks like that on Windows, multiprocessing initializes a completely new interpreter for each child process. This means that your imports are actually repeated in each child rather than just sharing the imported code in the memory of the parent.
From my perspective, you have a few options
Part of the issue here will be with OpenBLAS allocating ~30 MB per thread. So on a 32-core machine that’ll take about 1 GB. This was improved as much as possible in recent OpenBLAS versions. You can also try to limit the number of threads it uses (https://github.com/xianyi/OpenBLAS#setting-the-number-of-threads-using-environment-variables).
Please reply with what works or doesn’t work and the OpenBLAS version you are using, if you are still seeing this problem.
If you use conda you can also switch to MKL, which doesn’t allocate memory like this.