question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Very high memory usage on Windows 10

See original GitHub issue

Reproducing code example:

from memory_profiler import profile

def process(num):
    result = 0;
    for i in range(num):
        result += i**2

@profile
def main():
    from scipy import integrate
    process(5000000)
    print("Done")

if __name__ == '__main__':
    main()

output

Done
Filename: multiprocess.py

Line #    Mem usage    Increment   Line Contents
================================================
     8     33.0 MiB     33.0 MiB   @profile
     9                             def main():
    10     57.6 MiB     24.5 MiB       from scipy import integrate
    11     57.6 MiB      0.0 MiB       process(5000000)
    12     57.6 MiB      0.0 MiB       print("Done")

In Windows Task Manager > Details tab > right click a column header > Select columns > Commit size Run the above script Watch the commit size of the new python process in Task Manager Note 1.5GB commit size (while working set is not that large, and @profile reports only 57MB.

When from scipy import integrate is commented out, the commit size will be only 33MB.

This high memory commitment is a problem especially when running many processes in parallel. Ex: https://github.com/mwaskom/seaborn/issues/2181

Scipy/Numpy/Python version information:

import sys, scipy, numpy; print(scipy.__version__, numpy.__version__, sys.version_info)
1.2.0 1.18.4 sys.version_info(major=3, minor=6, micro=5, releaselevel='final', serial=0)

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:1
  • Comments:11 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
ragibsoncommented, Aug 14, 2020

I’m not sure why the first interpreter over-commits so heavily, it does not appear significantly different than the second. I suppose you could try compiling your own version of CPython.

It looks like that on Windows, multiprocessing initializes a completely new interpreter for each child process. This means that your imports are actually repeated in each child rather than just sharing the imported code in the memory of the parent.

From my perspective, you have a few options

  1. Increase the size of your page file. I’m not sure why you have it disabled, but it is generally a misconception to believe that this would increase performance.
  2. Decrease the number of child processes you are using. It seems your system is hyperthreaded, so you might not lose much performance by cutting the number of children in half (depending on the workload of interest).
  3. Find an interpreter that doesn’t commit so much memory. I honestly have no idea why there is so much variation between the two interpreters on my Windows machine – I do most of my development on Linux and don’t have the tools to figure this out at the moment.
  4. Add additional memory to your machine.
  5. Switch to an OS with more efficient multiprocessing/forking semantics.
0reactions
rgommerscommented, Feb 22, 2022

Part of the issue here will be with OpenBLAS allocating ~30 MB per thread. So on a 32-core machine that’ll take about 1 GB. This was improved as much as possible in recent OpenBLAS versions. You can also try to limit the number of threads it uses (https://github.com/xianyi/OpenBLAS#setting-the-number-of-threads-using-environment-variables).

Please reply with what works or doesn’t work and the OpenBLAS version you are using, if you are still seeing this problem.

If you use conda you can also switch to MKL, which doesn’t allocate memory like this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Windows 10 High Memory Usage [Causes and Solutions]
How to Fix Windows 10 High Memory Usage · Close unnecessary programs. · Disable startup programs. · Disable Superfetch service. · Increase virtual ......
Read more >
High memory usage. How to fix them - Microsoft Community
1) See fixes for high RAM usage: · 2) Optimize Virtual Memory: · 3) Type Resource in Search box, open Resource Monitor as...
Read more >
How To Stop RAM Usage in Windows 10 (Prevent High Usage)
Here are a few effective ways to stop high RAM usage in Windows 10: Close unused background programs. Disable startup programs.
Read more >
How to Fix High Memory Usage with Windows 10 - Shred Cube
1. Close Programs You're Not Using · 2. Move Select Files to an External Hard Drive · 3. Uninstall Unnecessary Programs · 4....
Read more >
How to Fix High Memory Usage in Windows - Make Tech Easier
How to Fix High Memory Usage in Windows · 1. Close Unnecessary Programs and Applications · 2. Remove Unwanted Programs from Your PC...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found