question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Possible memory leaking when combining session, threading and proxies

See original GitHub issue

I it helps I got error OSError: [Errno 24] Too many open files when running script, not sure if it is related with memory leak, I solved setting to 10000 ulimit -n 10000

Expected Result

RAM usage kept under reasonable limits

Actual Result

RAM usage doesn’t stop growing

Reproduction Steps

I usually wouldn’t be posting target website or the proxy credentials, but in this case I think they are needed for reproduce the bug.

import requests
from threading import Thread
from time import sleep

session = requests.Session()
from memory_profiler import profile
from random import randrange
finished = False


def get_proxy():
    proxy = "http://lum-customer-hl_f53c879b-zone-static-session-" + str(randrange(999999)) + ":au2d3rzz8tut@zproxy.lum-superproxy.io:22225"
    return {
        "http": proxy,
        "https": proxy
    }


def make_request(url):
    session.get(url, proxies=get_proxy())

def worker():
    while True:
        if finished: return
        make_request("http://1000imagens.com/")


@profile
def main():
    global finished
    threads = []
    for i in range(2):
        t = Thread(target=worker)
        t.start()
        threads.append(t)

    count = 0
    while True:
        sleep(1)
        count += 1
        if count == 300:
            finished = True
            return

main()

System Information

$ python3.9 -m requests.help
{
  "chardet": {
    "version": "3.0.4"
  },
  "cryptography": {
    "version": ""
  },
  "idna": {
    "version": "2.6"
  },
  "implementation": {
    "name": "CPython",
    "version": "3.9.1"
  },
  "platform": {
    "release": "4.15.0-134-generic",
    "system": "Linux"
  },
  "pyOpenSSL": {
    "openssl_version": "",
    "version": null
  },
  "requests": {
    "version": "2.25.1"
  },
  "system_ssl": {
    "version": "1010100f"
  },
  "urllib3": {
    "version": "1.22"
  },
  "using_pyopenssl": false
}
# lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.5 LTS
Release:        18.04
Codename:       bionic

I tried with python versions 3.6, 3.8 and 3.9 and found no difference.

Output of memory_profiler

Line #    Mem usage    Increment  Occurences   Line Contents
============================================================
    31     23.8 MiB     23.8 MiB           1   @profile
    32                                         def main():
    33                                             global finished
    34     23.8 MiB      0.0 MiB           1       threads = []
    35     23.8 MiB      0.0 MiB           3       for i in range(2):
    36     23.8 MiB      0.0 MiB           2           t = Thread(target=worker)
    37     23.8 MiB      0.0 MiB           2           t.start()
    38     23.8 MiB      0.0 MiB           2           threads.append(t)
    39
    40     23.8 MiB      0.0 MiB           1       count = 0
    41                                             while True:
    42    547.1 MiB    523.2 MiB         300           sleep(1)
    43    547.1 MiB      0.0 MiB         300           count += 1
    44    547.1 MiB      0.0 MiB         300           if count == 300:
    45    547.1 MiB      0.0 MiB           1               finished = True
    46    547.1 MiB      0.0 MiB           1               return

After 5 minutes it eats +500MB ram. If I leave it running indefinitely it would consume all available ram and would be killed.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
shukaicommented, Jul 12, 2021

when using random proxy, session.get_adapter(“http://”).proxy_manager dnot remove ProxyManager Object. too many ProxyManger object to memory leaking. session = requests.session() for x in range(1, 100): try: session.get(“http://test.comaaa”, proxies={“http”: “http://{}:{}”.format(x,x)}, timeout=0.1) except: continue print(session.get_adapter(“http://”).proxy_manager)

0reactions
ll125498acommented, Jun 17, 2022

+1 same issue here

Read more comments on GitHub >

github_iconTop Results From Across the Web

Avoiding memory leaks in POSIX thread programming
And most likely, such a memory leak is caused by a failure to join the joinable threads. Use pmap to count thread stacks....
Read more >
- Finding memory leaks
Finding memory leaks with Valgrind Memcheck​​ Valgrind Memcheck tracks all allocated heap blocks so it can find memory leaks when the program terminates....
Read more >
Is there a better way to avoid memory leaks when using ...
I'm using Pool and multiprocessing async. Each process sends a series of continuous request, in each request I switch randomly the header(User- ...
Read more >
How to prevent memory leak from multi-threaded process
See attachment from JConsole to see what happens with memory, the import is causing a memory leak and even if stopped the memory...
Read more >
8 Using the Coherence C++ Object Model - Oracle Help Center
This management reduces the potential of programming errors which may lead to memory leaks or corruption. This results in a stable platform for...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found