question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

The locust python instance can use max 1 cpu core on macOS

See original GitHub issue

Describe the bug

I have a simple locust file which hit an json rest endpoint. I start the load with 50 user with hatch rate 25. Then I try to keep adding users in the next run. No matter how many user I specify in the command, I can see that the CPU is only top at 100%CPU cores on my mac with 16CPU cores.

Expected behavior

I expect it to take more CPU and ram when the load is increasing.

Actual behavior

CPU only top at 1 full core never beyond that.

Steps to reproduce

run the locust command below with the python script at the same folder.

versions of python and locust

dev01:~$ locust --version
locust 1.1.1

dev01:~$ python3 -V
Python 3.8.4

dev01:~$ python -V
Python 2.7.16

Environment

  • OS: MacOS 10.15.6

  • Python version: 3.8.4

  • Locust version: 1.1.1

  • Locust command line that you ran: locust -f class-test-random-site-12month.py --host=https://report.dev.int --headless -u 500 -r 100 -t 3m --csv=result-12mon

  • Locust file contents (anonymized if necessary):

import random
from locust import HttpUser, task, between
from locust.contrib.fasthttp import FastHttpUser

class QuickstartUser(HttpUser):
    wait_time = between(0, 0)
    @task
    def article_12mon(self):
        # top 20 stores in test data
        storeList = [7041,7011,7013,7047,7042, 7043, 7063]
        # top departments excluding seasonal/garden
        deptList = [27, 24, 59, 29, 25, 23]
        storeId = str(random.choice(storeList))
        deptId = str(random.choice(deptList))
        self.client.get("/dashboard/store/" + storeId + "/departments/" + deptId + "/classes?filterSelection=12MON", name="class-test - store: " + storeId + " - dept: " + deptId)

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6

github_iconTop GitHub Comments

github_iconTop Results From Across the Web

Increase performance with a faster HTTP client
Locust's default HTTP client uses python-requests. It provides a nice API that many python developers are familiar with, and is very well-maintained.
Read more >
using multi-CPU platforms with locust - Stack Overflow
I think the best option to run locust on multiple cores locally is to run locust master and workers with docker and ...
Read more >
Locust - Read the Docs
Locust is an easy-to-use, distributed, user load testing tool. It is intended for load-testing web sites (or other systems).
Read more >
Implementing dynamic allocation of user load in a distributed ...
separate python instances have to be run [7]. So to utilize the full potential of a CPU with 4 cores, 4 python instances...
Read more >
Multiprocessing Pool Number of Workers in Python
We can issue one-off tasks to the process pool using functions such as ... For example, if we had 4 physical CPU cores...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found