question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Locust never gets past 10 req/s, despite the server being much quicker than that

See original GitHub issue

I’ve got a very simple locustfile, running against a very simple local server, and locust tells me it can do 10req/s, while ab tells me it can do 600+.

My locustfile:

class MyTasks(TaskSet):
  @task
  def read_root(self):
    self.client.get("/")

class MyUser(HttpLocust):
  host = "http://localhost:8080"
  task_set = MyTasks

My server is a very simple flask app, running locally under CherryPy, and returning a fixed string value.

If I run this with locust -c 10 -r 10 -n 1000 --no-web -f locustfile.py (1000 requests with 10 users, all hatched immediately) I end up with:

--------------------------------------------------------------------------------------------------------------------------------------------
 GET /                                                           1000     0(0.00%)       5       2      12  |       6    9.90
--------------------------------------------------------------------------------------------------------------------------------------------
 Total                                                           1000     0(0.00%)                                       9.90

Percentage of the requests completed within given times
 Name                                                           # reqs    50%    66%    75%    80%    90%    95%    98%    99%   100%
--------------------------------------------------------------------------------------------------------------------------------------------
 GET /                                                            1000      6      6      6      6      6      7      7      8     12
--------------------------------------------------------------------------------------------------------------------------------------------

It hovers around 10, but doesn’t pass 10 requests per second at any point. Each request only takes 6ms avg, suggesting each user could do at least 100/s, so 10 users -> at least 1,000 max from locust itself. That should definitely make my server the bottleneck.

Unfortunately though, my server can do way more than 10 requests/s. If I use ab instead, running:

ab -n 1000 -c 10 http://localhost:8080/

(1000 requests, with 10 threads)

I get :

Concurrency Level:      10
Time taken for tests:   1.647 seconds
Complete requests:      1000
Failed requests:        0
Write errors:           0
Total transferred:      138000 bytes
HTML transferred:       6000 bytes
Requests per second:    607.24 [#/sec] (mean)
Time per request:       16.468 [ms] (mean)
Time per request:       1.647 [ms] (mean, across all concurrent requests)
Transfer rate:          81.84 [Kbytes/sec] received

About 600 requests per second, much more in line with what I’d expect.

Why does Locust say I can only do 10 requests per second, when I can safely do 600? Am I missing something obvious in my configuration, or is ab doing something enormously different from locust that causes this effect? Seems like it should be easy to reproduce the same basic result in this case with both tools.

Issue Analytics

  • State:closed
  • Created 9 years ago
  • Comments:11 (4 by maintainers)

github_iconTop GitHub Comments

7reactions
Jahajacommented, Dec 22, 2014
1reaction
aldenpeterson-wfcommented, Jan 24, 2018

@shaikshakeel did you check the documentation?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Quick Tutorial on Locust
Start First Locust Performance Test ; RPS, Requests per second. It shows how many HTTP requests are currently being sent to the server....
Read more >
reqs/sec not correct when load testing with Python Locust
I know Locust actually need to wait server respond and then send the next request. Even though, if server respond quick, like 20ms,...
Read more >
Performance and Load Testing using Locust - PFLB
Locust is an open source load testing tool. Instead of configuration formats or UIs, with Locust you get a familiar python framework that...
Read more >
Release 1.0b2 - Locust Documentation
Locust is an easy-to-use, distributed, user load testing tool. It is intended for load-testing web sites (or other systems).
Read more >
How we manipulated Locust to test system performance under ...
Prior to back-to-school (known as “BTS” to us) we load-test our system to handle a 5x to 10x increase on current usage. That...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found