question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"Stop" button doesn't always stop workers

See original GitHub issue

Describe the bug

In my test environment I am running a distributed loadtest with 3 workers in GUI mode. Intermittently (seemingly generally on tests following the initial test after spin up of the hosts), when pressing the “STOP” button on the master GUI web interface, the feedback will say “Stopping” and looking at the output on the workers console they will also report that they are “Stopping” but the number of users never decreases and the load test continues. We have to “CTRL-C” each worker separately to get the test to stop.

Expected behavior

When the “STOP” button is pressed the load test should stop immediately on all workers (or at least within a few seconds).

Actual behavior

The workers and the GUI report the status as “STOPPING” and the test continues – the “STOP” button is now gone so the only recourse is to log on to each worker and stop the test manually.

Output from one of the workers on the command line is: “[2020-10-06 23:12:49,585] locust02.YYYY.XXXX.local/INFO/locust.runners: Stopping 666 users”

Steps to reproduce

On the 3 workers we run this command: locust -f locustfile.py --worker --master-host=172.20.2.254 On the master host we run this command: locust -f locustfile.py --master

On the GUI we set the number of users to 2000 and the spawn rate of 100. (We have tried 6000 and 300, and 1000 and 100.) The problem is intermittent.

locustfile.py is shown below

Environment

  • OS: CentOS Linux release 7.6.1810
  • Python version: Python 3.6.8
  • Locust version: locust 1.2.2
  • Locust command line that you ran: worker: locust -f locustfile.py --worker --master-host=172.20.2.254 master: locust -f locustfile.py --master
  • Locust file contents (anonymized if necessary):
# coding=utf-8
import json
import os
import random
import requests
import time
import locust_plugins
from locust import HttpUser, task, between, TaskSet, User

deviceIdStart = 1000000000000000000
deviceIdRange = 10000
client_id = os.getenv("CLIENTID")
client_secret = os.getenv("CLIENTSECRET")

ballots = ["test"]

candidates = {
    "test": {
        "T1": "test1",
        "T2": "test2",
        "T3": "test3",
        "T4": "test4",
        "T5": "test5",
        "T6": "test6",
        "T7": "test7",
        "T8": "test8",
    }
}


def candidate():
    ballot_id = random.choice(ballots)
    candidate_list = list(candidates[ballot_id].keys())
    candidate_id = random.choice(candidate_list)
    candidate_name = candidates[ballot_id][candidate_id]

    return ballot_id, candidate_id, candidate_name


def device_id():
    dIdS = int(os.getenv("DEVIDSTART", deviceIdStart))
    dIdR = int(os.getenv("DEVIDRANGE", deviceIdRange))
    return random.randrange(dIdS, dIdS + dIdR)


class UserBehavior(HttpUser):
    min_wait = 2000
    max_wait = 9000
    host = os.getenv("TARGET_URL")

    def __init__(self, parent):
        super(UserBehavior, self).__init__(parent)

        self.token = ""
        self.headers = {}
        self.tokenExpires = 0

    def on_start(self):
        self.token = self.login()

        self.headers = {
            "Authorization": "%s %s"
            % (self.token["token_type"], self.token["access_token"])
        }

        self.tokenExpires = time.time() + self.token["expires_in"] - 120

    def login(self):
        """
        Gets the token for the user
        :rtype: dict
        """
        global client_id
        global client_secret

        url = os.getenv("AUTH_URL")
        print("Get token with %s" % url)
        response = requests.post(
            url,
            headers={
                "X-Client-Id": client_id,
                "X-Client-Secret": client_secret,
                "cache-control": "no-cache",
            },
        )
        try:
            content = json.loads(response.content)
            print("Access token: %s" % content.get("access_token"))
            return content
        except:
            print("Error in getToken(): %s" % content.get("error_msg"))
            return None

    @task
    def vote(self):
        if self.tokenExpires < time.time():
            self.token = self.login()
            if self.token:
                self.tokenExpires = time.time() + self.token["expires_in"] - 120
            else:
                print("Unable to get SAT Token")
                return None
        selection = candidate()
        message = {
            "Id": "TEST-P",
            "dId": device_id(),
            "bId": selection[0],
            "sIds": selection[1],
            "sTexts": selection[2],
        }
        response = self.client.post(
            "/api/v1/test?partner=test", message, headers=self.headers
        )

    # vim: set fileencoding=utf-8 :

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:2
  • Comments:11 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
irvintimcommented, Oct 7, 2020

I will try it on the latest master – and will let you know if I see any change. Thanks for the link to the other report – I don’t think it’s exactly the same, but the mention of the “gevent.sleep(0)” that they added gives me an idea of something to try that might help with my problem. I can also temp remove the “randrange” call and see if that has any impact. I’ll report back my findings.

0reactions
zhenhuaplancommented, Dec 1, 2022

It seems that this problem still exists in the current version. At present, this problem often occurs when I use locust. After clicking the STOP button, the state always shows STOPING,The following is my master’s log information

[2022-12-01 12:10:43,200] srv969220428/INFO/locust.main: Starting web interface at http://0.0.0.0:8089 (accepting connections from all network interfaces)
[2022-12-01 12:10:43,245] srv969220428/INFO/locust.main: Starting Locust 2.8.4
[2022-12-01 12:10:43,294] srv969220428/INFO/locust.runners: Client 'srv969220428_2bdeee017d6d4fbf8ce76061cc2c6690' reported as ready. Currently 1 clients ready to swarm.
[2022-12-01 12:10:43,296] srv969220428/INFO/locust.runners: Client 'srv969220428_ce2e9d937a144478a9e2760ab58af518' reported as ready. Currently 2 clients ready to swarm.
[2022-12-01 12:10:43,303] srv969220428/INFO/locust.runners: Client 'srv969220428_0150cfd7f59b4f45a0f2f3eb58132a72' reported as ready. Currently 3 clients ready to swarm.
[2022-12-01 12:10:43,335] srv969220428/INFO/locust.runners: Client 'srv969220428_611f6ce743b84365b32bc2c629252be5' reported as ready. Currently 4 clients ready to swarm.
[2022-12-01 12:10:43,347] srv969220428/INFO/locust.runners: Client 'srv969220428_e43b2377b93d4b09b85818fe4fe0558a' reported as ready. Currently 5 clients ready to swarm.
[2022-12-01 12:10:43,480] srv969220428/INFO/locust.runners: Client 'srv969220428_dff0727ceeab400db1892731d709b029' reported as ready. Currently 6 clients ready to swarm.
[2022-12-01 12:10:43,594] srv969220428/INFO/locust.runners: Client 'srv969220428_0c57f876a0be4eb9a5a405ca8660ecd0' reported as ready. Currently 7 clients ready to swarm.
[2022-12-01 12:10:43,601] srv969220428/INFO/locust.runners: Client 'srv969220428_f3659be744c84a73b0dd3c894c660731' reported as ready. Currently 8 clients ready to swarm.
[2022-12-01 12:10:46,364] srv969220428/INFO/locust.runners: Client 'srv91862111_a95135e3c58841c6b195171a6c902d2d' reported as ready. Currently 9 clients ready to swarm.
[2022-12-01 12:10:46,372] srv969220428/INFO/locust.runners: Client 'srv91862111_d17deff1ed0943559e5ed030fc291e97' reported as ready. Currently 10 clients ready to swarm.
[2022-12-01 12:10:46,401] srv969220428/INFO/locust.runners: Client 'srv91862111_44a0695572174f118f1a566d82806e83' reported as ready. Currently 11 clients ready to swarm.
[2022-12-01 12:10:49,721] srv969220428/INFO/locust.runners: Client 'srv8020201013_700b2270942b447abeba767474f04206' reported as ready. Currently 12 clients ready to swarm.
[2022-12-01 12:10:49,722] srv969220428/INFO/locust.runners: Client 'srv8020201013_778ae1bc828f41d3b0573738c4753860' reported as ready. Currently 13 clients ready to swarm.
[2022-12-01 12:10:49,722] srv969220428/INFO/locust.runners: Client 'srv8020201013_6b7943b6b0dd46419dabf9c4dfda13ff' reported as ready. Currently 14 clients ready to swarm.
[2022-12-01 12:10:49,726] srv969220428/INFO/locust.runners: Client 'srv8020201013_16209a44c3f941a4926f6eb2a137f4e8' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:27:36,873] srv969220428/INFO/locust.runners: Sending spawn jobs of 500 users at 20.00 spawn rate to 15 ready clients
[2022-12-01 12:27:55,189] srv969220428/WARNING/locust.runners: Worker srv8020201013_16209a44c3f941a4926f6eb2a137f4e8 exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:27:55,189] srv969220428/WARNING/locust.runners: Worker srv8020201013_778ae1bc828f41d3b0573738c4753860 exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:28:00,193] srv969220428/WARNING/locust.runners: Worker srv8020201013_700b2270942b447abeba767474f04206 exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:28:00,199] srv969220428/WARNING/locust.runners: Worker srv8020201013_6b7943b6b0dd46419dabf9c4dfda13ff exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:28:01,142] srv969220428/INFO/locust.runners: All users spawned: {"ODYSSEY": 500} (500 total users)
[2022-12-01 12:29:07,328] srv969220428/INFO/locust.runners: Removing srv969220428_dff0727ceeab400db1892731d709b029 client from running clients
[2022-12-01 12:29:07,328] srv969220428/INFO/locust.runners: Removing srv969220428_0c57f876a0be4eb9a5a405ca8660ecd0 client from running clients
[2022-12-01 12:29:07,328] srv969220428/INFO/locust.runners: Removing srv969220428_611f6ce743b84365b32bc2c629252be5 client from running clients
[2022-12-01 12:29:07,328] srv969220428/INFO/locust.runners: Client 'srv969220428_dff0727ceeab400db1892731d709b029' reported as ready. Currently 13 clients ready to swarm.
[2022-12-01 12:29:07,329] srv969220428/INFO/locust.runners: Client 'srv969220428_0c57f876a0be4eb9a5a405ca8660ecd0' reported as ready. Currently 14 clients ready to swarm.
[2022-12-01 12:29:07,329] srv969220428/INFO/locust.runners: Client 'srv969220428_611f6ce743b84365b32bc2c629252be5' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,329] srv969220428/INFO/locust.runners: Removing srv969220428_ce2e9d937a144478a9e2760ab58af518 client from running clients
[2022-12-01 12:29:07,329] srv969220428/INFO/locust.runners: Client 'srv969220428_ce2e9d937a144478a9e2760ab58af518' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,330] srv969220428/INFO/locust.runners: Removing srv969220428_0150cfd7f59b4f45a0f2f3eb58132a72 client from running clients
[2022-12-01 12:29:07,330] srv969220428/INFO/locust.runners: Client 'srv969220428_0150cfd7f59b4f45a0f2f3eb58132a72' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,330] srv969220428/INFO/locust.runners: Removing srv969220428_e43b2377b93d4b09b85818fe4fe0558a client from running clients
[2022-12-01 12:29:07,330] srv969220428/INFO/locust.runners: Client 'srv969220428_e43b2377b93d4b09b85818fe4fe0558a' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,332] srv969220428/INFO/locust.runners: Removing srv91862111_a95135e3c58841c6b195171a6c902d2d client from running clients
[2022-12-01 12:29:07,332] srv969220428/INFO/locust.runners: Client 'srv91862111_a95135e3c58841c6b195171a6c902d2d' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,333] srv969220428/INFO/locust.runners: Removing srv91862111_d17deff1ed0943559e5ed030fc291e97 client from running clients
[2022-12-01 12:29:07,333] srv969220428/INFO/locust.runners: Removing srv91862111_44a0695572174f118f1a566d82806e83 client from running clients
[2022-12-01 12:29:07,334] srv969220428/INFO/locust.runners: Client 'srv91862111_d17deff1ed0943559e5ed030fc291e97' reported as ready. Currently 14 clients ready to swarm.
[2022-12-01 12:29:07,334] srv969220428/INFO/locust.runners: Client 'srv91862111_44a0695572174f118f1a566d82806e83' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,334] srv969220428/INFO/locust.runners: Removing srv969220428_2bdeee017d6d4fbf8ce76061cc2c6690 client from running clients
[2022-12-01 12:29:07,334] srv969220428/INFO/locust.runners: Client 'srv969220428_2bdeee017d6d4fbf8ce76061cc2c6690' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,337] srv969220428/INFO/locust.runners: Removing srv8020201013_6b7943b6b0dd46419dabf9c4dfda13ff client from running clients
[2022-12-01 12:29:07,337] srv969220428/INFO/locust.runners: Client 'srv8020201013_6b7943b6b0dd46419dabf9c4dfda13ff' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,337] srv969220428/INFO/locust.runners: Removing srv8020201013_778ae1bc828f41d3b0573738c4753860 client from running clients
[2022-12-01 12:29:07,338] srv969220428/INFO/locust.runners: Removing srv969220428_f3659be744c84a73b0dd3c894c660731 client from running clients
[2022-12-01 12:29:07,338] srv969220428/INFO/locust.runners: Client 'srv969220428_f3659be744c84a73b0dd3c894c660731' reported as ready. Currently 14 clients ready to swarm.
[2022-12-01 12:29:07,338] srv969220428/INFO/locust.runners: Client 'srv8020201013_778ae1bc828f41d3b0573738c4753860' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,339] srv969220428/INFO/locust.runners: Removing srv8020201013_16209a44c3f941a4926f6eb2a137f4e8 client from running clients
[2022-12-01 12:29:07,339] srv969220428/INFO/locust.runners: Client 'srv8020201013_16209a44c3f941a4926f6eb2a137f4e8' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:07,409] srv969220428/INFO/locust.runners: Removing srv8020201013_700b2270942b447abeba767474f04206 client from running clients
[2022-12-01 12:29:07,409] srv969220428/INFO/locust.runners: Client 'srv8020201013_700b2270942b447abeba767474f04206' reported as ready. Currently 15 clients ready to swarm.
[2022-12-01 12:29:08,254] srv969220428/WARNING/locust.runners: Worker srv8020201013_700b2270942b447abeba767474f04206 exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:29:08,260] srv969220428/WARNING/locust.runners: Worker srv8020201013_16209a44c3f941a4926f6eb2a137f4e8 exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:29:08,262] srv969220428/WARNING/locust.runners: Worker srv8020201013_6b7943b6b0dd46419dabf9c4dfda13ff exceeded cpu threshold (will only log this once per worker)
[2022-12-01 12:29:08,267] srv969220428/WARNING/locust.runners: Worker srv8020201013_778ae1bc828f41d3b0573738c4753860 exceeded cpu threshold (will only log this once per worker)
Read more comments on GitHub >

github_iconTop Results From Across the Web

Solved: Stop button is not working. - NI Community
I have to stop manually. My vi have two event structures. I have even tried quit buton. The button will stay pressed but...
Read more >
How To Start And Stop A Continuously Running Background ...
In this situation I believe the questions was asking for a way to start and stop. The background worker can trip you up...
Read more >
Start Stop button Illuminated - BMW M3 Forum (E90 E92)
So, I'm almost positive this has to be my battery draining issue. I tried disconnecting battery overnight and charging it. Once I reconnected ......
Read more >
Can Emergency Stop be used as an "on/off" control?
Emergency stop systems are not intended to be used as on/off devices for switching power to the machinery on/off on a daily basis....
Read more >
No start stop button in M440i - G20 BMW 3-Series Forum
Whats next? M3 will coast on track with engine shut off? Doesn't auto stop disable in sport mode?
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found