question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Limiting Parallel Execution

See original GitHub issue

I have a scheduled task that I would like to run once per minute, but ONLY if an instance of the task is not executing. Looking through the database models, I do not see any straight-forward way to handle this, as the task attribute of the schedule model does not seem to get populated until after the task has completed. Suggestions? If somebody can point me in the right direction I will try to whip up a PR to add this as a feature if it is desired. Thanks!

Issue Analytics

  • State:open
  • Created 6 years ago
  • Reactions:2
  • Comments:7

github_iconTop GitHub Comments

1reaction
jordanmkonczcommented, Feb 18, 2018

Thanks @Eagllus, I somehow completely missed the Chains section of the docs; that does look like it should do exactly what I want. 👍

0reactions
jordanmkonczcommented, Apr 10, 2018

@Eagllus apologies if this is explained in the docs, I tried to find it but wasn’t able to. How exactly would you “abort the task”? Do you simply mean have the task return early before doing whatever work it would normally do?

I tried using the chains functionality for the problem I had, but unfortunately it’s not quite what I need. The tasks for which I’m trying to limit parallel execution are requested by end-users which means they can be requested at any time and in any quantity; i.e. sometimes 10 of them could be requested within a 2 minute period, which would take up all of my workers.

I’m trying to find a solution in which most tasks just get added to the queue at any time and are processed normally (i.e. they just get picked up by any available worker and will complete quite quickly), but a special type of task can only be running on a single worker at any point in time. That is, if I have 5 workers and one of these special tasks is already being processed by a worker and there are 10 other special tasks sitting in the queue ready to be processed then the other 10 will just remain in the queue (with the other 4 workers either doing nothing, or processing normal tasks); only after the special task currently being processed has completed will one of the 10 in the queue be picked up by a worker.

It seems like django-q does not currently provide any way to achieve this, so I’m wondering whether I could use database locks like you suggested as a way to achieve this. But the thing is, if one of my special tasks gets picked up by a worker and then determines that a lock is already present (i.e. another special task is already being processed), I need the task to just be returned to the queue rather than just doing a return to abort early, and then being considered ‘completed’ even though the work for that task was not actually completed.

Do you have any suggestions for how I could achieve the functionality I’m looking for?

Read more comments on GitHub >

github_iconTop Results From Across the Web

About Limiting the Parallel Server Resources for a Consumer ...
To limit the parallel servers used by a consumer group, use the parallel_server_limit parameter with the CREATE_PLAN_DIRECTIVE procedure or the ...
Read more >
Limiting Parallelism - IBM
Restricting parallelism by using RLF - The resource limit facility provides an option to disable each type of parallelism for dynamic queries ......
Read more >
How to limit parallel execution of serverless lambda function
In AWS Lambda, a concurrency limit determines how many function invocations can run simultaneously in one region. You can set this limit ...
Read more >
Effective limited concurrent execution in JavaScript - Medium
A promise pool allows you to limit the maximum number of functions running concurrently by waiting for the promises to settle. I'm going...
Read more >
JS7 Getting started: How to limit parallel execution of jobs
JS7 allows to specify a number of job nodes for mutual exclusion to prevent such jobs from being executed in parallel either in...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found