queue.push() doesn't check if task id is currently a worker
See original GitHub issueJust a clarification on my part. I was noticing that long running tasks aren’t considered for duplicating checking on ‘id’ when pushed into a queue. For example, I could have a queue of concurrency 10, and push in a task that’ll take 10 minutes to execute. It’s id property is long_running_task. Once work begins on it, I can push another tasks, `` into the queue and it’ll begin working, despite having the same id as the long running task in the worker. I would have thought that if there’s a task with the same ID that’s working, any new tasks with the same ID would be rejected/merged?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:2
- Comments:8 (2 by maintainers)
Top Results From Across the Web
Celery check pending tasks number before specified taskid
I'm working with celery in high load and set this one (for using priority queues) - it shouldn't be a problem. BTW you...
Read more >Tasks received but not executing · Issue #3759 - GitHub
I have tasks that are received but will not execute. This the output of celery worker: $ celery worker -A myapp.celeryapp --loglevel=INFO ...
Read more >Migrating push queues to Cloud Tasks - Python 2
This page describes how you can migrate push queue code from Task Queues to Cloud Tasks. Cloud Tasks is now the preferred way...
Read more >Tasks are sent to RabbitMQ, but never received by a worker
Tasks are scheduled with apply_async() and the task ID is immediately logged. ... Check if the tasks are stuck in rabbitmq's queues (maybe...
Read more >Multi Process Task Queue in 100 Lines of R Code - Tidyverse
pop() has a timeout argument, which lets you wait for a task to finish, if all pushed tasks are still running. It returns...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

An old task, is it still relevant? When doing concurrent processing, I expect the queue would prevent tasks with the same ID to be processed at the same time, as it could lead to conflicts…
This problem still persists today.
I have a database watcher which looks if new collection is added. If so, add the collection to the queue. Then I have a long running task applied to each collection which can take 6-8 minutes to complete and accesses shared resources (files on the server, modifies them, emails them, etc.)
The queue works amazing on the first task, but then when it has 3-4 other pending tasks it takes 2 at the same time despite
{concurrent: 1, batchSize: 1, afterProcessDelay: 1000, maxRetries:10, retryDelay: 1000 }Edit: even if you check if ID is the same, the queue still ends up executing 2-3 tasks at the same time. I’ll get back with an update when I get results