producer.flush make celery hang
See original GitHub issueThe following is my test code:
from celery import Celery
from kafka import KafkaProducer
app = Celery('test', broker='redis://127.0.0.1:6379/0')
producer = KafkaProducer(bootstrap_servers=['172.16.24.45:9092', '172.16.24.44:9092'])
@app.task
def send_msg():
# producer = KafkaProducer(bootstrap_servers=['172.16.24.45:9092', '172.16.24.44:9092'])
for i in range(10):
producer.send('test', b'this is the %dth test message' % i)
producer.flush()
if __name__ == '__main__':
app.start()
I use the following command to start worker:
celery -A app worker -l debug
then I enter python command line to send task:
from app import *
send_msg.delay()
If I use global producer variable, when I call send_msg.delay(), celery worker will hang and wait for producer flush, it will never end. But If I use local producer variable which is commented in above code, celery worker will work well.
I want to use global producer because It will work more efficient than local and not frequently create and close connections with kafka brokers. But how can I fix this problem?
Please help me and thanks.
Issue Analytics
- State:
- Created 6 years ago
- Reactions:3
- Comments:7 (1 by maintainers)
Top Results From Across the Web
producer based on confluent_kafka: message seems to never ...
I'm trying to create a simple Kafka producer based on confluent_kafka. ... it hangs at flush() (it terminates if I set a timeout...
Read more >Celery Documentation - Read the Docs
Celery is a simple, flexible, and reliable distributed system to process vast amounts of messages, while providing.
Read more >Index — Celery 5.2.7 documentation
celery -upgrade-settings command line option ... RPCBackend.Producer attribute) · (Task attribute) ... Consumer method) · flush() (celery.concurrency.base.
Read more >Change history — Kombu 5.2.4 documentation - Celery
Simple: If passing a Queue object the simple utils will now take default routing key from that queue. Contributed by Fernando Jorge Mota....
Read more >Two years with Celery in Production: Bug Fix Edition - Medium
We had to manually restart the hanging workers and quickly became a headache. One hypothesis was that a particular task was making the...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
hey @gzeronet we are having the same issue where the KafkaProducer is not working with multiprocessing, could you please share some resources on how you solved this with gevent ?
Sorry, that will not work. You cannot share a global producer with celery tasks. You will need to create a local producer for each task instance.
On May 7, 2017 2:20 AM, “starplanet” notifications@github.com wrote: