KafkaProducer hangs when publishing to new topic (and using uWSGI master process)
See original GitHub issueI’m trying to upgrade from SimpleProducer
to the newer KafkaProducer
, but I’m running into an issue where the producer hangs when it tries to publish to a topic that doesn’t exist. This occurs only when the producer is run with uWSGI, and uWSGI is configured to use a master process.
The messages are successfully published if uWSGI is run without a master process or if the topic already exists.
This behavior manifests with both kafka 0.9.0.1 and 0.10.0.
It seems like this might be related to http://stackoverflow.com/questions/37693563/how-to-make-kafka-python-or-pykafka-work-as-an-async-producer-with-uwsgi-and-gev and https://github.com/dpkp/kafka-python/issues/709, but the fact that it only breaks for nonexistent topics makes me think it might be a separate issue.
Issue Analytics
- State:
- Created 7 years ago
- Reactions:1
- Comments:6 (2 by maintainers)
Top GitHub Comments
Ah, thank you! Two issues:
You cannot share producer instances across processes, only threads. I expect that is why the master process pattern is failing.
Second, producer.send() is async but is not guaranteed to deliver if you close the producer abruptly. In your final example I suspect that your producer instances are so short-lived that they are being reaped before flushing all pending messages. To guarantee delivery (or exception) call producer.send().get(timeout) or producer.flush() . otherwise you’ll need to figure out how to get a producer instance per-uwsgi-thread and have it shared across requests (you would still want to flush before thread shutdown to guarantee no messages are dropped)
uWsgi with ‘master=false’ option works fine for this scene. But I solved this by create independent producer contexts in each process. I write a ‘mq’ module
And in app logic: