[QUESTION] Example for running celery task which includes db transcation in async fastapi app
See original GitHub issueIs there any working example of running a celery task, which has some DB transactions ( async DB transaction) ??
Note: I’m using MongoDB (motor) for db utils.
This URL is a well-documented example for running fastapi with MongoDB in async fashion.
I’ve checked all project template generator as well. None of them have any example for having a celery task interacting with db transactions. ( I’ll be okay to contribute on this once these current issues will get resolve)
I’ve found that there is no support for an async function in current (stable) celery release. Until then, it’s going to be a massive rewriting for me,if I wanted to use fastapi with celery ( Need to drop off MongoDB async way).
Is there any better way to deal with this situation? So, that fastapi can have few endpoints work with MongoDB (motor) transactions and a few endpoints ( which have celery task) can consume db functions in sync way to work well.
In a nutshell, Q1: How to handle async db transactions in the celery task? Q2: Can we have async and sync db connection in fast api?
The reason, I’m asking these questions because I’m getting below exception in the celery tasks. which are depending on db calls ( mongodb motor).
kombu.exceptions.EncodeError: Object of type coroutine is not JSON serializable
@tiangolo Any thought would be appreciated. Since I really don’t want to lose the power of async and it’s performance advantages with fastapi.
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (2 by maintainers)
Top GitHub Comments
@tiangolo I’d tried ARQ already. But, it can’t handle too much load. It got failed in my load testing. Celery was the only solution in my case. Though ARQ works pretty well if you don’t have a much-complicated task to be performed.
Nevertheless, Celery coroutine support is in pipeline and will get a release in celery 5.0 as per the github discussion. Code is almost ready for it. But, I’d reported an issue in it.
celery app file
worker.py
task definition file
tasks/default.py
a file to expose to fastapi
tq.py
in your fastapi routes
when rabbitmq server and postgres database is ready, start the celery worker:
celery -A worker.app worker --loglevel=INFO --pool=gevent --concurrency=10
, depends on your actual task, you can try out different worker pool settings ‘processes’, ‘threads’ etc and concurrency settings. @cs-satish-mishra