question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[QUESTION] Example for running celery task which includes db transcation in async fastapi app

See original GitHub issue

Is there any working example of running a celery task, which has some DB transactions ( async DB transaction) ??

Note: I’m using MongoDB (motor) for db utils.

This URL is a well-documented example for running fastapi with MongoDB in async fashion.

I’ve checked all project template generator as well. None of them have any example for having a celery task interacting with db transactions. ( I’ll be okay to contribute on this once these current issues will get resolve)

I’ve found that there is no support for an async function in current (stable) celery release. Until then, it’s going to be a massive rewriting for me,if I wanted to use fastapi with celery ( Need to drop off MongoDB async way).

Is there any better way to deal with this situation? So, that fastapi can have few endpoints work with MongoDB (motor) transactions and a few endpoints ( which have celery task) can consume db functions in sync way to work well.

In a nutshell, Q1: How to handle async db transactions in the celery task? Q2: Can we have async and sync db connection in fast api?

The reason, I’m asking these questions because I’m getting below exception in the celery tasks. which are depending on db calls ( mongodb motor).

kombu.exceptions.EncodeError: Object of type coroutine is not JSON serializable @tiangolo Any thought would be appreciated. Since I really don’t want to lose the power of async and it’s performance advantages with fastapi.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
codesutrascommented, Jun 8, 2020

Thanks for reporting back and closing the issue.

If using async functions for distributed background tasks, I would try ARQ

@tiangolo I’d tried ARQ already. But, it can’t handle too much load. It got failed in my load testing. Celery was the only solution in my case. Though ARQ works pretty well if you don’t have a much-complicated task to be performed.

Nevertheless, Celery coroutine support is in pipeline and will get a release in celery 5.0 as per the github discussion. Code is almost ready for it. But, I’d reported an issue in it.

1reaction
ZhengRuicommented, Nov 4, 2022

celery app file worker.py

from celery import Celery

app = Celery('app', backend='rpc://', broker='amqp://localhost')

task definition file tasks/default.py

import asyncio
from database import Database
from ..worker import app

db = Database(f'postgresql://user:password@localhost:port/db')

async def some_task(param1: str, param2: str, current_task_ref):
    await db.connect()

    # interact with db
    tot = 1e4
    for i in range(tot):
        current_task_ref.update_state(state='PROGRESSING', meta={'current': i + 1, 'total': tot})

    await db.close()
    return {'current': tot, 'total': tot}

@app.task(bind=True, acks_late=True)
def some_task_t(self, *args, **kwargs):
    return asyncio.run(some_task(*args, **kwargs, current_task_ref=self))

a file to expose to fastapi tq.py

from celery.result import AsyncResult

from .tasks.default import (
    some_task_t,
    another_task_t,
)

def get_task_progress(task_id):
    result = AsyncResult(task_id)
    return result.info

in your fastapi routes

import tq

@r.post('/sometask')
async def some_task_r(
    param1: str,
    param2: str,
    db: Database = Depends(get_db),
):
    task = tq.some_task_t.delay(
        param1,
        param2,
    )

    return task.task_id

@r.get('/sometask/progress')
async def get_some_task_progress_r(
    task_id: str,
):
    return tq.get_task_progress(task_id)

when rabbitmq server and postgres database is ready, start the celery worker: celery -A worker.app worker --loglevel=INFO --pool=gevent --concurrency=10, depends on your actual task, you can try out different worker pool settings ‘processes’, ‘threads’ etc and concurrency settings. @cs-satish-mishra

Read more comments on GitHub >

github_iconTop Results From Across the Web

Working with Celery and Django Database Transactions
In this article, we'll look at how to prevent a Celery task dependent on a Django database transaction from executing before the database...
Read more >
Async Architecture with FastAPI, Celery, and RabbitMQ
In this tutorial, we will see how we can integrate Celery into FastAPI application to perform asynchronous tasks without blocking user's requests.
Read more >
tiangolo/fastapi - Gitter
I'm using celery to run tasks on-demand, and send the logging back through the broker (RabbitMQ). A WebSocket endpoint forwards messages for a...
Read more >
SQL (Relational) Databases - FastAPI
So, you can copy this example and run it as is. Later, for your production application, you might want to use a database...
Read more >
Save an object using Celery and Gino - python
If I delete 'async/await' from Celery worker, this error doesn't show up, Celery task has 'SUCCESS' state but database is empty. Can't really ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found