[QUESTION] Merging large JSON fields in response schemas
See original GitHub issueFirst check
-
I used the GitHub search to find a similar issue and didn’t find it. https://github.com/tiangolo/fastapi/issues/701#issuecomment-552312286
-
I searched the FastAPI documentation, with the integrated search.
-
I already searched in Google “How to X in FastAPI” and didn’t find any information.
Description
I am dealing with JSON data in the order of 10 MBs (up to hundred MBs) which I directly get from a Postgres Instance. This data is stored as JSONB on this database. To fetch this large amount of data without parsing it into a dictionary I do the following:
items = db_session.query(models.Table.id, cast(models.Table.data, String)).filter_by(id=id).all()
Since I know this data has been properly validated when it was inserted I just use the construct factory from pydantic:
class Item(BaseModel):
id: int
data: Union[A, B, C, D]
built_items = [schemas.Item.construct(id=x[0], data=x[1]) for x in items]
Then, on the endpoint I directly return a response using:
starlette.responses.JSONResponse(content=jsonable_encoder(built_items))
But, I still describe the response_model as List[Item] as I need the documentation for this endpoint.
Using this strategy I am able to achieve really good response times, though the original JSON data is encoded now as a string and not as an object when decoded.
So the clients of the API have to decode the response many times: 1) The request itself 2) Once for each JSON object retrieved
Is there any good practice on how to tackle this problem?
Issue Analytics
- State:
- Created 4 years ago
- Comments:14 (9 by maintainers)
Top GitHub Comments
When fetching data that’s been pre-encoded in JSON, I generally create a custom response class to skip JSON encoding and validation completely. FastAPI will acknowledge
response_model
in your route for any subclass of JSONResponse, so all you need to do is this:The problem I’m seeing here is that you don’t get “raw” json from the DB, you get individual JSON elements that still need to be assembled together. What you could always do is use Postgres’
json_array_agg()
function to aggregate a column of JSON elements into a JSON array. Sinceid
anddata
are also stored separately, you’ll probably also need something likejson_build_object('id', id, 'data', data)
, and then wrap that injson_array_agg
.I don’t know how hard that would be to acheive in ORM mode, though.
Sorry for the delay, yes it did!