upload file of multiple files of 256M is very slow
See original GitHub issueFirst Check
- I added a very descriptive title to this issue.
- I used the GitHub search to find a similar issue and didn’t find it.
- I searched the FastAPI documentation, with the integrated search.
- I already searched in Google “How to X in FastAPI” and didn’t find any information.
- I already read and followed all the tutorial in the docs and didn’t find an answer.
- I already checked if it is not related to FastAPI but to Pydantic.
- I already checked if it is not related to FastAPI but to Swagger UI.
- I already checked if it is not related to FastAPI but to ReDoc.
Commit to Help
- I commit to help with one of those options 👆
Example Code
from fastapi import File, UploadFile
import aiofiles
# Upload file using fastapi
@app.post("/upload")
async def upload1(files: UploadFile = File(...)):
for file in files:
try:
contents = await file.read()
async with aiofiles.open(file.filename, 'wb') as f:
await f.write(contents)
except Exception:
return {"message": "There was an error uploading the file(s)"}
finally:
await file.close()
return {"message": f"Successfuly uploaded {[file.filename for file in files]}"}
# STREAM method to upload file
from fastapi import Request
import aiofiles
@app.post('/upload')
async def upload2(request: Request):
try:
filename = request.headers['filename']
async with aiofiles.open(filename, 'wb') as f:
async for chunk in request.stream():
await f.write(chunk)
except Exception:
return {"message": "There was an error uploading the file"}
return {"message": f"Successfuly uploaded {filename}"}
Description
if I try to upload using upload1 in parallel 6 files of 256M each, fastapi upload is stuck(the function upload1 is not even entered until all 6 files are copied), from reading about it here: https://stackoverflow.com/questions/65342833/fastapi-uploadfile-is-slow-compared-to-flask/70667530#70667530 I understand that if the size is bigger then 1MB then the file is written to disk (to a tmp directory, but I couldn’t find it there for some reason) I tried second approach using upload2 (stream) but then my upload was stuck in this line: async for chunk in request.stream() Im wondering if there is anything wrong with these implementations (upload1 or upload2), why in upload1, all the files are uploaded and then upload1 function is entered (for each upload) how to generate a bigger chunk for upload2?
Operating System
Linux
Operating System Details
Vmware Ubuntu 18.4
FastAPI Version
0.78.0
Python Version
python 3.8
Additional Context
No response
Issue Analytics
- State:
- Created a year ago
- Comments:15 (8 by maintainers)
Top GitHub Comments
How is it with pure Starlette?
No… There’s nothing hidden on my question… I’m asking the behavior on Starlette. 🤔
If the issue still happens with pure Starlette, then it’s not a FastAPI issue.