Issue
I have containers of my app using flask + nginx + gunicorn + celery that process informations from a .xml file.
This app is inside a AWS Lightsail, which has 4g of RAM and 2vCPUs.
The files have been bigger than 400mb and takes about 40min + to upload it.
- Does anyone knows how to make faster uploads? Would it change to increase the vcpus? (I dont believe it would)
This is my sample code:
@app.route("/", methods=("GET", "POST"), strict_slashes=False)
# @login_required
async def index():
try:
if request.method == 'POST' and request.content_length < app.config['MAX_CONTENT_LENGTH']:
file = request.files['inputfile']
nome_xml = secure_filename(file.filename)
if nome_xml.lower().endswith('.xml'):
exists = db.session.query(XML).filter_by(xml=nome_xml).first()
os.makedirs(app.config['UPLOAD_PATH'], exist_ok=True)
upload_path = os.path.join(app.config['UPLOAD_PATH'], nome_xml)
file.save(upload_path) **<<<< THIS IS TAKING SO LONG!**
if not exists:
process_xml_file.delay(nome_xml, upload_path, 'testuser', id_evento_db)
db.session.close_all()
return render_template('home/index.html')
else:
logger.info(f"Arquivo inválido ou já enviado anteriormente: {secure_filename(file.filename)}")
return render_template('home/index.html')
Solution
you can use Dropzone library and integrate it on your upload form. You will also need to enable chunking.
Please refer to https://codecalamity.com/uploading-large-files-by-chunking-featuring-python-flask-and-dropzone-js/
Answered By - Mike
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.