'Large File upload issue (working in local but not working on server)
I have a Django-based application that has the functionality to upload the file. We used to deploy the same in the azure web app. For the small file (max file size I have tried with 20 MB) works in both local as well as server. But if it is a large file (tested with 495 and 990 MB) it works in a local environment. But once deployed into azure it is not working. After some time it is throwing 504 gateway timeout.
Now I was reading different articles, issues on large file handling. In every article/post it seems we have to implement file chunking mechanism. But what I don't understand why it is working in local and why it is not working once it is deployed on the server.
Is there any setting needs to be changed in Django or in HTML to make it happen?
Solution 1:[1]
Thank you snowman729. Posting your suggestions as answer to help other community members.
Django will by default, put uploaded file data into memory if it is less than 2.5MB. Anything larger will be written to the server's /tmp
directory and then copied across when the transfer completes. Many of Django's file upload settings can be customised, details are available in the documentation.
Before we consider any technical constraints, uploading such large files with the browser will give the user a very poor experience. You are also likely to run into problems on the server. Apart from the extremely long time that each thread will be taken with dealing with the streamed data, you have the time it takes for the system to copy the resulting file from /tmp
to its correct location.
Reference is here
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | MadhurajVadde-MT |