I need to somehow implement an ability to upload files through an ASP.NET application which is working within our corporate network. The problem is those files are getting increasingly big. At the moment we're using a very generic asynchronous upload but the problem is that files are getting increasingly big and the max limit of 3.9gb per file set through maxAllowedContentLength since the max value of uint won't allow anything more. Soon files which users are suppose to upload will exceed this value and might reach up to 100gb in size.
I tried looking online for some solution to this problem but in most articles by large files people mean 1gb at best.
So is there any way to upload really large files (up to 100g) through ASP.NET MVC\WebAPI application or I need to look for alternative solutions?
Yes there is, you need to split the file in smaller parts see the example here: http://forums.asp.net/t/1742612.aspx?How+to+upload+a+big+file+in+Mvc+
You could consider sending it in chunks. This would skip over the large file requirement (as each request would only be the size of the chunk you send), but is slightly more complicated on the client and server side.
I've done something similar for streaming uploaded files over a websocket, but this could easily be done with multiple ajax requests. In either case you'll want to use the JavaScript File API to read a segment of the file on the client's computer, encode that segment to something you can send (probably Base64), and send that particular segment to the web server. You could also send additional data such as file position to ensure the server is writing the file properly. The server can choose how to respond (can be as simple as a "true" to acknowledge receipt), after which the client javascript would read and send the next chunk of the file.
I have a demo of this using WebSockets on a github repo here (ASP.NET MVC server-side code here) but with a few tweaks you could easily make this into sequential AJAX requests.
Related
Been following the file upload samples MS offer, and various other examples on here; and after many days of trying to get this to work I am stuck.
I need to be able to upload files up to 10GB - I am using the streaming physical method. I have changed the Request Size limit. I am using IIS, so I have turned off Request Filtering to get a file over 4GB to be accepted. But any file over 4GB I choose, the controller hits and then errors with Unexpected end of stream. I have the DisableFormBinding attribute, I have tried enabling buffering, I have tried ignoring the AntiForgeryToken - I am out of ideas.
Is a file over 4GB impossible to do via streaming, do I need to use an older chunking method?
If you have files that large, never use byte[] or MemoryStream in your code. Only operate on streams if you download/upload files.
ASP.NET Core supports uploading one or more files using buffered model binding for smaller files and unbuffered streaming for larger
files.
File upload scenarios
Two general approaches for uploading files are buffering and streaming.
1 - Buffering
The entire file is read into an IFormFile, which is a C# representation of the file used to process or save the file.
The resources (disk, memory) used by file uploads depend on the number and size of concurrent file uploads. If an app attempts to buffer too many uploads, the site crashes when it runs out of memory or disk space. If the size or frequency of file uploads is exhausting app resources, use streaming.
Any single buffered file exceeding 64 KB is moved from memory to a temp file on disk.
Path.GetTempFileName throws an IOException if more than 65,535 files are created without deleting previous temporary files. The limit of
65,535 files is a per-server limit. For more information on this limit on Windows OS
2 - Streaming
The file is received from a multipart request and directly processed or saved by the app. Streaming doesn't improve performance significantly. Streaming reduces the demands for memory or disk space when uploading files.
for more details : Upload files in ASP.NET Core 5
I think this might help: Upload Large Files To MVC / WebAPI Using Partitioning
If you host your web application on IIS, because at the IIS level, you have a filter that does not allow you to upload such large files. You can unlock this filter directly in IIS. If you are using it, you also need to configure Kestrel. More information can be found here.
https://www.webdavsystem.com/server/documentation/large_files_iis_asp_net/
I am using Webclient.UploadFileAsync function to call a rest webservice to upload files to a server. The uploads to server can also be done from a web application.
The server side processing is in milliseconds. So, most of the time of upload is spent in transport. I am able to upload a 6.28 MB file from web application in 2 minutes, But the same upload if done from my winform application using Webclient.UploadFileAsync takes 3 minutes.
Difference between web browser upload & webservice upload is that the former directly saves a file to server and in case of webservice first the webservice is called and then file is saved to the server.
So,what is the reason for such a huge difference in speed ? And how can this difference be reduced ?
Update: I tried using fiddler as suggested, and found an interesting thing.When I uploaded a file, while the fiddler was running, I got huge improvement in upload speed.Close to the speed of web application.And, when I tried uploading when the fiddler wasn't running, I got very slow upload speed as before.So, there seems to be a bug in webclient class.How do I get around this issue?
I can't add comments due to my reputation, so sorry for getting your hopes up in advance, but it would seem that since you have to go through middleware so to speak, the overall load time is increased, if it's not that important and you have the correct tools to do so, there are many FTP clients and libraries out there that could do this, and probably quicker than your web server's speed. Although if it's required for you to go through a web server, I wouldn't have much of an answer other than perhaps using an external webclient that can maybe run slightly faster.
So to sort of answer your question, using a secure FTP library would most likely be faster, and the speed difference is mainly due to the middleware you have to go through before hitting your actual server.
I need low level control over http file uploads so I can kill an upload early.
Background:
We have a application (an httphandler) that serves downloads of media. It is able to throttle the downloads and also it's able to embed some encrypted meta-data in the file as it is being served to the client.
Here's what I'm trying to do:
I'm trying to figure out how to do the reverse of what I wrote above. I want to give our users the ability to upload one of these files so that our app can extract the encrypted data, decrypt it, and present it to the user. The problem is that these files can be big (200+ MB). The metadata though, is at the beginning of the file and we're able to read it even if the file is just a fragment.
So, does anyone know how I can gain low level control over the file upload so I can get the first N bytes of the file and then kill the upload? I'm hoping this is similar to the code I wrote for throttling downloads. I'm open to doing this via an httphandler, the webapi, or any other way that would be easy.
is there any easy way to show images in asp.net page from a ftp server?
If your password is secure then follow below steps :-
You really should create an FTP account that only has access to the folder with the images on your FTP server. Do that as soon as possible.
For a better overall solution, either synchronize the images to your webserver, or write an HTTP handler that will fetch the image server-side and streams the bytes to the client as if the image was on your server. Have a look at System.Net.FtpWebRequest for the second solution.
If you have write access to disk on the web server, you could implement both parts of the solution. So if an image is fetched the first time, write it to disk before sending it to the client. The next time it's requested, simply redirect the request to the image on disk (or dynamically change the URL of the tag for that product). This way, you build a cache of the images on your web server as time passes. Of course, you need to be able to invalidate the cache in case an image is updated.
<img src="ftp://..."/>
I'm using ASP.NET for file uploads, and I'm hitting a bit of a snag in that data sent by the client as multipart form data is read straight into RAM.
Obviously, this means maximum file upload size is limited to the available RAM on the machine, and even then is much smaller as soon as you have multiple simultaneous uploads.
Is it possible to get ASP.NET to write the data to a temporary file on the hard drive as it is recieved rather than reading into RAM?
ASP.NET saves uploaded data to a temporary file upon reach of requestLengthDiskThreshold:
http://msdn.microsoft.com/en-us/library/system.web.configuration.httpruntimesection.requestlengthdiskthreshold.aspx
Personally I use NeatUpload http://neatupload.codeplex.com/. It is a custom HttpHandler that writes large uploads to disk. Maximum file size supported is 4GB.
Additionally, it displays a progress bar when the upload is in progress, so that large uploads will not seem hanged.