The ASP.NET Core documentation on file uploading talks about the option to upload files in an unbuffered way using MultipartReader, however as I understand it, this only reads each MultipartSection in an unbuffered way - it has to read a whole MultipartSection before your code gets called with the whole section being buffered in memory (I have confirmed this by logging, and my logging only gets called once the HTTP request has finished posting). If you're uploading a large file as one MultipartSection this isn't terribly helpful.
Is there a way to do truly unbuffered uploading in ASP.NET Core (or at least, have the buffer be something small like 32kb)? As the data comes in from the client, it would be made available to my code to stream out to disk or uploaded somewhere else over a fast network connection?
it has to read a whole MultipartSection before your code gets called with the whole section being buffered in memory (I have confirmed this by logging, and my logging only gets called once the HTTP request has finished posting). If you're uploading a large file as one MultipartSection this isn't terribly helpful.
The code gets called once after HTTP request has finished posting. This is expected, but it doesn't means Asp.net core will buffer the data.
As the document mentioned that there are two general approaches for uploading files buffering and streaming in ASP.Net core. For the buffering approach, the entire file is read into an IFormFile. And for the streaming, there is no additional object created by the Asp.net core, it read the content directly from HttpContext.Request.Body, instead of created the IFormFile by Asp.NET Core.
And the goal of streaming approach is reducing the demands for memory or disk space when uploading files as it states below:
The file is received from a multipart request and directly processed or saved by the app. Streaming doesn't improve performance significantly. Streaming reduces the demands for memory or disk space when uploading files.
Related
Been following the file upload samples MS offer, and various other examples on here; and after many days of trying to get this to work I am stuck.
I need to be able to upload files up to 10GB - I am using the streaming physical method. I have changed the Request Size limit. I am using IIS, so I have turned off Request Filtering to get a file over 4GB to be accepted. But any file over 4GB I choose, the controller hits and then errors with Unexpected end of stream. I have the DisableFormBinding attribute, I have tried enabling buffering, I have tried ignoring the AntiForgeryToken - I am out of ideas.
Is a file over 4GB impossible to do via streaming, do I need to use an older chunking method?
If you have files that large, never use byte[] or MemoryStream in your code. Only operate on streams if you download/upload files.
ASP.NET Core supports uploading one or more files using buffered model binding for smaller files and unbuffered streaming for larger
files.
File upload scenarios
Two general approaches for uploading files are buffering and streaming.
1 - Buffering
The entire file is read into an IFormFile, which is a C# representation of the file used to process or save the file.
The resources (disk, memory) used by file uploads depend on the number and size of concurrent file uploads. If an app attempts to buffer too many uploads, the site crashes when it runs out of memory or disk space. If the size or frequency of file uploads is exhausting app resources, use streaming.
Any single buffered file exceeding 64 KB is moved from memory to a temp file on disk.
Path.GetTempFileName throws an IOException if more than 65,535 files are created without deleting previous temporary files. The limit of
65,535 files is a per-server limit. For more information on this limit on Windows OS
2 - Streaming
The file is received from a multipart request and directly processed or saved by the app. Streaming doesn't improve performance significantly. Streaming reduces the demands for memory or disk space when uploading files.
for more details : Upload files in ASP.NET Core 5
I think this might help: Upload Large Files To MVC / WebAPI Using Partitioning
If you host your web application on IIS, because at the IIS level, you have a filter that does not allow you to upload such large files. You can unlock this filter directly in IIS. If you are using it, you also need to configure Kestrel. More information can be found here.
https://www.webdavsystem.com/server/documentation/large_files_iis_asp_net/
I need to somehow implement an ability to upload files through an ASP.NET application which is working within our corporate network. The problem is those files are getting increasingly big. At the moment we're using a very generic asynchronous upload but the problem is that files are getting increasingly big and the max limit of 3.9gb per file set through maxAllowedContentLength since the max value of uint won't allow anything more. Soon files which users are suppose to upload will exceed this value and might reach up to 100gb in size.
I tried looking online for some solution to this problem but in most articles by large files people mean 1gb at best.
So is there any way to upload really large files (up to 100g) through ASP.NET MVC\WebAPI application or I need to look for alternative solutions?
Yes there is, you need to split the file in smaller parts see the example here: http://forums.asp.net/t/1742612.aspx?How+to+upload+a+big+file+in+Mvc+
You could consider sending it in chunks. This would skip over the large file requirement (as each request would only be the size of the chunk you send), but is slightly more complicated on the client and server side.
I've done something similar for streaming uploaded files over a websocket, but this could easily be done with multiple ajax requests. In either case you'll want to use the JavaScript File API to read a segment of the file on the client's computer, encode that segment to something you can send (probably Base64), and send that particular segment to the web server. You could also send additional data such as file position to ensure the server is writing the file properly. The server can choose how to respond (can be as simple as a "true" to acknowledge receipt), after which the client javascript would read and send the next chunk of the file.
I have a demo of this using WebSockets on a github repo here (ASP.NET MVC server-side code here) but with a few tweaks you could easily make this into sequential AJAX requests.
I have a ServiceStack client that calls a service which returns an large data file of between 100MB to 10GB. Currently this client works perfectly over the LAN using the Stream.CopyTo method to save to file.
When the client is being run over a WAN (in addition to over a LAN) the ability to resume a download if it is stopped or loses a connection, as well as see the progress, has become important.
The service supports sending partial files but I am unclear on how to support this in the ServiceStack client or even if it is built in and I do not realise it.
Also the CopyTo method on the stream to copy to another stream does not report its progress so for a large file it can take hours and you can't tell its progress. I have seen posts about a CopyFrom method that reports progress but I can't find that anywhere on the classes I am using.
My environment is the latest version of ServiceStack and latest version of Mono on Mac OS X and .NET 4.5 on Windows.
Being able to do concurrent downloads on a file for different segments would be ideal but just been able to resume downloads would be a huge help.
Can someone point out the best way to do this in the ServiceStack client? Most examples I found are just for the normal web client and are very old so wanted to see what the best way is with the later version of dotnet and mono as well as with ServiceStack.
If possible a basic example would be perfect.
ServiceStack supports Partial Content Responses for services that return:
A Physical File
return new HttpResult(new FileInfo(filePath), request.MimeType);
A Memory Stream
return new HttpResult(ms, "audio/mpeg");
Raw Text
return new HttpResult(customText, "text/plain");
Static files served through ServiceStack
Partial Content is also available in static file downloads served directly through ServiceStack which lets you stream mp3 downloads or should you ever want to your static .html, .css, .js, etc.
Http Utils
See PartialContentResultTests.cs for examples of how to request partial downloads using ServiceStack's built-in HTTP Utils.
I need low level control over http file uploads so I can kill an upload early.
Background:
We have a application (an httphandler) that serves downloads of media. It is able to throttle the downloads and also it's able to embed some encrypted meta-data in the file as it is being served to the client.
Here's what I'm trying to do:
I'm trying to figure out how to do the reverse of what I wrote above. I want to give our users the ability to upload one of these files so that our app can extract the encrypted data, decrypt it, and present it to the user. The problem is that these files can be big (200+ MB). The metadata though, is at the beginning of the file and we're able to read it even if the file is just a fragment.
So, does anyone know how I can gain low level control over the file upload so I can get the first N bytes of the file and then kill the upload? I'm hoping this is similar to the code I wrote for throttling downloads. I'm open to doing this via an httphandler, the webapi, or any other way that would be easy.
I'm using ASP.NET for file uploads, and I'm hitting a bit of a snag in that data sent by the client as multipart form data is read straight into RAM.
Obviously, this means maximum file upload size is limited to the available RAM on the machine, and even then is much smaller as soon as you have multiple simultaneous uploads.
Is it possible to get ASP.NET to write the data to a temporary file on the hard drive as it is recieved rather than reading into RAM?
ASP.NET saves uploaded data to a temporary file upon reach of requestLengthDiskThreshold:
http://msdn.microsoft.com/en-us/library/system.web.configuration.httpruntimesection.requestlengthdiskthreshold.aspx
Personally I use NeatUpload http://neatupload.codeplex.com/. It is a custom HttpHandler that writes large uploads to disk. Maximum file size supported is 4GB.
Additionally, it displays a progress bar when the upload is in progress, so that large uploads will not seem hanged.