Uploading Large FIles (Over 4GB) ASP .NET CORE 5 - c#

Been following the file upload samples MS offer, and various other examples on here; and after many days of trying to get this to work I am stuck.
I need to be able to upload files up to 10GB - I am using the streaming physical method. I have changed the Request Size limit. I am using IIS, so I have turned off Request Filtering to get a file over 4GB to be accepted. But any file over 4GB I choose, the controller hits and then errors with Unexpected end of stream. I have the DisableFormBinding attribute, I have tried enabling buffering, I have tried ignoring the AntiForgeryToken - I am out of ideas.
Is a file over 4GB impossible to do via streaming, do I need to use an older chunking method?

If you have files that large, never use byte[] or MemoryStream in your code. Only operate on streams if you download/upload files.
ASP.NET Core supports uploading one or more files using buffered model binding for smaller files and unbuffered streaming for larger
files.
File upload scenarios
Two general approaches for uploading files are buffering and streaming.
1 - Buffering
The entire file is read into an IFormFile, which is a C# representation of the file used to process or save the file.
The resources (disk, memory) used by file uploads depend on the number and size of concurrent file uploads. If an app attempts to buffer too many uploads, the site crashes when it runs out of memory or disk space. If the size or frequency of file uploads is exhausting app resources, use streaming.
Any single buffered file exceeding 64 KB is moved from memory to a temp file on disk.
Path.GetTempFileName throws an IOException if more than 65,535 files are created without deleting previous temporary files. The limit of
65,535 files is a per-server limit. For more information on this limit on Windows OS
2 - Streaming
The file is received from a multipart request and directly processed or saved by the app. Streaming doesn't improve performance significantly. Streaming reduces the demands for memory or disk space when uploading files.
for more details : Upload files in ASP.NET Core 5
I think this might help: Upload Large Files To MVC / WebAPI Using Partitioning

If you host your web application on IIS, because at the IIS level, you have a filter that does not allow you to upload such large files. You can unlock this filter directly in IIS. If you are using it, you also need to configure Kestrel. More information can be found here.
https://www.webdavsystem.com/server/documentation/large_files_iis_asp_net/

Related

How to upload large files to GCP Cloud Storage using .Net client library

I need to upload large files to Google Cloud Storage bucket from .Net Client library, but I can't find any method for multi part upload.
However, I see there is a way to upload large files using Parallel composite uploads. But looks like it's available only for command line tools and JSON and XML apis.
Is there any implementation for the same in the .Net client or any other alternative?
You can also use resumable upload methods for JSON and XML APIs and also for client libraries like node.js, C++,Go,Java,PHP, Python and Ruby.
As per comment Upload size specifications :
You can upload and store any MIME type of data up to 5 TiB and If you upload from an in-region service that averages 500 Mbps for its upload speed, the cutoff size for files is almost 2 GB.

Truly unbuffered file uploads in ASP.NET Core?

The ASP.NET Core documentation on file uploading talks about the option to upload files in an unbuffered way using MultipartReader, however as I understand it, this only reads each MultipartSection in an unbuffered way - it has to read a whole MultipartSection before your code gets called with the whole section being buffered in memory (I have confirmed this by logging, and my logging only gets called once the HTTP request has finished posting). If you're uploading a large file as one MultipartSection this isn't terribly helpful.
Is there a way to do truly unbuffered uploading in ASP.NET Core (or at least, have the buffer be something small like 32kb)? As the data comes in from the client, it would be made available to my code to stream out to disk or uploaded somewhere else over a fast network connection?
it has to read a whole MultipartSection before your code gets called with the whole section being buffered in memory (I have confirmed this by logging, and my logging only gets called once the HTTP request has finished posting). If you're uploading a large file as one MultipartSection this isn't terribly helpful.
The code gets called once after HTTP request has finished posting. This is expected, but it doesn't means Asp.net core will buffer the data.
As the document mentioned that there are two general approaches for uploading files buffering and streaming in ASP.Net core. For the buffering approach, the entire file is read into an IFormFile. And for the streaming, there is no additional object created by the Asp.net core, it read the content directly from HttpContext.Request.Body, instead of created the IFormFile by Asp.NET Core.
And the goal of streaming approach is reducing the demands for memory or disk space when uploading files as it states below:
The file is received from a multipart request and directly processed or saved by the app. Streaming doesn't improve performance significantly. Streaming reduces the demands for memory or disk space when uploading files.

Uploading a large file (up to 100gb) through ASP.NET application

I need to somehow implement an ability to upload files through an ASP.NET application which is working within our corporate network. The problem is those files are getting increasingly big. At the moment we're using a very generic asynchronous upload but the problem is that files are getting increasingly big and the max limit of 3.9gb per file set through maxAllowedContentLength since the max value of uint won't allow anything more. Soon files which users are suppose to upload will exceed this value and might reach up to 100gb in size.
I tried looking online for some solution to this problem but in most articles by large files people mean 1gb at best.
So is there any way to upload really large files (up to 100g) through ASP.NET MVC\WebAPI application or I need to look for alternative solutions?
Yes there is, you need to split the file in smaller parts see the example here: http://forums.asp.net/t/1742612.aspx?How+to+upload+a+big+file+in+Mvc+
You could consider sending it in chunks. This would skip over the large file requirement (as each request would only be the size of the chunk you send), but is slightly more complicated on the client and server side.
I've done something similar for streaming uploaded files over a websocket, but this could easily be done with multiple ajax requests. In either case you'll want to use the JavaScript File API to read a segment of the file on the client's computer, encode that segment to something you can send (probably Base64), and send that particular segment to the web server. You could also send additional data such as file position to ensure the server is writing the file properly. The server can choose how to respond (can be as simple as a "true" to acknowledge receipt), after which the client javascript would read and send the next chunk of the file.
I have a demo of this using WebSockets on a github repo here (ASP.NET MVC server-side code here) but with a few tweaks you could easily make this into sequential AJAX requests.

Throttle or kill large file uploads in asp.net (httphandler, webapi, or mvc controller)

I need low level control over http file uploads so I can kill an upload early.
Background:
We have a application (an httphandler) that serves downloads of media. It is able to throttle the downloads and also it's able to embed some encrypted meta-data in the file as it is being served to the client.
Here's what I'm trying to do:
I'm trying to figure out how to do the reverse of what I wrote above. I want to give our users the ability to upload one of these files so that our app can extract the encrypted data, decrypt it, and present it to the user. The problem is that these files can be big (200+ MB). The metadata though, is at the beginning of the file and we're able to read it even if the file is just a fragment.
So, does anyone know how I can gain low level control over the file upload so I can get the first N bytes of the file and then kill the upload? I'm hoping this is similar to the code I wrote for throttling downloads. I'm open to doing this via an httphandler, the webapi, or any other way that would be easy.

Multipart File Uploads in ASP.NET, can the data be written to disk rather than read into RAM?

I'm using ASP.NET for file uploads, and I'm hitting a bit of a snag in that data sent by the client as multipart form data is read straight into RAM.
Obviously, this means maximum file upload size is limited to the available RAM on the machine, and even then is much smaller as soon as you have multiple simultaneous uploads.
Is it possible to get ASP.NET to write the data to a temporary file on the hard drive as it is recieved rather than reading into RAM?
ASP.NET saves uploaded data to a temporary file upon reach of requestLengthDiskThreshold:
http://msdn.microsoft.com/en-us/library/system.web.configuration.httpruntimesection.requestlengthdiskthreshold.aspx
Personally I use NeatUpload http://neatupload.codeplex.com/. It is a custom HttpHandler that writes large uploads to disk. Maximum file size supported is 4GB.
Additionally, it displays a progress bar when the upload is in progress, so that large uploads will not seem hanged.

Categories

Resources