I am trying to send a file to a WebAPI controller that does some processing with the file on a server. Everything seems to work well until I tried files that are large than 2mb... files large than this seem to be throwing an odd exception.
Here is the snippet:
var progress = new ProgressMessageHandler();
progress.HttpSendProgress += ProgressEventHandler;
HttpClient client = HttpClientFactory.Create(progress);
client.Timeout = TimeSpan.FromMinutes(20);
try
{
using (
var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 1024,
useAsync: true))
{
var content = new StreamContent(fileStream, 1024);
var address = new Uri(string.Format("{0}api/File/Upload?submittalId={1}&fileName={2}&documentTypeId={3}", FileServiceUri, tabTag.submittalId, Path.GetFileName(file), documentTypeId));
client.MaxResponseContentBufferSize = 2147483647;
var response = await client.PostAsync(address, content);
var result = response.Content.ReadAsAsync<object>();
if (!response.IsSuccessStatusCode)
continue;
}
The exception is thrown on the line:
var response = await client.PostAsync(address, content);
and is:
No MediaTypeFormatter is available to read an object of type 'Object' from content with media type 'text/html'
It's not even hitting the breakpoint at the beginning of my service controller so I didnt include that code(although I can if thats potentially an issue). As I said above, this ONLY happens with files > 2mb -- small files work just fine(thank god so I have something to show for a demo ^^).
Anyhelp with this would be greatly appreciated.
Cory's observation is right that Web API doesn't have a in-built formatter to either serialize or deserialize text/html content. My guess is that you are most probably getting an error response in html. If its indeed that, you can do the following:
When uploading files to a IIS hosted Web API application, you need to take care of the following stuff.
You need to look for the following 2 settings in Web.config to increase the upload size:
NOTE(maxRequestLength="size in Kilo bytes"):
<system.web> <httpRuntime targetFramework="4.5" maxQueryStringLength="" maxRequestLength="" maxUrlLength="" />
NOTE(maxAllowedContentLength is in bytes):
<system.webServer> <security> <requestFiltering> <requestLimits maxAllowedContentLength="" maxQueryString="" maxUrl=""/>
Also note that the default buffer policy of Web API in IIS hosted scenarios is buffered, so if you are uploading huge files, your request would be consuming lot of memory. To prevent that you can change the policy like the following:
config.Services.Replace(typeof(IHostBufferPolicySelector), new CustomBufferPolicySelector());
//---------------
public class CustomBufferPolicySelector : WebHostBufferPolicySelector
{
public override bool UseBufferedInputStream(object hostContext)
{
return false;
}
}
The response is coming back with a text/html Content-Type, and ReadAsAsync<object>() doesn't know how to deserialize text/html into an object.
Likely, your web app is configured to only accept files up to a certain size and is returning an error with a friendly HTML message. You should be checking the response code before trying to deserialize the content:
var response = await client.PostAsync(address, content);
response.EnsureSuccessStatusCode();
Related
Hello something strange happens to me with asp.net. The app is supported to receive up to 51mb and it works fine when receiving file requests, however I am trying to send the same file through an external rest api but there I am getting the error "Request Entity Too Large". The strange thing is that when I send the same file via postman, the destination server accepts the document and responds fine, I only have the problem when sending the file from my application to the external api.
maxRequestLength:
<httpRuntime maxRequestLength="51200" targetFramework="4.7.2" enableVersionHeader="false" />
maxAllowedContentLength:
<security>
<requestFiltering removeServerHeader="true" >
<requestLimits maxAllowedContentLength="51200" />
</requestFiltering>
</security>
c# code
public async Task Send(ApiRequest request)
{
using (var client = new HttpClient())
{
string url = GetUrl();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var req= new HttpRequestMessage(HttpMethod.Post, url);
var stream =new MemoryStream(Encoding.UTF8.GetBytes(request.base64doc));
request.base64doc = null;
StringContent payloadContent = new StringContent(JsonConvert.SerializeObject(request), Encoding.UTF8, "application/json");
var content = new MultipartFormDataContent()
{
{ new StreamContent(stream), "documentName", request.filename,
{ payloadContent, "data" }
};
req.Content = content;
using (HttpResponseMessage response = await client.SendAsync(req))
{
//gets "request entity too large"
}
}
}
my doubt is because the postman lets me send the document and my application does not let me
Do I have to find out when the maximum allowed by the external API is?
I had a similar issue. I was able to send requests using Postman and not from my application.
The fixture was to update limits on the server side.
Probably request size was bigger when it was sent from the application or nginx was treating requests differently. Unfortunately, I don't have the details regarding server's nginx because it is served by a separate company.
I am trying to upload up to 2GB of data using a HttpClient.
The data is sent through the request body into an aspx page where it is read (horrible, I know but I cannot change this.)
The data is placed in a MultipartFormDataContent and posted like this:
var apiRequest = new MultipartFormDataContent();
apiRequest.Add(new StreamContent(file), fileName, fileName);
apiRequest.Headers.ContentDisposition =
new ContentDispositionHeaderValue("form-data") { Name = fileName, FileName = fileName };
HttpClient client = new HttpClient();
client.Timeout = TimeSpan.FromMinutes(30);
HttpResponseMessage response = null;
try
{
response = client.PostAsync(apiEndPoint, form).Result;
response.EnsureSuccessStatusCode();
string responseBody = response.Content.ReadAsStringAsync().Result;
}
catch (HttpRequestException e)
{
log.LogError($"logging here");
}
Things I have tried :
-HttpClient http version 1.0 instead of default
-HttpClient MaxRequestContentBufferSize
-web.config maxAllowedContentLength
-web.config AspMaxRequestEntityAllowed
-web.config maxRequestLength
Currently, the files get added to the httpClient correctly but I cannot get them to post to the web app. I got up to 900MB through but anything over that simply redirects to the main page and I get HTML from the web app in the response body.
Thanks in advance!
After a lot of hair pulling we found the solution by looking at the IIS logs.
The logs revealed that the Microsoft URLScan tool was blocking the requests.
When the request body was not approved by the scan, IIS would redirect you straight to the main page with no error.
You have to configure a max request length in the urlscan.ini file.
More info here: https://ajaxuploader.com/large-file-upload-iis-debug.htm
The file is located at C:\WINDOWS\system32\inetsrv\urlscan
I have a file upload function that takes a multipart form data request and puts the file in AWS. IT's working fine for small files, but fails when the file gets very large. (haven't tested exactly how big, but we want to be able to handle any size).
Here's the Linqpad script I'm using to test upload:
const string filePath = #"C:\Users\josh.bowdish\Pictures\SmallFile.png";
const string contentType = "image/png";
//const string filePath = #"C:\Users\josh.bowdish\Pictures\ReallyBigFile.mp4";
//const string contentType = "video/mp4";
using (var httpClient = new HttpClient())
{
MultipartFormDataContent form = new MultipartFormDataContent();
var bytes = File.ReadAllBytes(filePath);
form.Add(new ByteArrayContent(bytes)
{
Headers =
{
ContentLength = bytes.Length,
ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue(contentType)
}
},"notused","not.used");
using (var response = await httpClient.PostAsync("http://localhost:52655/api/Storage/UploadAttachment", form))
{
response.EnsureSuccessStatusCode();
response.Content.Dump();
}
}
The SmallFile.png uploads just fine, but ReallyBigFile.mp4 doesn't even hit my local service. It does get me a 404 error though, which doesn't make sense to me because the Service endpoint doesn't change.
I'd post the receiving method's code (api/Storage/UploadAttachment), but attempting to call it with the large file doesn't even hit my first breakpoint just inside the method call.
My googling efforts haven't returned very much of use as of yet. Any guidance would be much appreciated! Please let me know if there's anything else I can provide to help figure it out!
Thanks,
~Josh
If you're getting a 404 on larger files but not smaller files, ensure your web config settings allows for large files:
<!-- maxRequestLength in kilobytes -->
<system.web>
<httpRuntime maxRequestLength="1000000" />
</system.web>
<!-- maxAllowedContentLength in bytes -->
<system.WebServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="1000000000" />
</requestFiltering>
</security>
</system.WebServer>
When I use Postman to try uploading a large file to my server (written in .NET Core 2.2), Postman immediately shows the HTTP Error 404.13 - Not Found error: The request filtering module is configured to deny a request that exceeds the request content length
But when I use my code to upload that large file, it gets stuck at the line to send the file.
My client code:
public async void TestUpload() {
StreamContent streamContent = new StreamContent(File.OpenRead("D:/Desktop/large.zip"));
streamContent.Headers.Add("Content-Disposition", "form-data; name=\"file\"; filename=\"large.zip\"");
MultipartFormDataContent multipartFormDataContent = new MultipartFormDataContent();
multipartFormDataContent.Add(streamContent);
HttpClient httpClient = new HttpClient();
Uri uri = new Uri("https://localhost:44334/api/user/testupload");
try {
HttpResponseMessage httpResponseMessage = await httpClient.PostAsync(uri, multipartFormDataContent);
bool success = httpResponseMessage.IsSuccessStatusCode;
}
catch (Exception ex) {
}
}
My server code:
[HttpPost, Route("testupload")]
public async Task UploadFile(IFormFileCollection formFileCollection) {
IFormFileCollection formFiles = Request.Form.Files;
foreach (var item in formFiles) {
using (var stream = new FileStream(Path.Combine("D:/Desktop/a", item.FileName), FileMode.Create)) {
await item.CopyToAsync(stream);
}
}
}
My client code gets stuck at the line HttpResponseMessage httpResponseMessage = await httpClient.PostAsync(uri, multipartFormDataContent), while the server doesn't receive any request (I use a breakpoint to ensure that).
It gets stuck longer if the file is bigger. Looking at Task Manager, I can see my client program uses up high CPU and Disk as it is actually uploading the file to the server. After a while, the code moves to the next line which is
bool success = httpResponseMessage.IsSuccessStatusCode
Then by reading the response content, I get exactly the result as of Postman.
Now I want to know how to immediately get the error to be able to notify the user in time, I don't want to wait really long.
Note that when I use Postman to upload large files, my server doesn't receive any request as well. I think I am missing something, maybe there is problem with my client code.
EDIT: Actually I think it is the client-side error. But if it is server-side error, then it still doesn't mean too much for me. Because, let me clear my thought. I want to create this little helper class that I can use across projects, maybe I can share it with my friends too. So I think it should be able, like Postman, to determine the error as soon as possible. If Postman can do, I can too.
EDIT2: It's weird that today I found out Postman does NOT know before hand whether the server accepts big requests, I uploaded a big file and I saw it actually sent the whole file to the server until it got the response. Now I don't believe in myself anymore, why I thought Postman knows ahead of time the error, I must be stupid. But it does mean that I have found a way to do my job even better than Postman, so I think this question might be useful for someone.
Your issue has nothing to do with your server-side C# code. Your request gets stuck because of what is happening between the client and the server (by "server" I mean IIS, Apache, Nginx..., not your server-side code).
In HTTP, most clients don't read response until they send all the request data. So, even if your server discovers that the request is too large and returns an error response, the client will not read that response until the server accepts the whole requests.
When it comes to server-side, you can check this question, but I think it would be more convenient to handle it on the client side, by checking the file size before sending it to the server (this is basically what Postman is doing in your case).
Now I am able to do what I wanted. But first I want to thank you #Marko Papic, your informations do help me in thinking about a way to do what I want.
What I am doing is:
First, create an empty ByteArrayContent request, with the ContentLength of the file I want to upload to the server.
Second, surround HttpResponseMessage = await HttpClient.SendAsync(HttpRequestMessage) in a try-catch block. The catch block catches HttpRequestException because I am sending a request with the length of the file but my actual content length is 0, so it will throw an HttpRequestException with the message Cannot close stream until all bytes are written.
If the code reaches the catch block, it means the server ALLOWS requests with the file size or bigger. If there is no exception and the code moves on to the next line, then if HttpResponseMessage.StatusCode is 404, it means the server DENIES requests bigger than the file size. The case when HttpResponseMessage.StatusCode is NOT 404 will never happen (I'm not sure about this one though).
My final code up to this point:
private async Task<bool> IsBigRequestAllowed() {
FileStream fileStream = File.Open("D:/Desktop/big.zip", FileMode.Open, FileAccess.Read, FileShare.Read);
if(fileStream.Length == 0) {
fileStream.Close();
return true;
}
HttpRequestMessage = new HttpRequestMessage();
HttpMethod = HttpMethod.Post;
HttpRequestMessage.Method = HttpMethod;
HttpRequestMessage.RequestUri = new Uri("https://localhost:55555/api/user/testupload");
HttpRequestMessage.Content = new ByteArrayContent(new byte[] { });
HttpRequestMessage.Content.Headers.ContentLength = fileStream.Length;
fileStream.Close();
try {
HttpResponseMessage = await HttpClient.SendAsync(HttpRequestMessage);
if (HttpResponseMessage.StatusCode == HttpStatusCode.NotFound) {
return false;
}
return true; // The code will never reach this line though
}
catch(HttpRequestException) {
return true;
}
}
NOTE: Note that my approach still has a problem. The problem with my code is the ContentLength property, it shouldn't be exact the length of the file, it should be bigger. For example, if my file is exactly 1000 bytes in length, then if the file is successfully uploaded to the server, the Request that the server gets has greater ContentLength value. Because HttpClient doesn't just only send the content of the file, but it has to send many informations in addition. It has to send the boundaries, content types, hyphens, line breaks, etc... Generally speaking, you should somehow find out before hand the exact bytes that HttpClient will send along with your files to make this approach work perfectly (I still don't know how so far, I'm running out of time. I will find out and update my answer later).
Now I am able to immediately determine ahead of time whether the server can accept requests that are as big as the file my user wants to upload.
I have WCF service that I want to make a post request to with some parameters and it will return me a file. The service is ok, I tested it using curl. The file is about 20 MB. I know that BackgroundDownloader is made for such large files but it does not support post requests.
My code is as follows:
var requestBody = "my parameters ...";
var handler = new HttpClientHandler { UseDefaultCredentials = true, AllowAutoRedirect = false };
var client = new HttpClient(handler);
HttpContent httpContent = new StringContent(requestBody, Encoding.UTF8, "application/json");
httpContent.Headers.ContentType = new MediaTypeHeaderValue("application/x-www-form-urlencoded");
HttpResponseMessage response = await client.PostAsync("the url...", httpContent);
response.EnsureSuccessStatusCode();
var stream = await response.Content.ReadAsStreamAsync();
///some code to store the stream to a file
The problem is that the code bever gets to the ReadAsStreamAsync part, it always fails with A task was canceled exception.
I use similiar code to download strings from that service (just using ReadAsStringAsync instead of ReadAsStreamAsync) and it works fine.
What is the problem? Or what is the proper way to do this?
You should reconsider the use of BackgroundDownloader. A background download will continue even if your app suspends. In a WinRT application, you can expect suspension to happen all the time. Forcing the user to keep your app running while downloading is not a good idea.
You can configure a WCF service to accept GET requests by setting the serviceMetadata element in web.config or the WebGet or WebInvoke attributes in code. Check Download File using WCF Rest Service for an example that uses the WebGet attribute and returns a Stream object.
Concerning your original question, you should create a proper WCF proxy as described in Accessing WCF Services with a Windows Store Client App. Web service calls require a lot of header and body setup beyond setting the media type. The proxy does all that for you.
Just don't do it for downloading files.
It looks like the problem was the file size. I solved it by setting a high Timeout and MaxResponseContentBufferSize on the HttpClient