I am doing a .mp4 file download from Azure Blob storage and pushing that to the UI. The download works fine, the issue is it doesn't look like the headers are set correctly for content-length. Thus you cannot track the download time because the browser only says what has been downloaded and not how much is left and the estimated time. Is my code for the response wrong or should I change my request? My code as follows:
[HttpGet("VideoFileDownload")]
public IActionResult VideoFileDownloadAsync([FromQuery]int VideoId)
{
...code to get blob file
return new FileStreamResult(blob.OpenRead(), new MediaTypeHeaderValue("application/octet-stream")
}
I have played around with various request and response headers but it makes no difference.
The files are big and I know the old asp.net way of checking for range headers and then do a chunked stream but I want to use the new features in .net core which doesn't work as expected or maybe I just don't understand it thoroughly. Can somebody give me a working sample of a file download with asp.net core code.
If you have the file size, you can set the response's content length in the Response object just before returning the FileStreamResult, like this:
[HttpGet("VideoFileDownload")]
public IActionResult VideoFileDownloadAsync([FromQuery]int VideoId)
{
...code to get blob file
long myFileSize = blob.Length; // Or wherever it is you can get your file size from.
this.Response.ContentLength = myFileSize;
return new FileStreamResult(blob.OpenRead(), new MediaTypeHeaderValue("application/octet-stream")
}
Related
When I use Postman to try uploading a large file to my server (written in .NET Core 2.2), Postman immediately shows the HTTP Error 404.13 - Not Found error: The request filtering module is configured to deny a request that exceeds the request content length
But when I use my code to upload that large file, it gets stuck at the line to send the file.
My client code:
public async void TestUpload() {
StreamContent streamContent = new StreamContent(File.OpenRead("D:/Desktop/large.zip"));
streamContent.Headers.Add("Content-Disposition", "form-data; name=\"file\"; filename=\"large.zip\"");
MultipartFormDataContent multipartFormDataContent = new MultipartFormDataContent();
multipartFormDataContent.Add(streamContent);
HttpClient httpClient = new HttpClient();
Uri uri = new Uri("https://localhost:44334/api/user/testupload");
try {
HttpResponseMessage httpResponseMessage = await httpClient.PostAsync(uri, multipartFormDataContent);
bool success = httpResponseMessage.IsSuccessStatusCode;
}
catch (Exception ex) {
}
}
My server code:
[HttpPost, Route("testupload")]
public async Task UploadFile(IFormFileCollection formFileCollection) {
IFormFileCollection formFiles = Request.Form.Files;
foreach (var item in formFiles) {
using (var stream = new FileStream(Path.Combine("D:/Desktop/a", item.FileName), FileMode.Create)) {
await item.CopyToAsync(stream);
}
}
}
My client code gets stuck at the line HttpResponseMessage httpResponseMessage = await httpClient.PostAsync(uri, multipartFormDataContent), while the server doesn't receive any request (I use a breakpoint to ensure that).
It gets stuck longer if the file is bigger. Looking at Task Manager, I can see my client program uses up high CPU and Disk as it is actually uploading the file to the server. After a while, the code moves to the next line which is
bool success = httpResponseMessage.IsSuccessStatusCode
Then by reading the response content, I get exactly the result as of Postman.
Now I want to know how to immediately get the error to be able to notify the user in time, I don't want to wait really long.
Note that when I use Postman to upload large files, my server doesn't receive any request as well. I think I am missing something, maybe there is problem with my client code.
EDIT: Actually I think it is the client-side error. But if it is server-side error, then it still doesn't mean too much for me. Because, let me clear my thought. I want to create this little helper class that I can use across projects, maybe I can share it with my friends too. So I think it should be able, like Postman, to determine the error as soon as possible. If Postman can do, I can too.
EDIT2: It's weird that today I found out Postman does NOT know before hand whether the server accepts big requests, I uploaded a big file and I saw it actually sent the whole file to the server until it got the response. Now I don't believe in myself anymore, why I thought Postman knows ahead of time the error, I must be stupid. But it does mean that I have found a way to do my job even better than Postman, so I think this question might be useful for someone.
Your issue has nothing to do with your server-side C# code. Your request gets stuck because of what is happening between the client and the server (by "server" I mean IIS, Apache, Nginx..., not your server-side code).
In HTTP, most clients don't read response until they send all the request data. So, even if your server discovers that the request is too large and returns an error response, the client will not read that response until the server accepts the whole requests.
When it comes to server-side, you can check this question, but I think it would be more convenient to handle it on the client side, by checking the file size before sending it to the server (this is basically what Postman is doing in your case).
Now I am able to do what I wanted. But first I want to thank you #Marko Papic, your informations do help me in thinking about a way to do what I want.
What I am doing is:
First, create an empty ByteArrayContent request, with the ContentLength of the file I want to upload to the server.
Second, surround HttpResponseMessage = await HttpClient.SendAsync(HttpRequestMessage) in a try-catch block. The catch block catches HttpRequestException because I am sending a request with the length of the file but my actual content length is 0, so it will throw an HttpRequestException with the message Cannot close stream until all bytes are written.
If the code reaches the catch block, it means the server ALLOWS requests with the file size or bigger. If there is no exception and the code moves on to the next line, then if HttpResponseMessage.StatusCode is 404, it means the server DENIES requests bigger than the file size. The case when HttpResponseMessage.StatusCode is NOT 404 will never happen (I'm not sure about this one though).
My final code up to this point:
private async Task<bool> IsBigRequestAllowed() {
FileStream fileStream = File.Open("D:/Desktop/big.zip", FileMode.Open, FileAccess.Read, FileShare.Read);
if(fileStream.Length == 0) {
fileStream.Close();
return true;
}
HttpRequestMessage = new HttpRequestMessage();
HttpMethod = HttpMethod.Post;
HttpRequestMessage.Method = HttpMethod;
HttpRequestMessage.RequestUri = new Uri("https://localhost:55555/api/user/testupload");
HttpRequestMessage.Content = new ByteArrayContent(new byte[] { });
HttpRequestMessage.Content.Headers.ContentLength = fileStream.Length;
fileStream.Close();
try {
HttpResponseMessage = await HttpClient.SendAsync(HttpRequestMessage);
if (HttpResponseMessage.StatusCode == HttpStatusCode.NotFound) {
return false;
}
return true; // The code will never reach this line though
}
catch(HttpRequestException) {
return true;
}
}
NOTE: Note that my approach still has a problem. The problem with my code is the ContentLength property, it shouldn't be exact the length of the file, it should be bigger. For example, if my file is exactly 1000 bytes in length, then if the file is successfully uploaded to the server, the Request that the server gets has greater ContentLength value. Because HttpClient doesn't just only send the content of the file, but it has to send many informations in addition. It has to send the boundaries, content types, hyphens, line breaks, etc... Generally speaking, you should somehow find out before hand the exact bytes that HttpClient will send along with your files to make this approach work perfectly (I still don't know how so far, I'm running out of time. I will find out and update my answer later).
Now I am able to immediately determine ahead of time whether the server can accept requests that are as big as the file my user wants to upload.
I have been developing a OneDrive desktop client app because the one built into windows has been failing me for reasons I cannot figure out. I'm using the REST API in C# via an HttpClient.
All requests to the onedrive endpoint work fine (downloading, uploading small files, etc.) and uploading large files worked fine up until recently (about two days ago). I get the upload session URL and start uploading data to it, but after uploading two chunks to it successfully (202 response), the third request and beyond times out (via the HttpClient), whether it be a GET to get the status, or a PUT to upload data. The POST to create the session still works.
I have tried: getting a new ClientId, logging into a new Microsoft account, reverting code to a known working state, and recloning git repository.
In PostMan, I can go through the whole process of creating a session and uploading chunks and not experience this issue, but if I take an upload URL that my application retrieves from the OneDrive API and try to PUT data to it in PostMan, the server doesn't respond (unless the request is invalid, then it sometimes tells me). Subsequent GET requests to this URL also don't respond.
Here is a log of all requests going to the OneDrive API after authentication: https://pastebin.com/qRrw2Sb5
and here is the relevant code:
//first, create an upload session
var httpResponse = await _httpClient.StartAuthenticatedRequest(url, HttpMethod.Post).SendAsync(ct);
if (httpResponse.StatusCode != HttpStatusCode.OK)
{
return new HttpResult<IRemoteItemHandle>(httpResponse, null);
}
//get the upload URL
var uploadSessionRequestObject = await HttpClientHelper.ReadResponseAsJObjectAsync(httpResponse);
var uploadUrl = (string)uploadSessionRequestObject["uploadUrl"];
if (uploadUrl == null)
{
Debug.WriteLine("Successful OneDrive CreateSession request had invalid body!");
//TODO: what to do here?
}
//the length of the file total
var length = data.Length;
//setup the headers
var headers = new List<KeyValuePair<string, string>>()
{
new KeyValuePair<string, string>("Content-Length", ""),
new KeyValuePair<string, string>("Content-Range","")
};
JObject responseJObject;
//the response that will be returned
HttpResponseMessage response = null;
//get the chunks
List<Tuple<long, long>> chunks;
do
{
HttpResult<List<Tuple<long, long>>> chunksResult;
//get the chunks
do
{
chunksResult = await RetrieveLargeUploadChunksAsync(uploadUrl, _10MB, length, ct);
//TODO: should we delay on failure?
} while (chunksResult.Value == null);//keep trying to get thre results until we're successful
chunks = chunksResult.Value;
//upload each fragment
var chunkStream = new ChunkedReadStreamWrapper(data);
foreach (var fragment in chunks)
{
//setup the chunked stream with the next fragment
chunkStream.ChunkStart = fragment.Item1;
//the size is one more than the difference (because the range is inclusive)
chunkStream.ChunkSize = fragment.Item2 - fragment.Item1 + 1;
//setup the headers for this request
headers[0] = new KeyValuePair<string, string>("Content-Length", chunkStream.ChunkSize.ToString());
headers[1] = new KeyValuePair<string, string>("Content-Range", $"bytes {fragment.Item1}-{fragment.Item2}/{length}");
//submit the request until it is successful
do
{
//this should not be authenticated
response = await _httpClient.StartRequest(uploadUrl, HttpMethod.Put)
.SetContent(chunkStream)
.SetContentHeaders(headers)
.SendAsync(ct);
} while (!response.IsSuccessStatusCode); // keep retrying until success
}
//parse the response to see if there are more chunks or the final metadata
responseJObject = await HttpClientHelper.ReadResponseAsJObjectAsync(response);
//try to get chunks from the response to see if we need to retry anything
chunks = ParseLargeUploadChunks(responseJObject, _10MB, length);
}
while (chunks.Count > 0);//keep going until no chunks left
Everything does what the comments say or what the name suggests, but a lot of the methods/classes are my own, so i'd be happy to explain anything that might not be obvious.
I have absolutely no idea what's going on and would appreciate any help. I'm trying to get this done before I go back to school on Saturday and no longer have time to work on it.
EDIT: After waiting a while, requests can be made to the upload URL again via PostMan.
EDIT 2: I can no longer replicate this timeout phenomenon in Postman. Whether I get the upload URL from my application, or from another Postman request, and whether or not the upload has stalled in my application, I can seem to upload all the fragments I want to through Postman.
EDIT 3: This not-responding behavior starts before the content stream is read from.
Edit 4: Looking at packet info on WireShark, the first two chunks are almost identical, but only "resend" packets show up on the third.
So after 3 weeks of varying levels of testing, I have finally figured out the issue and it has almost nothing to do with the OneDrive Graph api. The issue was that when making the Http requests, I was using the HttpCompletionOption.ResponseHeadersRead but not reading the responses before sending the next one. This means that the HttpClient was preventing me from sending more requests until I read the responses from the old ones. It was strange because it allowed me to send 2 requests before locking up.
I have image processor app, for processing image and getting it from azure blob storage. In this moment my blob storage service is returning absolute url to image of blob return redirect to this url. For example:
[Route("/blob-storage/{imageName}")]
[HttpGet]
public async Task<IActionResult> GetImage(string imageName, ImageSize size)
{
var imageUrl = await this.ImageProcessorFacade.GetImageUrl(imageName, size);
return Redirect(imageUrl);
}
Now I want to caching this returned image. Yes, exists ResponseCache attribute, but It doesn't me work with redirect and I thing that it is bad way to solving this problem. For get image, I call for example this: http://localhost/blob-storage/test.jpeg?size...
And response is redirect to blob.windows.net/... etc.
Is there way, how to cache it?
Thank you for your time!
I want to caching this returned image. Yes, exists ResponseCache attribute, but It doesn't me work with redirect and I thing that it is bad way to solving this problem. For get image, I call for example this: http://localhost/blob-storage/test.jpeg?size... And response is redirect to blob.windows.net/... etc.
You call Redirect() method in your controller action to redirect to a specified URL, the HTTP Location header field will be returned in response. And then the user agent (e.g. a web browser) that is invited by a response with 3xx code will make a second request to the new URL specified in the location field, which happens on web browser side, you could not append or modify cache-related headers to response.
If you'd like to lower latency and faster delivery the images that stored in Azure blob storage, you could try to use the CDN.
First time posting! I've been breaking my head on this particular case. I've got a Web application that needs to upload a file towards a web-api and receive an SVG file (in a string) back.
The web-app uploads the file as follows:
using (var client = new WebClient())
{
var response = client.UploadFile(apiUrl, FileIGotEarlierInMyCode);
ViewBag.MessageTest = response.ToString();
}
Above works, but then we get to the API Part:
How do I access the uploaded file? Pseudocode:
public string Post([FromBody]File f)
{
File uploadedFile = f;
String svgString = ConvertDataToSVG(uploadedFile);
return s;
}
In other words: How do I upload/send an XML-file to my Web-api, use/manipulate it there and send other data back?
Thanks in advance!
Nick
PS: I tried this answer:
Accessing the exact data sent using WebClient.UploadData on the server
But my code did not compile on Request.InputStream.
The reason Request.InputStream didn't work for you is that the Request property can refer to different types of Request objects, depending on what kind of ASP.NET solution you are developing. There is:
HttpRequest, as available in Web Forms,
HttpRequestBase, as available in MVC Controllers
HttpRequestMessage, as available in Web API Controllers.
You are using Web API, so HttpRequestMessage it is. Here is how you read the raw request bytes using this class:
var data = Request.Content.ReadAsByteArrayAsync().Result;
I have a simple form that uploads an image to a database. Using a controller action, the image can then be served back (I've hard coded to use jpegs for this code):
public class ImagesController : Controller
{
[HttpPost]
public ActionResult Create(HttpPostedFileBase image)
{
var message = new MessageItem();
message.ImageData = new byte[image.ContentLength];
image.InputStream.Read(message.ImageData, 0, image.ContentLength);
this.session.Save(message);
return this.RedirectToAction("index");
}
[HttpGet]
public FileResult View(int id)
{
var message = this.session.Get<MessageItem>(id);
return this.File(message.ImageData, "image/jpeg");
}
}
This works great and directly browsing to the image (e.g. /images/view/1) displays the image correctly. However, I noticed that when FireBug is turned on, I'm greeted with a lovely error:
Image corrupt or truncated: data:image/jpeg;base64,/f39... (followed by the base64 representation of the image).
Additionally in Chrome developer tools:
Resource interpreted as Document but transferred with MIME type image/jpeg.
I checked the headers that are being returned. The following is an example of the headers sent back to the browser. Nothing looks out of the ordinary (perhaps the Cache-Control?):
Cache-Control private, s-maxage=0
Content-Type image/jpeg
Server Microsoft-IIS/7.5
X-AspNetMvc-Version 3.0
X-AspNet-Version 4.0.30319
X-SourceFiles =?UTF-8?B?(Trimmed...)
X-Powered-By ASP.NET
Date Wed, 25 May 2011 23:48:22 GMT
Content-Length 21362
Additionally, I thought I'd mention that I'm running this on IIS Express (even tested on Cassini with the same results).
The odd part is that the image displays correctly but the consoles are telling me otherwise. Ideally I'd like to not ignore these errors. Finally, to further add to the confusion, when referenced as an image (e.g. <img src="/images/view/1" />), no error occurs.
EDIT: It is possible to fully reproduce this without any of the above actions:
public class ImageController : Controller
{
public FileResult Test()
{
// I know this is directly reading from a file, but the whole purpose is
// to return a *buffer* of a file and not the *path* to the file.
// This will throw the error in FireBug.
var buffer = System.IO.File.ReadAllBytes("PATH_TO_JPEG");
return this.File(buffer, "image/jpeg");
}
}
You're assuming the MIME type is always image/jpeg, and your're not using the MIME type of the uploaded image. I've seen this MIME types posted by different browsers for uploaded images:
image/gif
image/jpeg
image/pjpeg
image/png
image/x-png
image/bmp
image/tiff
Maybe image/jpeg is not the correct MIME type for the file and the dev tools are giving you a warning.
Could it be that the session.Save/Get is truncating the jpeg?
Use Fiddler and save this file on the server. Attempt a GET request directly to the image. Then attempt the GET to the action method. Compare fiddler's headers and content (can save out and compare with a trial of BeyondCompare). If they match for both get requests - well.. that wouldn't make sense - something would be different in that case and hopefully point to the issue. Something has to be different - but without seeing the fiddler output its hard to say : )
Could it possibly be that the image itself is corrupt? If you save it as a file on your website and access it directly does the error come up? How does that request look compared to your action request in Fiddler? It could be the browsers are trying to get the content type by extension, you could try a route like this to see if there are any changes:
routes.MapRoute(
"JpegImages",
"Images/View/{id}.jpg",
new { controller = "Images", action = "View" }
);
One more thing to check. image.InputStream.Read() returns an integer which is the actual number of bytes read. It may be that all the bytes aren't able to be read at once, can you record that and throw an error if the numbers don't match?
int bytesRead = image.InputStream.Read(message.ImageData, 0, image.ContentLength);
if (bytesRead != image.ContentLength)
throw new Exception("Invalid length");
I wonder if it is something to do with X-SourceFiles. I'm doing the exact same thing as you with MVC but I am persisting my byte array in the database. The only difference I don't understand in our headers is the X-SourceFiles.
Here is something about what X-SourceFiles does What does the X-SourceFiles header do? and it talks about encoding. So maybe?? The answerer claims this only happens on your local host by the way.
As far as I understand it if you are returning a proper byte array that is a jpeg then your code should work fine. That is exactly what I'm doing successfully (without an X-SourceFiles header).
Thanks everyone for all the help. I know this is is going to be a very anti-climatic ending for this problem, but I was able to "resolve" the issue. I tried building my code from another machine using the same browser/firebug versions. Oddly enough, no errors appeared. When I went back to the other machine (cleared all cache and even re-installed browser/firebug) it was still getting the error. What's even more weird is that both Chrome/Firefox are now showing the error when I visit other websites.
Again, thanks everyone for all their suggestions!