Get file stream without load it into memory - c#

How is it possible to get the file that the user is uploading without load it into memory?
This row make the memory go up var request = await httpContent.ReadAsMultipartAsync();
The client can only upload one file so is there a way to get the stream without go trough ReadAsMultipartAsync?
var request = await httpContent.ReadAsMultipartAsync();
var content = request.Contents.FirstOrDefault();
var stream = await content.ReadAsStreamAsync();
I only want the stream of the content. If I get the stream like this:
var stream = await httpContent.ReadAsStreamAsync();
Then it give a file that looks like this:
-----------------------------224143682423505141523038143258
Content-Disposition: form-data; name="file"; filename="test.txt"
Content-Type: text/plain
my dummy content
-----------------------------224143682423505141523038143258--

Uploading file to an asp.net webservice always bloats the memory because that's the way your request pipeline works. There is an alternative though, when you are using a cloud service for example. In that case you can use the valet key pattern to upload a file immediately to a storage solution. This prevents your service from having to deal with the payload.
If you do want your service to handle the request, there's no way of doing to without consuming memory.

Related

Losing content-type when persisting to Azure blob storage

I've got an Azure function which persists files sent in the request to blob storage. For some reason -- even though the files have a content-type -- they're being persisted with the application/octet-stream content-type, rather than say, image/jpeg.
var file = req.Form.Files["File"];
var blobClient = new BlobContainerClient(connection, containerName);
var blob = blobClient.GetBlobClient(fileName);
await blob.UploadAsync(file.OpenReadStream());
Any ideas why this is happening?
The file, when uploaded, will not naturally figure out the content type (or keep the existing one) - it needs to be manually set.
If you don't manually set it at a blob level they will upload as a application/octet-stream as you've found.
Luckily it's quite simple to do, access the properties and set it directly like:
blob.Properties.ContentType = "video/mp4";

Stream file upload from client directly to s3 through ASP.net-core2 app

Background: A user has a file to upload paired with some other multiform text data. This is POST'd to our ASP.net core 2 application which then has some logic built around the successful download of the said file.
I know I can generate a pre-signed S3 URL for the client to consume but we are looking to avoid that.
The problem I have is being unable to directly stream the file to S3. We can save the file locally and then stream that file to S3 (successfully) but we want to avoid this as we will have scaling issues. We want to minimize the caching of data to stream to S3 if possible, as well as not save any files locally to disc for upload.
I've tried multiple flavours of
TransferUtility.UploadAsync(stream, ...)
IAmazonS3.UploadPartAsync(...)
IAmazonS3.UploadObjectFromStreamAsync(...)
https://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
and a few other ways which I can no longer remember.
This is one (if not the major) problem I've come across in most attempts https://github.com/aws/aws-sdk-net/issues/675
How can I use the HTTP stream provided to upload said data to S3?
This is how I'm handling my file stream in my resultscontroller:
[HttpPost("myupload")]
public async Task<IActionResult> TestBlobAsync()
{
result = await this._internMethod(Request);
}
I'm testing my software using the following POSTMAN HTTP profile
POST /api/result/myupload HTTP/1.1
Host: localhost:62788
Content-Type: multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW
Cache-Control: no-cache
Postman-Token: a17bef73-6015-c547-6f41-bc47274cb679
------WebKitFormBoundary7MA4YWxkTrZu0gW
Content-Disposition: form-data; name="result"; filename="dummy-35MB.bin"
Content-Type: application/octet-stream
------WebKitFormBoundary7MA4YWxkTrZu0gW
Content-Disposition: form-data; name="value1"
2
------WebKitFormBoundary7MA4YWxkTrZu0gW
Content-Disposition: form-data; name="value2"
8
------WebKitFormBoundary7MA4YWxkTrZu0gW--
Please let me know if I can provide any other details.
My solutuion was to change
[HttpPost("myupload")]
public async Task<IActionResult> TestBlobAsync()
{
result = await this._internMethod(Request);
}
to
[HttpPost("myupload")]
public async Task<IActionResult> TestBlobAsync(IFileForm file)
{
...
}
I am now able to use Stream stream = file.OpenReadStream() and pass it to TransferUtility.UploadAsync(stream, ...) and successfully get around the lack of content-length issues I was having leveraging the Request object manually.

ASP.NET Core Response headers on a FileStreamResult

I am doing a .mp4 file download from Azure Blob storage and pushing that to the UI. The download works fine, the issue is it doesn't look like the headers are set correctly for content-length. Thus you cannot track the download time because the browser only says what has been downloaded and not how much is left and the estimated time. Is my code for the response wrong or should I change my request? My code as follows:
[HttpGet("VideoFileDownload")]
public IActionResult VideoFileDownloadAsync([FromQuery]int VideoId)
{
...code to get blob file
return new FileStreamResult(blob.OpenRead(), new MediaTypeHeaderValue("application/octet-stream")
}
I have played around with various request and response headers but it makes no difference.
The files are big and I know the old asp.net way of checking for range headers and then do a chunked stream but I want to use the new features in .net core which doesn't work as expected or maybe I just don't understand it thoroughly. Can somebody give me a working sample of a file download with asp.net core code.
If you have the file size, you can set the response's content length in the Response object just before returning the FileStreamResult, like this:
[HttpGet("VideoFileDownload")]
public IActionResult VideoFileDownloadAsync([FromQuery]int VideoId)
{
...code to get blob file
long myFileSize = blob.Length; // Or wherever it is you can get your file size from.
this.Response.ContentLength = myFileSize;
return new FileStreamResult(blob.OpenRead(), new MediaTypeHeaderValue("application/octet-stream")
}

Artifactory REST Deploy screwing up upload

I am trying to deploy artifacts to artifactory using their REST API, however all my files end up having
-------------------------------28947758029299 Content-Disposition: form-data; name="test.txt"; filename="new2.txt" Content-Type:
application/octet-stream
appended to the file. Here is my code (keep in mind this is only me testing the concept...the code will be cleaned after I get a success)
var uriString = "artifactoryuri";
var uri = new Uri(uriString);
var credentialCache = new CredentialCache{{uri, "Basic",new NetworkCredential("UN", "PW")}};
var restClient = new RestClient(uriString);
var restRequest = new RestRequest(Method.PUT){Credentials = credentialCache};
restRequest.AddFile("test.txt", #"pathto\new2.txt");
var restResponse = restClient.Execute(restRequest);
How can I fix this? Is it because it is a text file and artifactory tends to store executables and such? If so, I can live with that. This will be used to upload chm files currently.
This is caused by the AddFile method - RestSharp will create a multipart/form request by default. I could not find a good solution for preventing this behavior, although many people ask about it. You can take a look at Can RestSharp send binary data without using a multipart content type? and PUT method and send raw bytes

Storing an image from a webservice

I am accessing an API that returns a favicon for a specified domain (http://getfavicon.appspot.com/). I have a long list of domains that I want to get Icons for and don't want to make the call to the web service every time, so I figured I would get the response and store the image either on the file system or in a DB Blob.
However. I don't know how to get something meaningful from the response stream that comes back from the service.
byte[] buf = new byte[8192];
var request = (HttpWebRequest)WebRequest.Create("http://getfavicon.appspot.com/http://stackoverflow.com");
var response = (HttpWebResponse)request.GetResponse();
var resStream = response.GetResponseStream();
I've got as far as here to get a response back, but how would I can I treat this as something I can save to a SQL DB or out to the filesystem?
Am I missing something simple?
Thanks
If you use the System.Net.WebClient class, you can do this a little easier.
This will download the URL and save it to a local file:
var client = new System.Net.WebClient();
client.DownloadFile(
// Url to download
#"http://getfavicon.appspot.com/http://stackoverflow.com",
// Filename of where to save the downloaded content
"stackoverflow.com.ico");
If you want a byte[] instead, use the DownloadData method.

Categories

Resources