I have a webapi (asp.net core) that receive a file and post to another webAPI
for now, I create a FileStream , and using HttpClient to Post this file.
But I wonder is that a two way Stream that can replace the FileStream, I mean a Stream that's ReadAsync will wait until it have enough bytes for read's buffer.
var content = new MultipartFormDataContent();
// the problem is here, the stream must be ready for "read to end",
// so I buffered the uploaded file to a FileStream
content.Add(new StreamContent(filestream),"file1","myfilename");
await client.PostAsync("url",content )
Related
I send an audio file to a server API in MultiPartFormData. for this purpose, first, I convert storage File to Byte format, then I convert Byte to Stream and after that I post with MultiPartFormData request.That server answer my request in MultiPartformData format with an another Audio file too.
I receive that respond in HttpResponceMesseage, my question is how can I convert it to mp3 file?
I am using windows iot with UWP coding platform.
multipartContent.Add(new ByteArrayContent(await GetBytesAsync(storageFile)),"audio","audio.mp3");
request.Content = multipartContent;
var response = await httpClient.SendAsync(request);
var content = new StreamReader(await response.Content.ReadAsStreamAsync()).ReadToEnd();
In UWP, if you want to write to a file using stream, we will following the four-step model:
Open the file to get a stream
Get an output stream.
Create a DataWriter object and call the corresponding Write method.
Commit the data in the data writer and flush the output stream.
Please see Create, write, and read a file and Best practices for writing to files for more information.
The official File access sample for your reference.
I done it with extra following code.
first, I convert response to byte[] array, then I write bytes to file within new task thread,that because of main thread correspond to UI won't let another Async task run on it.
var response = await httpClient.SendAsync(request);
byte[] x=await response.Content.ReadAsByteArrayAsync();
await Task.Run(() =>
System.IO.File.WriteAllBytes(storageFile.Path,x));
I want to write an Image to a stream and read it afterwards.
Im on Win 10 UWP.
My code:
InMemoryRandomAccessStream imrasIn = new InMemoryRandomAccessStream();
await _mediaCapture.CapturePhotoToStreamAsync(ImageEncodingProperties.CreateJpeg(), imrasIn);
DetectedFaces = await _faceClient.DetectAsync(imrasIn.GetInputStreamAt(0).AsStreamForRead());
It does not work, DetectAsync gets an empty stream (Error: Image size is too small).
Do I need other classes? CapturePhotoToStreamAsync wants an IRandomAccessStream and DetectAsync wants a Stream.
I had to rewind the stream before reading (and after writing to it):
imrasIn.Seek(0);
I'm having tough time trying to send large files with HTTP file upload in ASP.NET.
The target is to transfer contents larger than 2GB, compressed to save the bandwidth. Sending 3GB uncompressed works well, all the files are received and saved correctly on the disk. Yet, when I use compression (either gzip or deflate), I get the following error from the receiving API:
Unexpected end of MIME multipart stream. MIME multipart message is not complete.
The thing is that only when sending large requests (approx. 300MB is the upper limit), I get the exception. Uploading 200MB compressed works well.
Here's the upload code:
using (var client = new HttpClient())
{
using (var content = new MultipartFormDataContent())
{
client.BaseAddress = new Uri(fileServerUrl);
client.DefaultRequestHeaders.TransferEncodingChunked = true;
CompressedContent compressedContent = new CompressedContent(content, "gzip");
var request = new HttpRequestMessage(HttpMethod.Post, "/api/fileupload/")
{
Content = compressedContent
};
var uploadResponse = client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead).Result;
}
}
The CompressedContent class is a class from WebApiContrib set of helpers
Here's the receiving end
// CustomStreamProvider is an implementation to store the files uploaded on the disk
var streamProvider = new CustomStreamProvider(uploadPath);
var res = await Request.Content.ReadAsMultipartAsync(streamProvider);
streamProvider.FileData.Select(file => new FileInfo(file.LocalFileName)).Select(fi => fi.FullName)
Can you provide me with a clue what the problem is? Why is it that larger contents (larger than 300MB) appear to be compressed improperly while smaller are transferred just fine?
Are you compressing the file in chunks? If you are able to send uncompressed files, then clearly, the problem is your compression routine. Using gzip or Deflate out of the box has limitation on file size. Therefore you need to compress in chunks.
Tip:
Debug your compression routine. Try to compress your file and save it to the HDD and see if it is readable.
Take a look at this article
Check what is content-length, HTTP server side routine does not expect you to send different content-length. You are probably sending content-length of original file size, where else you are sending compressed content and MultiPart parser is expecting data equal to content-length.
Instead, you send a zipped file and extract it on server side and let both zipping and unzipping be different at application level instead of plugging inside http protocol.
You will have to make sure, order of compression/decompression and content-length are correct on both side. As far as I am aware, compression on upload is not very common on HTTP protocol.
Try to add a name to your input as below:
<input type="file" id="fileInput" name="fileInput"/>
Or use a custom stream to append the newline that ASP.NET web api is expecting:
Stream reqStream = Request.Content.ReadAsStreamAsync().Result;
MemoryStream tempStream = new MemoryStream();
reqStream.CopyTo(tempStream);
tempStream.Seek(0, SeekOrigin.End);
StreamWriter writer = new StreamWriter(tempStream);
writer.WriteLine();
writer.Flush();
tempStream.Position = 0;
StreamContent streamContent = new StreamContent(tempStream);
foreach(var header in Request.Content.Headers)
{
streamContent.Headers.Add(header.Key, header.Value);
}
// Read the form data and return an async task.
await streamContent.ReadAsMultipartAsync(provider);
Considered we have two methods:
Task DownloadFromAToStreamAsync(Stream destinationStream);
Task UploadToBFromStreamAsync(Stream sourceStream);
Now we need to download content from A and upload it to B in a single operation.
One of the solutions:
using (var stream = new MemoryStream())
{
await DownloadFromAToStreamAsync(stream);
stream.Seek(0, SeekOrigin.Begin);
await UploadToBFromStreamAsync(stream);
}
But this solution requires the whole stream content to be loaded in memory.
How to solve the task more efficiently?
Change the Download method to accept an additional size parameter which indicates how much to download. Then loop downloading and uploading untill a download returns an empty stream.
I am working on developing an HTTP Server/Client and I can currently send small files over it such as .txt files and other easy to read files that do not require much memory. However when I want to send a larger file say a .exe or large .pdf I get memory errors. This are occurring from the fact that before I try to send or receive a file I have to specify the size of my byte[] buffer. Is there a way to get the size of the buffer while reading it from stream?
I want to do something like this:
//Create the stream.
private Stream dataStream = response.GetResponseStream();
//read bytes from stream into buffer.
byte[] byteArray = new byte[Convert.ToInt32(dataStream.Length)];
dataStream.read(byteArray,0,byteArray.Length);
However when calling "dataStream.Length" it throws the error:
ExceptionError: This stream does not support seek operations.
Can someone offer some advice as to how I can get the length of my byte[] from the stream?
Thanks,
You can use CopyTo method of the stream.
MemoryStream m = new MemoryStream();
dataStream.CopyTo(m);
byte[] byteArray = m.ToArray();
You can also write directly to file
var fs = File.Create("....");
dataStream.CopyTo(fs);
The network layer has no way of knowing how long the response stream is.
However, the server is supposed to tell you how long it is; look in the Content-Length response header.
If that header is missing or incorrect, you're out of luck; you'll need to keep reading until you run out of data.