I send an audio file to a server API in MultiPartFormData. for this purpose, first, I convert storage File to Byte format, then I convert Byte to Stream and after that I post with MultiPartFormData request.That server answer my request in MultiPartformData format with an another Audio file too.
I receive that respond in HttpResponceMesseage, my question is how can I convert it to mp3 file?
I am using windows iot with UWP coding platform.
multipartContent.Add(new ByteArrayContent(await GetBytesAsync(storageFile)),"audio","audio.mp3");
request.Content = multipartContent;
var response = await httpClient.SendAsync(request);
var content = new StreamReader(await response.Content.ReadAsStreamAsync()).ReadToEnd();
In UWP, if you want to write to a file using stream, we will following the four-step model:
Open the file to get a stream
Get an output stream.
Create a DataWriter object and call the corresponding Write method.
Commit the data in the data writer and flush the output stream.
Please see Create, write, and read a file and Best practices for writing to files for more information.
The official File access sample for your reference.
I done it with extra following code.
first, I convert response to byte[] array, then I write bytes to file within new task thread,that because of main thread correspond to UI won't let another Async task run on it.
var response = await httpClient.SendAsync(request);
byte[] x=await response.Content.ReadAsByteArrayAsync();
await Task.Run(() =>
System.IO.File.WriteAllBytes(storageFile.Path,x));
Related
I have a webapi (asp.net core) that receive a file and post to another webAPI
for now, I create a FileStream , and using HttpClient to Post this file.
But I wonder is that a two way Stream that can replace the FileStream, I mean a Stream that's ReadAsync will wait until it have enough bytes for read's buffer.
var content = new MultipartFormDataContent();
// the problem is here, the stream must be ready for "read to end",
// so I buffered the uploaded file to a FileStream
content.Add(new StreamContent(filestream),"file1","myfilename");
await client.PostAsync("url",content )
I've looked around for a solution to my issue but no-one seems to being aiming for quite what I'm trying to achieve.
My problem is such, I have Zip files stored in Azure Blob storage, now for security's sake we have an API2 controller action that provisions these zip files, rather than allowing direct downloading. This action will retrieve the blob, and download it to a stream, so that it can be packaged within a HTTPResponseMessage.
All of the above works, however, when I attempt to recreate the zip file, I'm informed it's corrupted. For now I'm just attempting to have the server (running on localhost) create the zip file, whereas the endgame is to have remote Client applications do this (I'm fairly certain the solution to my issue on the server would be the same.
public class FileActionResult : IHttpActionResult
{
private HttpRequestMessage _request;
private ICloudBlob _blob;
public FileActionResult(HttpRequestMessage request, ICloudBlob blob)
{
_request = request;
_blob = blob;
}
public async Task<HttpResponseMessage> ExecuteAsync(System.Threading.CancellationToken cancellationToken)
{
var fileStream = new MemoryStream();
await _blob.DownloadToStreamAsync(fileStream);
var byteTest = new byte[fileStream.Length];
var test = fileStream.Read(byteTest, 0, (int)fileStream.Length);
try
{
File.WriteAllBytes(#"C:\testPlatFiles\test.zip", byteTest);
}
catch(ArgumentException ex)
{
var a = ex;
}
var response = _request.CreateResponse(HttpStatusCode.Accepted);
response.Content = new StreamContent(fileStream);
response.Content.Headers.ContentLength = _blob.Properties.Length;
response.Content.Headers.ContentType = new MediaTypeHeaderValue(_blob.Properties.ContentType);
//set the fileName
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = _blob.Name,
Size = _blob.Properties.Length
};
return response;
}
}
I've looked into Zip libraries to see if any present the solution for converting the stream of a zip back to a zip file, but all I can find is reading zip files into streams, or the creation in order to provision a file download instead of a filecreate.
Any help would be much appreciated, thank you.
You use DotNetZip. Its ZipFile class has a static factory method that should do what you want: ZipFile.Read( Stream zipStream ) reads the given stream as a zip file and gives you back a ZipFile instance (which you can use for whatever.
However, if your Stream contains the raw zip data and all you want to do is persist it to disk, you should just be able to write the bytes straight to disk.
If you're getting 'zip file corrupted' errors, I'd look at the content encoding used to send the data to Azure and the content encoding it's sent back with. You should be sending it up to Azure with a content type of application/zip or application/octet-stream and possibly adding metadata to the Azure blob entry to send it down the same way.
Edited To Note: DotNetZip used to live at Codeplex. Codeplex has been shut down. The old archive is still available at Codeplex. It looks like the code has migrated to Github:
https://github.com/DinoChiesa/DotNetZip. Looks to be the original author's repo.
https://github.com/haf/DotNetZip.Semverd. This looks to be the currently maintained version. It's also packaged up an available via Nuget at https://www.nuget.org/packages/DotNetZip/
I'm having tough time trying to send large files with HTTP file upload in ASP.NET.
The target is to transfer contents larger than 2GB, compressed to save the bandwidth. Sending 3GB uncompressed works well, all the files are received and saved correctly on the disk. Yet, when I use compression (either gzip or deflate), I get the following error from the receiving API:
Unexpected end of MIME multipart stream. MIME multipart message is not complete.
The thing is that only when sending large requests (approx. 300MB is the upper limit), I get the exception. Uploading 200MB compressed works well.
Here's the upload code:
using (var client = new HttpClient())
{
using (var content = new MultipartFormDataContent())
{
client.BaseAddress = new Uri(fileServerUrl);
client.DefaultRequestHeaders.TransferEncodingChunked = true;
CompressedContent compressedContent = new CompressedContent(content, "gzip");
var request = new HttpRequestMessage(HttpMethod.Post, "/api/fileupload/")
{
Content = compressedContent
};
var uploadResponse = client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead).Result;
}
}
The CompressedContent class is a class from WebApiContrib set of helpers
Here's the receiving end
// CustomStreamProvider is an implementation to store the files uploaded on the disk
var streamProvider = new CustomStreamProvider(uploadPath);
var res = await Request.Content.ReadAsMultipartAsync(streamProvider);
streamProvider.FileData.Select(file => new FileInfo(file.LocalFileName)).Select(fi => fi.FullName)
Can you provide me with a clue what the problem is? Why is it that larger contents (larger than 300MB) appear to be compressed improperly while smaller are transferred just fine?
Are you compressing the file in chunks? If you are able to send uncompressed files, then clearly, the problem is your compression routine. Using gzip or Deflate out of the box has limitation on file size. Therefore you need to compress in chunks.
Tip:
Debug your compression routine. Try to compress your file and save it to the HDD and see if it is readable.
Take a look at this article
Check what is content-length, HTTP server side routine does not expect you to send different content-length. You are probably sending content-length of original file size, where else you are sending compressed content and MultiPart parser is expecting data equal to content-length.
Instead, you send a zipped file and extract it on server side and let both zipping and unzipping be different at application level instead of plugging inside http protocol.
You will have to make sure, order of compression/decompression and content-length are correct on both side. As far as I am aware, compression on upload is not very common on HTTP protocol.
Try to add a name to your input as below:
<input type="file" id="fileInput" name="fileInput"/>
Or use a custom stream to append the newline that ASP.NET web api is expecting:
Stream reqStream = Request.Content.ReadAsStreamAsync().Result;
MemoryStream tempStream = new MemoryStream();
reqStream.CopyTo(tempStream);
tempStream.Seek(0, SeekOrigin.End);
StreamWriter writer = new StreamWriter(tempStream);
writer.WriteLine();
writer.Flush();
tempStream.Position = 0;
StreamContent streamContent = new StreamContent(tempStream);
foreach(var header in Request.Content.Headers)
{
streamContent.Headers.Add(header.Key, header.Value);
}
// Read the form data and return an async task.
await streamContent.ReadAsMultipartAsync(provider);
I am working on developing an HTTP Server/Client and I can currently send small files over it such as .txt files and other easy to read files that do not require much memory. However when I want to send a larger file say a .exe or large .pdf I get memory errors. This are occurring from the fact that before I try to send or receive a file I have to specify the size of my byte[] buffer. Is there a way to get the size of the buffer while reading it from stream?
I want to do something like this:
//Create the stream.
private Stream dataStream = response.GetResponseStream();
//read bytes from stream into buffer.
byte[] byteArray = new byte[Convert.ToInt32(dataStream.Length)];
dataStream.read(byteArray,0,byteArray.Length);
However when calling "dataStream.Length" it throws the error:
ExceptionError: This stream does not support seek operations.
Can someone offer some advice as to how I can get the length of my byte[] from the stream?
Thanks,
You can use CopyTo method of the stream.
MemoryStream m = new MemoryStream();
dataStream.CopyTo(m);
byte[] byteArray = m.ToArray();
You can also write directly to file
var fs = File.Create("....");
dataStream.CopyTo(fs);
The network layer has no way of knowing how long the response stream is.
However, the server is supposed to tell you how long it is; look in the Content-Length response header.
If that header is missing or incorrect, you're out of luck; you'll need to keep reading until you run out of data.
I want to stream data from a server into a MediaElement in my Windows 8 Store (formerly Metro) app. However, I need to "record" the stream while it is streaming, so it can be served from cache if re-requested, so I don't want to feed the URL directly into the MediaElement.
Currently, the stumbling block is that MediaElement.SetSource() accepts an IRandomAccessStream, not a System.IO.Stream, which is what I get from HttpWebResponse.GetResponseStream().
The code I have now, which does not work:
var request = WebRequest.CreateHttp(url);
request.AllowReadStreamBuffering = false;
request.BeginGetResponse(ar =>
{
var response = ((HttpWebResponse)request.EndGetResponse(ar));
// this is System.IO.Stream:
var stream = response.GetResponseStream();
// this needs IRandomAccessStream:
MediaPlayer.SetSource(stream, "audio/mp3");
}, null);
Is there a solution that allows me to stream audio, but allows me to copy the stream to disk when it has finished reading from the remote side?
I haven't experimented with the following idea, but it might work: You could start streaming the web data into a file and then, after a few seconds (for buffering), pass that file to the MediaElement.
I noticed that MediaElement can be picky about opening a file that is being written into, but I have seen it work. Though, I can't say why it sometimes work and why it sometimes doesn't.
I guess this would help you to convert Stream to IRandomAccessStream
InMemoryRandomAccessStream ras = new InMemoryRandomAccessStream();
using (Stream stream = response.GetResponseStream();)
{
await stream.CopyToAsync(ras.AsStreamForWrite());
}