Is HttpClient flawed when sending large files/content? - c#

After reading/googling about HttpClient, I have the impression that this component is not suitable for uploading large files or contents to REST services.
It seems that if the upload takes more than the established timeout, the transmission will fail. Does it make sense? What does this timeout means?
Getting progress information seems hard or requires add-ons.
So my questions are: Is it possible to sove these two issues without too much hassle? Otherwise, what's the best approach when working with large contents and REST services?

Yes, if the upload takes longer that the TimeOut, the upload will fail. This is a limitation of HttpClient. The most robust solution to this problem is the one that Thomas Levesque has written an article about, and linked in his comments to your question. You have to use HttpWebRequest instead of HttpClient.
If you want to get progress messages, open the file as a FileStream and manually iterate through it, copying bytes in increments onto the (upload) request stream. As you go, you can calculate your progress relative to the file size.
TL's code example. Be sure to read the article though!:
long UploadFile(string path, string url, string contentType)
{
// Build request
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = WebRequestMethods.Http.Post;
request.AllowWriteStreamBuffering = false;
request.ContentType = contentType;
string fileName = Path.GetFileName(path);
request.Headers["Content-Disposition"] = string.Format("attachment; filename=\"{0}\"", fileName);
try
{
// Open source file
using (var fileStream = File.OpenRead(path))
{
// Set content length based on source file length
request.ContentLength = fileStream.Length;
// Get the request stream with the default timeout
using (var requestStream = request.GetRequestStreamWithTimeout())
{
// Upload the file with no timeout
fileStream.CopyTo(requestStream);
}
}
// Get response with the default timeout, and parse the response body
using (var response = request.GetResponseWithTimeout())
using (var responseStream = response.GetResponseStream())
using (var reader = new StreamReader(responseStream))
{
string json = reader.ReadToEnd();
var j = JObject.Parse(json);
return j.Value<long>("Id");
}
}
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.Timeout)
{
LogError(ex, "Timeout while uploading '{0}'", fileName);
}
else
{
LogError(ex, "Error while uploading '{0}'", fileName);
}
throw;
}
}

Related

Fill ComboBox with names of files from a directory on FTP server

I have a file folder in my FTP server and I want to fill a ComboBox with the contents inside of that folder. How would I go about doing this?
string result = string.Empty;
//Request location and server name---------->
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create("ftp://*******" +"/" + "Products" + "/");
//Lists directory
request.Method = WebRequestMethods.Ftp.ListDirectory;
// set credentials
request.Credentials = new NetworkCredential("user1","1234");
//initialize response
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
//reader to read response
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
combobox1.Text = FTP_Server();
//data from file.
result = reader.ReadToEnd();
reader.Close();
response.Close();
Thanks! I didn't know if this was even possible!
Read the listing by lines:
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://example.com/remote/path/");
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.Credentials = new NetworkCredential("username", "password");
comboBox1.BeginUpdate();
try
{
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
using (Stream stream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(stream))
{
while (!reader.EndOfStream)
{
comboBox1.Items.Add(reader.ReadLine());
}
}
}
finally
{
comboBox1.EndUpdate();
}
Downloading whole listing to a string and splitting it afterwards (as suggested by the other answer) can be pretty ineffective, if there's lot of entries.
Without knowing the exact format of your response string, my instinct would be to split the response string:
string files[] = result.Split("\r\n");
Then to iterate over the individual files, adding them to your combobox1's Items:
// Depending on how many items you're adding, you may wish to prevent a repaint until the operation is finished
combobox1.BeginUpdate();
foreach(string file in files)
{
combobox1.Items.Add(file);
}
combobox1.EndUpdate();
That should take care of it for you! There is some excellent (and exhaustive) documentation on MSDN as well, which will often contain some usage examples to help you out further: https://msdn.microsoft.com/en-us/library/system.windows.forms.combobox(v=vs.110).aspx#Examples
Note that, if you end up wanting to display information from a different FTP response, you'll have to clear the combobox1 like so first: combobox1.Items.Clear();

Issue in Resumable Upload on Google Drive

I am trying to upload large file on Google drive performing Resumable upload.
Here is the code flow
Step 1 : Creating file on Google Drive using Drive service and initiating the resumable upload session
using put request
String fileID = _DriveService.Files.Insert(googleFileBody).Execute().Id;
//Initiating resumable upload session
String UploadUrl = null;
String _putUrl = "https://www.googleapis.com/upload/drive/v2/files/" + fileID + "?uploadType=resumable";
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(_putUrl);
httpRequest.Headers["Authorization"] = "Bearer " + AccessToken;
httpRequest.Method = "PUT";
requestStream = httpRequest.GetRequestStream();
_webResponse = (HttpWebResponse)httpRequest.GetResponse();
if (_webResponse.StatusCode == HttpStatusCode.OK)
{
//Getting response OK
UploadUrl = _webResponse.Headers["Location"].ToString();
}
Step 2 : Uploading chunks to using UploadUrl . The byte array is in multiple of 256kb and call to this function is in the loop for every chunk
private void AppendFileData(byte[] chunk)
{
try
{
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(UploadUrl);
httpRequest.ContentLength = chunk.Length;
httpRequest.Headers["Content-Range"] = "bytes " + startOffset + "-" + endOffset+ "/" + sourceFileSize;
httpRequest.ContentType= MimeType;
httpRequest.Method = "PUT";
MemoryStream stream =new MemoryStream(chunk);
using (System.IO.Stream requestStream = httpRequest.GetRequestStream())
{
stream.CopyTo(requestStream);
requestStream.Flush();
requestStream.Close();
}
HttpWebResponse httpResponse = (HttpWebResponse)(httpRequest.GetResponse()); // Throws exception as
//System.Net.WebException: The remote server returned an error: (308) Resume Incomplete.
//at System.Net.HttpWebRequest.GetResponse()
// There is no data getting appended to file
// Still executing the append for remaining chunks
}
catch(System.Net.WebException ex)
{
}
}
For my last chunk which is not multiple of 256KB I am getting error as
System.Net.WebException: The remote server returned an error: (400)
Bad Request. at System.Net.HttpWebRequest.GetResponse()
What I am doing wrong in this code? Please suggest.
Thanks in advance
Mayuresh.
Try checking if the last chunk is passing the correct size and not the entire array as stated in this forum. Ali has stated in that forum that "one potential issue is this: if you are sending a byte array which is half empty for the last request (i.e. the buffer has been read in less than the chunk size)." Here is a sample implementation of resumable upload. Hope this helps.

C# Stream Response from 3rd party, minimal buffering

Our ASP.NET MVC endpoint is a behaving as a proxy to another 3rd party HTTP endpoint, which returns about 400MB of XML document generated dynamically.
Is there a way for ASP.NET MVC to "stream" that 3rd party response straight to the user of our endpoint with "minimal" buffering ?
At the moment, it looks like ASP.NET System.Web.Mvc.Controller.File() loads the whole file into memory as the response.
Not sure how I can confirm this, other than the jump in memory usage ?
System.Web.Mvc.Controller.File(
The IIS AppPool memory usage increases by 400MB, which is then re-claimed by Garbage Collection later.
It will be nice if we can avoid System.Web.Mvc.Controller.File() loading the whole 400MB strings into memory, by streaming it "almost directly" from incoming response,
is it possible ?
The mock c# linqpad code is roughly like this
public class MyResponseItem {
public Stream myStream;
public string metadata;
}
void Main()
{
Stream stream = MyEndPoint();
//now let user download this XML as System.Web.Mvc.FileResult
System.Web.Mvc.ActionResult fileResult = System.Web.Mvc.Controller.File(stream, "text/xml");
fileResult.Dump();
}
Stream MyEndPoint() {
MyResponseItem myResponse = GetStreamFromThirdParty("https://www.google.com");
return myResponse.myStream;
}
MyResponseItem GetStreamFromThirdParty(string fullUrl)
{
MyResponseItem myResponse = new MyResponseItem();
System.Net.WebResponse webResponse = System.Net.WebRequest.Create(fullUrl).GetResponse();
myResponse.myStream = webResponse.GetResponseStream();
return myResponse;
}
You can reduce the memory footprint by not buffering and just copying the stream directly to output stream, an quick n' dirty example of this here:
public async Task<ActionResult> Download()
{
using (var httpClient = new System.Net.Http.HttpClient())
{
using (
var stream = await httpClient.GetStreamAsync(
"https://ckannet-storage.commondatastorage.googleapis.com/2012-10-22T184507/aft4.tsv.gz"
))
{
Response.ContentType = "application/octet-stream";
Response.Buffer = false;
Response.BufferOutput = false;
await stream.CopyToAsync(Response.OutputStream);
}
return new HttpStatusCodeResult(200);
}
}
If you want to reduce the footprint even more you can set a lower buffer size with the CopyToAsync(Stream, Int32) overload, default is 81920 bytes.
My requirement on proxy download also need to ensure the source ContentType (or any Header you need) can be forwarded as well. (e.g. If I proxy-download a video in http://techslides.com/sample-webm-ogg-and-mp4-video-files-for-html5, I need to let user see the same browser-video-player screen as they open the link directly, but not jumping to file-download / hard-coded ContentType)
Basing on answer by #devlead + another post https://stackoverflow.com/a/30164356/4684232, I adjusted a lil on the answer to fulfill my need. Here's my adjusted code in case anyone has the same need.
public async Task<ActionResult> Download(string url)
{
using (var httpClient = new System.Net.Http.HttpClient())
{
using (var response = await httpClient.GetAsync(url, HttpCompletionOption.ResponseHeadersRead))
{
response.EnsureSuccessStatusCode();
using (var stream = await response.Content.ReadAsStreamAsync())
{
Response.ContentType = response.Content.Headers.ContentType.ToString();
Response.Buffer = false;
Response.BufferOutput = false;
await stream.CopyToAsync(Response.OutputStream);
}
}
return new HttpStatusCodeResult(200);
}
}
p.s. HttpCompletionOption.ResponseHeadersRead is the important performance key. Without it, GetAsync will await until the whole source response stream are downloaded, which is much slower.

Copy a PDF Stream to File

I'm calling a routine in PHP (TCPDF) from C# via WebRequest using StreamReader. The PDF file is returned as a stream and stored in a string (obv). I know the data being returned to the string is actually a PDF file, as I've tested it in PHP. I'm having a hard time writing the string to a file and actually getting a valid PDF in C#. I know it has something to do with the way I'm trying to encode the file, but the several things I've tried have resulted in 'Not today, Padre' (i.e. they didn't work)
Here's the class I'm using to perform the request (thanks to user 'Paramiliar' for the example I'm using/borrowed/stole):
public class httpPostData
{
WebRequest request;
WebResponse response;
public string senddata(string url, string postdata)
{
// create the request to the url passed in the paramaters
request = (WebRequest)WebRequest.Create(url);
// set the method to POST
request.Method = "POST";
// set the content type and the content length
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postdata.Length;
// convert the post data into a byte array
byte[] byteData = Encoding.UTF8.GetBytes(postdata);
// get the request stream and write the data to it
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteData, 0, byteData.Length);
dataStream.Close();
// get the response
response = request.GetResponse();
dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
// read the response
string serverresponse = reader.ReadToEnd();
//Console.WriteLine(serverresponse);
reader.Close();
dataStream.Close();
response.Close();
return serverresponse;
}
} // end class httpPostData
...and my call to it
httpPostData myPost = new httpPostData();
// postData defined (not shown)
string response = myPost.senddata("http://www.example.com/pdf.php", postData);
In case it isn't clear, I'm stuck writing string response to a valid .pdf file. I've tried this (Thanks to user Adrian):
static public void SaveStreamToFile(string fileFullPath, Stream stream)
{
if (stream.Length == 0) return;
// Create a FileStream object to write a stream to a file
using (FileStream fileStream = System.IO.File.Create(fileFullPath, (int)stream.Length))
{
// Fill the bytes[] array with the stream data
byte[] bytesInStream = new byte[stream.Length];
stream.Read(bytesInStream, 0, (int)bytesInStream.Length);
// Use FileStream object to write to the specified file
fileStream.Write(bytesInStream, 0, bytesInStream.Length);
}
}
..and the call to it:
string location = "C:\\myLocation\\";
SaveStreamToFile(location, response); // <<-- this throws an error b/c 'response' is a string, not a stream. New to C# and having some basic issues with things like this
I think I'm close...a nudge in the right direction would be greatly appreciated.
You can use WebClient. Use the method DownloadFile, or the async ones.
Have fun!
Fernando.-
Sorry, I haven't read your comments till now.
I guess you have already done this...
But this may help you (just replace urls and paths) (from: http://msdn.microsoft.com/en-us/library/ez801hhe.aspx )
string remoteUri = "http://www.contoso.com/library/homepage/images/";
string fileName = "ms-banner.gif", myStringWebResource = null;
// Create a new WebClient instance.
WebClient myWebClient = new WebClient();
// Concatenate the domain with the Web resource filename.
myStringWebResource = remoteUri + fileName;
Console.WriteLine("Downloading File \"{0}\" from \"{1}\" .......\n\n", fileName, myStringWebResource);
// Download the Web resource and save it into the current filesystem folder.
myWebClient.DownloadFile(myStringWebResource,fileName);
Console.WriteLine("Successfully Downloaded File \"{0}\" from \"{1}\"", fileName, myStringWebResource);
Console.WriteLine("\nDownloaded file saved in the following file system folder:\n\t" + Application.StartupPath);

uploading a file with WCF REST Services

I am new to WCF and Rest services, and tried to do some implementation from posts I found on the web, but I am still getting some problems.
So let me explain my scenario.
I have a WPF application, and in it I have a feedback form, which the client can fill up, attach some screenshots, and send it. Now my idea was to gather all this info inside an XML file, which I am already doing successfully, and then uploading this XML file on my server in a particular folder.
Now as I understand it, the client app has to have a POST method to post the stream to the server, and then I should have an aspx page on the server to decode back the stream I get from the POST, and formulate my XML file, and then save it inside the folder, correct me if I'm wrong.
At the moment I have implemented the code on the client as follows :-
public static void UploadFile()
{
serverPath = "http://localhost:3402/GetResponse.aspx";
filePath = "C:\\Testing\\UploadFile\\UploadFile\\asd_asd_Feedback.xml";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(serverPath);
//request.MediaType = "text/xml";
request.ContentType = "text/xml";
//request.Method = "POST";
request.Method = "POST";
//request.ContentLength = contentLength;
//request.ContentType = "application/x-www-form-urlencoded";
using (FileStream fileStream = File.OpenRead(filePath))
using (Stream requestStream = request.GetRequestStream())
{
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int byteCount = 0;
while ((byteCount = fileStream.Read(buffer, 0, bufferSize)) > 0)
{
requestStream.Write(buffer, 0, byteCount);
}
}
string result = String.Empty;
try
{
using (WebResponse response = request.GetResponse())
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
result = reader.ReadToEnd();
}
}
catch (Exception exc)
{
}
if (result == "OK")
{
}
else
{
// error...
}
}
Now how can I pass the requestStream to the GetResponse.aspx page? And is this the correct way to go?
Thanks for your help and time
I don't understand what your code is trying to do. Have you considered actually using a WCF client and a WCF service for doing the actual upload itself?
There is a sample that does this! This blog post details how to use the programming model on the service side, and this follow-up blog post details how to use it on the client side. I've seen it used quite a bit for file upload and image transfer scenarios, so it might help your situation as well! The example present in those blog posts is a file upload one.
Hope this helps!

Categories

Resources