I have a rather confusing problem at hand. Trying to post some JSON to some .php files on a remote server.
The code is the following:
wc = new WebClient();
Uri urlToRequest = new Uri(webserviceUrl + url);
wc.UploadStringCompleted += new UploadStringCompletedEventHandler(wc_DownloadDataCompleted);
wc.Headers.Add(HttpRequestHeader.ContentType, "application/json");
var stopwatch = Stopwatch.StartNew();
wc.UploadStringAsync(urlToRequest, json);
Console.WriteLine("Async time: " + stopwatch.ElapsedMilliseconds);
I am posting to 2 php files on the same server, the same JSON (expecting different results).
When I post to the first file I get this in the console:
Async time: 2576
When I post to the other I get this in the console:
Async time: 0
The JSON I am sending is a really simple {"user":"bob","action":"get"}
When debugging and I put a breakpoint for example at wc = new WebClient(); and just go Step Over line by line, on the first call the Step Over hangs at UploadStringAsync for 2-3 seconds but on the second call it just jumps over it (as it should).
The .php files are on the same server.
Any ideas why is the same call behaves differently in 2 calls, and shouldn't UploadStringAsync be async in all cases?
After reading up on the subject i got to some conclusions:
from this answer Why does WebClient.DownloadStringTaskAsync() block ? - new async API/syntax/CTP
wc.Proxy = null;
this solves the problem and the first request is now down to 10-15ms which is acceptable.
The small delay in the first request seems to be a DNS resolve issue, which still runs synchronously according to this:
C# async methods still hang UI
Related
Is it possible to read API data as it comes , I wrote the below Csharp code in my controller but sometimes the data takes more than 2 minutes and I was wondering if it was possible to load the data in my website as they come instead of waiting for it. Below is my current code:
private static async Task<List<Model>> GetFlightData()
{
using (var client = new HttpClient())
{
client.Timeout = TimeSpan.FromMilliseconds(Timeout.Infinite);
var content = await client.GetStringAsync(URL);
var result = JsonConvert.DeserializeObject<List<Model>>(content);
return result;
}
}
The fastest way is to save the data statically and initialize it on start up,
the problem with this solution is that IIS may restart you website when there's no traffic and the data will be lost (and will cause the next visitor to wait a whole 2 minutes).
The best suggestion I have is to save it to redis/other cache of your chosing and then just pull it from there.
I have a Windows Service that sends json data to a MVC5 WebAPI using the WebClient. Both the Windows Service and the WebClient are currently on the same machine.
After it has run for about 15 minutes with about 10 requests per second each post takes unreasonably long to complete. It can start out at about 3 ms to complete a request and build up to take about 5 seconds, which is way too much for my application.
This is the code I'm using:
private WebClient GetClient()
{
var webClient = new WebClient();
webClient.Headers.Add("Content-Type", "application/json");
return webClient;
}
public string Post<T>(string url, T data)
{
var sw = new Stopwatch();
try
{
var json = JsonConvert.SerializeObject(data);
sw.Start();
var result = GetClient().UploadString(GetAddress(url), json);
sw.Stop();
if (Log.IsVerboseEnabled())
Log.Verbose(String.Format("json: {0}, time(ms): {1}", json, sw.ElapsedMilliseconds));
return result;
}
catch (Exception)
{
sw.Stop();
Log.Debug(String.Format("Failed to send to webapi, time(ms): {0}", sw.ElapsedMilliseconds));
return "Failed to send to webapi";
}
}
The result of the request isn't really of importance to me.
The serialized data size varies from just a few bytes to about 1 kB but that does not seem to affect the time it takes to complete the request.
The api controllers that receive the request completes their execution almost instantly (0-1 ms).
From various questions here on SO and some blog posts I've seen people suggesting the use of HttpWebRequest instead to be able to control options of the request.
Using HttpWebRequest I've tried these things that did not work:
Setting the proxy to an empty proxy.
Setting the proxy to null
Setting the ServicePointManager.DefaultConnectionLimit to an arbitrary large number.
Disable KeepAlive (I don't want to but it was suggested).
Not opening the response stream at all (had some impact but not enough).
Why are the requests taking so long? Any help is greatly appreciated.
It turned out to be another part of the program that took all available connections. I.e. I was out of sockets and had to wait for a free one.
I found out by monitoring ASP.NET Applications\Requests/Sec in the Performance Monitor.
So I am trying to query an API that's accessible via HTTP ( no authorization ). To speed things up, I tried to use a Parallel.ForEach loop but it seems like the longer it runs, the more errors pop up.
It fails to retrieve more and more requests. I know the API provider isn't limiting me because I can request the very same blocked URLs in my Internet browser. Also, these are different failed URLs each time, so it doesn't seem to be the case of malformed requests.
The error doesn't seem to occur while I use single threaded foreach loop.
My malfunctioning loop is below:
Parallel.ForEach(this.urlArray, singleUrl => {
this.apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl );
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
Normal foreach loop works fine but is very slow:
foreach (string singleUrl in this.urlArray) {
this.apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl);
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
Also: I've had a solution in PHP - I spawned several "fetchers" simultaneously and it never hung up. It seems strange to me that PHP would handle multithreaded retrieval better than C# so I must obviously miss something.
How do I query the API fastest way? Without these strange failures?
Hi did you try to speed up your code with a sync downloads like in this question (see marked answer):
DownloadStringAsync wait for request completion
your could loop through your uris and get a callback for each successfull download.
EDIT : i have seen that you use
this.apiResponseBlob = DL
when you use multithreading every thread tries to write in that variable. This could be a reason vor your bug. Try using an instance of that object type or use
lock{}
so that only one thread can write this variable at time.
http://msdn.microsoft.com/de-de/library/c5kehkcz.aspx
like
Parallel.ForEach(this.urlArray, singleUrl => {
var apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl );
lock(singleUrl.ToString()){
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
}
I have a WinRT application (8.0, not 8.1, so I can't use Windows.Web.HttpClient) where I am uploading large files to a site. I am using System.Net.Http.HttpClient with the System.Net.Http.Handlers.ProgressMessageHandler from the Microsoft.AspNet.WebApi.Client nuget package for the purposes of tracking progress.
No matter how big a file I upload, I always seem to get the HttpSendProgress event called once, and only once, with 100% progress (and totalBytes == sentBytes). However the file doesn't actually complete uploading to the site until sometime after the event fired, depending on file size and whether I've limited the upload speed etc. The upload does work, but the progress reporting is useless.
I used a network monitoring tool and could see the data being transferred slowly after the progress event was called (when I let the app run after stopping on a break point) - but I only got the event raised one time and with 100% progress before the upload finished.
I presume the HttpClient is writing to some kind of buffer which is happening much more quickly than the actual upload, but I can't figure out how to change/prevent that, or what the point of the ProgressMessageHandler class is if it always works this way.
At the moment the code I'm using looks something like the following;
public static async Task<string> UploadDataAsync(string uploadUrl, byte[] data, string contentTypeHeader, string oauthHeader, Action<long, long?> progressCallback)
{
var ph = new System.Net.Http.Handlers.ProgressMessageHandler();
if (progressCallback != null)
{
ph.HttpSendProgress += (sender, args) =>
{
progressCallback(args.BytesTransferred, args.TotalBytes);
};
}
var client = HttpClientFactory.Create(ph);
client.Timeout = new TimeSpan(0, 20, 0);
if (!String.IsNullOrEmpty(oauthHeader))
client.DefaultRequestHeaders.Add("Authorization", oauthHeader);
var content = new ByteArrayContent(data);
content.Headers.TryAddWithoutValidation("Content-Type", contentTypeHeader);
var postResponse = await client.PostAsync(new Uri(uploadUrl), content);
var result = await postResponse.Content.ReadAsStringAsync();
if (!postResponse.IsSuccessStatusCode)
{
throw new OAuthException(result);
}
return result;
}
I ran into the same problem, and the issue was using ByteArrayContent for the Post call. That type doesn't get chunked by the underlying HttpClient (my vague understanding of how this works).
You need to use StreamContent to get progress updates. I used it with a FileStream, which worked perfect. I've seen reports that MemoryStream does not give progress. YMMV.
I recommend chain your both tasks with ContinueWith and use TaskContinuationOptions.OnlyOnRanToCompletion to give the ProgressMessageHandler the chance to report the progress exactly as you want.
Hope this may help.
postResponse.ContinueWith(task =>
{
if (task.Result.IsSuccessStatusCode)
{
}
}, TaskContinuationOptions.OnlyOnRanToCompletion);
Earlier I made an HttpWebRequest that worked perfectly fine, and my StreamReader read the HTML of the website perfectly.
But all of the sudden, after having tested it's functionality and confirmed that it worked many times, it hangs the program when it comes to the StreamReader line.
I have tried removing this line, and the code continued.
The thing is; I tried inputting a different website than the one I need to use, (I put in www.google.com) and it worked perfectly fine. So my error conclusion is, that it is only the website I need to use that I can't access anymore which makes me think that the endpart (the website) is cancelling my connection or blocking me or something. BUT! The HttpWebRequest itself doesn't hang or anything, which must mean that it successfully established a request to the website?
Enough chit-chat, here's the code:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("website here");
MessageBox.Show("1"); //This is shown.
string HTMLLink = (new StreamReader(request.GetResponse().GetResponseStream())).ReadToEnd(); //This is where the program hangs....
MessageBox.Show("2"); //This is not shown! Below this isn't being executed.
if (HTMLLink.Length > 0)
{
HTMLLink = HTMLLink.Substring(HTMLLink.IndexOf("uuu"), HTMLLink.Length - HTMLLink.IndexOf("uuu"));
HTMLLink = HTMLLink.Substring(0, HTMLLink.IndexOf("\" TARGET="));
request = (HttpWebRequest)WebRequest.Create(HTMLLink);
string HTML = (new StreamReader(request.GetResponse().GetResponseStream())).ReadToEnd();
if (HTML.Length > 0 && HTML.Contains(" </script><br><br><br>") && HTML.Contains(" <br><br><script "))
{
HTML = HTML.Substring(HTML.IndexOf(" </script><br><br><br>") + 22, HTML.IndexOf("<br><br><script "));
HTML = HTML.Substring(0, HTML.IndexOf("<br><br><script "));
HTML = HTML.Replace("\r\n", "");
HTML = HTML.Replace("\n", "");
HTML = HTML.Replace("<br>", "\r\n");
HTML = HTML.Replace("<BR>", "\r\n");
HTML = HTML.Replace("<br />", "\r\n");
HTML = HTML.Replace("<BR />", "\r\n");
textBox.Text = HTML;
}
}
And please keep in mind that it worked perfectly earlier then all of the sudden it started hanging, and that it works fine with www.google.com.
And by the way, yes I have done many searches. No useful results.
I have tried the timeout already, it does timeout.
Maybe the website has blocked my program thinking it's a spider? what then?
Everytime when I reach the StreamReader (no matter how I set it up) it starts to hang.
And it keeps hanging, it doesn't deliver any result.
This ONLY happens with lyrics007.com which is the exact website I need. It works fine with google.
Help, please!
Thanks in advance!
WebRequest.GetResponse() is a blocking call. It will wait until it can successfully connect and receive the response before it returns control to the caller, or will throw an exception if unsuccessful. This behaviour can't be modified.
You usually don't want your application to sit waiting for something to happen though, so you usually delegate the GetResponse() call to another thread, so you can continue doing other work in the current thread.
The usual way to overcome this problem is to call asynchronously. Rather than a call to GetResponse, you will call BeginGetResponse(), passing in a function which should be executed when the operation completes (eg, containing the remainder of your current method, plus a call to EndGetResponse()). Control of execution can be passed back to the caller whilst the response is being waited for in a background thread, handled for you automatically by the .NET threadpool.
The request is not sent until the call to GetResponse. If that is where it is hanging, I would be inclined to say the site is not responding. Did you try using a web browser to connect to that URL and see if it works?