Assume I have the following code:
private string PostData(string functionName, string parsedContent)
{
string url = // some url;
var http = (HttpWebRequest)WebRequest.Create(new Uri(url));
http.Accept = "application/json";
http.ContentType = "application/json";
http.Method = "POST";
http.Timeout = 15000; // 15 seconds
Byte[] bytes = Encoding.UTF8.GetBytes(parsedContent);
using (Stream newStream = http.GetRequestStream())
{
newStream.Write(bytes, 0, bytes.Length);
}
using (WebResponse response = http.GetResponse())
{
using (var stream = response.GetResponseStream())
{
var sr = new StreamReader(stream);
var content = sr.ReadToEnd();
return content;
}
}
}
I set up a breakpoint over this line of code:
using (Stream newStream = http.GetRequestStream())
before http.GetRequestStream() gets executed. Here is a screenshot of my active threads:
This whole method is running in background thread with ThreadId = 3 as you can see.
After pressing F10 we get http.GetRequestStream() method executed. And here is an updated screenshot of active threads:
As you can see, now we have one extra active thread that is in state of waiting. Probably the method http.GetRequestStream() spawns it. Everything is fine, but.. this thread keeps hanging like that for the whole app lifecycle, which seems not to be the intended behaviour.
Am I misusing GetRequestStream somehow?
If I use ilspy it looks like the request is send asynchronously. That would explain the extra thread.
Looking a little bit deeper the HttpWebRequest creates a static TimerQueue with one thread and a never ending loop, that has a Monitor.WaitAny in it. Every webrequest in the appdomain will register a timer callback for timeout handling and all those callbacks are handled by that thread. Due to it being static that instance will never get garbage collected and therefore it will keep hold of the thread.
It did register for the AppDomain.Unload event so if that fires it will clean up it's resources including any threads.
Do notice that these are all internal classes and those implementation details might change at any time.
Related
I am currently working on a proof of concept application using the Xamarin free trial, and I have hit a rather interesting little problem... Here is the code I am using within a Portable Class Library:
using System;
using System.Net;
using Newtonsoft.Json;
namespace poc
{
public class CurrentWeatherInformation
{
public string WeatherText { get; set; }
public CurrentWeatherInformation (string cityName)
{
// api.openweathermap.org/data/2.5/weather?q=Leeds
var request = (HttpWebRequest)WebRequest.Create(string.Format("http://api.openweathermap.org/data/2.5/weather?q={0}", cityName));
request.ContentType = "application/json";
request.Method = "GET";
object state = request;
var ar = request.BeginGetResponse (WeatherCallbackMethod, state);
var waitHandle = ar.AsyncWaitHandle as System.Threading.ManualResetEvent;
waitHandle.WaitOne();
}
public void WeatherCallbackMethod(IAsyncResult ar)
{
object state = ar.AsyncState;
var request = state as HttpWebRequest;
var response = request.EndGetResponse(ar);
var data = new System.IO.StreamReader (response.GetResponseStream ()).ReadToEnd ();
this.WeatherText = data;
}
}
}
Essentially, I just want to call against a webservice and get a response, but I note with Xamarin that I am unable to do this using the good old GetResponse() method, and have to use BeginGetResponse() and EndGetResponse() instead, with the old IAsyncResult pattern. Shizzle.
Anyway, my problem is that the code following my waiting on the waitHandle is executing BEFORE the code in the callback, and I don't see why. This is precisely what we have the wait handle for!
Can anyone spot what I am sure will prove to be a simple mistake by a simpleton?
On Windows Phone you are forced to use the async API. When you try to wait for a result of an async method synchronously on main thread you can end up in an infinite loop.
Use async and await when you do expensive things. It's the common pattern for doing asynchronous work.
Take a look at some tutorials:
https://visualstudiomagazine.com/articles/2013/10/01/asynchronous-operations-with-xamarin.aspx
http://developer.xamarin.com/recipes/android/web_services/consuming_services/call_a_rest_web_service/
How to implement Android callbacks in C# using async/await with Xamarin or Dot42?
https://github.com/conceptdev/xamarin-forms-samples/blob/master/HttpClient/HttpClientDemo/GeoNamesWebService.cs
I'm coding a multithreaded web-crawler that performs a lot of concurrent httpwebrequests every second using hundreds of threads, the application works great but sometimes(randomly) one of the webrequests hangs on the getResponseStream() completely ignoring the timeout(this happen when I perform hundreds of requests concurrently) making the crawling process never end, the strange thing is that with fiddler this never happen and the application never hang, it is really hard to debug because it happens randomly.
I've tried to set
Keep-Alive = false
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
but I still get the strange behavior, any ideas?
Thanks
HttpWebRequest code:
public static string RequestHttp(string url, string referer, ref CookieContainer cookieContainer_0, IWebProxy proxy)
{
string str = string.Empty;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
request.UserAgent = randomuseragent();
request.ContentType = "application/x-www-form-urlencoded";
request.Accept = "*/*";
request.CookieContainer = cookieContainer_0;
request.Proxy = proxy;
request.Timeout = 15000;
request.Referer = referer;
//request.ServicePoint.MaxIdleTime = 15000;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
List<byte> list = new List<byte>();
byte[] buffer = new byte[0x400];
int count = responseStream.Read(buffer, 0, buffer.Length);
while (count != 0)
{
list.AddRange(buffer.ToList<byte>().GetRange(0, count));
if (list.Count >= 0x100000)
{
break;
}
count = 0;
try
{
HERE IT HANGS SOMETIMES ---> count = responseStream.Read(buffer, 0, buffer.Length);
continue;
}
catch
{
continue;
}
}
//responseStream.Close();
int num2 = 0x200 * 0x400;
if (list.Count >= num2)
{
list.RemoveRange((num2 * 3) / 10, list.Count - num2);
}
byte[] bytes = list.ToArray();
str = Encoding.Default.GetString(bytes);
Encoding encoding = Encoding.Default;
if (str.ToLower().IndexOf("charset=") > 0)
{
encoding = GetEncoding(str);
}
else
{
try
{
encoding = Encoding.GetEncoding(response.CharacterSet);
}
catch
{
}
}
str = encoding.GetString(bytes);
// response.Close();
}
}
return str.Trim();
}
The Timeout property "Gets or sets the time-out value in milliseconds for the GetResponse and GetRequestStream methods." The default value is 100,000 milliseonds (100 seconds).
The ReadWriteTimeout property, "Gets or sets a time-out in milliseconds when writing to or reading from a stream." The default is 300,000 milliseconds (5 minutes).
You're setting Timeout, but leaving ReadWriteTimeout at the default, so your reads can take up to five minutes before timing out. You probably want to set ReadWriteTimeout to a lower value. You might also consider limiting the size of data that you download. With my crawler, I'd sometimes stumble upon an unending stream that would eventually result in an out of memory exception.
Something else I noticed when crawling is that sometimes closing the response stream will hang. I found that I had to call request.Abort to reliably terminate a request if I wanted to quit before reading the entire stream.
There is nothing apparent in the code you provided.
Why did you comment response.Close() out?
Documentation hints that connections may run out if not explicitly closed. The response getting disposed may close the connection but just releasing all the resources is not optimal I think. Closing the response will also close the stream so that is covered.
The system hanging without timeout can be just a network issue making the response object a dead duck or the problem is due the high number of threads resulting in memory fragmentation.
Looking at anything that may produce a pattern may help find the source:
How many threads are typically running (can you bundle request sets in less threads)
How is the network performance at the time the thread stopped
Is there a specific count or range when it happens
What data was processed last when it happened (are there any specific control characters or sequences of data that can upset the stream)
Want to ask more questions but not enough reputation so can only reply.
Good luck!
Below is some code that does something similar, it's also used to access multiple web sites, each call is in a different task. The difference is that I only read the stream once and then parse the results. That might be a way to get around the stream reader locking up randomly or at least make it easier to debug.
try
{
_webResponse = (HttpWebResponse)_request.GetResponse();
if(_request.HaveResponse)
{
if (_webResponse.StatusCode == HttpStatusCode.OK)
{
var _stream = _webResponse.GetResponseStream();
using (var _streamReader = new StreamReader(_stream))
{
string str = _streamReader.ReadToEnd();
I have this code which is meant to make a async call but it is not, please have a look at it and let me know where is it going wrong.
try
{
byte[] bytes;
Stream objRequestStream = null;
bytes = System.Text.Encoding.ASCII.GetBytes(GetJSONforGetMenuDetails(Id, MenuIds));
wReq = (HttpWebRequest)WebRequest.Create(new Uri("http://" + MobileWLCUrl + urlCreateCacheAPI));
wReq.ContentLength = bytes.Length;
wReq.ContentType = "text/x-json";
wReq.ServicePoint.Expect100Continue = false;
wReq.Method = "POST";
objRequestStream = wReq.GetRequestStream();
objRequestStream.Write(bytes, 0, bytes.Length);
objRequestStream.Close();
wReq.BeginGetResponse(new AsyncCallback(FinishWebRequest), null);
//resp = WebAccess.GetWebClient().UploadString("http://" + MobileWLCUrl + urlCreateCacheAPI, GetJSONforGetMenuDetails(Id, MenuIds));
//EngineException.CreateLog("Cache Created (for Menus: " + MenuIds + ") in API for LocationId: " + Id);
}
catch (Exception ex) { EngineException.HandleException(ex); }
void FinishWebRequest(IAsyncResult result)
{
WebResponse wResp = wReq.EndGetResponse(result) as WebResponse;
StreamReader sr = new StreamReader(wResp.GetResponseStream());
String res = sr.ReadToEnd();
EngineException.CreateLog("Cache Created (for Menus: " + MenuIds + ") in API for LocationId: " + LocId);
}
Where is it going wrong? When I debug it, it waits for the call to get over to continue, but that should not happen.
BeginGetResponse is documented to include a synchronous portion:
The BeginGetResponse method requires some synchronous setup tasks to
complete (DNS resolution, proxy detection, and TCP socket connection,
for example) before this method becomes asynchronous. As a result,
this method should never be called on a user interface (UI) thread
because it might take some time, typically several seconds. In some
environments where the webproxy scripts are not configured properly,
this can take 60 seconds or more.
Apart from that, if you call FinishWebRequest (technically, if you call EndGetResponse) before the request has had time to complete, EndGetResponse will block.
This is expected, since EndGetResponse is required to return to you an object that you can get the response data from -- but how can it return such an object if the request has not yet completed?
I'm running into a bit of problem when trying to readToEnd from an asynchronous webrequest. Here's the code:
public void GetHTTP(string http)
{
request = (HttpWebRequest)WebRequest.Create(http);
RequestState rs = new RequestState(); // Class to hold state. Can ignore...
Setup(); // contain statements such as request.Accept =...;
rs.Request = request;
IAsyncResult r = (IAsyncResult)request.BeginGetResponse(new AsyncCallback ResponseSetup), rs);
allDone.WaitOne();
Referer = http; //Can ignore this...
}
private void ResponseSetup(IAsyncResult ar)
{
RequestState rs = (RequestState)ar.AsyncState;
WebRequest request = rs.Request;
WebResponse response = request.EndGetResponse(ar);
Stream ResponseStream = response.GetResponseStream();
rs.ResponseStream = ResponseStream;
IAsyncResult iarRead = ResponseStream.BeginRead(rs.BufferRead, 0, BUFFER_SIZE, new AsyncCallback(ReadCallBack), rs);
StreamReader reader = new StreamReader(ResponseStream);
information = reader.ReadToEnd();
//The problem is right here; ReadToEnd.
}
When trying to invoke the readToEnd method, I get this error message: The stream does not support concurrent I/O read or write operations.
I've searched, but I could not find a solution to this problem. How can it be fixed?
That's because you're trying to do two reads. The call to BeginRead initiates a read operation. Then ReadToEnd initiates another read operation on the same stream.
I think what you want is just the ReadToEnd. Remove the call to ResponseStream.BeginRead.
here's my method:
private static void UpdatePref(List<EmailPrefer> prefList)
{
if(prefList.Count > 0)
{
foreach (EmailPref pref in prefList)
{
UpdateEmailRequest updateRequest = new UpdateEmailRequest(pref.ID.ToString(), pref.Email, pref.ListID.ToString());
UpdateEmailResponse updateResponse =(UpdateEmailResponse) updateRequest.SendRequest();
if (updateResponse.Success)
{
Console.WriteLine(String.Format("Update Succsesful. ListID:{0} Email:{2} ID:{1}", pref.ListID, pref.Email, pref.ID));
continue;
}
Console.WriteLine( String.Format("Update Unsuccessful. ListID:{0} Email:{2} ID:{1}\n", pref.ListID, pref.Email, pref.ID));
Console.WriteLine(String.Format("Error:{0}", updateResponse.ErrorMessage));
}
Console.WriteLine("Updates Complete.");
}
Console.WriteLine("Procses ended. No records found to update");
}
the list has around 84 valid records that it's looping through and sending an API request for. But it stops on the 3rd API call and only processes 2 out of the 84 records. When I debug to see what's happening, I only see that it stops here in my SendRequest method without spitting out any error. It's stops at the GetRequestStream and when I step to that and try to keep stepping, it just stops and my application stops running without any error!
HttpWebRequest request = CreateWebRequest(requestURI, data.Length);
request.ContentLength = data.Length;
request.KeepAlive = false;
request.Timeout = 30000;
// Send the Request
requestStream = request.GetRequestStream();
wtf? Eventually if I let it keep running I do get the error "The Operation Has Timed Out". But then why did the first 2 calls go through and this one timed out? I don't get it.
Also, a second question. Is it inefficient to have it create a new object inside my foreach for sending and receiving? But that's how I stubbed out those classes and required that an email, ListID and so forth be a requirement to send that type of API call. I just didn't know if it's fine or not efficient to create a new instance through each iteration in the foreach. Might be common but just felt weird and inefficient to me.
EDIT: It seems you answered your own question already in the comments.
I don't have personal experience with this, but it seems you need to call close on the HTTP web request after you've fetched the response. There's a limit of 2 on the number of open connections and the connection isn't freed until you Close(). See http://blogs.msdn.com/feroze_daud/archive/2004/01/21/61400.aspx, which gives the following code to demonstrate the symptoms you're seeing.
for(int i=0; i < 3; i++) {
HttpWebRequest r = WebRequest.Create(“http://www.microsoft.com“) as HttpWebRequest;
HttpWebResponse w = r.GetResponse() as HttpWebResponse;
}
One possibility for it timing out is that the server you're talking to is throttling you. You might try inserting a delay (a second, maybe?) after each update.
Assuming that UpdateEmailRequest and UpdateEmailResponse are somehow derived from WebRequest and WebResponse respectively, it's not particularly inefficient to create the requests the way you're doing it. That's pretty standard. However, note that WebResponse is IDisposable, meaning that it probably allocates unmanaged resources, and you should dispose of it--either by calling the Dispose method. Something like this:
UpdateEmailResponse updateResponse =(UpdateEmailResponse) updateRequest.SendRequest();
try
{
if (updateResponse.Success)
{
Console.WriteLine(String.Format("Update Succsesful. ListID:{0} Email:{2} ID:{1}", pref.ListID, pref.Email, pref.ID));
continue;
}
Console.WriteLine( String.Format("Update Unsuccessful. ListID:{0} Email:{2} ID:{1}\n", pref.ListID, pref.Email, pref.ID));
Console.WriteLine(String.Format("Error:{0}", updateResponse.ErrorMessage));
}
finally
{
updateResponse.Dispose();
}
I guess it's possible that not disposing of the response objects keeps an open connection to the server, and the server is timing out because you have too many open connections.