We will need to call out to a 3rd party to retrieve a value using REST, however if we do not receive a response within 10ms, I want to use a default value and continue processing.
I'm leaning towards using an asynchronous WebRequest do to this, but I was wondering if there was a trick to doing it using a synchronous request.
Any advice?
If you are doing a request and waiting on it to return I'd say stay synchronous - there's no reason to do an async request if you're not going to do anything or stay responsive while waiting.
For a sync call:
WebRequest request = WebRequest.Create("http://something.somewhere/url");
WebResponse response = null;
request.Timeout = 10000; // 10 second timeout
try
{
response = request.GetResponse();
}
catch(WebException e)
{
if( e.Status == WebExceptionStatus.Timeout)
{
//something
}
}
If doing async:
You will have to call Abort() on the request object - you'll need to check the timeout yourself, there's no built-in way to enforce a hard timeout.
You could encapsulate your call to the 3rd party in a WebService. You could then call this WebService synchronously from your application - the web service reference has a simple timeout property that you can set to 10 seconds or whatever.
Your call to get the 3rd party data from your WebService will throw a WebException after the timeout period has elapsed. You catch it and use a default value instead.
EDIT: Philip's response above is better. RIF.
Related
I am calling HTTP requests to an API which has a limit on how many requests can be made.
These HTTP requests are done in loops and are done very quickly resulting in the HttpClient sometimes throwing a '10030 App Rate Limit Exceeded' exception error which is a 429 HTTP error for too many requests.
I can solve this by doing a Thread.Sleep between each call, however this slows down the application and is not reasonable.
Here is the code I am using:
public static async Task<List<WilliamHillData.Event>> GetAllCompetitionEvents(string compid)
{
string res = "";
try
{
using (HttpClient client = new HttpClient())
{
client.DefaultRequestHeaders.TryAddWithoutValidation("Content-Type", "application/json");
client.DefaultRequestHeaders.TryAddWithoutValidation("apiKey", "KEY");
using (HttpResponseMessage response = await client.GetAsync("https://gw.whapi.com/v2/sportsdata/competitions/" + compid + "/events/?&sort=startDateTime"))
{
res = await response.Content.ReadAsStringAsync();
}
}
JObject jobject = JObject.Parse(res);
List<WilliamHillData.Event> list = jobject["events"].ToObject<List<WilliamHillData.Event>>();
return list;
}
catch (Exception ex)
{
throw ex;
}
}
Is there a way I can find out how many requests can be made per second/minute? And possibly once the limit has been reached throttle down or do a Thread.Sleep until the limit has gone down instead of doing a Thread.Sleep each time it is been called so I am slowing down the app when it is required?
Cheers.
Is there a way I can find out how many requests can be made per second/minute?
Asking the owner of the API or reading the API Docs would help. You also can test it but you never know if there are other limits and even a Ban.
I can solve this by doing a Thread.Sleep between each call, however this slows down the application and is not reasonable.
You need to. You can filter out the 429 Error and not letting it throw. You must look whats better and faster. Slowing down your API Calls, so you stay within the limit or go full speed until you get a timeout and waiting that time.
I have a custom WebUploadTraceListener : TraceListener that I use to send HTTP (and eventually HTTPS) POST data to a web service that writes it to a database.
I have tested doing this with both WebClient and HttpWebRequest and empirically I'm seeing better performance with the latter.
Because of the one-way nature of the data, I don't care about the server response. But I found that if I don't handle the HttpWebResponse my code locks up on the third write. I think this is because of the DefaultConnectionLimit setting and the system not reusing the resource...
Per Jon Skeet
Note that you do need to dispose of the WebResponse returned by request.GetResponse - otherwise the underlying infrastructure won't know that you're actually done with it, and won't be able to reuse the connection.
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(ServiceURI);
httpRequest.Method = "POST";
httpRequest.ContentType = "application/x-www-form-urlencoded";
try
{
using (Stream stream = httpRequest.GetRequestStream())
{
stream.Write(postBytes, 0, postBytes.Length);
}
using (HttpWebResponse response = (HttpWebResponse)httpRequest.GetResponse())
{
/// discard response
}
}
catch (Exception)
{
/// ...
}
I want to maximize the speed of sending the POST data and get back to the main program flow as quickly as possible. Because I'm tracing program flow, a synchronous write is preferable, but not mandatory as I can always add a POST field including a Tick count.
Is an HttpWebRequest and stream.Write the quickest method for doing this in .Net4.0 ?
Is there a cheaper way of discarding the unwanted response?
Attually, httpRequest.GetResponse only post the data to server and don't download anything to the client, except the information to tell client if the request is successfully processed by server.
You only get the respone data when you call GetResponseStream. If even the information about the success/error of the request you don't want to received, you don't have anyway to tell if server is success process the request or not.
So the answer is:
Yes, that is lowest level for managed code, unless you want mess up
with socket (which you shouldn't)
With HttpWebRequest, no. You almost don't get any data
you don't want.
I currently have a python server running that handles many different kinds of requests and I'm trying to do this using C#. My code is as follows:
try
{
ServicePointManager.DefaultConnectionLimit = 10;
System.Net.ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Proxy = null;
request.Method = "GET";
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
response.Close();
}
}
catch (WebException e)
{
Console.WriteLine(e);
}
My first get request is almost instant but after that, the time it takes for one request to go through is almost 30 seconds to 1 min. I have researched everywhere online and tried to change the settings to make it run faster but it can't seem to work. Are there anymore settings that I could change to make it faster?
By using my psychic debugging skills I guess your server only accepts one connection at the time. You do not close your request, so you connection is kept alive. Keep alive attribute. The server will accept new connection when you current one is closed, which is default 100000ms or the server timeout. In your case I guess 30 to 60 seconds. You can start by setting the keepalive attribe to false.
This question already has answers here:
C#: How to programmatically check a web service is up and running?
(2 answers)
Closed 9 years ago.
i'm doing following,
var request = (HttpWebRequest)WebRequest.Create(serviceUrl);
var response = (HttpWebResponse)request.GetResponse();
if(response.StatusCode == HttpStatusCode.OK)
{
//perform some operations.
}
in order to check if the service in up and running in my asp.net c# 4.0 application.
its giving me expected results but if the service is not running it is taking too much time (nearly about 1 min) to get the response.
is their any other better way to check if the service is up and running?
Your request time out property is not letting the fault response to be delievered in a short time. Reffer to this link, here the time out scenario are disscussed very well also visit the link provided in the answer. You should consider using Asynchronous calls to web service (look for Asynchronous calls to a Web service here).
You don't have to do much more than you are already doing, other than add a timeout to your request object. The request timeout should be specified in milliseconds, so 5000 for a 5 second timeout.
Be aware you will then need to catch the exception to detect the timeout, this is a System.Net.WebException with the message The operation has timed out, so your code will look something like:
var request = (HttpWebRequest)WebRequest.Create(serviceUrl);
request.Timeout = 5000;
try
{
var response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode == HttpStatusCode.OK)
{
//perform some operations.
}
}
catch (WebException wex)
{
if (wex.Message == "The operation has timed out")
{
// do stuff
}
}
If the website isn't responding after one second or so, it's probably safe to assume that it's a bad link and that I should move on to my next link in a set of "possible links."
How do I tell the WebClient to stop attempting to download after some predetermined amount of time?
I suppose I could use a thread, but if it's taking longer than one second to download, that's ok. I just want to make sure that it's connecting with the site.
Perhaps I should modify the WebClient headers, but I don't see what I should modify.
Perhaps I should use some other class in the System.Net namespace?
If you use the System.Net.WebRequest class you can set the Timeout property to be short and handle timeout exceptions.
try{
var request = WebRequest.Create("http://www.contoso.com");
request.Timeout = 5000; //set the timeout to 5 seconds
request.Method = "GET";
var response = request.GetResponse();
}
catch(WebException webEx){
//there was an error, likely a timeout, try another link here
}