Asynchronous request multiple URLs, and most requests are timed out - c#

There are multiple URLs,I use asynchronous request to call the URLs,but most requests are timed out.The URLs are accessible,asynchronous request only any one of the URLs, can get a response.The code like this:
foreach (var url in URLs)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
IAsyncResult result = (IAsyncResult)request.BeginGetResponse(new AsyncCallback(RequestCallback), request);
ThreadPool.RegisterWaitForSingleObject(result.AsyncWaitHandle, new WaitOrTimerCallback(TimeoutCallback), request, request.Timeout, true);
}
Can someone tell me the reason?

Not exactly an answer, but if you have .net 4.5 and VS2012 you can use async await, which is much cleaner. The code would be...
var tasks = from url in URLs
let request = WebRequest.Create(url)
select request.GetRequestStreamAsync();
Stream[] results = Task.WaitAll(tasks);
However this isn't a very good way to write code. You should process your streams as quickly as possible, as Windows should limit the number of Http streams your program can have open at the same time.

Related

Should I use synchronous/asynchronous API calls when creating an API that calls 2 other APIs?

I’m creating an API that serves as the bridge between the app and 2 other APIs. I want to know if what is the best way to do this. I’m using HttpClient. The app has almost a thousand users so if I use synchronous calls does that mean that if a user calls the API, then the other users have to wait until the 1st user gets the response before it proceeds to the other API requests? Is there a better way of doing an API like this?
Here is a sample of my code using synchronous:
[HttpGet]
[Route("api/apiname")]
public String GetNumberofP([FromUri]GetNumberofPRequest getNPRequest){
var request = JsonConvert.SerializeObject(getNPRequest);
string errorMessage = "";
try{
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", token.gettoken());
var response = httpClient.GetAsync("api/MobileApp/GetNumberP?"
+ "strCardNumber=" + getNPRequest.strCardNumber
+ "&strDateOfBirth=" + getNPRequest.strDateOfBirth).Result;
return response;
}
catch (Exception e){
throw utils.ReturnException("GetNumberofP", e, errorMessage);
}
}
if I use synchronous calls does that mean that if a user calls the API, then the other users have to wait until the 1st user gets the response before it proceeds to the other API requests
No. When a request comes into the pipeline, a new thread is spawned by the framework. So if 1,000 requests come in at the same time, the 1,000th user will not have to wait for the other 999 requests to finish.
You are better off using async code for this anyway. For any I/O like network requests, you're usually better off for performance letting a background thread do the waiting. Side note, you never want to call .Result because that forces the async code to become blocking and effectively becomes synchronous.
t's always easy to turn a synchronous call into an asynchronous one, but the other way around is fraught with danger. You should make your API asynchronous.
[HttpGet]
[Route("api/apiname")]
public Task<string> GetNumberofP([FromUri]GetNumberofPRequest getNPRequest)
{
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", token.gettoken());
return httpClient.GetAsync($"api/MobileApp/GetNumberP?strCardNumber={getNPRequest.strCardNumber}&strDateOfBirth={getNPRequest.strDateOfBirth}");
}
You should also consider creating a new httpClient for each call.
It seems you're missing the async and await keywords.
public async String GetNumberofP([FromUri]GetNumberofPRequest getNPRequest){
(...)
var response = await httpClient.GetAsync();

Timeout behaviour in HttpWebRequest.GetResponse() vs GetResponseAsync()

When I try the following code:
var request = (HttpWebRequest)HttpWebRequest.Create(url);
request.Timeout = 3; // a small value
var response = request.GetResponse();
Console.WriteLine(response.ContentLength);
for a URL that I know it is going to take more than 3 millisecond to load (I put a Thread.Sleep(110000) in Application_BeginRequest) it works fine and throws a WebException as expected.
Problem is when I switch to async method:
var response = request.GetResponseAsync().Result;
or
var response = await request.GetResponseAsync();
This async version completely ignores any Timeout value, including ReadWriteTimeout and ServicePoint.MaxIdleTime
I couldn't find anything about Timeout in MSDN's GetResponseAsync() now I'm wondering if it is a bug in GetResponseAsync() or something is wrong in the way I use async here?
Timeout does not apply to asynchronous HttpWebRequest requests. To quote the docs:
The Timeout property has no effect on asynchronous requests
I recommend you use HttpClient instead, which was designed with asynchronous requests in mind.
Follow a solution to solve the problem.
await Task.Run(() => {
var varHttpResponse = varWebRequest.GetResponse();
});

Sending multiple requests to a server using multithreading

I have a task where I form thousands of requests which are later sent to a server. The server returns the response for each request and that response is then dumped to an output file line by line.
The pseudo code goes like this:
//requests contains thousands of requests to be sent to the server
string[] requests = GetRequestsString();
foreach(string request in requests)
{
string response = MakeWebRequest(request);
ParseandDump(response);
}
Now, as can be seen the serve is handling my requests one by one. I want to make this entire process fast. The server in question is capable of handling multiple requests at a time. I want to apply multi-threading and send let's say 4 requests to the server at a time and dump the response in same thread.
Can you please give me any pointer to possible approaches.
You can take advantage of Task from .NET 4.0 and the new toy HttpClient, sample code below is showed how you send requests in parallel, then dump response in the same thread by using ContinueWith:
var httpClient = new HttpClient();
var tasks = requests.Select(r => httpClient.GetStringAsync(r).ContinueWith(t =>
{
ParseandDump(t.Result);
}));
Task uses ThreadPool under the hood, so you don't need to specify how many threads should be used, ThreadPool will manage this for you in optimized way.
The easiest way would be to use Parallel.ForEach like this:
string[] requests = GetRequestsString();
Parallel.ForEach(requests, request => ParseandDump(MakeWebRequest(request)));
.NET framework 4.0 or greater is required to use Parallel.
I think this could be done in a consumer-producer-pattern. You could use a ConcurrentQueue (from the namespace System.Collections.Concurrent) as a shared resource between the many parallel WebRequests and the dumping thread.
The pseudo code would be something like:
var requests = GetRequestsString();
var queue = new ConcurrentQueue<string>();
Task.Factory.StartNew(() =>
{
Parallel.ForEach(requests , currentRequest =>
{
queue.Enqueue(MakeWebRequest(request));
}
});
Task.Factory.StartNew(() =>
{
while (true)
{
string response;
if (queue.TryDequeue(out response))
{
ParseandDump(response);
}
}
});
Maybe a BlockingCollection might serve you even better, depending on how you want to go about synchronizing the threads to signal the end of incoming requests.

Should I Use Async Processing?

I've got a Windows service that monitors a table (with a timer) for rows, grabs rows one at a time when they appear, submits the information to a RESTful web service, analyzes the response, and writes some details about the response to a table. Would I gain anything by making this async? My current (stripped down) web service submission code is as follows:
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(new Uri(url));
HttpWebResponse resp;
try
{
resp = (HttpWebResponse)req.GetResponse();
}
catch (WebException we)
{
resp = (HttpWebResponse)we.Response;
}
if (resp != null)
{
Stream respStream = resp.GetResponseStream();
if (respStream != null)
{
responseBody = new StreamReader(respStream).ReadToEnd();
}
resp.Close();
respStream.Close();
}
return responseBody;
If you don't care how long it takes to get your response for any individual request, no, there's no particular reason to make it async.
On the other hand, if you're waiting for one request to fully finish before you start your next request you might have trouble handling a large volume. In this scenario you might want to look at parallelizing your code. But that's only worth discussing if you get large numbers of items entered into the database for processing.
By using asynchronous methods you avoid the need to have a dedicated thread blocking on the results of the async operation. You can free up that thread to work on other more productive tasks until the async task has finished and is again productive work.
If your service is not in high demand, and is not frequently in a position where you have more work that needs doing than threads to do it, then you don't need to worry. For many business apps they simply have so low usage rates that this isn't needed.
If you have a large scale app servicing enough users that you have a very high number of concurrent requests (even if it's just at peak usage times) then it may be work switching to asynchronous counterparts.

ASP.NET MVC AsyncController and IO-bound requests

I have an AsyncController and a homepage that queries the user's friends list and does some database work with them. I implemented the async action method pattern for any requests that call external web services. Is this an efficient way of handling this situation? During times of high request volume I am seeing IIS being thread-starved at times and I worry that my nested Async magic may somehow be involved in this.
My main questions/talking points are:
Is it safe to nest an IAsyncResult asynchronous web request inside an Async controller action? Or is this just doubling the load somewhere?
Is it efficient to use ThreadPool.RegisterWaitForSingleObject to handle timing out long running web requests or will this eat ThreadPool threads and starve the rest of the app?
Would it be more efficient to just do a synchronous web request inside an Async Controller action?
Example code:
public void IndexAsync()
{
AsyncManager.OutstandingOperations.Increment();
User.GetFacebookFriends(friends => {
AsyncManager.Parameters["friends"] = friends;
AsyncManager.OutstandingOperations.Decrement();
});
}
public ActionResult IndexCompleted(List<Friend> friends)
{
return Json(friends);
}
User.GetFacebookFriends(Action<List<Friend>>) looks like this:
void GetFacebookFriends(Action<List<Friend>> continueWith) {
var url = new Uri(string.Format("https://graph.facebook.com/etc etc");
HttpWebRequest wc = (HttpWebRequest)HttpWebRequest.Create(url);
wc.Method = "GET";
var request = wc.BeginGetResponse(result => QueryResult(result, continueWith), wc);
// Async requests ignore the HttpWebRequest's Timeout property, so we ask the ThreadPool to register a Wait callback to time out the request if needed
ThreadPool.RegisterWaitForSingleObject(request.AsyncWaitHandle, QueryTimeout, wc, TimeSpan.FromSeconds(5), true);
}
QueryTimeout just aborts the request if it takes longer than 5 seconds.
The fully asynchronous method you first describe is best, as this releases TP threads back to the pool for reuse. It's highly likely that you're performing some other blocking action elsewhere. What happens in QueryResponse? Although you fetch the response asynchronously, are you also reading the response stream asynchronously? If not, make it so, then TP starvation should be reduced.

Categories

Resources