WCF - Wondering about Request Queueing - c#

I have a quick simply question about requests in WCF. Does WCF automatically queue requests to a service(the service being a singleton) when multiple users request the same process, ie lets say I have a function that takes a while to complete, and two users make a call to this function, does WCF automatically queue the requests so that when the first request is finished it then starts processing the next?
~Just Wondering

The service behavior attribute on the contract defines how sessions, instances and concurrency are handled. See http://msdn.microsoft.com/en-us/library/ms731193.aspx for more details.
Basically you can configure it (1) handle one request at a time or (2) multiple requests at the same time.

Related

How to process incoming calls asynchronously to WCF?

I have a WCF service that is being called from a web application (Web App) that is used by multiple users.
It takes some time for the WCF service to process a single request (e.g. 30 seconds).
I have noticed that if I open the Web App in two tabs or in two browser windows and make the Web App send two requests to the WCF service, they are processed subsequently. That is WCF will start processing the second request only after the first request will be processed to the end.
Is there a way to make the WCF service process incoming requests in parallel?
Just because technology allows you to do something doesn't mean it's a good idea.
Here are the instructions for the dangerous tool you're asking for help developing.
https://learn.microsoft.com/en-us/dotnet/framework/wcf/how-to-implement-an-asynchronous-service-operation
Every programmer should understand the complexity an accessible, distributed process makes possible. In short, they centralize user requirements. Harvard Business Review proves DRY is a bad practice.
http://www.powersemantics.com/p.html
Proceed at your own discretion. I've given you an alternative.

ASP.NET Web API 2 Handling asynchronous requests

I am trying to determine what will happen when my Web API methods are being called simultaneously by two clients. In order to do so I am generating two async requests in Python:
rs = (grequests.post(url, data = json), grequests.post(url, data = json))
grequests.map(rs)
The requests call a non-async method in my WebApi which is changing my resource, which is a static class instance. From what I've understood, API requests should not run concurrently, which is what I want since I don't want the two requests to interfere with eachother. However, if I set a break point in my API method, the two requests seem to run concurrently (in parallell). This leads to some unwanted results since the two requests are changing the same resource in parallell.
Is there any way to make sure that the Web API handles the requests non-concurrently?
Requests to your Web API methods are handled by the IIS application pool (pool of threads) which initializes a thread for every synchronous request it receives. There is no way to tell IIS to run these threads non-concurrently.
I believe you have a misunderstanding of what a "non-async" Web API method is. When a Web API method is async that means it will share its application pool thread while it's in a wait state. This has the advantage of other requests not having to initialize a new thread (which is somewhat expensive). It also helps minimize the number of concurrent threads which in turn minimizes the number of threads that the application pool has to queue up.
For non-async methods, the IIS application pool will initialize a new thread for every request even if an inactive thread is available. Also, it will not share that thread with any other requests.
So the short answer to your question is no. There is no way to make sure that the Web API requests are handled non-concurrently. Brian Driscoll's comment is correct. You will have to lock your shared resources during a request.

Queuing concurrent web service requests in ASP.Net

I have an application that sends concurrent web service requests. These requests are then processes by a .Net proxy (that has several methods depending on the request). When a request is processed, a SoapLogger Extension class will then log request and response calls.
The issue I have is that some of the web service requests did not receive responses and are stuck. One reason could be threads timed-out. Is there a way that web service method requests can be queued and add a "sleep" parameter to synchronous request? or is a reverse-proxy the last resort?
I am thinking of processing 1 web service synch request at a time and add a sleep parameter on the interface to give time to the asynchronous part to progress while any new incoming request would be queued.

Respond in WCF web role after other roles complete

I have a WCF service with public REST and SOAP endpoints that has a single post method. Currently the app processes as follows
Object Posted to service method
Service method passes data to storage queue then sends an XML response.
Second worker role checks queue, does some processing and depending on processing, adds record to DB or rejects it.
My problem currently is that the WCF method responds before a decision is made to accept or reject the message. There is a lot of processing, and various worker roles are used after the WCF message is received depending on customer settings. I would like to be able to make the accept/reject decision and then respond with relevant data. I am wondering how to do this.
I could do an asynchronous Service method, but then all code would need to go into the WCF role or callable library which isn't ideal. Is there a way to not to the async End method until there is a response from another worker role (a message is in the queue)? I realize I could just do a thread.sleep and check the queue periodically, but then I would have to add the original request data to the queue and I really don't think that's the correct way to do things. I Googled this and couldn't find a concrete, non-hackish method.
Let me know if code samples would help, but (I think) it's pretty straightforward.

How to make a call to WCF webservice with more than one client at the same time (in parallel)

I have a c# WCF web service which is a server and I do have 2 clients one is java client and another is c++ client. I want both the clients to run at the same time. The scenario I have and am unable to figure it out is:
My java client will be making a call to the WCF web service and the processing time might take around 10 mins, meanwhile I want my c++ client to make a call to the web service and the get the response back. But right now I am just able to make a call to web service using c++ client when the java client request is being processed. I am not getting the response back for c++ client request until java client request is completed.
Can any one please suggest me how to make this work parallel. Thanks in advance.
Any "normal" WCF service can most definitely handle more than one client request at any given time.
It all depends on your settings for InstanceContextMode:
PerSession means, each session gets a copy of the service class to handle a number of requests (from that same client)
PerCall means, each request gets a fresh copy of the service class to handle the request (and it's disposed again after handling the call)
Single means, you have a singleton - just one copy of your service class.
If you have a singleton - you need to ask yourself: why? By default, PerCall is the recommended setting, and that should easily support quite a few requests at once.
See Understanding Instance Context Mode for a more thorough explanation.
Use
[ServiceBehavior( ConcurrencyMode = ConcurrencyMode.Multiple )]
attribute over your service class. More on this for example here:
http://www.codeproject.com/Articles/89858/WCF-Concurrency-Single-Multiple-and-Reentrant-and
This is peripheral to your question but have you considered asynchronous callbacks from the method that takes 10+ minutes to return, and then having the process run in a separate thread? It's not really good practice to have a service call waiting 10 minutes synchronously, and might solve your problem, although the service should allow for multiple callers at once anyway (our WCF service takes thousands of simultaneous requests).
When you call a WCF you have a choice in either calling it synchronously or asynchronously. A synchronous call waits for the response to send back to the caller in the same operation. In the caller it would look like "myresult = svc.DoSomething()". With an asynchronous call, the caller gives the service a function to call when it completes but does not wait for the response. The caller doesn't block while waiting for the response and goes about its business.
Your callback will take DoSomethingCompletedEventArgs:
void myCallback(object sender, DoSomethingCompletedEventArgs e)
{
var myResult = e.Result;
//then use the result however you would have before.
}
You register the callback function like an event handler:
svc.DoSomethingCompleted+=myCallback;
then
svc.DoSomethingAsync(). Note there is no returned value in that statement; The service would execute myCallBack when it completes and pass the result. (All WCF calls from Silverlight have to be asynchronous but for other clients this restriction isn't there).
Here's a codeproject article that demonstrates a slightly different way in detail.
http://www.codeproject.com/Articles/91528/How-to-Call-WCF-Services-Synchronously-and-Asynchr
This keeps the client from blocking during the 10+ minute process but doesn't really change the way the service itself functions.
Now the second part of what I was mentioning was firing off the 10+ minute process in a separate thread from inside the service. The service methods themselves should be very thin and just be calling functionality in other libraries. Functions that are going to take a long time should ideally be called in their own threads (say a backgroundworker, for which you register on the service side a callback when it completes) and have some sort of persistent system to keep track of their progress and any results that need to go back to the client. If it were me I would register the request for the process in a db and then update that db with its completion. The client would then periodically initiate a simple poll to see if the process was completed and get any results. You might be able to set up duplex binding to get notified when the process completes automatically but to be honest it's been a few years since I've done any duplex binding so I don't remember exactly how it works.
These topics are really too big for me to go into depth here. I would suggest researching multithreaded operations with the BackgroundWorker.

Categories

Resources