I have an application that sends concurrent web service requests. These requests are then processes by a .Net proxy (that has several methods depending on the request). When a request is processed, a SoapLogger Extension class will then log request and response calls.
The issue I have is that some of the web service requests did not receive responses and are stuck. One reason could be threads timed-out. Is there a way that web service method requests can be queued and add a "sleep" parameter to synchronous request? or is a reverse-proxy the last resort?
I am thinking of processing 1 web service synch request at a time and add a sleep parameter on the interface to give time to the asynchronous part to progress while any new incoming request would be queued.
Related
I have a problem statement where a product is making an outbound call to a self-hosted service in a synchronous way, and is expecting the response in synchronous manner.
In order to provide the response, my self hosted service needs to do a lot of async operations and goes to on-prem. On-prem sends back the response to receiver service, and while sending the response back to self-hosted service, I am not able to identify the response belongs to which request.
I have read about semaphore, and was wondering if that could be used to solve this.
For example:
Main state of each thread in Semaphore
Let the self hosted service do the usual work, and wait for something
Once I have the response from on-prem and SNS sends an event, awake to do the further processing
High-level architecture
Kindly note: I can't change the flow or the architecture
I am trying to determine what will happen when my Web API methods are being called simultaneously by two clients. In order to do so I am generating two async requests in Python:
rs = (grequests.post(url, data = json), grequests.post(url, data = json))
grequests.map(rs)
The requests call a non-async method in my WebApi which is changing my resource, which is a static class instance. From what I've understood, API requests should not run concurrently, which is what I want since I don't want the two requests to interfere with eachother. However, if I set a break point in my API method, the two requests seem to run concurrently (in parallell). This leads to some unwanted results since the two requests are changing the same resource in parallell.
Is there any way to make sure that the Web API handles the requests non-concurrently?
Requests to your Web API methods are handled by the IIS application pool (pool of threads) which initializes a thread for every synchronous request it receives. There is no way to tell IIS to run these threads non-concurrently.
I believe you have a misunderstanding of what a "non-async" Web API method is. When a Web API method is async that means it will share its application pool thread while it's in a wait state. This has the advantage of other requests not having to initialize a new thread (which is somewhat expensive). It also helps minimize the number of concurrent threads which in turn minimizes the number of threads that the application pool has to queue up.
For non-async methods, the IIS application pool will initialize a new thread for every request even if an inactive thread is available. Also, it will not share that thread with any other requests.
So the short answer to your question is no. There is no way to make sure that the Web API requests are handled non-concurrently. Brian Driscoll's comment is correct. You will have to lock your shared resources during a request.
We have an existing Websphere MQ Queue Manager (running fine, no issues). This has for each "method" a pair of queues: Request and Response.
We'd like to put a web service front end over this for the benefit of some apps we have that cannot call MQ but can call web services.
Of course, Web Services can be synchronous but our MQ is async...and I am not sure how to get around this.
Example:
App calls webservice...web service waits for response.
Webservice calls MQ Request queue and puts the message.
of course, the response will be on a different channel...so my thinking is that the webservice would have to read all the messages on the queue and only remove the correct one (by some identifier such as GUID).
Has anyone got any previous design knowledge on solving this?
The web service does need to read all the response messages, you can perform a correlate get. When the request is put on the request queue you use the request message id and wait on the response queue for the response message with the correlation id. MQ handles this very efficiently.
Here is another stackoverflow answer that shows some code for performing a correlated get
Issue in Correlating request message to resp message in Java Client to access MQ Series
I have an ASP.NET application which transforms HTTP GET requests into web service calls.
Now each request results in a web method call. I would like to collect or group the data from the requests to execute less service calls. So each request should check if a processor object exists and accept more data.
Then pass the data to this object or create a new processor object. The maximum waiting period of time for a request to be passed to the service would normally be in the range of seconds or maybe a couple of minutes.
Is there a nice pattern for this?
I have a quick simply question about requests in WCF. Does WCF automatically queue requests to a service(the service being a singleton) when multiple users request the same process, ie lets say I have a function that takes a while to complete, and two users make a call to this function, does WCF automatically queue the requests so that when the first request is finished it then starts processing the next?
~Just Wondering
The service behavior attribute on the contract defines how sessions, instances and concurrency are handled. See http://msdn.microsoft.com/en-us/library/ms731193.aspx for more details.
Basically you can configure it (1) handle one request at a time or (2) multiple requests at the same time.