I have a WCF service, marked with the OperationContract attribute.
I have a potentially long running task I want to perform when this operation is carried out, but I don't want the caller (in this case Silverlight) to have to wait for that to complete.
What is my best option for this?
I was thinking of either
something like the OnActionExecuted method of ActionFilterAttibute in System.Web.Mvc, but couldn't see an equivilent.
something listening to an event. (The process I want to call is a static, so I'm not too sure about this approach)
something else:
In the scenario I'm working in, I lock the app so the user cannot make any changes during the save until I get the response (a status code) back.
Keep in mind, Silverlight won't actually have to 'wait' for the call to finish. When you create a service reference within Silverlight you will automatically get async calls.
Assuming you really don't need to wait for the call to finish (ie: your service method uses a 'void' return type) you can mark the service method as one-way via:
[OperationContract(IsOneWay = true)]
void MyServiceMethod(some args);
In general, I suggest having another process service handle long-running actions. Create a simple Windows Service, and have it pull requests from an MSMQ queue via WCF. Have the main service post requests to the background service, then return to its caller. If anyone cares about the results, then the results may be placed in an output queue, and the Silverlight application could get them by querying the output queue.
You might also look into Windows Workflow Foundation, which is made to fit very well with WCF. In fact, you can have just this kind of service, where all the logic of the service is in the workflow. If the workflow takes too long, it can be persisted to disk until it's ready to go again.
my suggestion is to go for nettcp binding for your distributed computing
try it and you will get a solution for your problem
for nettcpbinding usage please follow below link
http://msdn.microsoft.com/en-us/library/ff183865.aspx
Related
We are trying to Consume REST API, for message processor which has some operation which might take more than configured timeout.
Would like to know, if the timeout of Http call to API, will stop execution of API, or API will keep executing?
Idea is that, we can fire and forget API, we are not worried if API does not return 404 or 503. But would like to hear if API will continue to execute?
Any input or suggestion appreciated.
You should use some kind of background processing to handle the process.
I recommend using Hangfire for it.
https://www.hangfire.io/
Use Hangfire to enqueue a job, it will return a job id. You can return this job id to client side.
Expose another API to check for the status of this job.
Great way is to handle this with callback/observer pattern. First of all, understand that there are two types of timeout, server and client. You can explicitly specify client timeout, server timeout is handled by server itself.
So, you will need to implement algorithm such that,
you identify each request unique way and mark it before firing into
memory or file/db.
Fire request with associated callback method.
Hence on response you have control to do stuff like, mark request
fulfilled or failed or what ever it is.
Mark/delete request data.
As you'll most likely figure it out, I'm not very experienced with async operations in general (only used Android's AsyncTask).
This is the outline of a WCF REST POST method; I'll use this image to hopefully explain what I'm trying to achieve.
The FirstJob saves some stuff to the database.
SecondJob reads what was saved in the database and does some work with the data.
The client does not care about what happens in SecondJob and just wants to receive the response from FirstJob.
So the two jobs don't need to run in parallel as SecondJob depends on FirstJob; the SecondJob would ideally run in a separate thread/context(?) or similar.
From what I've noticed, the second job does start in a separate thread, the execution reaches the return statement while the 2nd job is running, but the request does not end until SecondJob finishes.
I'd personally treat the second job as a separate POST operation and call the second job POST from the controller. The controller is the controller for the first job and can return the correct status from the first job; it just happens to call a POST out to a second endpoint while doing it.
The benefit of this approach is that the second job doesn't even need to be on the same IIS (in an NLB farm it could be anywhere) so you get load balancing thrown in for free. Alternatively the "second job server" can be on a specific URL reserved just for this kind of background processing task.
I suggest you not to rely on the IIS to handle your background task as it can shut down it without waiting. I suggest you to create a windows service application, which will accept the requests for a second jobs, via another WCF binding or database requests or something else.
You can get the results of the second jobs with another request from your controller, as #PhillipH stated.
The thing I was trying to do was actually working in the first place, but the visual studio debugger fooled me. I tested again without the debugger, but with a Tread.Sleep(60000) and it looks like it behaves as expected. The SecondJob keeps running in the background after the api call returned the response.
I have a long running operation you might read in couple of my another questions (for your reference here is first and second).
In the beginning of whole deal, project expose a form in which user should specify all necessary information about XML file and upload XML file itself. In that method all user input data caught and went to an WCF service that handles such king of files. Controller got only task id of such processing.
Then user got redirected to progress bar page and periodically retrieves status of task completeness, refreshes the progress bar.
So here is my issue comes. When processing of XML file if over, how can I get results back and show them to user?
I know that HTTP is stateless protocol but there is cookie mechanism that could help in this situation. Of course, I may just save processing results to some temporary place, like a static class in WCF server, but there is a high load on service, so it will eat all of supplied memory.
In other words, I would like to pass task to WCF service (using netNamedPipeBinding) and receive results back as fast as it really possible. I want to escape temporary saving result to some buffer and wait until client will gather it back.
As far as I go is using temporary buffer not on service side but at client's:
using (XmlProcessingServiceClient client = new XmlProcessingServiceClient())
{
client.AnalyzeXmlAsync(new Task { fileName = filePath, id = tid });
client.AnalyzeXmlCompleted += (sender, e) =>
{
System.Web.HttpContext.Current.Application.Lock();
// here is I just use single place for all clients. I know it is not right, it is just for illustrating purposes.
System.Web.HttpContext.Current.Application["Result"] = e;
System.Web.HttpContext.Current.Application.UnLock();
};
}
I suggest you to use a SignalR hub to address your problem. You have a way to call a method on the client directly to notify the operation completed. And this happen without having to deal with the actual infrastructure trouble there is in implementing such strategies. Plus SignalR plugs easily in an asp.net MVC application.
To be honest I didn't really get the part about the wcf server and stuff, but I think I can give you more of an abstract answer. To be sure:
You have a form with some fields + file upload
The user fills in the form and supplies an XML file
You send the XML file to an WFC services which procress it
Show in the mean time a progress bar which updates
After completion show the results
If this is not want you want or this is not what your question is about you can skip my answer, otherwise read on.
Before we begin: Step 3 is a bit ambiguous: It could mean that we send the data to the service and wait for it to return the result or that we send the data to the service and we donĀ“t wait for it to return the result.
Situation 1:
Create in a view the form with all the required fields
Create an action in your controller which handles the postback.
The action will send the data to the service and when the service returns the result, your action will render a view with the result.
On the submit button you add an javascript on click event. This will trigger an ajax call to some server side code which will return the progress.
The javascript shows some sort of status bar with the correct progress and repeats itself every x seconds
When the controller finishes it will show the result
Situation 2:
-
-
After sending the data to the service the controller shows a view with the progress bar.
We add an javascript event on document ready which checks the status of the xml file and updates a progressbar. (same as the onclick event in step 4 in situation 1)
When the progressbar reaches 100% it will redirect to a different page which shows the results
Does this answer your question?
Best regards,
BHD
netNamedPipeBinding will not work for cross-machine communication if this is what you have in mind.
If you want to host our service on IIS then you will need one of the bindings that use HTTP as their transport protocol. Have a look at the duplex services that allow both endpoints to send messages. This way the server can send messages to the client anytime it wishes to. You could created a callback interface for progress reporting. If the task is going to take a considerable amount of time to complete, then the overhead of the progress reporting through HTTP might be ok.
Also have a look at Building and Accessing Duplex Services if you want to use a duplex communication over HTTP with Silverlight (PollingDuplexHttpBinding).
Finally you could look for a Comet implementation for ASP.NET. In CodeProject you will at least a couple (CometAsync and PokeIn).
I'm not sure if this is the best solution but I was able to do something similar. This was the general setup:
Controller A initialized a new class with the parameters for the action to be performed and passed the user's session object
The new class called a method in a background thread which updated the user's session as it progressed
Controller B had json methods that when called by client side javascript, checked the user's session data and returned the latest progress.
This thread states that using the session object in such a way is bad but I'm sure you can do something similar with a thread safe storage method like sql or a temp file.
How would one go about waiting for an asynchronous result in a controller method? It seems like it would be trivial to implement, but so far I have not found a good example that is clean and elegant.
Here is the problem. I have two controller methods. Method A and Method B. Method A starts a long running process via TaskFactory, and uses ContinueWith to update a cached value when the process is finished. There are some intermediate steps between A and B. Now when I get to Method B, I need to check the value in the cache. If the value in the cache has not been updated yet, I need to wait for x amount of seconds and periodically check to see if the value has been updated.
I would prefer to handle all of this in the controller, so the client doesn't have to poll for the result (and having to rewrite the controller to handle the polling). But I haven't been able to figure out a way to implement the polling inside the controller method that doesn't block until the timeout expires(I'm worried about thread pool starvation). Maybe there isn't a clean implementation that I can use here. And if polling is the right or only answer, I will just have to accept it.
HTTP is a stateless protocol. You can't implement such polling on the server without blocking. You have 2 possibilities:
Polling on the client - have the client hammer your server at regular intervals with AJAX requests
Use push and have the server notify the client(s) when some task completes - checkout SignalR
You can wait for mvc 4 to get async controllers or get beta of mvc 4 and .net 4.5.
I have a class library I developed that is rather processing intensive that I currently call through a WCF REST service.
The REST service directly accesses the DLLs for the class library and more or less the WCF rest service is an interface for the system.
Let's say the following methods are defined:
Create Request
Starts a thread that takes five minutes, but immediately returns a session ID that the process generates and the thread uses to report when it is completed to the database.
Check Status
Accepts a session id and checks the database to see if the process has completed.
I have to think that there is a better way to "manage" the threads running, however, my requirements state that the user should receive an immediate response from the REST service upon issuing a request.
I am using the WCF Message property to return XML to the browser and as this application can be called from any programming language I can't use classic WCF and callbacks (I think, correct me if I am wrong).
Sometimes I run into an issue where an error occurs and the iscomplete event never gets written to the database and therefore the "Check Status" method says it's processing forever.
Does anyone have any ideas about what is normally done and what can be done in this situation?
Thanks!
Jeffrey Kevin Pry
Your service should return a 202 Accepted at the initial request with a way for the client to check the current status, either through the Location header or as part of the content.
As you indicate the client then polls the URL indicated to check the current status. I would also suggest adding a bit of cache time to this response in case a client just starts looping.
How you handle things on the server is up to you and in no way related to REST. For one thing I would put all logic that executes as the background thread in a try/catch to you can return an error status back if an error occurs and possibly retry the action depending on the circumstances.
I implemented a similiar process for importing/processing of large files and to be honest, I have never had a problem. Perhaps resolving the reason that the IsComplete never gets set will make this more resilient.
Not much of an answer, but still..