I implemented a WCF service that will do some long task. It needs to provide client with notifications about current progress of that long task. It is working well so far, but the problem is:
When user closes the client app, and then open it again, client app should start receiving updates from server again about the task that is running.
There can be multiple tasks started by different users at the same time.
So for example, client starts a process named "proc1" that will be 3 hours long, and after 15 minutes he closes the app. The process continues to work on server. After 30 minutes client starts the app again and then client app needs to start getting notifications about the process client has started 30 minutes ago. How can this be accomplished?
Thanks in advance.
You should save on client side some process id that can be used later to get progress of that process. Try using that process id to reconnect to the notifications.
Related
I'm using signalR for having alive connection between clients and webapis in the server. we have a lot of groups in which there are 3 types of clients. android app, web app and raspberry device (IoT) and they have to keep a specific status updated all the time (immediately after status changed in the database) and check server for get and post commands and responses.
But i think having alive connection (websocket) all the time will drain battery of raspberry. so i think if there is a solution for handle clients connections manually in an arbitrary time schedule.
So can i open and close connection of clients from server (hub) in the arbitrary time schedule and repeat it every day?
For example , i want open the connection of a group of clients from 3 to 5 (pm) and after that i want just open connection every 5 minutes and keep it alive for 1 second. and repeat it.
Maybe it seems stupid but i want signalR in scheduling time span. is there a better way?
We have an ASP.Net MVC app that is running with a gateway policy that any web requests that go over 5 minutes are terminated. One of the features is exporting some data. It's been running just above 5 minutes. Would SignalR help? Does having a persistent connection between the client and server be enough for the gateway to think that it is active and will not terminate it?
We face the same issue in our project where we have to process some data in API and UI can't wait for such long processing time interval response from API.
We use SignalR to notify the requester UI/Client when data get successfully processed.
I have a C# windows Service installed in a customer server that does the following tasks:
Listen to a SQL Broker service for any insert / update in 3 tables and then POST data to an API method so remote server gets updated with latest. (using SqlTableDependency)
Polling method every 5 minutes to verify / validate that on remote server these 3 tables have same data. (In case the SQL Broker service is not working)
Starts a SelfHosted WebAPI server (this doesn't work because customer doesn't allow server to be exposed to Internet)
This last selfhosted task was implemented so that from an application it can request to the customer server to perform some updates on a table.
I would like to know if there is a way to subscribe the windows service to a realtime broascast engine / service such Pusher or AWS SQS, etc. The idea behind is that I can trigger tasks in the remote customer windows service from an outside application.
Any idea if this is a doable thing? If I can do this I even can get rid of the Polling task in the windows service because now I can get the windows service to push information to the API based on an event that I can trigger from an external source.
This might not be the best workaround but seems to be working pretty nice.
What I did was to implement in the Windows Service an infinite loop with a Long polling call to a AWS SQS, having the max Receive message wait time parameter in SQS to 20 seconds. What this allowed me is to reduce the empty response messages and also reduce cost for requests to SQS service to only one every 20 seconds.
If a message comes in when the long polling is beging handled then inmediatly the long polling stops and the message is received.
Because the messages sent frequency is not that big, lets say on every 20 seconds I receive 1 then:
3 request messages every minute
180 per hour
4,320 per day
103,680 per month
And AWS Pricing is $0.40 per 1 million so that will be practically free.
I've created a WCF solution where I can kick off several simultaneous long running processes that report progress and have the option to cancel each one individually. I can disconnect the client and the processes keep running and I can start a new client and add an additional process and they all keep running simultaneously. That part works great.
My question is how can I connect a second client to all of the already running callbacks and also have any processes added from the second client show up in the first client (if it's still running)?
WCF publisher/subscriber pattern is exactly what I needed I just didn't know how to ask it.
I have a problem with the .net client in a windows service. After some time it freezes .
The usecase is this.
A user uploads one or more files to our website. The service detects this and starts to process the files, sending a signal to the website when processing starts and when it ends.
The service checks for new files every 10 seconds. In this iteration we make a new connection to the server and stops it again, most of the time no messages is sent.
I suspect the connect/disconnect to be causing this. Is it better to open the connection when the services start and then use this connection in every iteration? The service must be able to run without restarts for months.