Receive information once server updated from asp.net webapi - c#

I am new to asp.net. I started learning webapi then later I wish to learn the mvc and spa.
I am developing sample api that can be accessed from 2 kinds of desktop application.
App1 will use post request to update the data using api. and App2 will use get request (run every 30 sec) to get the updated information.
Is there anyway sever can send a message for App2 so that it can get the updated data without having check server for every 30sec ?
At present I running a dispatcher timer to send request periodically with 30 sec delay.
Thank you in advance.

Related

SignalR to prevent gateway timeout

We have an ASP.Net MVC app that is running with a gateway policy that any web requests that go over 5 minutes are terminated. One of the features is exporting some data. It's been running just above 5 minutes. Would SignalR help? Does having a persistent connection between the client and server be enough for the gateway to think that it is active and will not terminate it?
We face the same issue in our project where we have to process some data in API and UI can't wait for such long processing time interval response from API.
We use SignalR to notify the requester UI/Client when data get successfully processed.

c# Windows Service - subscribe to realtime events (notifications) from a external source

I have a C# windows Service installed in a customer server that does the following tasks:
Listen to a SQL Broker service for any insert / update in 3 tables and then POST data to an API method so remote server gets updated with latest. (using SqlTableDependency)
Polling method every 5 minutes to verify / validate that on remote server these 3 tables have same data. (In case the SQL Broker service is not working)
Starts a SelfHosted WebAPI server (this doesn't work because customer doesn't allow server to be exposed to Internet)
This last selfhosted task was implemented so that from an application it can request to the customer server to perform some updates on a table.
I would like to know if there is a way to subscribe the windows service to a realtime broascast engine / service such Pusher or AWS SQS, etc. The idea behind is that I can trigger tasks in the remote customer windows service from an outside application.
Any idea if this is a doable thing? If I can do this I even can get rid of the Polling task in the windows service because now I can get the windows service to push information to the API based on an event that I can trigger from an external source.
This might not be the best workaround but seems to be working pretty nice.
What I did was to implement in the Windows Service an infinite loop with a Long polling call to a AWS SQS, having the max Receive message wait time parameter in SQS to 20 seconds. What this allowed me is to reduce the empty response messages and also reduce cost for requests to SQS service to only one every 20 seconds.
If a message comes in when the long polling is beging handled then inmediatly the long polling stops and the message is received.
Because the messages sent frequency is not that big, lets say on every 20 seconds I receive 1 then:
3 request messages every minute
180 per hour
4,320 per day
103,680 per month
And AWS Pricing is $0.40 per 1 million so that will be practically free.

Azure webapp load balance time out issue,getting HTTP status of 500 and sub status of 121

We have a web site which load and analyse excel data and report back to the user.Now the process of analyzing the excel data takes about on average over 5 minutes (depending on the data) during which time the client server communication seems to be idle.
This web site is hosted on Azure as a webapp, and it seems that Azure has a load balancing time out according to the following link
https://azure.microsoft.com/en-us/blog/new-configurable-idle-timeout-for-azure-load-balancer/
in this link it is mentioned that
In its default configuration, Azure Load Balancer has an ‘idle
timeout’ setting of 4 minutes.
This means that if you have a period of inactivity on your tcp or http
sessions for more than the timeout value, there is no guarantee to
have the connection maintained between the client and your service.
because of this issue the end user constantly get HTTP status of 500 and sub status of 121.
Currently we cant re-architecture the system nor able to change deploying as a webapp.
We have tried to sending Jquery ajax request to the server on a set interval but this doesn't seem to be working.
The above article talks about keeping the TCP session alive using ServicePoint.SetTcpKeepAlive(), but we have no idea how to implement this in MVC web application.(Did not find any samples on the net either)
We really need to resolve this issue because this could make or break our project.So any help is appreciated. specifically any working sample code using ServicePoint.SetTcpKeepAlive() in an MVC application is greatly appritiated
Thanks in advance.
UPDATE
i tried out what Irb mentioned but still no luck.As you can see in the given image i call KeepSessionAlive repeatedly.At every call to KeepSessionAlive i access the Session Variable making sure not to time out the session.But still the call to Save returns 500. Again this only happens in Azure
A web api method that return json will not always keep session alive. If you request something that invokes session then you can call it on a timed interval from the client. Something as simple as requesting an image with a jpeg web handler could work. Here is a link to something similar.

Why do Windows Azure not load balance while Thread.Sleep()?

I am playing with the the Windows Azure emulator running an MVC website with a single controller method that calls Thread.Sleep(5000) before it returns.
On the client I run a loop that sends a POST request to the controller every 1000 ms, receives a reply from the server with the RoleEnvironment.CurrentRoleInstance.Id, and prints it on the screen.
I have 4 instances of my MVC worker role running.
I understand that the connection: keep-alive HTTP header can keep the browser from making a request to a different instance, because an existing connection is open.
But still, even when loading up my site in multiple browser windows, it keeps hanging while waiting for the Thread.Sleep(), and then (most times) continues to get replies from the same instance.
Why doesn't Azure's load balancer send subsequent requests to a non-busy worker role instance? Do I need to manually mark it as busy?
You mentioned using the emulator, which doesn't handle load balancing the same way as Azure's real load balancers. See this post for details about the differences. I don't know what exactly is going on in your case, but... I'd suggest you trying this out in Azure to see if you get the behavior you're expecting.

How to use SignalR to notify web clients from ASP.NET MVC 3 that MSMQ tasks were completed

How would one use SignalR to implement notifications in an .NET 4.0 system that consists of an ASP.NET MVC 3 application (which uses forms authentication), SQL Server 2008 database and an MSMQ WCF service (hosted in WAS) to process data? The runtime environment consists of IIS 7.5 running on Windows Server 2008 R2 Standard Edition.
I have only played with the samples and do not have extensive knowledge of SignalR.
Here is some background
The web application accepts data from the user and adds it to a table. It then calls an one way operation (with the database key) of the WCF service to process the data (a task). The web application returns to a page telling the user the data was submitted and they will be notified when processing is done. The user can look at an "index" page an see which tasks are completed, failed or are in progress. They can continue to submit more tasks (which is independent of previous data). They can close their browser and come back later.
The MSMQ based WCF service reads the record from the database and processes the data. This may take anything from milliseconds to several minutes. When its done processing the data, the record is updated with the corresponding status (error or fail) and results.
Most of the time, the WCF service is not performing any processing, however when it does, users generally want to know when its done as soon as possible. The user will still use other parts of the web application even if they don't have data to be processed by the WCF Service.
This is what I have done
In the primary navigation bar, I have an indicator (similar to Facebook or Google+) for the user to notify them when the status of tasks has changed. When they click on it, they get a summary of what was done and can then view the results if they wish to.
Using jQuery, I poll the server for changes. The controller action checks to see if there is any processes that were modified (completed or failed) and return them otherwise waits a couple of seconds and check again without returning to the client. In order to avoid a time out on the client, it will return after 30 seconds if there was no changes. The jQuery script waits a while and tries again.
The problems
Performance degrades with every user that views a page. There is no need for them to do anything in particular. We've noticed that memory usage of Firefox 7+ and Safari increases over time.
Using SignalR
I'm hoping that switching to SignalR can reduce polling and thus reduce resource requirements especially if nothing has changed task wise in the database. I have trouble getting the WCF service to notify clients that its done with processing a task given the fact that it uses forms based authentication.
By asking this question, I hope someone will give me better insight how they will redesign my notification scheme using SignalR, if at all.
If I understand correctly, you need a way of associating a task to a given user/client so that you can tell the client when their task has completed.
SignalR API documentation tells me you can call JS methods for specific clients based on the client id (https://github.com/SignalR/SignalR/wiki/SignalR-Client). In theory you could do something like:
Store the client id used by SignalR as part of the task metadata:
Queue the task as normal.
When the task is processed and de-queued:
Update your database with the status.
Using the client id stored as part of that task, use SignalR to send that client a notification:
You should be able to retrieve the connection that your client is using and send them a message:
string clientId = processedMessage.ClientId //Stored when you originally queued it.
IConnection connection = Connection.GetConnection<ProcessNotificationsConnection>();
connection.Send(clientId, "Your data was processed");
This assumes you mapped this connection and the client used that connection to start the data processing request in the first place. Your "primary navigation bar" has the JS that started the connection to the ProcessNotificationsConnection endpoint you mapped earlier.
EDIT: From https://github.com/SignalR/SignalR/wiki/Hubs
public class MyHub : Hub
{
public void Send(string data)
{
// Invoke a method on the calling client
Caller.addMessage(data);
// Similar to above, the more verbose way
Clients[Context.ClientId].addMessage(data);
// Invoke addMessage on all clients in group foo
Clients["foo"].addMessage(data);
}
}

Categories

Resources