Users will be able to configure the printer/scanner on a web application.
So I've a windows service running on client machine that communicate to cloud db through API and get the printer/scanner details and configure them accordingly in local network.
The service is configured to run every 30 minutes. So when a user modifies a printer/scanner property using the web application, the update will be available in client machine only after windows service run once. So the maximum time delay will be 30 minutes as the windows service is configure to run every 30 minutes.
Windows service cannot invoke API frequently as there will be too much load on the web server and also the update doesn't happen often but when it happens, customer expects their local network printer/scanner to be updated immediately with new configuration.
So, the question here is how effectively I could update the cloud data to the local service so that they will be in sync as soon as possible.
Please share if there is any other way to achieve this.
I've heard about message queue/clickone but I am not sure how it could suit here
Related
I am currently creating a Web App that receives updates using GET methods from ESP8266. The ESP8266 sends updates every 5 minutes. These "updates" puts a time log in the database through the Web App. The Web App then checks if the ESP8266 is still "ON" by checking the time log if the ESP8266 has sent data within 5 minutes. However, this only works if the Web App is open on a browser.
Is there a way that even if the Web App is not accessed by any client (browser), it can still check if the ESP8266 has logged within that 5 minute window? I am thinking that it is freely running in the Server and continuously checking if the ESP8266 is still "ON". Because if the ESP8266 is turned OFF it will not be able to send data to the Web App anymore and and the Sever would automatically update the database that the ESP8266 is OFF.
I am having issues about running an API on Godaddy server. The API basically sends requests to a website constantly in a certain period time as soon as it starts to operate. Therefore it has timer that is created in Application_Start for controlling this action. For some reason, my API stops working after some time if no one makes a request. However I need my API to work all the time since I need a list that has live data which is collected from another website. Below you can read the steps I take and problem that I encounter in details:
I created my Web API on Visual Studio 2013 written in C#.
I bought the server from Godaddy having Windows Deluxe hosting.
I uploaded my files to httpdocs folder of my server using ftp.
When I call my API by typing "mysite.com/myWebAPI/myList" it starts to work and initially it returns an empty list (which is normal i think)
Then I make same request in 2 seconds (allow my API to collect data) and the list that I desire is returned with live data inside collected from another website.
After this point, my API should not stop. It has to send request every X seconds to a website and update the information in the list.
However, after 5 or 10 minutes, if no one sends a request then my API stops therefore it stops collecting information from another website and list is not updated.
Then, if another request is made, it becomes active again and starts to work but now my list is empty once again. This means that the list is created all over again. This can only happen if Application_Start is called once more.
Note that when I am running this Web API on my localhost server, it works perfectly. It does not stop and gathers the information correctly by sending requests to the website in every X seconds. Even though I don't make any request for 30 minutes, it returns me the list I want after I send request when 30 minutes have passed.
So the question is, Is there a way to fix this problem and make my API work all the time without stopping on a GoDaddy server having Windows Deluxe Hosting?
I may have to do something in IIS application pool thing but I am not sure what to do.
Thank you for your help.
I'm not sure specifically about GoDaddy, but on Azure Web App Service this is also common and it is resolved by enabling the 'always on' feature which presumably automates the job of pinging the API every x minutes. If GoDaddy has a similar feature, enabling it may solve this problem.
Create a scheduled task to ping your API before the IIS idle
timeout.
http://www.codeproject.com/Articles/12117/Simulate-a-Windows-Service-using-ASP-NET-to-run-sc
http://www.quartz-scheduler.net/
You can also setup Pingdom to ping your API.
Currently working on a .NET solution for an application server. I'm using .NET 4.0 running on Windows Server 2008 R2 with IIS 7.5.
My requirements are:
The application server can run multiple Console applications at once on a schedule - Quartz.net looks like a really good solution to this problem - and is working well for me so far
The application server will also host a web application that will report on jobs (what time they ran, what they did, how long they took etc)
I would like to be able to restart the "service" that is running my jobs and trigger ad hoc jobs from the web interface.
The Service that is running my jobs needs to run all the time
Once this is live I will not have direct access to the machine to restart a Windows Service, but i could potentially setup IIS to be able to do this for me.
WCF Services looks quite promising to me - but I'm not sure where to host it. My current project uses a WCF Service to run console applications using the Quartz.net plugin. Configuration for what to run and when to run it is stored in an oracle database and my WCF service connets directly to the database to retrieve this information (not sure if that is the intended use of WCF).
If I host the WCF Service in IIS / WAS then running the console applications might be a security concern from what I've read. I can keep the WCF service running all the time using appFabric at least. Alternatively I could host it in a Windows Service and allow my web app to consume the WCF service to report on the jobs. I'm concerned about using a Windows Service though as I wont have direct maintenance access to this machine and if it breaks I'm in trouble. I would really like to be able to do the maintenance from a web application. A windows service also feel a little unnecessary given it can be hosted from IIS.
So my question is - is a WCF Service the right approach to this problem or can anyone think of a better approach?
If a WCF service is a good approach - where should I host it so that I can perform maintenance via a web interface given I will not have direct access to the machine itself?
Should the WCF service be the one to start and schedule the jobs?
I think you're overengineering it, possibly.
The Problem: You have a web site which needs to start up jobs on an ad-hoc basis. Other jobs will be run to a fixed schedule. The web site will report on all/any of these jobs.
For running the scheduled jobs, a Windows Service using Quartz is indeed an ideal solution for the fixed schedule part. However, to report on those jobs the data must be collected by the Service and made available. A service can be set up to restart on fail, so you can guarantee that it will always be running (barring a minute or two when it's restarting if it fails - and why should it?. However, any history will be lost unless the Service stores it somewhere it can be retrieve it after a restart.
The simpler solution to the web site getting the history is for the Service to write its data to a database. Then it doesn't worry about a restart: all the history has already been saved, and the data can be read by the web site at any time.
Similarly, if the web site talks directly to the Service (as a WCF Service or otherwise) then what happens if the service is not currently running? The request gets fails until the restart is completed. Frustrating for the user. Alternatively, the web site puts the request into the database. The service monitors the database for requests, and starts jobs appropriately when it sees a new request. If a request is written while the service is not running, when it restarts it will see the request(s) in the DB and execute them.
So I think using a WCF service is overkill, and actually introduces some problems: persistence of history, and what to do about requests made while the service is down. These problems don't arise if you go the way I've described.
Cheers -
I created a class with a timer that will perform some actions every 10 min. However the class is updated frequently and the it must run every 10 min all day!
How can I ensure that the server is using the latest version without interrupting the schedule?
I am considering creating a windows service. It is inconvenient to update it on the server.
You have to:
Stop the service
Un-register the service
Uninstall service
Uninstall the application
Install new application
Register the new service
Start the new service
tl;dr: I am looking for an easy solution to deploy a windows service or another solution to the server that is simple and convenient to update.
I am considering a WCF service, but they are not made for this purpose. I believe with a WCF service you can easily publish it directly to the VPS IIS.
Maxim
I want to be able to configure the Azure Load Balancer Emulator in such a way that two consecutive calls to the web app will always result in calls to different instances.
How am i able to perform that?
How can i verify that the load balancer is working as expected? Using the HttpContext.Current.Request.Url and seing if the endpoint port changes?
Thanks in advance
The default Load Balancer which are available to your Windows Azure Web and Worker roles are software load balancers and not so much configurable however they do work in Round Robin setting. If you want to test this behavior this is what you need to do:
Create two (or more) instances of your service with RDP access enabled so you can RDP to both instances
RDP to your both instances and run NETMON or any network monitor solution in it.
Now access your Windows Azure web application from your desktop
You need to understand that when a network connection is made from your desktop the connection is still alive based on network settings (default 60 seconds) so you need to wait until default timeout is passed to access your Windows Azure web application again.
When you will access your Windows Azure Web application again you can verify that seconds time the request went to next instance. BE sure to pass the connection timeout otherwise your request will be keep handled by same instance.
Note: If you dont want to use RDP, you sure can also create a test ASP.NET page to write some special code based on your specific instance which will show you that this page is specific to certain instance. The best way to do is to read the Instance ID as below:
int instanceID = RoleEnvironment.CurrentRoleInstance.Id;
If you want to have more control over Windows Azure Load Balancing, i would suggest using the Windows Azure Traffic Manager which will help you to route the traffic to your site via Round-Robin, Performance or backup based scenario. More info on using Traffis Manager is in this article.