We want to synchronize the data in our application with the data of an external service. (Such as accountancy software. This synchronization should be executed every night and when the customer wants to.
I am aware that long running threads don't belong in web applications and this synchronization should be executed within an external windows service. But the downside of this method is, that is becomes harder to deploy / maintain, since the application can be installed on the customer's webserver too.
Is it possible to completely integrate this synchronization with just the use of a class library project withing my solution, which will start up at the Application_Start event?
Since your application is hosted on IIS, it's maintained by the application pool process. If you create additional module for your task, it will be running within context of the same process. You have to be sure this process is still working in the middle of the night, when application is not used, in order to perform the synchronization you want. You can use Quartz.NET to schedule your sync task.
But still, I think much better idea is to perform the synchronization from windows service. Service should communicate with the application for example by using database, where it logs its current activity. It gives you the possibility to monitor of the service state from the web by connecting to such database. I know service forces some additional administration effort, but it will be much more reliable and secure. You can also add service starting possibility from your web application (if pool process user has access rights to windows service) to overcome (or at least minimize) administration effort connecting with restarting your service after some failure.
I've written such functionality, so just to give you an overall look of what I mean by web monitoring of such external service, check the screen below. It can be written with the ajax support to achieve more responsiveness (pooling mechanism), which will be convenient for the end user.
Related
How to host a Windows Service in IIS and keep that service runing like it is running on Windows?
Could I use some feature from WCF service?
I've not access to the Windows itself, only to IIS. Inside that service I'll create a thread which at scheduled time will process some data.
In short, you can't.
A more detailed answer is that there are 2 problems:
IIS worker processes are launched only when a HTTP request comes in. This means you can't start your service with the system.
IIS worker processes are recycled (i.e. restarted) on several conditions. For example, a worker process is restarted if no HTTP request comes in for a long time. This means you can't control when your service is shut down, unless you have access to application pool recycling configuration. Keep in mind that the recycling logic only ensures that all pending HTTP requests are complete, but does not await all background threads to complete.
You can come with a partial solution this way:
Create a WCF service method that checks if your long-running thread is alive and if not, starts it.
Create a very simple windows service that periodically (once in 5 seconds) calls that method. Deploy the service somewhere, e.g. on your own machine.
The only question that remains is: do you really need to avoid windows services? Could you find a place to host the service? There are some use cases when a windows service is the best or even the only way.
You cant, in a nut shell.
However you can make use of the health monitoring API specifically the heartbeat functionality. see:
http://msdn.microsoft.com/en-us/library/system.web.management.webheartbeatevent.aspx
for details on the class you will need to implement to be called when there is work to do
also this answer on SO might help
Understanding heartbeat in ASP.NET health monitoring
Once you have implemented a webheartbeatevent derived class you can check your db or what ever you want to check if there is work to do.
A better solution IMHO is to scrap the service entirely and redesign the system to be 100% web based, as services become a deployment and maintenance nightmare. as i assume you are now finding out...
I am developing a solution in .Net utilising the VMWare Web Service API to create new isolated virtualised development environments.
The front end will be web based. Requestors input details for the specific environment which will be persisted to SQL. So I need to write an engine of some sort to pull the data from SQL and work on the long running task of creating resource pools, switch port groups and cloning existing VM templates etc. As work progresses, events will be raised to write logs and update info back to SQL. This allows requestors to pull data back into a webpage to see how it's progressing or if it's completed.
The thing I am struggling with is how to engineer the engine which will exec the long running task of creating the environment. I cannot write a windows service (which I would like) as we work in a very secure environment and it's not possible (group policy etc). I could write a web service to execute the tasks, extending the httpRuntime executionTimeout to allow the task to complete. But I'm keen to hear what you guys think may be a better solution to use (based on .Net 3.5). The solution needs to be service oriented as we may be using it on other projects within our org. What about WWF, WCF? I have not used any of the newer technologies available since .Net 2.0 as we've only just been approved to move up from .Net 2.0.
First of all, a Windows Service isn't insecure. One of software development devils is discarding a solution by ignorance or because a lack of investigation, requirement analysis and taking decisions collaborately.
Anyway, if you want to do it in a SOA way, WCF is going to be your best friend.
But maybe you can mix a WCF service hosted by Internet Information Services and Message Queue Server (MSMQ).
A client calls a WCF service operation and enqueues a message to some Message Queue Server queue. Later, a Windows scheduled task executed overtime, checks if your queue has at least an incoming message, and task processes this and others until you dequeue all messages.
Doing this way, IIS WCF host will never need to increase its execution time and it'll serve more requests, while the heavy work of executing this long tasks is performed by a Windows scheduled task, for example a console application or a PowerShell cmdlet (or just a PowerShell script, why not?).
Check this old but useful MSDN article about MSMQ: http://msdn.microsoft.com/en-us/library/ms978430.aspx
I don't understand your comment regarding services being insecure, I think that is false.
However I would look into creating something that uses a messaging bus/daemon.
RabbitMQ: http://www.rabbitmq.com/
MassTransit: http://masstransit-project.com/
NServiceBus: http://nservicebus.com/
Then you can basically use something like WCF, Console App or Service as your daemon.
I have seen examples, sample codes etc for self hosting WCF services within a console app, windows service etc.
My question is, how will this work in production? Will it be efficient? Will it scale?
I m not sure, how it will work, so other question is, will that be single threaded? multi threaded? do i need to manage the multi threading? appdomains?
I prefer hosting with command line, windows service for application related reasons.
My question is,
Will it be efficient? Will it scale?
Yes and yes. But for really large scale apps you should still consider IIS (+WAS).
so other question is, will that be single threaded? multi threaded?
That is determined by the configuration.
Will it be efficient?
It depends on the service implementation, on the maximum number of requests it is able to manage within a specific time-frame. Efficiency is a relative measure: let's assume your service is able to process 20 messages/sec, if your requirement is to be able to process 10 messages/sec, then your service is efficient. But if the requirement is 30, then it is not.
Will it scale?
Once again, it is not related to hosting. Are your services stateless? if not then, they probably won't scale much since load balancing is not possible.
Will it be manageable ?
Probably not:
- you need to have a user logged on the server to run the app
- it does not auto-start with the server
- it cannot auto-restart on failure
- it does not create instances of the service pro-actively
- it does not provide (without custom code) a way to check the service health
Single instance ? Multiple threads ?
If your service does not maintain state between calls per client, then configure it as "one instance per call and no multithreading" -> No concurrency, high throughput
If your service does maintain state, then configure it as "one instance per session and multithreading" to allow a client to perform concurrent calls. Be careful about concurrency issues and protect your resources.
If your service does not maintain state per client but keeps some global data stored for all calls, consider the "single instance per service and multithreading". Keep in mind the possible concurrency issues. In that you might as well use "one instance per call" and keep the global storage out the service.
A Windows service hosting a WCF endpoint is fine for small services that aren't going to be hit often; you don't have to mess with IIS (which can be a REAL pain IMO). However, there will only be one listener listening, so it's not recommended for a service that is likely to be hit from several places at once (use IIS for that; it sets up an app pool that can handle many simultaneous requests). This model is good for one-on-one interop between two machines; you might set up the service host on a "set and forget" box living out in a warehouse somewhere, and call it to perform simple but custom tasks like rebooting, log dumps, etc.
Avoid having any user app (console or otherwise) host a service endpoint, except for initial proof-of-concept testing. In addition to the single-listener drawback, a user app MUST be run in the context of a logged-in user (not the service users, which are "logged in" as part of Windows startup), and must have custom "keepalive" monitoring; with a service, Windows can be told to simply restart it if it crashes, while it doesn't give a toss about a user app crashing other than to prevent that program taking down the whole OS (and to ask the user if they want to report the crash).
I'm writing an ASP.NET MVC site where I need to have a "Tasks" application that runs alongside the website. Such a "Tasks" application would collect data at set intervals and insert it into the database.
Of course, I could write a simple Console Application and use the Windows Task Scheduler to run it, but my site is being hosted by GoDaddy and I only have medium trust permissions.
Are there any methods for implementing such functionality while not violating medium trust permissions?
One method that I'm considering is a method in the site itself that gathers data, waits for a long time, and then gathers data again. Would that interupt users' connection to the site?
You can do it an ugly way.
Spin-off a thread which keeps on doing the tasks job. Initiate the thread with your own custom website request. The thread will keep running at the back-end.
I have a data loading application that has to be executed multiple times per day at irregular intervals. I am planning to write a service to kick off the downloads and import the data to a database server. Are there advantages to using a standard service over a webservice or vice versa?
I think you're missing the point here.
Web Services typically are used for a form of communication or remote execution. You call a remote function on a web-service to either adjust the behavior of the machine it's running on or to retrieve data from it.
Windows Services are background processes that run on a machine without any "logged on user" being required. They can perform tasks and do things while the user is at the login screen, or do elevated operations. You can talk to services to adjust their behavior or retrieve information, but it's general purpose is different than a webservice.
The biggest notable difference here is that web-services must be called, they don't run on their own.
For your application I would suggest using a Windows (Standard) Service, as you can have it execute code once per day. I would only use a web-service if you've got something else to automate the calls to the web-service and you require a response from the server detailing it's execution result (success/fail/warning/etc...)
You could also consider writing a normal (windows or console) application that is triggered by a Windows Scheduled Task. What you've described doesn't necessarily sound like something that would require a service.
Sounds like a good use of a windows service to me. Off the top of my head, I'd use a windows service if:
1. Work is performed on a scheduled basis (regular or irregular intervals) in the background;
2. No interaction is needed - work is just done in the background and kicks off based on polling or some other type of trigger (message dropped in queue, database value trigger, scheduled timespan, etc.);
3. Needs to be monitored (either starts/stops along with logging) and you can take advantage of WMI, perfmon and event log with little effort.
A web service is better for tasks that are interactive (like if you wanted to initiate the download based upon a request received).
Sounds like a windows service is the approach you should take.
Hope this helps. Good luck!
If "irregular interval" does mean, the application is invoked by another application, I would use a web service.
If the action is scheduled, I would use a windows service.
If you are working with SQL Server (scheduled or not), I would also consider SQL Server Integration Services.