I'm looking for a better infrastructure setup for managing and deploying internally developed applications which are executed periodically.
The current setup grew into an unmonitorable heterogeneous collection of applications, that can only be executed directly on the scheduler VM.
Current situation:
Windows environment
Bunch of Jobs written in PowerShell and as C# applications some containing rather complex logic, some perform ETL-Operations
Jobs either configured as service or console applications triggered by default Windows scheduler and running on a dedicated VM
Application specific logging into Log-Files (some applications)
Configuration via app.config file for each C# console application
Windows scheduler doesn't provide nice Web-GUI to watch and monitor job executions.
Ideal situation:
Central monitoring: Overview of all jobs (when run, failures)
Trigger manually via a web frontend
Trigger job execution via API with possibility to check whether execution succeeded.
Central job configuration (connection strings, configuration parameters)
Constraints:
No cloud: Due to internal restrictions the software has to reside inside our own network. Our company owns a sufficiently dimensioned server rack for hosting required servers internally.
Considered Options
Azure WebJobs
From what I have read this would be exactly the solution I'm looking for. Due to our "no cloud" policy we'd need to host our own Azure Pack internally, which might require quite some effort to set up and is possibly a technical overkill for these requirements.
Self-written Web-API Project
Another option would be to write a dedicated Web-API project that contains all job functions, has one central configuration and exposes the job functions as Web-API Methods and using Quartz.net for scheduling.
However if possible I'd prefer to use some standard software, so I won't be responsible for maintaining yet another central piece of our infrastructure.
Which option would you choose? Or are there any better alternatives?
I think what you look for is so-called Master Data Management, so you should check for
http://www.talend.com/
https://www.informatica.com/
http://www.tibco.com/
Related
We have 20+ console applications for background processing(sending emails, tagging customers based on condition etc). Currently, all are configured using a windows task scheduler and hosted on a different server. it's difficult to manage and schedule manually.
I am new to microservice architecture.
can I create and run all this jobs with MS architecture.
This can be done via programatically and configuration option. Basically save the schedule configuration and apply on application. you may have to design User interface or xml file based configuration to save the schedule details for each job. In java, Quartz framework is quite popular. For .Net - refer Quartz.Net Cron expression
To create a background process or schedule jobs you can do it in your .Net application or via the Windows Service application.
To do it in your .Net Application you can do it with Quartz or HangFire. Both are reliable and straightforward.
To do it in a Microservice environment you can create an independent service and add your new jobs to it.
Also, even you can listen to events inside this specific service and fire it whenever each service wants. It would be quite flexible
Another option is that you can use Worker Services in. NET which helps you to create and run your specific instance of your application (Thanks to dotnet core).
If you ask me which one is the simple one I would ask Quartz.
Currently I have a MVC Web application which is hosted on IIS on a windows server in Azure. On the same VM, I also have a console application (MyConsoleApp.exe).
The Web application is aware of the location of the MyConsoleApp.exe. Through the Web application, I can create and kill processes. Every time I create a process through the UI, the Web application will create a new process running the MyApplication.exe. The process information is logged to a SQL database (PID and GUID) and then through the UI I can decide to kill the process or fetch information each console app has logged.
The diagram below shows the architecture with 3 processes created running the console app. N number of processes can be created. This is just an example.
I would like to transform my applications into microservices using .net core, and I am also replacing SQL with Mongo.
My question is, how I can replace the part where I create separate processes through the Web app? The Web app as a microservice will run on its own container. I was thinking of spinning up containers remotely, to imitate what each process is currently doing, but it sounds awful.
Could you please suggest an architectural solution or design I could consider implementing?
My vision is something like this, but the processes part is troubling me.
This sort of thing is usually accomplished by spinning up workers at then end of a queue. This is detailed in Microsoft's Reference Architecture where they talk about Function Apps.
I am having an asp.mvc application which resides on a server.From this application, I want to start a process which is a bit long-running operation and will be resource intensive operation.
So what I want to do is I want to have some user agent like 3 which will be there on 3 machines and this user agent will use resources of their respective machines only.
Like in Hadoop we have master nodes and cluster in which tasks are run on the individual cluster and there is 1 master node keeping track of all those clusters.
In Azure, we have virtual machines on which tasks are run and if require Azure can automatically scale horizontally by spinning up the new instance in order to speed up the task.
So I want to create infrastructure like this where I can submit my task to 3 user agents from the mvc application and my application will keep track of this agents like which agent is free, which is occupied, which is not working something like this.
I would like to receive progress from each of this user agent and show on my MVC application.
Is there any framework in .net from which I can manage this background running operations(tracking, start, stop etc..) or what should be the approach for this?
Update : I don't want to put loads of server for this long running operations and moreover I want to keep track of this long running process too like what they are doing, where is error etc.
Following are the approach which I am thinking and I don't know which will make more sense:
1) Install Windows Service in the form of agents of 2-3 computer on premises to take advantage of resp resources and open a tcp/ip connection with this agents unless and until the long running process is complete.
2) Use hangfire to run this long running process outside of IIS thread but I guess this will put load on server.
I would like to know possible problems of above approaches and if there are any better approaches than this.
Hangfire is really a great solution for processing background tasks, and we have used used it extensively in our projects.
We have setup our MVC application on separate IIS servers which is also a hangfire client and just enqueues the jobs needs to be executed by hangfire server. Then we have two hangfire server instances, which are windows service application. So effectively there is no load on the MVC app server to process the background jobs, as it is being processed by separate hangfire servers.
One of the extremely helpful feature of hangfire is its out of the box dashboard, that allows you to monitor and control any aspect of background job processing, including statistics, background job history etc.
Configure the hangfire in application as well as in hangfire servers
public void Configuration(IAppBuilder app)
{
GlobalConfiguration.Configuration.UseSqlServerStorage("<connection string or its name>");
app.UseHangfireDashboard();
app.UseHangfireServer();
}
Please note that you use the same connection string across. Use app.UseHangfireServer() only if you want to use the instance as hangfire server, so in your case you would like to omit this line from application server configuration and use only in the hangfire servers.
Also use app.UseHangfireDashboard() in instance which will serve your hangfire dashboard, which would be probably your MVC application.
At that time we have done it using Windows Service, but if had to do it now, I would like to go with Azure worker role or even better now Azure Web Jobs to host my hangfire server, and manage things like auto scaling easily.
Do refer hangfire overview and documentation for more details.
Push messages to MSMQ from your MVC app and have your windows services listen (or loop) on new messages entering the queue.
In your MVC app create a ID per message queued, so make restful API calls from your windows services back to the mvc app as you make progress on the job?
Have a look at Hangfire, this can manage background tasks and works across VMs without conflict. We have replaced windows services using this and it works well.
https://www.hangfire.io
Give a try to http://easynetq.com/
EasyNetQ is a simple to use, opinionated, .NET API for RabbitMQ.
EasyNetQ is a collection of components that provide services on top of the RabbitMQ.Client library. These do things like serialization, error handling, thread marshalling, connection management, etc.
To publish with EasyNetQ
var message = new MyMessage { Text = "Hello Rabbit" };
bus.Publish(message);
To subscribe to a message we need to give EasyNetQ an action to perform whenever a message arrives. We do this by passing subscribe a delegate:
bus.Subscribe<MyMessage>("my_subscription_id", msg => Console.WriteLine(msg.Text));
Now every time that an instance of MyMessage is published, EasyNetQ will call our delegate and print the message’s Text property to the console.
The performance of EasyNetQ is directly related to the performance of the RabbitMQ broker. This can vary with network and server performance. In tests on a developer machine with a local instance of RabbitMQ, sustained over-night performance of around 5000 2K messages per second was achieved. Memory use for all the EasyNetQ endpoints was stable for the overnight run
We want to synchronize the data in our application with the data of an external service. (Such as accountancy software. This synchronization should be executed every night and when the customer wants to.
I am aware that long running threads don't belong in web applications and this synchronization should be executed within an external windows service. But the downside of this method is, that is becomes harder to deploy / maintain, since the application can be installed on the customer's webserver too.
Is it possible to completely integrate this synchronization with just the use of a class library project withing my solution, which will start up at the Application_Start event?
Since your application is hosted on IIS, it's maintained by the application pool process. If you create additional module for your task, it will be running within context of the same process. You have to be sure this process is still working in the middle of the night, when application is not used, in order to perform the synchronization you want. You can use Quartz.NET to schedule your sync task.
But still, I think much better idea is to perform the synchronization from windows service. Service should communicate with the application for example by using database, where it logs its current activity. It gives you the possibility to monitor of the service state from the web by connecting to such database. I know service forces some additional administration effort, but it will be much more reliable and secure. You can also add service starting possibility from your web application (if pool process user has access rights to windows service) to overcome (or at least minimize) administration effort connecting with restarting your service after some failure.
I've written such functionality, so just to give you an overall look of what I mean by web monitoring of such external service, check the screen below. It can be written with the ajax support to achieve more responsiveness (pooling mechanism), which will be convenient for the end user.
I am developing a solution in .Net utilising the VMWare Web Service API to create new isolated virtualised development environments.
The front end will be web based. Requestors input details for the specific environment which will be persisted to SQL. So I need to write an engine of some sort to pull the data from SQL and work on the long running task of creating resource pools, switch port groups and cloning existing VM templates etc. As work progresses, events will be raised to write logs and update info back to SQL. This allows requestors to pull data back into a webpage to see how it's progressing or if it's completed.
The thing I am struggling with is how to engineer the engine which will exec the long running task of creating the environment. I cannot write a windows service (which I would like) as we work in a very secure environment and it's not possible (group policy etc). I could write a web service to execute the tasks, extending the httpRuntime executionTimeout to allow the task to complete. But I'm keen to hear what you guys think may be a better solution to use (based on .Net 3.5). The solution needs to be service oriented as we may be using it on other projects within our org. What about WWF, WCF? I have not used any of the newer technologies available since .Net 2.0 as we've only just been approved to move up from .Net 2.0.
First of all, a Windows Service isn't insecure. One of software development devils is discarding a solution by ignorance or because a lack of investigation, requirement analysis and taking decisions collaborately.
Anyway, if you want to do it in a SOA way, WCF is going to be your best friend.
But maybe you can mix a WCF service hosted by Internet Information Services and Message Queue Server (MSMQ).
A client calls a WCF service operation and enqueues a message to some Message Queue Server queue. Later, a Windows scheduled task executed overtime, checks if your queue has at least an incoming message, and task processes this and others until you dequeue all messages.
Doing this way, IIS WCF host will never need to increase its execution time and it'll serve more requests, while the heavy work of executing this long tasks is performed by a Windows scheduled task, for example a console application or a PowerShell cmdlet (or just a PowerShell script, why not?).
Check this old but useful MSDN article about MSMQ: http://msdn.microsoft.com/en-us/library/ms978430.aspx
I don't understand your comment regarding services being insecure, I think that is false.
However I would look into creating something that uses a messaging bus/daemon.
RabbitMQ: http://www.rabbitmq.com/
MassTransit: http://masstransit-project.com/
NServiceBus: http://nservicebus.com/
Then you can basically use something like WCF, Console App or Service as your daemon.