Background process to update DB every few seconds with JSON data - c#

I am a complete novice with this but...
I have a small ASP MVC C# application reading an SQL database which I would like to be updated by a background process updating the DB with a JSON request potentially up to every minute or few seconds.
What is the best way to implement the background JSON DB update? In the MVC app on a persistent timer (is that even possible?) or independently (completely outside of the app) in a separate process with an executable running in the background with an internal programmatic timer or else using some kind of scheduler?
EDIT: For the sake of understanding - it is market prices in the JSON string that obviously need to be updated in the DB quite often ie potentially up to every few seconds if desirable or nessesary

I would use a Windows Service combined with the Quartz.net package.
You can run anything you want and on any schedule.
EDIT: From the above discussion I gather that your job would poll for market prices (a web request) that would run every few seconds and on getting the result would update your database.
EDIT2:
This would be your Quartz job:
public class FetchAndSaveFinancialData : IJob
{
public void Execute()
{
//web request to get info
//save to db
}
}
Then your windows service base class:
public class YourFinancialServiceBase : ServiceBase
{
protected override void OnStart(string[] args)
{
ServiceMain();
base.OnStart(args);
}
protected override void OnStop()
{
base.OnStop();
}
protected void ServiceMain()
{
var scheduler = StdSchedulerFactory.GetDefaultScheduler();
var job = JobBuilder.Create<FetchAndSaveFinancialData>().WithIdentity("Job1", "Group1").Build();
ITrigger trigger = TriggerBuilder.Create().WithIdentity("Trigger1","Group1")
.StartNow()
.WithSimpleSchedule(x=>x
.WithIntervalInSeconds(5)
.RepeatForever()
).Build();
scheduler.ScheduleJob(job,trigger);
scheduler.Start();
}
}
EDIT3:
It all depends on what you use to access the data. If you plan on using EntityFramework then I would keep the MVC project and the WindowsService project in the same solution referencing a Library project. The library project would have all your models and allow you to see if changing something effects one or both components.
On the other hand if you rely a lot on stored procedures in your database the above is less relevant. You will have to consolidate any changes in your stored procedures more than in code.
In my experience, I'd rather have to sift through many projects in one solution than trying to find a completely different solution in a different language that is a critical part of your application.

Related

Manually triggering a quartz job with arguments that is hosted in a windows service

I have created a Quartz server running in a windows service that has various scheduled jobs.
However, there is one job that I need to be triggered manually from an event in my web application UI.
Quartz.NET job:
public class IntensiveJob : IJob
{
public void Execute(IJobExecutionContext context)
{
// Get job parameters here... BUT HOW?!
// Do some intensive processing here...
}
}
Action that I need to trigger the job in:
public class SomeController : Controller
{
[HttPost]
public ActionResult Run()
{
// Need to be able to trigger the intensive job here...
// Ideally with some arguments too... E.g:
var job = new IntensiveJob();
job.Execute();
return RedirectToAction("Index");
}
}
Any suggestions on best way to implement this or alternative approaches would be great.
The solution that best fits your problem is to create a trigger manually that gets executed now by the scheduler. The problem that arises is if you need to have different data in your trigger each time it is created to be passed to the scheduler. The trigger factory that is created by quartz will always produce an identical object when asked for a new trigger. You would have to instantiate an instance of the Trigger Impl with the necessary details and pass that to the scheduler to run. An example in Java would look something like:
JobDetailImpl jdi = new JobDetailImpl();
jdi.setJobClass(SomeClassWithJob);//The class that contains the task to done.
SimpleTriggerImpl sti = new SimpleTriggerImpl();
sti.setStartTime(new Date(System.currentTimeMillis()));
sti.setRepeatInterval(1);
sti.setRepeatCount(0);
context.getScheduler().scheduleJob(jdi,sti); //pseudocode here
The key thing is to use quartz for the scheduling and not to just insert data into the database. Also, each of the items require a name and group.

Right architecture for using HangFire

I'm about to start using hangfire in C# in a asp.net mvc web application, and wonder how to create the right architecture.
As we are going to use HangFire, we are using it as a messagequeue, so we can process(store in the database) the user data directly and then for instance notify other systems and send email later in a separate process.
So our code now looks like this
function Xy(Client newClient)
{
_repository.save(newClient);
_crmConnector.notify(newClient);
mailer.Send(repository.GetMailInfo(), newClient)
}
And now we want to put the last two lines 'on the queue'
So following the example on the hangfire site we could do this
var client = new BackgroundJobClient();
client.Enqueue(() => _crmConnector.notify(newClient));
client.Enqueue(() => mailer.Send(repository.GetMailInfo(), newClient));
but I was wondering whether that is the right solution.
I once read about putting items on a queue and those were called 'commands', and they were classes especially created to wrap a task/command/thing-to-do and put it on a queue.
So for the notify the crm connector this would then be
client.Enqueue(() => new CrmNotifyCommand(newClient).Execute();
The CrmNotifyCommand would then receive the new client and have the knowledge to execute _crmConnector.notify(newClient).
In this case all items that are put on the queue (executed by HangFire) would be wrapped in a 'command'.
Such a command would then be a self containing class which knows how to execute a kind of business functionality. When the command itself uses more than 1 other class it could also be known as a facade I guess.
What do you think about such an architecture?
I once read about putting items on a queue and those were called
'commands', and they were classes especially created to wrap a
task/command/thing-to-do and put it on a queue.
Yes, your intuition is correct.
You should encapsulate all dependencies and explicit functionality in a separate class, and tell Hangfire to simply execute a single method (or command).
Here is my example, that I derived from Blake Connally's Hangfire demo.
namespace HangfireDemo.Core.Demo
{
public interface IDemoService
{
void RunDemoTask(PerformContext context);
}
public class DemoService : IDemoService
{
[DisplayName("Data Gathering Task Confluence Page")]
public void RunDemoTask(PerformContext context)
{
Console.WriteLine("This is a task that ran from the demo service.");
BackgroundJob.ContinueJobWith(context.BackgroundJob.Id, () => NextJob());
}
public void NextJob()
{
Console.WriteLine("This is my next task.");
}
}
}
And then separately, to schedule that command, you'd write something like the following:
BackgroundJob.Enqueue("demo-job", () => this._demoService.RunDemoTask(null));
If you need further clarification, I encourage you to watch Blake Connally's Hangfire demo.

ASP.NET MVC web application access and modify database recurrently best practice

My applications users have a balance attribute that needs to be updated as long as they have services activated. So far, the update functionality uses a .net webjob that runs every hour (webjobs can run every hour at the most for shared or basic subscriptions).
Is there a better solution to implement a balance update feature? I also considered doing that on Application_Start() the following way:
public class MvcApplication : System.Web.HttpApplication
{
private ApplicationDbContext db = new ApplicationDbContext();
private PaymentsController paymentsController = new PaymentsController();
protected void Application_Start()
{
AreaRegistration.RegisterAllAreas();
FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
RouteConfig.RegisterRoutes(RouteTable.Routes);
BundleConfig.RegisterBundles(BundleTable.Bundles);
Timer tmr = new Timer();
tmr.Interval = 60000; //1 minute
tmr.Elapsed += updateUsersBalance;
tmr.Start();
}
private void updateUsersBalance(Object source, System.Timers.ElapsedEventArgs e)
{
var users = db.Users.ToList();
foreach (var user in users)
{
user.balance -= 1;
db.Entry(user).State = EntityState.Modified;
}
db.SaveChanges(); //save updated balances
}
Is this a reliable mechanism to update the balance every minute? Is it ok to have a reference to the database and a controller in the Global.cs file?
(leave aside the precision of the timer)
In my case this scenario would be preferable to a webjob because of the limitation that I can run them every hour at the most.
No this would not be a reliable method by itself. It is possible for IIS to shut down the app pool and therefore your loop wouldn't be running. You could possible get around that by just setting up ASP.NET Auto-Start (in Azure there is an "Always On" switch in the Configuration page to enable it) but really a job runner is probably the better option (in addition to ASP.NET Auto-Start). Maybe checkout Hangfire (which is what we are currently using) or Quartz.net
#nest I did not understand exactly what is your architecture, but I think that I understand what you need.
Getting your balance updated every minute is something virtual, think about it: "Why you need to update your balance if no one read it?"
With that in mind you can assert that your balance is updated whenever someone access it. This way you save processing resources. So you don`t need to bother about run this process on every minute, you need to run every time is has changes and for redundancy you can recalculate before the access.
That said you can use a Job Scheduler to calculate the balance, I suggest Hangfire, force the Job to run every time someone change values and also schedule to run on an interval or force to run if that interval is not met on someone access.
Of course this way you`ll need to change to Web Role, mainly because Hangfire has a web interface to you admin your jobs.

System.Threading.Timer callback not being hit

I have a windows service which, among other things, needs to do some database maintenance every 24 hours. (SQL express, so can't schedule it inside the database)
For this I have created a Timer with a callback to the database maintenance method, but it appears to only get hit once (upon creation of the timer, I assume).
I assumed this was due to the timer itself getting out of scope and being GC'd, but none of the sollutions I've tried seem to be working
The basic layout of the service is like this:
WindowsService.cs:
protected override void OnStart(string[] args)
{
ThreadPool.QueueUserWorkItem(StartServices);
}
private void StartServices() { Manager.Start(); }
Manager.cs:
public class Manager
{
private static Timer MaintenanceTimer;
public static void Start()
{
MaintenanceTimer = new Timer(DoMaintenance);
MaintenanceTimer.Change(new TimeSpan(0), new TimeSpan(24,0,0,0)); //I use 1 minute for testing
}
}
Obviously this code is severely simplified, but this is pretty much what happens.
As stated before, I believe GC is the problem, which made me try the following 2 things:
Use the constructor Timer(callback), so it will provide a
self-reference to the callback. However, this would still not prevent
it from running out of scope, so that's a no go.
Define the timer as
a static variable inside the Manager class. This should prevent it
from ever being GC'd, but still doesn't appear to have it be called
every 24 hours.
Any help in this matter would be greatly appreciated
In the end I used a regular System.Timers.Timer which solved my problems.
Still not sure where I went wrong with the System.Threading.Timer, though.
Since you cannot use the SQL Server agent in SQL Server Express, the best solution is to create a SQL Script, and then run it as a scheduled task.
It i easy to verify and mantain, you could have multiple scheduled tasks to fit in with your backup schedule/retention.
The command I use in the scheduled task is:
C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -i"c:\path\to\sqlbackupScript.sql

A question about making a C# class persistent during a file load

Apologies for the indescriptive title, however it's the best I could think of for the moment.
Basically, I've written a singleton class that loads files into a database. These files are typically large, and take hours to process. What I am looking for is to make a method where I can have this class running, and be able to call methods from within it, even if it's calling class is shut down.
The singleton class is simple. It starts a thread that loads the file into the database, while having methods to report on the current status. In a nutshell it's al little like this:
public sealed class BulkFileLoader {
static BulkFileLoader instance = null;
int currentCount = 0;
BulkFileLoader()
public static BulkFileLoader Instance
{
// Instanciate the instance class if necessary, and return it
}
public void Go() {
// kick of 'ProcessFile' thread
}
public void GetCurrentCount() {
return currentCount;
}
private void ProcessFile() {
while (more rows in the import file) {
// insert the row into the database
currentCount++;
}
}
}
The idea is that you can get an instance of BulkFileLoader to execute, which will process a file to load, while at any time you can get realtime updates on the number of rows its done so far using the GetCurrentCount() method.
This works fine, except the calling class needs to stay open the whole time for the processing to continue. As soon as I stop the calling class, the BulkFileLoader instance is removed, and it stops processing the file. What I am after is a solution where it will continue to run independently, regardless of what happens to the calling class.
I then tried another approach. I created a simple console application that kicks off the BulkFileLoader, and then wrapped it around as a process. This fixes one problem, since now when I kick off the process, the file will continue to load even if I close the class that called the process. However, now the problem I have is that cannot get updates on the current count, since if I try and get the instance of BulkFileLoader (which, as mentioned before is a singleton), it creates a new instance, rather than returning the instance that is currently in the executing process. It would appear that singletons don't extend into the scope of other processes running on the machine.
In the end, I want to be able to kick off the BulkFileLoader, and at any time be able to find out how many rows it's processed. However, that is even if I close the application I used to start it.
Can anyone see a solution to my problem?
You could create a Windows Service which will expose, say, a WCF endpoint which will be its API. Through this API you'll be able to query services' status and add more files for processing.
You should make your "Bulk Uploader" a service, and have your other processes speak to it via IPC.
You need a service because your upload takes hours. And it sounds like you'd like it to run unattended if necessary,, and you'd like it to be detached from the calling thread. That's what services do well.
You need some form of Inter-Process Communication because you'd like to send information between processes.
For communicating with your service see NetNamedPipeBinding
You can then send "Job Start" and "Job Status" commands and queries whenever you feel like to your background service.

Categories

Resources