Scheduling methods to run in ASP.NET - c#

Explanation:
I am developing a simple car business system and I have to implement the following feature:
A very special car model is delivered to a shop. There are a lot of people on waiting list exactly for this model.
When the car arrives the first client receives the right to buy it, he / she has 24 hours to use this opportunity.
I have a special state in the DB that determines if the user is: on waiting list (I have the exact position, as well) or can use opportunity to buy the car. Whenever the car arrives, I run a method that changes the state of the first client on waiting list. And here comes the problem:
Problem:
The client can use his opportunity, during the 24 hours period. But I have to check at the end, if he/she has bought the car. For this reason, I have to schedule a method to run in 24 hours.
Possible solution:
I am thinking about two things. First is using a job scheduler like Hangfire. The problem is that since I do not have any other jobs in my app, I do not want to include a whole package for such a small thing. Second is using making the checking method asynchronous and making the thread sleep for 24 hours before proceeding (I do not feel comfortable in working with threads and this is just an idea). I got the idea from this article. Keep in mind that more than one car can arrive in more than one shop. Does it mean that I should use many threads and how it is going to affect the performance of the system?
Question:
Which of the two solutions is better?
Is there another possibility that you can suggest in this particular case?

I agree. Importing a package for only one job if you aren't going to use it for many jobs is a little bit of overkill.
If you are running SQL server, I'd recommend writing a .NET console application to run on a schedule using the SQL Server Agent. (see image) If you have stored procedures that need to run, you also have the option to run them directly from the SQL job if for some reason you don't want to run them from your .NET application.
Since it sounds like you need this to run on a data driven schedule, you may consider adding a trigger to look for a new record in your database whenever that "special" car is inserted into the database. MSDN SQL Job using Trigger
I've done something similar to this where every morning, an hour prior to business hours starting, I run a .NET executable that checks the latest record in table A and compares it to a value in table B and determines if the record in table A needs to be updated.
I also use SQL Server to run jobs that send emails on a schedule based on data that has been added or modified in a database.
There are advantages to using SQL server to running your jobs as there are many options available to notify you of events, retry running failed jobs, logging and job history. You can specify any type of schedule from repeating frequently to only running once a week.

Related

In C# How can I get a event from MySQL when somebody Insert, Delete or Modify a record?

I am developing a program in WPF.Net, and I need to know when somebody makes a change over any table of the database.
The idea is receive a event from the database when it was changed. I was reading a lot of articles but I can't find a method to resolve my problem.
Kind Regards
The best solution is to use a message queue. After your app commits a change to the database, the app also publishes a message on the message queue. Other clients then just wait for notifications on that message queue.
There are a few other common solutions, but all of them have disadvantages.
Polling. If a client is interested in recent changes, they run a query searching for new data every N seconds.
The downside is you have to keep polling even during times when there are no changes. You might have to poll very frequently, depending on how promptly you need to notice the changes. This adds to database load just to support the polling queries.
Also it costs more if you have many clients all polling for queries. In one system I supported, the database was struggling to process 30,000 queries per second just for clients running polling.
Change Data Capture. Using the binary log as a de facto message queue, because it records all the changes. Use a client tool such as Debezium, or write your own binlog tail client (this is a lot of work).
The downside is the binlog records all changes, not just those you want to be notified about. You have to filter it somehow. Also you have to learn how to use Debezium or equivalent tool.
Triggers. Write a trigger on the table that invokes a UDF to post notification outside the database. This is a bad idea, because the trigger executes when your insert/update/delete executes, not when the transaction commits. Clients could be notified of changes before the changes are committed, so if they go query the database right after they get the notification, the change is not visible to them yet.
Also a disadvantage because it requires you install a UDF extension in MySQL Server. MySQL doesn't normally have any way of posting an external notification.
I'm not a C# developer so I can't suggest specific code. But the general methods above are similar regardless of which language the app is written in.
I don't think this is possible with MySQL, DBs like MondgoDB have this sort of feature.
You may like to use the method described in this answer.
Essentially have date/time fields on rows where you can pull data since a certain date time. Or you could use a CQRS/Event stratagem and maybe use a message queue.

NServiceBus batching a long running job

I'm working on a project that is using NSB, really like it but it's my first NSB solution so a bit of a noob. We have a job that needs to run every day that processes members - it is not expected to take long as the work is simple, but will potentially effect thousands of members, and in the future, perhaps tens or hundreds of thousands.
Having it all happen in a single handler in one go feels wrong, but having a handler discover affected members and then fire separate events for each one sounds a bit too much in the opposite direction. I can think of a few other methods of doing it, but was wondering if there is an idiomatic way of dealing with this in NSB?
Edit to clarify: I'm using Schedule to send a command at 3am, the handler for that will query the SQL db for a list of members who need to be processed. Processing will involve updating/inserting one or two rows per member. My question is around how to process that potentially larege list of members within NSB.
Edit part 2: the job now needs to run monthly, not daily.
I would not use a saga for this. Sagas should be lightweight and are designed for orchestration rather than performing work. They are started by messages rather than scheduled.
You can achieve your ends by using the built-in scheduler. I've not used it, but it looks simple enough.
You could do something like:
configure a command message (eg StartJob) to be sent every day at 0300.
StartJob handler will then query the DB to get the work.
Then, depending on your requirements:
If you need all the work done at once, create a single command with all the work in it, and send it to another endpoint for processing. If you use transactional MSMQ then this will succeed or fail as a unit.
If you don't care if only some work succeeds then create a command per unit of work, and dispatch to an endpoint for processing. This has the benefit that you can scale out using the distributor if you needed to.
I'm working on a project that is using NSB...We have a job that needs
to run every day...
Although you can use NSB for this kind of work, it's not really something I would do. There are many other approaches you could use. A SQL job or cron job would be the obvious one (and a hell of a lot quicker to develop, more performant, and simpler).
Even though it does support such use cases, NServiceBus is not really designed for scheduled batch processing. I would seriously question whether you should even use NSB for this task.
You mention a running process and that sounds like a job for a Saga (see https://docs.particular.net/nservicebus/sagas/). You can use saga data and persist checkpoints in different storage mediums (SQL, Mongo etc). But yes, having something long running then dispatch messages from the Saga to individual handlers is definitely something I would do also.
Something else to consider is message deferral (Timeout Managers). So for example, lets say you process x number of users but want to run this again. NServiceBus allows you to defer messages for a defined period and the message will sit in the queue waiting to be dispatched.
Anymore info just shout and I can update my answer.
A real NSB solution would be to get rid of the "batch" job that processes all those records in one run and find out what action(s) would cause each of these records to need processing after all.
When such an action is performed you should publish an NSB event and refactor the batch job to a NSB handler that subscribes to these events so it can do the processing the moment the action is performed, running in parallel with the rest of your proces.
This way there would be no need anymore for a scheduled 'start' message at 3 am, because all the work would already have been done.
Here is how I might model this idiomatically with NServiceBus: there might be a saga called PointsExpirationPolicy, which would be initiated at the moment that any points are awarded to a user. The saga would store the user ID, and number of points awarded, and also calculate the date/time the points should expire. Then it would request a timeout callback message to be sent at the date/time these points should expire. When that callback arrives, the saga sends a command to expire that number of points from the user's account. This would also give you some flexibility around the logic of exactly when and how points expire, and would eliminate the whole batch process.

Semaphore vs. SQL-Job when trying to remove expired SQL records

I'm using ASP.NET and C# to build some 'Social Network' web site,
while adding posts there are to SQL columns that i fill, the date and time when the post was added, and the date and time when the post is expired (It varies between all kind of posts..)
I want some process that constantly checks the SQL database and remove posts with expired date and time.
I've searched for solution and i understand that the 2 most suitable solutions are Semaphores and SQL Jobs (Correct me if i'm wrong).
I hope you could give me a hint about what's the best solution, if it's not one of the two what is it, and some info about the best solution as well..
Thanks!
Just hide posts that have expired based on the current time. For example
WHERE ExpiryDateTime > SYSUTCDATE()
Then you can clean old posts in the background at any frequency you like. Create a Windows Task Scheduler task that calls a special URL of your website. That URL should perform a database cleanup. This is a very simple and clearly correct solution.
If you don't like Windows Task Scheduler (and who really does like it...) you can use a scheduler lib such as Hangfire or Quartz.Net.
Neither.
A semaphore is a way of controlling resource use between multiple threads
An SQL job is a somewhat blunt tool designed to allow db admins to schedule tasks
I would create a separate program 'oldDataDeleter' code up your logic about what you want to delete or archive after how much time and then apply that logic in an atomic way. I would run this as a windows service with a timer, or a console app as a scheduled task
The key is to ensure that the program can run concurrently with itself and only does small atomic changes on a small chunk of data at a time.
You can then fire up multiple instances of this program running at a high frequency.
This ensures don't lock your database with large 'delete all from table X join table Y' statements and that your data is constantly trimmed rather than building up a big overnight job to run.
Edit for 'all code must be in a single website project' restriction
There is another solution which in some ways is better and works with your (slightly odd and very much not best practice) requirement.
That is to delete old entries whenever you make an insertion. So when you code adds a Post "insert into posts..." it also runs the delete "delete from posts where.."
This ensures that your program is self maintaining. However, you do incur a performance hit when adding posts. Given that a large social media site would be continually adding posts and needs to scale with its users. I don't recommend this solution.
However for small projects which don't need to scale it is neater.

Task Scheduling / Load Balance Pattern

I've run into this a few times recently at work. Where we have to develop an application that completes a series of items on a schedule, sometimes this schedule is configurable by the end user, other times its set in Config File. Either way, this task is something that should only be executed once, by a single machine. This isnt generally difficult, until you introduce the need for SOA/Geo Redundancy. In this particular case there are a total of 4 (could be 400) instances of this application running. There are two in each data center on opposite sides of the US.
I'm investigating successful patterns for this sort of thing. My current solution has each physical location determining if it should be active or dormant. We do this by checking a Session object that is maintained to another server. If DataCenter A is the live setup, then the logic auto-magically prevents the instances in DataCenter B from performing any execution. (We dont want the work to traverse the MPLS between DCs)
The two remaining instances in DC A will then query the Database for any jobs that need to be executed in the next 3 hours and cache them. A separate timer runs every second checking for jobs that need executed.
If it finds one it will execute a stored procedure first, that forces a full table lock, queries for the job that needs to be executed, checks the "StartedByInstance" Column for a value, if it doesnt find a value then it marks that record as being executed by InstanceX. Only then will it actually execute the job.
My direct questions are:
Is this a good pattern?
Are there any better patterns?
Are there any libraries/apis that would be of interest?
Thanks!

Use Quartz.NET to monitor Mysql table and execute some stuff

Is it possible to accomplish the following using Quartz.NET :
i have a mysql table named "VODContent" with a field "StreamAt" - datetime - is it possible to monitor this table and 5 minutes before or after the "StreamAt" datetime value, some process (code) is executed ?
i'am avaialable for further details if needed
Thanks in advance
Sure, this is possible:
Create one Quartz.Net Job to poll for the VODContent (there are other ways then polling for other DBMS but this is a simple solution which works for mysql since the mysqlconnector doesn't feature event notification as far as I know). In order to create such a job, you have to implement the IJob interface of Quartz.Net as described here and your program has to schedule it to run every n-seconds/minutes which is also described in the tutorial.
A short tutorial which shows how to connect to, read from and write to a mysql database via the mysql connector can be found here.
How to find out if you already started a job for a specific VODContent depends. You could, for example, modify the specific Row and introduce a new bool value "Processed" which gets updated upon scheduling the specific job. You then have to query only for those values which are not yet processed.
Please read the parts about statefull vs. stateless jobs in Quartz.Net and be sure to understand the difference.
That job created above would then check for changes in the table and schedule a new job which in turn runs your desired process / code according to your rules using the simple trigger instruction as described in the tutorial.
My feeling is that by doing what you asked, you are not using Quartz to its full potential.
An easier way to achieve what you want to do would be that each time you create an entry in VODContent, you also create a trigger scheduled at StreamAt + 5 minutes. This should be done at the level of your applicative code, in the process that is adding the entry in VODContent.
The trigger you create must have a name and a group (so that you can retrieve it at a later point). In your case, I believe the name could be the Id of your VODContent line. Here is how you create such a trigger:
var simpleTrigger = new SimpleTrigger("yourVODContentId", "VODContentStreamAt", startTime);
Then, if at some point you want to cancel the execution of your trigger (let's say for example that you remove an entry in VODContent because your user cancelled his order), you can call the UnscheduleJob on the Scheduler in order to remove your trigger:
_scheduler.UnscheduleJob("yourVODContentId", "VODContentStreamAt");
One last important thing is that your Quartz.NET instance should be configured to AdoJobStore, and not the RAMJobStore. That way, your triggers will be stored in a database, and even if the scheduler is stopped/restarted, you won't lose your scheduled triggers.
Basically, you should try to put yourself in a position where you push to Quartz, instead of having Quartz pull from your application.

Categories

Resources