Options for users can be turned on or off through checkboxes on the front end of a webApi Web-app:
Users Func A Func B Func C Func D
========================================
James o o o o
Mary o o o o
Clicking on a checkbox (o) calls an API end-point that updates that function setting for a user.
Users need to be sent an email when their options change BUT sending a separate email every time a checkbox is clicked isn't desireable for obvious reasons. I know I need to implement a user-timer that delays actaully sending the user an email until, say, a minute after a given user's option was last changed.
The code called every time a function was clicked would look something like this:
private void UpDateUserNotificationsSettings(UserModel model)
{
// Call Api endpoint to update the setting.
// Create a timer if one doesn't exist for this user.
// (I need the code for this bit)
_delayTimer = // (create if doesnt exist) new System.Timers.Timer();
_delayTimer.Interval = 60000;
_delayTimer.Elapsed += (o, e) => sendEmailMethod();
_delayTimer.Start();
}
However, I (think I) need a separate timer for each user, and am not sure how to go about doing that.
I agree with AlwaysLearning. The idea is to create a background running service that runs at set intervals or always running. You can use Hangfire for this https://www.hangfire.io/. Hangfire provides clean way of managing background jobs. You can write a CRON expression for a job. There are many tools like http://corntab.com/ to generate desired CRON expression.
You will have to maintain state of Functions change in database for each user that can show what changed from previous settings. If changes made falls in the window, then notify users.
I hope this helps to help you achieve final working code.
Related
This description at the moment is all theory, I don't have any code yet. I was hoping to bounce ideas off people.
I have a VueJS app, let's say a To Do app. It lists all of the things I need to do today. When I complete a To Do, I check a box in my Vue app and Axios fires of an Http Post to a .NET API end point. Let's say that API method has to do several things, like update several databases, execute a few stored procedures, etc. as in it can take a few seconds to complete. My Vue app gets a success response and I can then go on to check the next completed item.
If I have several things I've completed it could take several minutes of my day to check, wait, check and wait. Now I want to check several items or maybe even select all. I want to submit a list of items to the API, let them queue up and process in the background while I go about other business in the app. All the while, a panel in the app displays the items still being processed in the background. As each one completes in the API, a push notification occurs and the UI updates, removing the item from the list.
Does this sound doable? Would I have Vue listening for updates from the API? or would Vue periodically have to poll the API to see what it still has outstanding? What is the preferred way? The goal is to free up the user to keep working rather than watch paint dry.
#Connie, I can tell you from experience that it's really possible, with a few tweaks.
The first thing I'd do, is to add all the logic inside Vuex.
Making it really simple, the steps would be:
1. Create a vuex state called toDos, and I'd assume that each toDo would be an object containing a format such:
toDoModel = {
id: 1,
completed: false
}
API receives only 1 ID for updating
Create a vuex mutation for updating this toDos state:
updateToDo(state, toDoObject) {
const toDoObjectIndexOnState = state.toDos.findIndex(toDo =>
toDo.id == toDoObject.id)
//ToDo not found on state list
if (toDoObjectIndexOnState == -1) {
state.toDos.push(toDoObject)
return
}
state.toDos[toDoObjectIndexOnState] = toDoObject
}
Create a vuex action called updateToDoState, to perform the Axios call and update state:
updateToDoState({commit}, toDoId){
// Call API on Axios, assuming `data` as the key for returning the toDo with updated info
response = axios.post(ENDPOINT, toDoId)
.then({data: toDo} => {
if (!toDo) return
// Call mutation
commit('updateToDo', toDo)
})
Make the call on your main Vue Component to call the updateToDoState action on each checklist click to update toDo state
API can receive multiple IDs for updating
(you have two approaches:
- Have a mutation to change each toDo per time and the action would loop through them all
- Have the action to pass thewhole list and the mutation would take care of updating the store object for each returned Id
Here my examples assume the fist option
Create a vuex mutation for updating this toDos state:
updateToDo(state, toDoObject) {
const toDoObjectIndexOnState = state.toDos.findIndex(toDo =>
toDo.id == toDoObject.id)
//ToDo not found on state list
if (toDoObjectIndexOnState == -1) {
state.toDos.push(toDoObject)
return
}
state.toDos[toDoObjectIndexOnState] = toDoObject
}
Create a vuex action called updateToDosState, to perform the Axios call and update state:
updateToDosState({commit}, toDosIdsList){
// Call API on Axios, assuming `data` as the key for returning the toDo with updated info
response = axios.post(ENDPOINT, toDosIdsList)
.then({data: toDos} => {
if (!toDos) return
// Call mutation
toDos.forEach(toDo => commit('updateToDo', toDo))
})
Make the call on your main Vue Component to call the updateToDoState action for batch updating the toDos state
In case any part of this logic / code is not 100% clear, just let me know and I can update here!
Good evening,
In my SignalR application I have a javascript timer that is ran for all users "simultaneously". At the end of this timer, a server function is called, and this is where this problem starts.
As the function is called at the end of the timer, every connected user calls it at the same time, which is unnecessary because it will return the same output for all connected users. Being a logically complex function, having the server run it unnecessarily for all users adds up to be a great resource waste.
How can I make it so that it is ran only once (maybe the first time it is called (until the next timer stops))?
Thank you in advance
You could make use of GlobalHost.ConnectionManager.GetHubContext. This will allow you to get any hub context and then trigger Clients.All.YourFunction on that context. That will send send a message to all connected clients subscribed to that hub.
You will need to have a background process that runs every at the time your JavaScript function fires (by the way, relying on all your clients to call a JavaScript function simultaneously is really not a good idea; different client locations and different machine performance will mean they're not likely to be simultaneous).
The following is assuming that you're just running this on a single server. If you're going to be deploying this to a web farm, then you'll need to use a Database value to ensure you don't repeat the same work, or set up a particular server instance to be responsible for doing the calls (otherwise you'll end up with one call per server).
Create a process that runs in the Background (I'm sticking with a simple thread here, I actually use HangFire for this, but this will suffice for the example), e.g. On App_Start
Thread thread = new Thread(new ThreadStart(YourFunction));
thread.Start();
Then create YourFunction which will be responsible for your client calls:
private bool Cancel = false;
private void YourFunction()
{
do
{
string foo = "Foo";
IHubContext context = GlobalHost.ConnectionManager.GetHubContext<YourHub>();
context.Clients.All.SendYourMessage(foo);
Thread.Sleep(10000);
}
while(!Cancel)
}
And then on the client, just handle the message from the hub:
youyHub.client.sendYourMessage = function(message)
{
// message == "Foo"
};
I'm trying to find a solution to a send email action that may take a long time and time out our load balancer which is on Rackspace. The only question I could find that relates to this specific issue is this:
keep load balancer from timing out during long operation
As I understand it I need to run another action whilst the main slow action is completing to constantly poll and return in order to keep things alive. My email action contains the following:
var sendto = db.Users
.Where(b => b.Id == model.SentTo |
((model.SelectedRoles.Any(s => b.Roles.Select(h => h.RoleId).Contains(s)))
&& ((b.enrollment.Any(h => h.cohort.CourseID == model.CourseID) | model.CourseID == null))
&& (b.OrgID == model.OrgID | model.OrgID == null))).ToList();
foreach (var address in sendto)
{
string Body = "message goes here";
EmailConfig.SendMessageViaMailGun(filestoattach, address.Email, null, email.Subject, Body);
}
So a list is created and then looped through with emails being sent to each person on the list. The Async method answer in the question above seems like it would do the trick but in the comments I can see this is considered a bad idea. It's also out of date in terms of how async works in the latest MVC version.
My question is what is the best way to keep this action from timing out the load balancer whilst it is completing?
This has nothing to do with async really, it is an infrastructure issue.
There are two ways to perform long operations:
The proper way: have a backend server and a process running there + communicate to this backend process via queuing (or database polling), then the client updates based on the progress (stored in some database) and update the UI on the web server. You also need to track the progress on the backend to continue on case of unexpected shutdown.
The cheap way: Spin a different thread (or task) on the web server, and have it perform the operation, and poll from javascript the progress of this thread. This could however get shut down any minute (webserver recycle) and you lose the operation (if you are ok with this), then you need to pick up the operation and continue. A crude way would be to just wrap the whole thing you have with Task.Run, and return right away, then query the progress from Javascript, but as I said above this is prone to interruptions.
I've got a routine called GetEmployeeList that loads when my Windows Application starts.
This routine pulls in basic employee information from our Active Directory server and retains this in a list called m_adEmpList.
We have a few Windows accounts set up as Public Profiles that most of our employees on our manufacturing floor use. This m_adEmpList gives our employees the ability to log in to select features using those Public Profiles.
Once all of the Active Directory data is loaded, I attempt to "auto logon" that employee based on the System.Environment.UserName if that person is logged in under their private profile. (employees love this, by the way)
If I do not thread GetEmployeeList, the Windows Form will appear unresponsive until the routine is complete.
The problem with GetEmployeeList is that we have had times when the Active Directory server was down, the network was down, or a particular computer was not able to connect over our network.
To get around these issues, I have included a ManualResetEvent m_mre with the THREADSEARCH_TIMELIMIT timeout so that the process does not go off forever. I cannot login someone using their Private Profile with System.Environment.UserName until I have the list of employees.
I realize I am not showing ALL of the code, but hopefully it is not necessary.
public static ADUserList GetEmployeeList()
{
if ((m_adEmpList == null) ||
(((m_adEmpList.Count < 10) || !m_gotData) &&
((m_thread == null) || !m_thread.IsAlive))
)
{
m_adEmpList = new ADUserList();
m_thread = new Thread(new ThreadStart(fillThread));
m_mre = new ManualResetEvent(false);
m_thread.IsBackground = true;
m_thread.Name = FILLTHREADNAME;
try {
m_thread.Start();
m_gotData = m_mre.WaitOne(THREADSEARCH_TIMELIMIT * 1000);
} catch (Exception err) {
Global.LogError(_CODEFILE + "GetEmployeeList", err);
} finally {
if ((m_thread != null) && (m_thread.IsAlive)) {
// m_thread.Abort();
m_thread = null;
}
}
}
return m_adEmpList;
}
I would like to just put a basic lock using something like m_adEmpList, but I'm not sure if it is a good idea to lock something that I need to populate, and the actual data population is going to happen in another thread using the routine fillThread.
If the ManualResetEvent's WaitOne timer fails to collect the data I need in the time allotted, there is probably a network issue, and m_mre does not have many records (if any). So, I would need to try to pull this information again the next time.
If anyone understands what I'm trying to explain, I'd like to see a better way of doing this.
It just seems too forced, right now. I keep thinking there is a better way to do it.
I think you're going about the multithreading part the wrong way. I can't really explain it, but threads should cooperate and not compete for resources, but that's exactly what's bothering you here a bit. Another problem is that your timeout is too long (so that it annoys users) and at the same time too short (if the AD server is a bit slow, but still there and serving). Your goal should be to let the thread run in the background and when it is finished, it updates the list. In the meantime, you present some fallbacks to the user and the notification that the user list is still being populated.
A few more notes on your code above:
You have a variable m_thread that is only used locally. Further, your code contains a redundant check whether that variable is null.
If you create a user list with defaults/fallbacks first and then update it through a function (make sure you are checking the InvokeRequired flag of the displaying control!) you won't need a lock. This means that the thread does not access the list stored as member but a separate list it has exclusive access to (not a member variable). The update function then replaces (!) this list, so now it is for exclusive use by the UI.
Lastly, if the AD server is really not there, try to forward the error from the background thread to the UI in some way, so that the user knows what's broken.
If you want, you can add an event to signal the thread to stop, but in most cases that won't even be necessary.
I am working with an ASP site which requires a "reminder" email to be sent out every x, y and z minutes. I have attempted to start a timer upon an event (like a button click or page load) but this proved to be unreliable as the timers would be disposed of when the server performed an automatic backup or when the aspx.cs file was updated.
My new idea is to have a timer constantly running (a check is performed on a page load which ensures its running) and, when it elapses, it checks to see if either x, y or z minutes have elapsed. So if y elapses, it needs to send out a "reminder" email and then restart y's timer.
void ParentTimer_Elapsed(object sender, ElapsedEventArgs e)
{
foreach(Timer childTimer in ChildTimerList)
{
if(childTimer.Enabled == false) // And therefore has elapsed
{
sendReminderEmail(childTimer);
childTimer = checkAndSetCorrectInterval(childTimer);
childTimer.AutoReset = false;
childTimer.Enabled = true;
}
}
}
The list ChildTimerList would obviously contain x, y and z.
Can anybody forsee me running into any problems with this, or are there any better ways to approach it? My perfect solution would be a timer running costantly which doesn't need to be started upon an event but I don't think this is possible with ASP.
Furthermore, where should I initialise my parent timer and childlist variables? In a class within the App_Code folder or, statically, in a code-behind aspx.cs page?
Thanks in advance.
EDIT:
Yes, I do mean ASP.NET... :)
I would probably implement this with a simple console application (responsible for sending e-mails) and Task Scheduler in Windows (responsible for running the application on a schedule). Keep it simple. And robust.
Edit: Provided that you are in control of the server - it will probably not be the best solution in a shared hosting environment where you're only allowed to run web apps.