I'm working on a project where users can queue up actions to happen in the future. Each future action is saved in the database. I'm trying to find a way to get all the actions that have already passed and act on them. Currently I'm doing this:
public class TaskingManager
{
private static readonly System.Timers.Timer RefreshEventsLoop = new(1000);
public void Initialize()
{
RefreshEventsLoop.Elapsed += RefreshEventsLoop_Elapsed;
RefreshEventsLoop.AutoReset = false;
RefreshEventsLoop.Start();
}
private void RefreshEventsLoop_Elapsed(object? sender, System.Timers.ElapsedEventArgs e)
{
RefreshEventsLoop.Enabled = false;
ProcessEventsQueue();
RefreshEventsLoop.Enabled = true;
}
private void ProcessEventsQueue()
{
var timeNow = DateTime.UtcNow;
var context = new GalaxyDbContext(_dbOptions);
var dbFunctions = new DatabaseFunctions(context);
var eventsToProcess = context.FutureEvents
.Where(futureEvent => futureEvent.TriggerTime <= timeNow)
.ToList();
if (eventsToProcess.Any())
{
context.FutureEvents.RemoveRange(eventsToProcess);
context.SaveChanges();
foreach (var pastEvent in eventsToProcess)
{
_messageMap[pastEvent.MessageType].Invoke(dbFunctions).Execute(pastEvent);
}
}
else
{
dbFunctions.CreateFutureScienceTask();
}
}
}
This seems to work ok. But the problem is that after the app has been running for a while, I can see the LINQ part is taking up a huge amount of memory:
524MB used
And if I leave it running for a few days then it's up in the gigs.
I'm guessing running this query every second is wasteful on resources. But I don't know of a better way to do this. Is there a way to continuously check the database for something like this?
The first thing i can see is that you are not disposing the databasecontextt afer you used it.
Read this for more info about entityframework context lifetime.
To properly dispose it use a using statement.
using (var context = new GalaxyDbContext(_dbOptions))
{
//your code that uses the context
}
Or with new using syntax:
using (var context = new GalaxyDbContext(_dbOptions));
//your code that uses the context
Right know the problem could be that you create a context everytime you call the function and never dispose it and it still keeps references to the data etc..
Related
Following thing boggles my mind:
I have to bulk insert a lot of changes, some are inserts some are updates. I am not sure how to do it the best way.
Logic looks something like this:
public class Worker
{
public void Run(){
var mailer = new Mailer();
HashSet<DbElements> dbElementsLookUp = new HashSet<DbElement>(dbContext.DbElements);
List<Element> elements = GetSomeChangesFromSomewhere();
var dbElementsToSave = new List<DbElements>();
foreach(var element in elements)
{
CreateOrUpdateDbElement(element, dbElementsToSave);
// Sends some data based on the element - due to legacy implementation it uses its own context
mailer.SendSomeLogging(element);
}
try
{
dbContext.ChangeTracker.DetectChanges();
dbContext.Set<DbElement>().AddRange(dbElementsToSave);
dbContext.SaveChanges();
}
catch (Exception e)
{
LogErrors(e);
}
}
private CreateOrUpdateDbElement(ElementDto element, HashSet<DbElement> lookUp, List<DbElement> dbElementsToSave)
{
var entity = lookUp.FirstOrDefault(e => e.Id == element.Id);
if(element is not null)
{
entity.SomeProperty = element.SomeProperty;
dbContext.Configuration.AutoDetectChangesEnabled = false;
dbContext.Entry(entity).State = EntityState.Modified;
dbContext.Configuration.AutoDetectChangesEnabled = true;
}
else
{
dbElementsToSave.Add(new DbElement
{
SomeProperty = element.SomeProperty,
CreationDate = DateTime.Now
});
}
}
}
I'm not sure what's the best way to do this, especially for the DetectChanges. Is it save to disable the autodetectchanges and call the detectchanges outside of the foreach. I am working with a lot of data and due to the legacy implementation it is pretty slow because for each mail there is a write operation on the database. It actually works on another instance of the context so it does not interfer with the saving of the objects of dbelements.
Is it better to add the entities to update to another list and do the same as for the adding of entities?
I'm writing a program that will analyze changes in the stock market.
Every time the candles on the stock charts are updated, my algorithm scans every chart for certain pieces of data. I've noticed that this process is taking about 0.6 seconds each time, freezing my application. Its not getting stuck in a loop, and there are no other problems like exception errors slowing it down. It just takes a bit of time.
To solve this, I'm trying to see if I can thread the algorithm.
In order to call the algorithm to check over the charts, I have to call this:
checkCharts.RunAlgo();
As threads need an object, I'm trying to figure out how to run the RunAlgo(), but I'm not having any luck.
How can I have a thread run this method in my checkCharts object? Due to back propagating data, I can't start a new checkCharts object. I have to continue using that method from the existing object.
EDIT:
I tried this:
M4.ALProj.BotMain checkCharts = new ALProj.BotMain();
Thread algoThread = new Thread(checkCharts.RunAlgo);
It tells me that the checkCharts part of checkCharts.RunAlgo is gives me, "An object reference is required for the non-static field, method, or property "M4.ALProj.BotMain"."
In a specific if statement, I was going to put the algoThread.Start(); Any idea what I did wrong there?
The answer to your question is actually very simple:
Thread myThread = new Thread(checkCharts.RunAlgo);
myThread.Start();
However, the more complex part is to make sure that when the method RunAlgo accesses variables inside the checkCharts object, this happens in a thread-safe manner.
See Thread Synchronization for help on how to synchronize access to data from multiple threads.
I would rather use Task.Run than Thread. Task.Run utilizes the ThreadPool which has been optimized to handle various loads effectively. You will also get all the goodies of Task.
await Task.Run(()=> checkCharts.RunAlgo);
Try this code block. Its a basic boilerplate but you can build on and extend it quite easily.
//If M4.ALProj.BotMain needs to be recreated for each run then comment this line and uncomment the one in DoRunParallel()
private static M4.ALProj.BotMain checkCharts = new M4.ALProj.BotMain();
private static object SyncRoot = new object();
private static System.Threading.Thread algoThread = null;
private static bool ReRunOnComplete = false;
public static void RunParallel()
{
lock (SyncRoot)
{
if (algoThread == null)
{
System.Threading.ThreadStart TS = new System.Threading.ThreadStart(DoRunParallel);
algoThread = new System.Threading.Thread(TS);
}
else
{
//Recieved a recalc call while still calculating
ReRunOnComplete = true;
}
}
}
public static void DoRunParallel()
{
bool ReRun = false;
try
{
//If M4.ALProj.BotMain needs to be recreated for each run then uncomment this line and comment private static version above
//M4.ALProj.BotMain checkCharts = new M4.ALProj.BotMain();
checkCharts.RunAlgo();
}
finally
{
lock (SyncRoot)
{
algoThread = null;
ReRun = ReRunOnComplete;
ReRunOnComplete = false;
}
}
if (ReRun)
{
RunParallel();
}
}
I will try to tell my problem in as simple words as possible.
In my UWP app, I am loading the data async wise on my Mainpage.xaml.cs`
public MainPage()
{
this.InitializeComponent();
LoadVideoLibrary();
}
private async void LoadVideoLibrary()
{
FoldersData = new List<FolderData>();
var folders = (await Windows.Storage.StorageLibrary.GetLibraryAsync
(Windows.Storage.KnownLibraryId.Videos)).Folders;
foreach (var folder in folders)
{
var files = (await folder.GetFilesAsync(Windows.Storage.Search.CommonFileQuery.OrderByDate)).ToList();
FoldersData.Add(new FolderData { files = files, foldername = folder.DisplayName, folderid = folder.FolderRelativeId });
}
}
so this is the code where I am loading up a List of FolderData objects.
There in my other page Library.xaml.cs I am using that data to load up my gridview with binding data.
protected override void OnNavigatedTo(NavigationEventArgs e)
{
try
{
LoadLibraryMenuGrid();
}
catch { }
}
private async void LoadLibraryMenuGrid()
{
MenuGridItems = new ObservableCollection<MenuItemModel>();
var data = MainPage.FoldersData;
foreach (var folder in data)
{
var image = new BitmapImage();
if (folder.files.Count == 0)
{
image.UriSource = new Uri("ms-appx:///Assets/StoreLogo.png");
}
else
{
for (int i = 0; i < folder.files.Count; i++)
{
var thumb = (await folder.files[i].GetThumbnailAsync(Windows.Storage.FileProperties.ThumbnailMode.VideosView));
if (thumb != null) { await image.SetSourceAsync(thumb); break; }
}
}
MenuGridItems.Add(new MenuItemModel
{
numberofvideos = folder.files.Count.ToString(),
folder = folder.foldername,
folderid = folder.folderid,
image = image
});
}
GridHeader = "Library";
}
the problem I am facing is that when i launch my application, wait for a few seconds and then i navigate to my library page, all data loads up properly.
but when i try to navigate to library page instantly after launching the app, it gives an exception that
"collection was modified so it cannot be iterated"
I used the breakpoint and i came to know that if i give it a few seconds the List Folder Data is already loaded properly asyncornously, but when i dnt give it a few seconds, that async method is on half way of loading the data so it causes exception, how can i handle this async situation? thanks
What you need is a way to wait for data to arrive. How you fit that in with the rest of the application (e.g. MVVM or not) is a different story, and not important right now. Don't overcomplicate things. For example, you only need an ObservableCollection if you expect the data to change while the user it looking at it.
Anyway, you need to wait. So how do you wait for that data to arrive?
Use a static class that can be reached from everywhere. In there put a method to get your data. Make sure it returns a task that you cache for future calls. For example:
internal class Data { /* whatever */ }
internal static class DataLoader
{
private static Task<Data> loaderTask;
public static Task<Data> LoadDataAsync(bool refresh = false)
{
if (refresh || loaderTask == null)
{
loaderTask = LoadDataCoreAsync();
}
return loaderTask;
}
private static async Task<Data> LoadDataCoreAsync()
{
// your actual logic goes here
}
}
With this, you can start the download as soon as you start the application.
await DataLoader.LoadDataAsync();
When you need the data in that other screen, just call that method again. It will not download the data again (unless you set refresh is true), but will simply wait for the work that you started earlier to finish, if it is not finished yet.
I get that you don't have enough experience.There are multiple issues and no solution the way you are loading the data.
What you need is a Service that can give you ObservableCollection of FolderData. I think MVVM might be out of bounds at this instance unless you are willing to spend a few hours on it. Though MVVM will make things lot easier in this instance.
The main issue at hand is this
You are using foreach to iterate the folders and the FolderData list. Foreach cannot continue if the underlying collection changes.
Firstly you need to start using a for loop as opposed to foreach. 2ndly add a state which denotes whether loading has finished or not. Finally use observable data source. In my early days I used to create static properties in App.xaml.cs and I used to use them to share / observe other data.
I have a user control that displays information from the database. This user control has to update these information constantly(let's say every 5 seconds). A few instances of this user control is generated programmatically during run time in a single page. In the code behind of this user control I added a code that sends a query to the database to get the needed information (which means every single instance of the user control is doing this). But this seems to slow down the processing of queries so I am making a static class that will do the querying and store the information in its variables and let the instances of my user control access those variables. Now I need this static class to do queries every 5 seconds to update its variables. I tried using a new thread to do this but the variables don't seem to be updated since I always get a NullReferenceException whenever I access them from a different class.
Here's my static class:
public static class SessionManager
{
public static volatile List<int> activeSessionsPCIDs;
public static volatile List<int> sessionsThatChangedStatus;
public static volatile List<SessionObject> allSessions;
public static void Initialize() {
Thread t = new Thread(SetProperties);
t.Start();
}
public static void SetProperties() {
SessionDataAccess sd = new SessionDataAccess();
while (true) {
allSessions = sd.GetAllSessions();
activeSessionsPCIDs = new List<int>();
sessionsThatChangedStatus = new List<int>();
foreach (SessionObject session in allSessions) {
if (session.status == 1) { //if session is active
activeSessionsPCIDs.Add(session.pcid);
}
if (session.status != session.prevStat) { //if current status doesn't match the previous status
sessionsThatChangedStatus.Add(session.pcid);
}
}
Thread.Sleep(5000);
}
}
And this is how I am trying to access the variables in my static class:
protected void Page_Load(object sender, EventArgs e)
{
SessionManager.Initialize();
loadSessions();
}
private void loadSessions()
{ // refresh the current_sessions table
List<int> pcIds = pcl.GetPCIds(); //get the ids of all computers
foreach (SessionObject s in SessionManager.allSessions)
{
SessionInfo sesInf = (SessionInfo)LoadControl("~/UserControls/SessionInfo.ascx");
sesInf.session = s;
pnlMonitoring.Controls.Add(sesInf);
}
}
Any help, please? Thanks
Multiple threads problem
You have one thread that gets created for each and every call to SessionManager.Initialize.
That happens more than once in the lifetime of the process.
IIS recycles your app at some point, after a period of time should you have absolutely no requests.
Until that happens, all your created threads continue to run.
After the first PageLoad you will have one thread which updates stuff every 5 seconds.
If you refresh the page again you'll have two threads, possibly with different offsets in time but each of which, doing the same thing at 5 second intervals.
You should atomically check to see if your background thread is started already. You need at least an extra bool static field and a object static field which you should use like a Monitor (using the lock keyword).
You should also stop relying on volatile and simply using lock to make sure that other threads "observe" updated values for your static List<..> fields.
It may be the case that the other threads don't observe a change field and thusly, for them, the field is still null - therefore you get the NullReferenceException.
About volatile
Using volatile is bad, at least in .NET. There is a 90% chance that you think you know what it is doing and it's not true and there's a 99% chance that you feel relief because you used volatile and you aren't checking for other multitasking hazards the way you should.
RX to the rescue
I strongly suggest you take a look at this wonderful thing called Reactive Extensions.
Believe me, a couple of days' research combined with the fact that you're in a perfect position to use RX will pay of, not just now but in the future as well.
You get to keep your static class, but instead of materialised values that get stored within that class you create pipes that carry information. The information flows when you want it to flow. You get to have subscribers to those pipes. The number of subscribers does not affect the overall performance of your app.
Your app will be more scalable, and more robust.
Good luck!
There are few solution for this approach:
One of them is:
It's better in Global.asax in Application_start or Session_Start (depends on your case) create Thread to call your method:
Use below code :
var t = Task.Factory.StartNew(() => {
while(true)
{
SessionManager.SetProperties();
Task.Delay(5);
}
});
Second solution is using Job Scheduler for ASP.NET (that's my ideal solution).
for more info you can check this link How to run Background Tasks in ASP.NET
and third solution is rewrite your static class as follow:
public static class SessionManager
{
public static volatile List<int> activeSessionsPCIDs;
public static volatile List<int> sessionsThatChangedStatus;
public static volatile List<SessionObject> allSessions;
static SessionManager()
{
Initialize();
}
public static void Initialize() {
var t = Task.Factory.StartNew(() => {
while(true)
{
SetProperties();
Task.Delay(5);
}
});
}
public static void SetProperties() {
SessionDataAccess sd = new SessionDataAccess();
while (true) {
allSessions = sd.GetAllSessions();
activeSessionsPCIDs = new List<int>();
sessionsThatChangedStatus = new List<int>();
foreach (SessionObject session in allSessions) {
if (session.status == 1) { //if session is active
activeSessionsPCIDs.Add(session.pcid);
}
if (session.status != session.prevStat) { //if current status doesn't match the previous status
sessionsThatChangedStatus.Add(session.pcid);
}
}
Thread.Sleep(5000);
}
}
This is a solution that is a change in approach, but I kept the solution in Web Forms, to make it more directly applicable to your use case.
SignalR is a technology that enables real-time, two way communication between server and clients (browsers), which can replace your static session data class. Below, I have implemented a simple example to demonstrate the concept.
As a sample, create a new ASP.NET Web Forms application and add the SignalR package from nuget.
Install-Package Microsoft.AspNet.SignalR
You will need to add a new Owin Startup class and add these 2 lines:
using Microsoft.AspNet.SignalR;
... and within the method
app.MapSignalR();
Add some UI elements to Default.aspx:
<div class="jumbotron">
<H3 class="MyName">Loading...</H3>
<p class="stats">
</p>
</div>
Add the following JavaScript to the Site.Master. This code references signalr, and implement client-side event handlers and initiates contact with the signalr hub from the browser. here's the code:
<script src="Scripts/jquery.signalR-2.2.0.min.js"></script>
<script src="signalr/hubs"></script>
<script >
var hub = $.connection.sessiondata;
hub.client.someOneJoined = function (name) {
var current = $(".stats").text();
current = current + '\nuser ' + name + ' joined.';
$(".stats").text(current);
};
hub.client.myNameIs = function (name) {
$(".MyName").text("Your user id: " + name);
};
$.connection.hub.start().done(function () { });
</script>
Finally, add a SignalR Hub to the solution and use this code for the SessionDataHub implementation:
[HubName("sessiondata")]
public class SessionDataHub : Hub
{
private ObservableCollection<string> sessions = new ObservableCollection<string>();
public SessionDataHub()
{
sessions.CollectionChanged += sessions_CollectionChanged;
}
private void sessions_CollectionChanged(object sender, NotifyCollectionChangedEventArgs e)
{
if (e.Action == NotifyCollectionChangedAction.Add)
{
Clients.All.someOneJoined(e.NewItems.Cast<string>().First());
}
}
public override Task OnConnected()
{
return Task.Factory.StartNew(() =>
{
var youAre = Context.ConnectionId;
Clients.Caller.myNameIs(youAre);
sessions.Add(youAre);
});
}
public override Task OnDisconnected(bool stopCalled)
{
// TODO: implement this as well.
return base.OnDisconnected(stopCalled);
}
}
For more information about SignalR, go to http://asp.net/signalr
Link to source code: https://lsscloud.blob.core.windows.net/downloads/WebApplication1.zip
In the following code doesn't work as
public void Foo()
{
CompanyDataContext db = new CompanyDataContext();
Client client = (select c from db.Clients ....).Single();
Bar(client);
}
public void Bar(Client client)
{
CompanyDataContext db = new CompanyDataContext();
db.Client.Attach(client);
client.SomeValue = "foo";
db.SubmitChanges();
}
This doens't work, I get error msg. "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported."
How do you work with DataContexts throughout an application so you don't need to pass around a reference?
What
They really mean it with 'This is not supported.'. Attaching to an object fetched from another data context is not implemented.
There are a number of workarounds to the problem, the recommended way is by serializing objects, however this is not easy nor a clean approach.
The most simple approach I found is to use a readonly DataContext for fetching objects like this:
MyDataContext dataContext = new MyDataContext()
{
DeferredLoadingEnabled = false,
ObjectTrackingEnabled = false
};
The objects obtained from this context can be attached to another context but only applies to some scenarios.
The PLINQO framework generates detach for all entities making it easy to detach and reattach objects without receiving that error.
public void Foo()
{
CompanyDataContext db = new CompanyDataContext();
Client client = (select c from db.Clients ....).Single();
// makes it possible to call detach here
client.Detach();
Bar(client);
}
public void Bar(Client client)
{
CompanyDataContext db = new CompanyDataContext();
db.Client.Attach(client);
client.SomeValue = "foo";
db.SubmitChanges();
}
Here is the article that describing how the detach was implemented.
http://www.codeproject.com/KB/linq/linq-to-sql-detach.aspx
Yep. That's how it works.
You have tagged this asp.net so I guess it's a web app. Maybe you want one datacontext per request?
http://blogs.vertigo.com/personal/keithc/Blog/archive/2007/06/28/linq-to-sql-and-the-quote-request-scoped-datacontext-quote-pattern.aspx
(P.S. It's a lot harder in WinForms!)
I've created data access classes that encapsulate all the communication with Linq2Sql.
These classes have their own datacontext that they use on their objects.
public class ClientDataLogic
{
private DataContext _db = new DataContext();
public Client GetClient(int id)
{
return _db.Clients.SingleOrDefault(c => c.Id == id);
}
public void SaveClient(Client c)
{
if (ChangeSetOnlyIncludesClient(c))
_db.SubmitChanges();
}
}
Ofcourse you will need to keep this object instantiated as long as you need the objects.
Checking if only the rigth object has been changed is altso somewhat bothersom, you could make methods like
void ChangeClientValue(int clientId, int value);
but that can become a lot of code.
Attaching and detaching is a somewhat missing feature from Linq2Sql, if you need to use that a lot, you sould probably use Linq2Entities.
I took a look at this and found that it appears to work fine as long as the original DataContext has been disposed.
Try wrapping the DataContext with using() and make sure your changes occur after you've attached to the second DataContext? It worked for me..
public static void CreateEntity()
{
User user = null;
using (DataClassesDataContext dc = new DataClassesDataContext())
{
user = (from u in dc.Users
select u).FirstOrDefault();
}
UpdateObject(user);
}
public static void UpdateObject(User user)
{
using (DataClassesDataContext dc = new DataClassesDataContext())
{
dc.Users.Attach(user);
user.LastName = "Test B";
dc.SubmitChanges();
}
}
You need to handle object versioning.
An entity can only be attached as modified without original state if it declares a version member or does not have an update check policy.
So, if there's no timestamp member or other 'versioning' mechanism provided there's no way for LINQ to determine whether that data has changed - hence the error you are seeing.
I resolved this issue by adding a timestamp column to my tables but there are other ways around it. Rick Strahl has written some decent articles about exactly this issue.
Also, see this and this for a bit more info.