C# entity framework not updating values updated from another source - c#

having some strange issue happening with my code.
Let's say i have list of bananas i'm deleting in my database and send stored procedure request to also delete it in another DB from another service (that's why i have saveChanges and Refresh stuff in my code: to avoid OptimisticConcurrencyErrors that hapenned sometimes). We have syncronization tasks that get changes from another service and syncs values in our db:
foreach(var banana in bananaList)
{
//...
//some conditions
banana.SendDeleteRequestToService(); //after request sync process is triggered to update values in my db
banana.Delete(); //logical delete that place IsDeleted=1 in database
Connect.Refresh(banana); // method to refresh context in RefreshMode.Wins to avoid concurrency error
Connect.SaveChanges(true); //force to update deleted bananas
}
after what i'm getting bananaTrees that are linked with bananas in db and change banana tree status to "No bananas":
var bananaTreeList = bananaTree.GetList().Where(b => b.IsDeleted ==1);
bananaTreeList.ForEach(bt => bt.Status = "No bananas");
Connect.SaveChanges(true);
Issue is that some items from bananaTreeList are not getting status updated, though all bananas are being deleted and i see all bananaTrees in the list that need to have status changed when debugging. foreach also goes through all items i need, but status changes are not applied for some of the items.
I guess the main reason is due syncronization task (cannot stop or pause it) which is affecting database context and EF can't resolve changes from different sources. I want to understand how to update EF context so all of my bananaTrees would have their statuses updated correctly.
How to handle this kind of concurrency issues in EF? There is a lot of code in my project, but i tried to give the most important parts in my example.

Related

How to... Display Data from Database

I've got an Application which consists of 2 parts at the moment
A Viewer that receives data from a database using EF
A Service that manipulates data from the database at runtime.
The logic behind the scenes includes some projects such as repositories - data access is realized with a unit of work. The Viewer itself is a WPF-Form with an underlying ViewModel.
The ViewModel contains an ObservableCollection which is the datasource of my Viewer.
Now the question is - How am I able to retrieve the database-data every few minutes? I'm aware of the following two problems:
It's not the latest data my Repository is "loading" - does EF "smart" stuff and retrieves data from the local cache? If so, how can I force EF to load the data from the database?
Re-Setting the whole ObservableCollection or adding / removing entities from another thread / backgroundworker (with invokation) is not possible. How am I supposed to solve this?
I will add some of my code if needed but at the moment I don't think that this would help at all.
Edit:
public IEnumerable<Request> GetAllUnResolvedRequests() {
return AccessContext.Requests.Where(o => !o.IsResolved);
}
This piece of code won't get the latest data - I edit some rows manually (set IsResolved to true) but this method retrieves it nevertheless.
Edit2:
Edit3:
var requests = AccessContext.Requests.Where(o => o.Date >= fromDate && o.Date <= toDate).ToList();
foreach (var request in requests) {
AccessContext.Entry(request).Reload();
}
return requests;
Final Question:
The code above "solves" the problem - but in my opinion it's not clean. Is there another way?
When you access an entity on a database, the entity is cached (and tracked to track changes that your application does until you specify AsNoTracking).
This has some issues (for example, performance issues because the cache increases or you see an old version of entities that is your case).
For this reasons, when using EF you should work with Unit of work pattern (i.e. you should create a new context for every unit of work).
You can have a look to this Microsoft article to understand how implement Unit of work pattern.
http://www.asp.net/mvc/overview/older-versions/getting-started-with-ef-5-using-mvc-4/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
In your case using Reload is not a good choice because the application is not scalable. For every reload you are doing a query to database. If you just need to return desired entities the best way is to create a new context.
public IEnumerable<Request> GetAllUnResolvedRequests()
{
return GetNewContext().Requests.Where(o => !o.IsResolved).ToList();
}
Here is what you can do.
You can define the Task (which keeps running on ThreadPool) that periodically checks the Database (consider that periodically making EF to reload data has its own cost).
And You can define SQL Dependency on your query so that when there is a change in data, you can notify the main thread for the same.

Linq To SQL Delete + Insert best practice

As stated in the title i need to perform delete + insert, i do :
context.DeleteAllOnSubmit ( deleteQuery ) ;
foreach ( var entry in entries )
contex.InsertOnSubmit ( entry ) ;
context.SubmitChanges();
As wrote in that post :
Linq to SQL: execution order when calling SubmitChanges()
I read that the delete operation is the last one applied, but at the moment i see my logic work (i am sure that delete+insert happen dozen of times per day).
What i need is understand if the post is wrong or my logic is and for some reason (update check flag in linq to sql datamodel?) only lucky and avoid the trouble.
After that i would like to know what is the better pattern to make "update" when record cardinality changes.
I mean in my table there is a primary key that identify an entity (an entity has many records) and a subkey that identify each record in the same entity (sub entity).
I need to regenerate (because some sub entity may be inserted, edited or delete) so i use delete + insert (in the messagge form which i write to DB contains only entity and sub enetity that exist, not the deleted ones).
EG:
ID SubID Data
1 1_0 Father
2 2_0 Father
2 2_1 Child 1
3 3_0 Father
3 3_1 Child 1
3 3_2 Child 2
I have no control nor over the table (and data format inside them) nor over the message (that i use to write or delete the table displaied above).
I read that the delete operation is the last one applied, but at the moment i see my logic work (i am sure that delete+insert happen dozen of times per day). What i need is understand if the post is wrong or my logic is and for some reason (update check flag in linq to sql datamodel?) only lucky and avoid the trouble.
Post is correct, delete actually deleted at last.
Your code is working as per design, this is not by chance.
It actually loads all records to be deleted and then deleted all one by one. This happens at last.
This will never fail or will not deleted wrong records, however it has performance issue, you can refer very good msdn article on this
Regardless of how many changes you make to your objects, changes are made only to in-memory replicas. You have made no changes to the actual data in the database. Your changes are not transmitted to the server until you explicitly call SubmitChanges on the DataContext.
When you make this call, the DataContext tries to translate your changes into equivalent SQL commands. You can use your own custom logic to override these actions, but the order of submission is orchestrated by a service of the DataContext known as the change processor.
The sequence of events is as follows: refer msdn
When you call SubmitChanges, LINQ to SQL examines the set of known objects to determine whether new instances have been attached to them. If they have, these new instances are added to the set of tracked objects. This is why we are saying insertion at first
All objects that have pending changes are ordered into a sequence of objects based on the dependencies between them. Objects whose changes depend on other objects are sequenced after their dependencies. then the update
After Update deletion is done
Immediately before any actual changes are transmitted, LINQ to SQL starts a transaction to encapsulate the series of individual commands.
The changes to the objects are translated one by one to SQL commands and sent to the server.
At this point, any errors detected by the database cause the submission process to stop, and an exception is raised.
All changes to the database are rolled back as if no submissions ever occurred. The DataContext still has a full recording of all changes

How to deal with a stale cache in Entity Framework?

I had been getting very strange behavior form entity framework. I am coding a WebApi application so the objects I get from the browser are disconnected/detached. The data I get back is transactional such that it does not match any given table in the database. I have to do a number of lookups and data manipulation to get the actual updates to be done on the database.
The problem I seem to have is that in querying the data I am filling up the Tracked Changes cache. That wouldn't seem to be a problem to me since the true source of data should be the database. When I finally make the data changes and I call SaveChanges I get constraint errors. Here are my steps.
Query data.
Create rows to be inserted.
compare rows to db and make db changes.
After reviewing the data in Ctx.ChangeTracker.Entries() I found that an entry to be deleted was marked as Modified when it was supposed to be deleted. The way I worked around it was by Creating a new context for step 3. And it magically started working. I thought that was it, but in my test case I do a last read from the database to verify that my transaction was writing correctly. And I was getting an extra row that should already be deleted. And in fact was, when checking the db directly. Again a new context to do that last read fixed the problem.
I just assumed the default cache setting would just be used to track changes and not to speed up queries.
If I try to use AsNoTracking in my queries I also get into trouble because if I try to delete a row queried like that I get an error. And in my code I don't know if I am going to delete or modify until later on. Is there a way to clear the cache so I don't need to create a new context?
Is there a better way to deal with these issues?
EDIT:
AsNoTracking will do the trick, to some extent. I still found myself instantiating more copies of DbContext in order to prevent errors. Many to one entities have to be deleted in order or null foreign key errors are triggered.
var details = oldInvoice.details.ToList();
Context.Entry(oldInvoice).State = EntityState.Unchanged;
Context.Entry(oldInvoice).State = EntityState.Deleted;
details.ForEach(a => Context.Entry(a).State = EntityState.Deleted);
Entity Framework offers an exception DbUpdateConcurrencyException that you can catch on your calls to SaveChanges(). you could loop through the errors something like this:
catch (DbUpdateConcurrencyException ex)
{
saveFailed = true;
// Get the current entity values and the values in the database
var entry = ex.Entries.Single();
var currentValues = entry.CurrentValues;
var databaseValues = entry.GetDatabaseValues();
// Choose an initial set of resolved values. In this case we
// make the default be the values currently in the database.
var resolvedValues = databaseValues.Clone();
// Have the user choose what the resolved values should be
HaveUserResolveConcurrency(currentValues, databaseValues,
resolvedValues);
// Update the original values with the database values and
// the current values with whatever the user choose.
entry.OriginalValues.SetValues(databaseValues);
entry.CurrentValues.SetValues(resolvedValues);
}
} while (saveFailed);
also, your update code sounds suspicious as well. Usually when you pass data out to a client through WebApi or other mechanisms, the data that is returned doesn't have the tracking data, so you should be checking to see if it exists and re-attaching it to the context and changing it's state to EntityState.Modified if so before calling SaveChanges().

How do I get Entity Framework 5 to update stale data

I have an EF5 WPF/MVVM solution that's working without problems. The project is an order entry system but loading an order loads lots of related items so the context is used to track all the changes and then save it off, so the context is long lived. If user A loads an order and doesn't do anything with it, and then User B loads that order and updates it I have a refresh button that was intended to let User A update the stale data. Unfortunately, I can't seem to get EF5 to ignore the cache. I originally thought this would work:
_trackingContext.GetObjectContext().Refresh(RefreshMode.StoreWins, theOrders);
List<OrderLineItem> line_items = theOrders.SelectMany(x => x.OrderLineItems).ToList();
_trackingContext.GetObjectContext().Refresh(RefreshMode.StoreWins, line_items);
Where GetObjectContext() is just a wrapper
public ObjectContext GetObjectContext()
{
return (this as IObjectContextAdapter).ObjectContext;
}
Turns out this doesn't update the data. So I thought maybe I had to change the Merge option so I added
var set = _trackingContext.GetObjectContext().CreateObjectSet<OrderLineItem>();
set.MergeOption = MergeOption.OverwriteChanges;
and I also tried it for Orders (and with the PreserveChanges option) but nothing worked. Eventually, I just resorted to disposing and recreating the context and then recreating the search selection but it seems like this should be overkill. Is there some easier way to just get EF5 to update any stale data with fresh data from the database?
Update
OK - It turns out it was a testing methodology problem. After seeing #jure's reply and implementing it, and having it appear to not work I finally got smart. I broke out SQL Profiler. The right things were happening behind the scenes but I wasn't doing the right thing to update the view. Once I did that my original code worked.
There's a Reload method in the DbEntityEntry class, so you could do:
dbContext.Entry(entity).Reload();
but that is only for one object in the context that you need to refresh from Db.
Disposing and recreating the context is the way to go.

Ravendb Savechanges(); taking too long time to run?

Ran into a weird problem with RavenDB
public ActionResult Save(RandomModel model)
{
//Do some stuff, validate model etc..
RavenSession.Store(model);
RavenSession.SaveChanges();
var newListOfModels = RavenSession.Query<RandomModel>().ToList();
return View("randomview",newListOfModels);
}
The newListOfModels does not contain the model i just added with the store method.
However, if i add a Thread.Sleep(100) after savechanges the stored model is included in the new list.
Am i storing and Saving stuff to RavenDB the wrong way?
How should i be doing this?
Of course there is a work around by just adding the incoming model to the newListOfModels and running SaveChanges after for example in a basecontrollers onactionexecuted method.
My primary concern is why i need to delay the thread before i can query the documentsession and find my newly added model there.
RavenDB indexes are stale by their nature. From the documentation:
RavenDB performs data indexing in a background thread, which is
executed whenever new data comes in or existing data is updated.
Running this as a background thread allows the server to respond
quickly even when large amounts of data have changed, however in that
case you may query stale indexes.
So you need to tell RavenDB when querying to wait for the index to be refressed.
You can do with the various WaitFor... customization, you will most probably want the WaitForNonStaleResultsAsOfLastWrite option:
var newListOfModels = RavenSession
.Query<RandomModel>()
.Customize(x => x.WaitForNonStaleResultsAsOfLastWrite()).ToList();

Categories

Resources