I read record from database and dispose ObjectContext associated with it.
Later on when consumer finished with record I need to update it in db.
How can do it with new context?
Here is how I read records from db and put them into BlockingCollection:
var products = from r in context_.Products
orderby r.id
select r;
List<Product> pList = products.Take(10).ToList();
Now how to update those record in database with new context?
Thanks!
I tried the following, but failed:
SMEntities context = new SMEntities();
Product p = outPipe.GetConsumingEnumerable().First();
outContext.Products.Attach(p);
outContext.SaveChanges();
It returns with an Exception: object cannot be referenced by multiple instances of IEntityChangeTracker
UPDATE (explanation why I do not want to use the same context):
Product record are consumed by many tasks in different threads and I won't call context.SaveChanges in producer thread, as some of the record in threads may be in the middle of setting changes to the Product record. What will happen in this case?
Try this:
SMEntities newContext = new SMEntities();
foreach(var product in products)
{
newContext.Attach(product);
db.Entry(product).State = EntityState.Modified;
}
newContext.SaveChanges();
All changes done are only held in the current context. Once it is disposed, all your changes are also gone within that context. If you have a local entity with changes you can attach it to the new context.
context_.Attach(yourContextObject);
Afterwards just call
context_.SaveChanges();
Attach again, or alternative: Fetch it again in your new context and update it then (ApplyCurrentValues). See also Entity Framework 4 - ApplyCurrentValues<TEntity> vs. ObjectStateManager.ChangeObjectState
Related
Just a bit of an outline of what i am trying to accomplish.
We keep a local copy of a remote database (3rd party) within our application. To download the information we use an api.
We currently download the information on a schedule which then either inserts new records into the local database or updates the existing records.
here is how it currently works
public void ProcessApiData(List<Account> apiData)
{
// get the existing accounts from the local database
List<Account> existingAccounts = _accountRepository.GetAllList();
foreach(account in apiData)
{
// check if it already exists in the local database
var existingAccount = existingAccounts.SingleOrDefault(a => a.AccountId == account.AccountId);
// if its null then its a new record
if(existingAccount == null)
{
_accountRepository.Insert(account);
continue;
}
// else its a new record so it needs updating
existingAccount.AccountName = account.AccountName;
// ... continue updating the rest of the properties
}
CurrentUnitOfWork.SaveChanges();
}
This works fine, however it just feels like this could be improved.
There is one of these methods per Entity, and they all do the same thing (just updating different properties) or inserting a different Entity. Would there be anyway to make this more generic?
It just seems like a lot of database calls, would there be anyway to "Bulk" do this. I've had a look at this package which i have seen mentioned on a few other posts https://github.com/loresoft/EntityFramework.Extended
But it seems to focus on bulk updating a single property with the same value, or so i can tell.
Any suggestions on how i can improve this would be brilliant. I'm still fairly new to c# so i'm still searching for the best way to do things.
I'm using .net 4.5.2 and Entity Framework 6.1.3 with MSSQL 2014 as the backend database
For EFCore you can use this library:
https://github.com/borisdj/EFCore.BulkExtensions
Note: I'm the author of this one.
And for EF 6 this one:
https://github.com/TomaszMierzejowski/EntityFramework.BulkExtensions
Both are extending DbContext with Bulk operations and have the same syntax call:
context.BulkInsert(entitiesList);
context.BulkUpdate(entitiesList);
context.BulkDelete(entitiesList);
EFCore version have additionally BulkInsertOrUpdate method.
Assuming that the classes in apiData are the same as your entities, you should be able to use Attach(newAccount, originalAccount) to update an existing entity.
For bulk inserts I use AddRange(listOfNewEntitities). If you have a lot of entities to insert it is advisable to batch them. Also you may want to dispose and recreate the DbContext on each batch so that it's not using too much memory.
var accounts = new List<Account>();
var context = new YourDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
foreach (var account in apiData)
{
accounts.Add(account);
if (accounts.Count % 1000 == 0)
// Play with this number to see what works best
{
context.Set<Account>().AddRange(accounts);
accounts = new List<Account>();
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
context = new YourDbContext();
}
}
context.Set<Account>().AddRange(accounts);
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
For bulk updates, there's not anything built in in LINQ to SQL. There are however libraries and solutions to address this. See e.g. Here for a solution using expression trees.
List vs. Dictionary
You check in a list every time if the entity exists which is bad. You should create a dictionary instead to improve performance.
var existingAccounts = _accountRepository.GetAllList().ToDictionary(x => x.AccountID);
Account existingAccount;
if(existingAccounts.TryGetValue(account.AccountId, out existingAccount))
{
// ...code....
}
Add vs. AddRange
You should be aware of Add vs. AddRange performance when you add multiple records.
Add: Call DetectChanges after every record is added
AddRange: Call DetectChanges after all records is added
So at 10,000 entities, Add method have taken 875x more time to add entities in the context simply.
To fix it:
CREATE a list
ADD entity to the list
USE AddRange with the list
SaveChanges
Done!
In your case, you will need to create an InsertRange method to your repository.
EF Extended
You are right. This library updates all data with the same value. That is not what you are looking for.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library may perfectly fit for your enterprise if you want to improve your performance dramatically.
You can easily perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
Example:
public void ProcessApiData(List<Account> apiData)
{
// Insert or Update using the primary key (AccountID)
CurrentUnitOfWork.BulkMerge(apiData);
}
I need to update an entity that has two children which also has two children that both are dependent on both parents.
Job(PK: Jobid)
Holes(PK: Holeid, FK: Jobid) / Orders(PK: Orderid, FK: Jobid)
Tools(PK: Toolid, FK: Holeid, FK: Orderid) / ToolHoles(PK: Holeid, Orderid)
Tools also has 7 children that inherit from it.
The job will already exist on save. The job may or may not already contain 1 or more of each child entity.
I need to be able to save all this information in one transaction so that partial info is not saved to the database.
My current attempt has been to build up the Job entity with all the relevant information and call SaveChanges. If I'm adding new entities, keys will have to be generated on save for all but Jobid. Is what I'm trying to accomplish even possible?
Making some assumptions here, let me know if I'm off base. If your scenario looks like the below, then you should be fine.
var myHoles = new Holes();
var myOrders = new Orders();
var myTools = new Tools();
var myToolHoles = new ToolHoles();
myJob.Holes.Add(myHoles); //myJob already exists
myJob.Orders.Add(myOrders);
myHoles.Tools.Add(myTools);
myOrders.Tools.Add(myTools);
myHoles.ToolHoles.Add(myToolHoles);
myOrders.ToolHoles.Add(myToolHoles);
db.SaveChanges();
You say "one transaction" but the reality is that several transactions take place.
myHoles and myOrders will be inserted to the database with their JobId set appropriately.
EF will find out what their IDs are
myTools and myToolHoles will be inserted with the HoleId and OrderId set with the values found in the second step.
I have a scenario that could not find the solution for it and need some help
How can I achieve this,
I’d like to get current record for the client modify it and instead of update I’d like to add the new record to table for historical change information
client c = new client();
using (DBEntities db = new DBEntities())
{
// get current records in the client table for client
IQueryable<client> co = from p in db.client where p.CUS_NUMBER == scd.cus_number && p.isCurrent == true select p;
c = co.First();
//update email and and address
c.EMAIL = Helper.CleanInput("mymail#mm.com");
c.ADDRESS = Helper.CleanInput("123 Sheppard");
//instead of updating current record I'd like to add new record to the table to keep historical changes
db.AddToclient(c);
db.SaveChanges();
//I get error that
//An object with the same key already exists in the ObjectStateManager.
//The existing object is in the Modified state. An object can only be added to
//the ObjectStateManager again if it is in the added state.
Complete error
An object with the same key already exists in the ObjectStateManager. The existing object is in the Modified state. An object can only be added to the ObjectStateManager again if it is in the added state.
remove this code db.AddToclient(c); ,rest all is fine,You are already accessing the object by its reference so no need to add it again.It'll get modified when you call savechanges()
or use cloning if you want to add new object c = co.First().Clone();
It's look like you are adding same row to database and error is coming due to addition of same row again having same primary key which DB will not allow.
Try to add new row and make another table that keeps Historical information of old row and a reference as foreign key. You can add a boolean field that keep information regarding deletion let It is IsDeleted.
Hope It will Help
Thanks
The reason db.AddToclient(c); gives the error is because this object is being tracked by the object context, by possibly being in the database.
The best way to accomplish what you are trying to do is something like the following:
var newClient = new client()
{
EMAIL = Helper.CleanInput("mymail#mm.com"),
ADDRESS = Helper.CleanInput("123 Sheppard"),
};
db.AddToclient(newClient);
db.SaveChanges();
In Entity Framework, all objects retrieved from database by default are tracked by ObjectContext instance. Entity Framework internally maps all objects being tracked by his Key. This pattern is called Identity Map. This means that there will be only one instance of an entity per key. So, you don't need to call Add again, since the entity is already on EF map. You just need call SaveChanges to persist modified entities.
In your case you are:
1 - Creating a new instance of EF ObjectContext;
2 - Retrieving entities in your LINQ query;
3 - Changing values of properties of the retrieved entity;
4 - Adding again to the ObjContext; //error!
5 - Calling SaveChanges()
Step 4 is not necessary because the ObjectContext already knows about the retrieved objects.
If I cache a entire table:
static List<Table1> table1Cache = context.Table1.ToList();
Then I use it to associate:
var context = new Context();
var t2 = new Table2();
t2.MyTable1Reference = table1Cache.Single(x=>x.Id == paramIntId);
context.SaveChanges();
A new row will be inserted to Table1, because of the third line. EF thinks that is a new entity. I know that I can do somethings like always Attaching the cache when create de context(I have 1 context per Request), or use MyTable1ReferenceID = table1Cache.Single(x=>x.Id == paramIntId).Id;
But its not secure, I can forget sometimes, there is a good solution?
yes, that makes sense because the entity is not currently associated with the current context. therefore EF thinks it's transient and saves a new instance.
if you are caching across contexts, then you don't want to store the object itself. that is related to the context. instead you want to store the data in cache. basically serializing and deserializing the entity. You will also need to associate the entity when the current context so the next time it's retrieved from cache you can save change to both the cache and the database.
if all this sounds like a lot, it is. keeping 2 data stores synchronized is not an easy problem to solve. I would take a look at the implementation of 2nd level cache for NHibernate.
I am trying to increment a counter which is stored in the DB.
So this requires me to do and update using Entity Framework 1 (EF1).
I am doing something like this:
CounterTBL OrderCounter = MyRepository.CounterTableDetails("ORDERID");
Booking Booking = new Booking();
Booking.BookingAdminID = User.ID;
Booking.BookingStatus = 2;
OrderCounter.CounterFLD = OrderCounter.CounterFLD + 1;
using (var ctx = new WhygoContext())
{
ctx.AddToBookings(Booking);
ctx.SaveChanges();
}
Booking is inserted fine, but I expected the existing record to be updated, which is was not.
A search around StackOverflow and the web shows that I should do something like this:
ctx.CounterTBL.Attach(OrderCounter);
ctx.ApplyCurrentValues("CounterTBLs", OrderCounter);
Or similar, but my intellisense doesn't like this and it doesn't build so I assume these are only a part of EF 4.
I am sadly stuck with EF 1. Is there a way to do this.
I'm pretty new to this stuff, so maybe I'm not going about this in the right way...
When you're inserting Booking you are creating a new instance of the context and call save changes only on that instance. Your OrderCounter was loaded from repository and I guess it used different context instance. You should share the context instance between both operations or you will have to call SaveChanges on both context.
Btw. your code is not very reliable if it is run in ASP.NET because concurrent clients can store the same counter.