I am trying to increment a counter which is stored in the DB.
So this requires me to do and update using Entity Framework 1 (EF1).
I am doing something like this:
CounterTBL OrderCounter = MyRepository.CounterTableDetails("ORDERID");
Booking Booking = new Booking();
Booking.BookingAdminID = User.ID;
Booking.BookingStatus = 2;
OrderCounter.CounterFLD = OrderCounter.CounterFLD + 1;
using (var ctx = new WhygoContext())
{
ctx.AddToBookings(Booking);
ctx.SaveChanges();
}
Booking is inserted fine, but I expected the existing record to be updated, which is was not.
A search around StackOverflow and the web shows that I should do something like this:
ctx.CounterTBL.Attach(OrderCounter);
ctx.ApplyCurrentValues("CounterTBLs", OrderCounter);
Or similar, but my intellisense doesn't like this and it doesn't build so I assume these are only a part of EF 4.
I am sadly stuck with EF 1. Is there a way to do this.
I'm pretty new to this stuff, so maybe I'm not going about this in the right way...
When you're inserting Booking you are creating a new instance of the context and call save changes only on that instance. Your OrderCounter was loaded from repository and I guess it used different context instance. You should share the context instance between both operations or you will have to call SaveChanges on both context.
Btw. your code is not very reliable if it is run in ASP.NET because concurrent clients can store the same counter.
Related
Just a bit of an outline of what i am trying to accomplish.
We keep a local copy of a remote database (3rd party) within our application. To download the information we use an api.
We currently download the information on a schedule which then either inserts new records into the local database or updates the existing records.
here is how it currently works
public void ProcessApiData(List<Account> apiData)
{
// get the existing accounts from the local database
List<Account> existingAccounts = _accountRepository.GetAllList();
foreach(account in apiData)
{
// check if it already exists in the local database
var existingAccount = existingAccounts.SingleOrDefault(a => a.AccountId == account.AccountId);
// if its null then its a new record
if(existingAccount == null)
{
_accountRepository.Insert(account);
continue;
}
// else its a new record so it needs updating
existingAccount.AccountName = account.AccountName;
// ... continue updating the rest of the properties
}
CurrentUnitOfWork.SaveChanges();
}
This works fine, however it just feels like this could be improved.
There is one of these methods per Entity, and they all do the same thing (just updating different properties) or inserting a different Entity. Would there be anyway to make this more generic?
It just seems like a lot of database calls, would there be anyway to "Bulk" do this. I've had a look at this package which i have seen mentioned on a few other posts https://github.com/loresoft/EntityFramework.Extended
But it seems to focus on bulk updating a single property with the same value, or so i can tell.
Any suggestions on how i can improve this would be brilliant. I'm still fairly new to c# so i'm still searching for the best way to do things.
I'm using .net 4.5.2 and Entity Framework 6.1.3 with MSSQL 2014 as the backend database
For EFCore you can use this library:
https://github.com/borisdj/EFCore.BulkExtensions
Note: I'm the author of this one.
And for EF 6 this one:
https://github.com/TomaszMierzejowski/EntityFramework.BulkExtensions
Both are extending DbContext with Bulk operations and have the same syntax call:
context.BulkInsert(entitiesList);
context.BulkUpdate(entitiesList);
context.BulkDelete(entitiesList);
EFCore version have additionally BulkInsertOrUpdate method.
Assuming that the classes in apiData are the same as your entities, you should be able to use Attach(newAccount, originalAccount) to update an existing entity.
For bulk inserts I use AddRange(listOfNewEntitities). If you have a lot of entities to insert it is advisable to batch them. Also you may want to dispose and recreate the DbContext on each batch so that it's not using too much memory.
var accounts = new List<Account>();
var context = new YourDbContext();
context.Configuration.AutoDetectChangesEnabled = false;
foreach (var account in apiData)
{
accounts.Add(account);
if (accounts.Count % 1000 == 0)
// Play with this number to see what works best
{
context.Set<Account>().AddRange(accounts);
accounts = new List<Account>();
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
context = new YourDbContext();
}
}
context.Set<Account>().AddRange(accounts);
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context?.Dispose();
For bulk updates, there's not anything built in in LINQ to SQL. There are however libraries and solutions to address this. See e.g. Here for a solution using expression trees.
List vs. Dictionary
You check in a list every time if the entity exists which is bad. You should create a dictionary instead to improve performance.
var existingAccounts = _accountRepository.GetAllList().ToDictionary(x => x.AccountID);
Account existingAccount;
if(existingAccounts.TryGetValue(account.AccountId, out existingAccount))
{
// ...code....
}
Add vs. AddRange
You should be aware of Add vs. AddRange performance when you add multiple records.
Add: Call DetectChanges after every record is added
AddRange: Call DetectChanges after all records is added
So at 10,000 entities, Add method have taken 875x more time to add entities in the context simply.
To fix it:
CREATE a list
ADD entity to the list
USE AddRange with the list
SaveChanges
Done!
In your case, you will need to create an InsertRange method to your repository.
EF Extended
You are right. This library updates all data with the same value. That is not what you are looking for.
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library may perfectly fit for your enterprise if you want to improve your performance dramatically.
You can easily perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
Example:
public void ProcessApiData(List<Account> apiData)
{
// Insert or Update using the primary key (AccountID)
CurrentUnitOfWork.BulkMerge(apiData);
}
I'm quite new at this so forgive me for not knowing how to word this properly.
I have setup Entity Framework with a rather large database and have been trying to learn how to manipulate tables in the database.
My first method worked but I ran into an error. It would load the data into a datagridview just fine but when sliding the bar over to view the tables not on the screen it would throw an error. This is the process that triggered the error:
using (var context = new MydbEntities())
{
var query = (from a in db.Configurations
select a);
var result = query.ToList();
dataGridView1.DataSource = result;
}
Now if I change the first line to MydbEntities db = new MydbEntities(); I don't get an error. I'm trying to follow online tutorials but I thought maybe someone could help me understand the difference in these two.
Basically you met lazy-load and fact that Context got disposed when you tried to query next batch of records.
You have serveral options here, best ones in my opinion are:
Use EntityDataSource. This way DataControl will take care of EF Context instantiation and disposal. (MSDN has pretty good specs on that).
Implement custom ObjectDataSource. Use EF context within ObjectDataSource methods, instantiating and disposing it when needed.
(This article on the topic is little outdated, but you still can get the idea).
I have a C# mvc3 application that is using entity framework to pull data from my SQL server database. I discovered that it seemed to be pulling "old" or "cached" data instead of the data that was currently in the DB.
Once I updated the model it seemed to pull the new data.
My question is how do I make sure that I am always pulling "live" data from the database and not getting "cached" or "old" data?
When I ran the following code and checked the value of tblcompanyinfo.companyname it was returning an old company name (different from what was currently in the DB).
Once I updated the model and re-ran it, it returned the current value of company name.
private static ApptReminderEntities db = new ApptReminderEntities();
tblCompanyInfo tblcompanyinfo = db.tblCompanyInfoes.SingleOrDefault(t => (t.CompanyID == lCompanyID));
Thanks!
This may be due to your shared and static DbContext Instance i.e.
private static ApptReminderEntities db = new ApptReminderEntities();
Replace it with using block as below:
using(ApptReminderEntities db = new ApptReminderEntities())
{
tblCompanyInfo tblcompanyinfo = db.tblCompanyInfoes
.SingleOrDefault(t => (t.CompanyID == lCompanyID));
}
Using using statement, you are
Creating new ApptReminderEntities instance each time.
Doing whatever you want in database.
Using is automatically closing and disposing your instance.
So, for each trip to database, use using so that you will create new instance of your context each time.
The problem is that you're not creating a new context for each query - you've defined it as static.
You should always create a new ApptReminderEntities for each operation you do on the database.
You can use a repository pattern or similar. In my case, every operation I would do
var employees = new EmployeeRepository().GetEmployees();
and in the Employee repository constructor, it creates a new EmployeeEntities()
This makes it quite easy to unit test too, as I can overload my repository and pass in a dummy context.
I read record from database and dispose ObjectContext associated with it.
Later on when consumer finished with record I need to update it in db.
How can do it with new context?
Here is how I read records from db and put them into BlockingCollection:
var products = from r in context_.Products
orderby r.id
select r;
List<Product> pList = products.Take(10).ToList();
Now how to update those record in database with new context?
Thanks!
I tried the following, but failed:
SMEntities context = new SMEntities();
Product p = outPipe.GetConsumingEnumerable().First();
outContext.Products.Attach(p);
outContext.SaveChanges();
It returns with an Exception: object cannot be referenced by multiple instances of IEntityChangeTracker
UPDATE (explanation why I do not want to use the same context):
Product record are consumed by many tasks in different threads and I won't call context.SaveChanges in producer thread, as some of the record in threads may be in the middle of setting changes to the Product record. What will happen in this case?
Try this:
SMEntities newContext = new SMEntities();
foreach(var product in products)
{
newContext.Attach(product);
db.Entry(product).State = EntityState.Modified;
}
newContext.SaveChanges();
All changes done are only held in the current context. Once it is disposed, all your changes are also gone within that context. If you have a local entity with changes you can attach it to the new context.
context_.Attach(yourContextObject);
Afterwards just call
context_.SaveChanges();
Attach again, or alternative: Fetch it again in your new context and update it then (ApplyCurrentValues). See also Entity Framework 4 - ApplyCurrentValues<TEntity> vs. ObjectStateManager.ChangeObjectState
If I cache a entire table:
static List<Table1> table1Cache = context.Table1.ToList();
Then I use it to associate:
var context = new Context();
var t2 = new Table2();
t2.MyTable1Reference = table1Cache.Single(x=>x.Id == paramIntId);
context.SaveChanges();
A new row will be inserted to Table1, because of the third line. EF thinks that is a new entity. I know that I can do somethings like always Attaching the cache when create de context(I have 1 context per Request), or use MyTable1ReferenceID = table1Cache.Single(x=>x.Id == paramIntId).Id;
But its not secure, I can forget sometimes, there is a good solution?
yes, that makes sense because the entity is not currently associated with the current context. therefore EF thinks it's transient and saves a new instance.
if you are caching across contexts, then you don't want to store the object itself. that is related to the context. instead you want to store the data in cache. basically serializing and deserializing the entity. You will also need to associate the entity when the current context so the next time it's retrieved from cache you can save change to both the cache and the database.
if all this sounds like a lot, it is. keeping 2 data stores synchronized is not an easy problem to solve. I would take a look at the implementation of 2nd level cache for NHibernate.