How to access to the DBContext used in an EntityManager? - c#

I insert/update a large amount of entities (~5000) during a process and this is taking a huge amount of time (it timeout on a 5 minutes transaction).
I read that by default the DBContext.AutoDetectChangesEnabled is set to ON and cause this kind of behavior (http://www.exsertus.be/2014/10/ef-bulk-performance/).
To my understanding, Devforce "kind of" encapsulate a DBContext within each EntityManager. Devforce use it's own implementation unless I define mine which I did. I would like to know how can I access it to be able to "play" with this property AutoDetectChangesEnabled.
Or are there any other solution to insert/update/delete large amount of entities with Devforce ?
Regards

I have worked with this EF tool "https://www.nuget.org/packages/EFUtilities" and I got big performance enhancement with large inserts, as it uses bulk copy instead of normal insert per entity.
You can check the documentation of Github here.
I have used it with a 17,000 entities insert transaction and it finished it in a few seconds.
Check this to get a better understanding and comparison with EF.
http://blog.credera.com/technology-insights/microsoft-solutions/entity-framework-batch-operations-using-ef-utilities/
A sample of using the utility to insert a list of entities is like this:
using (var db = new YourDbContext())
{
EFBatchOperation.For(db, db.BlogPosts).InsertAll(list);
}
Hope this helps.

Since you've defined your own DbContext you can alter this property setting in the DbContext constructor with Configuration.AutoDetectChangesEnabled = false;
However, I'm not sure how much this change will help. If your application is n-tier and you're trying to save ~5000 entities across the wire this will always be slow, and you'll also run into communication timeouts. Generally if you need to do any bulk operations DevForce isn't the optimal approach.

Related

What is the best way to use the dbContext when I have a lot of operations (former stored procedures) going on async?

I migrated a lot of stored procedures (we want to get rid of them) and I coded them in LINQ using Entity Framework core and SQL server. First let me explain 2 projets that we have in the Backend solution:
Repository : we use the repository pattern with UnitOfWork for simple operations and of course CRUD.
Manager : We use Manager to store all the more complicated queries we can say the real business logic is there.
So far , it's okay but I only use one instance of my DbContext in both projects so i'm wondering if it's better for us to do something like this in each operation instead.
using (var context = new DBContext())
{
// Perform data access using the context
}
My goal is to make sure we don't get performance issue because we use the same context for too long and I don't want to keep track of modifications on data over all operations. Also, let's say we have an operation that contains a lot of modifications if an error/exception is thrown I don't want to keep track of anything I want a complete "rollback". We are working in async btw. First time posting here thanks in advance.

Transactional Systems: archiving data from databases with Entity Framework

I've written a small tool for archiving data from my Entity Framework code-first driven database.
I'm testing it thoroughly and I'm coming to a point where I'm trying it with large amounts of data. Where it comes to some problems. For example I got timeouts or exceptions like this sometimes:
Deadlock found when trying to get lock; try restarting transaction.
I know what transactions are and I guess Entity Framework is making them for all of its changes in one DbContext so in case any of them or the entire thing fails when SaveChanges() is called nothing is actually changed (short side questions: can I then simply run SaveChanges() again?)
What I want to know is this: since I need to delete different batches of information throughout my database (after exporting it) I'm constantly creating dbcontext's for each of those batches.
Should I create transactions manually for every batch and commit them all at once at the very end?
I'm studying Informatics and learn about transactional information systems in one of my courses. How is it possible with Entity Framework to create a meta transaction for all my single transactions when deleting batches of data, so all the data that is spread throughout the database is only really deleted when everything worked well, like this:
Or is there a better way to solve the entire thing?

NHibernate without reflection?

We are currently using NHibernate for accessing our database.
We have database, which store a "configuration". We have the notion of "revision", meaning that every table in our database has a revision, and when we make any change(even a small one), every field in our database get duplicated(except the changes that will not be the same).
The goal is to be able to switch easily from one revision from another, and be able to delete one configuration and still be able to switch from an earlier or older revision.
It implies that when we change the configuration, we make a lot of writing in the database(and other application will have to read also it).
This steps can take a very long time(5-10 minutes, we have a lot of table), compared to 10-20 seconds to store it in xml.
After spending some time in analysis, we have the impression that NHibernate has to make a lot of reflection to map database to c# objects(using our hbm.xml files).
My questions:
How does NHibernate read/write properties in every object, with reflection, right?
Does it make reflection at every write, or is there some optimization(cache, ...?)
Is there a possibility to avoid this "reflection"? Like having some class created/compiled on build(like it's possible with entity framework)?
If I've a working NHibernate model, is there something I can do to "tune" the DB access, without changing the database?
Thank you very much for your help.
Before assuming it is relection. I would strongly recommend downloading (as a free trial at least) NHProf to see what is taking NHibernate so long. Or some other database profiler.
If I had to guess it was the number of database updates that is required here but I wouldn't guess about performance without getting some metrics first ;)
For instance it might be that you need to increase your batch size in NHibernate if you are doing a lot of small updates in one session, which can be done in your nh config file
<property name="adonet.batch_size">300</property>
You cannot avoir Reflection, but it is not a problem. NHibernate use a lot of Reflection to prepare and emit dynamic code. Generated code is faster than static code because msil allow you more thing than c#.
If implementation of reflection usage seems to be your issue you can juste extends NHibernate but write a byte code provider.
Personnaly, proxy generator, usertype and property accessors can be faster than builtin implementation.
Generally, performance issue caused by Nhibernate is often :
missing fetching
precision data causing unwanted update
bad mapping (structure or simply datatype)
bad database/datatable configuration
bad commit strategy (flush mode for exemple)

Entity Framework and ADO.NET with Unit of Work pattern

We have a system built using Entity Framework 5 for creating, editing and deleting data but the problem we have is that sometimes EF is too slow or it simply isn't possible to use entity framework (Views which build data for tables based on users participating in certain groups in database, etc) and we are having to use a stored procedure to update the data.
However we have gotten ourselves into a problem where we are having to save the changes to EF in order to have the data in the database and then call the stored procedures, we can't use ITransactionScope as it always elevates to a distributed transaction and/or locks the table(s) for selects during the transaction.
We are also trying to introduce a DomainEvents pattern which will queue events and raise them after the save changes so we have the data we need in the DB but then we may end up with the first part succeeding and the second part failing.
Are there any good ways to handle this or do we need to move away from EF entirely for this scenario?
I had similar scenario . Later I break the process into small ones and use EF only, and make each small process short. Even overall time is longer, but system is easier to maintain and scale. Also I minimized joins, only update entity itself, disable EF'S AutoDetectChangesEnabled and ValidateOnSaveEnabled.
Sometimes if you look your problem in different ways, you may have better solution.
Good luck!

Insert new entity to database with closed tracking change in LinqToSql

Datacontext throws "object tracking is not enabled for the current datacontext instance" exception when i try to add new entities to db as below.
db.Posts.InsertOnSubmit(new entity);
Enabling tracking change is not a solution for me because it is too slow when i have many insert operation.
What is solution in this case ?
You cannot have your cake and eat it too.
Depending on your database structure, you could consider using two datacontexts.
One with changetracking enabled, one disabled.
However, you will still have one insert statement per record. That is just how linq-2-sql operates and there is no solution to that within l-2-s. You have to look into the SqlBulkCopy class for bulkinsertions.
Typically enabling and disabling object tracking simply wires up or ignores the change tracking event handlers. If you are trying to insert so many items that it becomes too slow when trying to wire up these events, you have a much bigger problem.
Remember, LINQ to SQL will issue a separate database request for each record you are adding. The network bottleneck here will surely be a bigger issue than just wiring up the change tracking events. LINQ to SQl isn't the best choice for bulk inserts. Consider using SSIS/Bulk Copy for that kind of operation.

Categories

Resources