I need to update the existing record in linq if the record exists elese add a new one.
will saveChanges() work for both? if yes how to differentiate between updation and insertion.
Thanks in advance.
SaveChanges() performs all changes you made to a data base since the last call. This includes:
Adding new items to a collection
Deleting items from a collection
Changing properties
So, you have either to add the record to a collection or to get the existing one and modify its properties. There is no general method to do this point. After you performed the changes, call SaveChanges() to save them.
Linq in general is for querying not for modification (it stands for Language Integrated Query after all) - ideally you do not want to create any side effects. Updating and insertion differ in that for updating you usually will have to query for an existing record to well.. update - for insertion you just add it. And yes SaveChanges() will work for both in that it commits your changes and additions to the underlying data store.
Your question is very broad - without a particular code that you are struggling with its hard to give a more detailed answer.
Related
I'm making an API call with C#, getting back a JSON, breaking it down into nested objects, breaking each object into fields and putting the fields into an SQL Server table.
There is one field (OnlineURL) which should be unique.
What is an efficient way of achieving this goal? I currently make a database call for every nested object I pull out of the JSON and then use an if statement. But this is not efficient.
Database Layer
Creating a unique index/constraint for the OnlineURL field in the database will enforce the field being unique no matter what system/codebase references it. This will result in applications erroring on inserts of new records where the OnlineURL already exists or updating record X to an OnlineURL that is already being used by record Y.
Application Layer
What is the rule when OnlineURL already exists? Do you reject the data? Do you update the matching row? Maybe you want to leverage a stored procedure that will insert a new row based on OnlineURL or update the existing one. This will turn a 2 query process into a single query, which will have an impact on large scale inserts.
Assuming your application is serial and the only one working against the database. You could also keep a local cache of OnlineURLs for use during your loop, read in the list once from the database, check each incoming record against it and then add each new OnlineURL you insert into the list. To read in the initial list is only a single query and each comparison is done in memory.
Create an index for that field and it will be.
It is necessary to check the uniqueness and that can't be fullfilled if you don't query the data. That means you will have to check the entire data in that column. Your first option is to improve the query with an index with a fill factor of 80 so you can avoid unnecessary page splits caused by the inserts.
Another option is to use caching and depends on your setup.
You could load the entire column in memory and check for the uniqueness there. Or you could use a distributed cache like Redis. Either way analyze the complexity costs and probably you'll that the index is the most ergonomic option.
I have a scenario where I need to synchronize a database table with a list (XML) from an external system.
I am using EF but am not sure which would be the best way to achieve this in terms of performance.
There are 2 ways to do this as I see, but neither seem to be efficient to me.
Call Db each time
-Read each entry from the XML
-Try and retrieve the entry from the list
-If no entry found, add the entry
-If found , update timestamp
-At end of loop, delete all entries with older timestamp.
Load All Objects and work in memory
Read all EF objects into a list.
Delete all EF objects
Add item for each item in the XML
Save Changes to Db.
The lists are not that long, estimating around 70k rows. I don't want to clear the db table before inserting the new rows, as this table is a source for data from a webservice, and I don't want to lock the table while its possible to query it.
If I was doing this in T-SQL i would most likely insert the rows into a temp table, and join to find missing and deleted entries, but I have no idea how the best way to handle this in Entity Framework would be.
Any suggestions / ideas ?
The general problem with Entity Framework is that, when changing data, it will fire a query for each changed record anyway, regardless of lazy or eager loading. So by nature, it will be extremely slow (think of factor 1000+).
My suggestion is to use a stored procedure with a table valued parameter and ignore Entity Framework all together. You could use a merge statement.
70k rows is not much, but 70k insert/update/delete statements is always going to be very slow.
You could test it and see if the performance is managable, but intuition says entity framework is not the way to go.
I would iterate over the elements in the XML and update the corresponding row in the DB one at a time. I guess that's what you meant with your first option? As long as you have a good query plan to select each row, that should be pretty efficient. Like you said, 70k rows isn't that much so you are better off keeping the code straightforward rather than doing something less readable for a little more speed.
It depends. It's ok to use EF if there'll be not many changes (say less than hundreds). Otherwise, need bulk insert into DB and merge rows inside database.
i need help:
I'm a beginner with Nhibernate.
I have created a wpf application that load a datagrid binded with an observable collection.
This collection is loaded with repository pattern and Nhibernate querying database.
I want to modify this collection with UI (edit, add, delete).
When i click to my save button i want to persist my changes to db table.
I read nhibernate documentation and i learn that there are 2 level of cache, my idea is to modify objects in first level cache, and when I am sure of my changes i want to persist.
there are some best practices for doing this?
How to mark for deletion or update an object and delete or update it after "save changes" click?
This should be an interesting read: Building a Desktop To-Do Application with NHibernate
Basically, you should use the ISession object's methods, and do operations inside a transaction, i.e. ISession.BeginTransaction()
It depends on how you get your entities. If they are root entities, e.g. employee then when you delete an entity from the grid, you should keep track of these deleted entities and call delete on all of them. You should also keep track of the added entities.
Then basically what you are left with are the updated entities. NH keeps track of the state and knows if an entity was modified.
We have ISession.Save/Update/Delete.
When you have done this for every modified entity, call Commit on the transaction. This will save the changes to the database.
If your entities are not roots, but e.g. are employees addresses, then it will be enough to call save on the employee - if your mappings are correct.
Datacontext throws "object tracking is not enabled for the current datacontext instance" exception when i try to add new entities to db as below.
db.Posts.InsertOnSubmit(new entity);
Enabling tracking change is not a solution for me because it is too slow when i have many insert operation.
What is solution in this case ?
You cannot have your cake and eat it too.
Depending on your database structure, you could consider using two datacontexts.
One with changetracking enabled, one disabled.
However, you will still have one insert statement per record. That is just how linq-2-sql operates and there is no solution to that within l-2-s. You have to look into the SqlBulkCopy class for bulkinsertions.
Typically enabling and disabling object tracking simply wires up or ignores the change tracking event handlers. If you are trying to insert so many items that it becomes too slow when trying to wire up these events, you have a much bigger problem.
Remember, LINQ to SQL will issue a separate database request for each record you are adding. The network bottleneck here will surely be a bigger issue than just wiring up the change tracking events. LINQ to SQl isn't the best choice for bulk inserts. Consider using SSIS/Bulk Copy for that kind of operation.
I have inherited a project that uses LLBLGen Pro for the DB layer. The DB model requires that when a entry is deleted a flag (DeletedDate is set to the current time). The last programmer ignored this requirement and has used regular deletes throughout the entire application.
Is there a way to set the code generator to do this automatically or do I have to overload each delete operator for the Entities that requires it?
I implemented this in SQL Server 2005 using INSTEAD OF triggers on delete for any soft delete table. The triggers set the delete flag and perform clean-up. The beauty of this solution is that it correctly handles deletes issued by any system that accesses the database. INSTEAD OF is relatively new in SQL Server, I know there's an Oracle equivalent.
This solution also plays nicely with our O/R mapper -- I created views that filter out soft deleted records and mapped those. The views are also used for all reporting.
You could create custom task in LLBLGen that would override those for you when you are generating entities. Check out their template studio and template examples on the website.
It depends if you are using self-servicing or adapter. If SS you will need to modify the template so that it sets the flag for you rather than deleting the entity.
If adapter, you can inherit from DataAccessAdapter and override the delete methods to set the flag for you rather than deleting the entities.
It's generally a crappy solution for performace though as every query then needs to filter out "deleted" entities - and because the selectvity on the "deleted" column won't be very high (all of your "undelted" records are null - i'm guessing this will be the majority of them) indexing it doesn't gain you a great deal - you will end up with a lot of table scans.