I have one datatable with 26 columns in it.
I need to update specific column based on filter.
but I dont want to do it using iteration because it's having thousands of records. It waill affects performance.
is there any way to do that.
I am new for linq so I searched for that but not getting proper solution.
There are some solutions but I can not understand it.
Please if anyone having solution?
This is where you have to either drop into ADO or seriously customize linq or EF.
Bulk inserts and updates are not something it does nicely.
Is batch or bulk insert possible in Linq 2 Sql ?
the same goes for EF.
Multiple row update is not supported by EF. For this better you use stored procedure. This is the reason EF has provided support for executing stored procedure. Use it and enjoy :)
Related
I currently have a populated datatable, but I'm having trouble inserting it into my existing (empty) database.
I have looked into sqlbulkcopy as well, but haven't had much luck.
Although using Entity Framework I would expect:
_db.TableName.AddRange(dt);
To properly insert the new data from my datatable.
Where am I going wrong here?
You are missing a few steps. First you need to create the table on the database, this is generally done not at run time for lots of complex reasons. Research Entity Framework Code First for more details on how to do this.
Secondly if you truly want to use Entity Framework you will need to convert your data table into a set of objects of the correct type.
Honestly though I would avoid Entity Framework and embed a CREATE TABLE script (or even better define it before you execute) and then use SqlBulkCopy, just note that SqlBulkCopy needs the table defined ahead of time.
When using Linq to Sql and updating something like a cross reference table where they could already be records in the table that would just need to stay there, records that will change and records that could be removed. What is the best practice as to handle this? I am thinking delete all and recreate. Is that bad?
Should I delete the reference records and repopulate all of them. Naturally removing what is no longer needed and creating what is needed.
or
Should i attempt to perform some type of check and remove what is old with what is being added
or
what is a better way?
Linq to SQL is not the best tool for scenario's like this.
Basically you will have to write update/insert/delete all by yourself. You can use the exists() any() etc. to create the sets, but the resulting SQL will be all individual inserts, updates.
It is query language after all.
In your case, I would do a merge through a stored procedure and call that.
A MERGE statement (StackOverFlow Merge Example) might work out best in this situation. It will allow for multiple rows to be manipulated at the same time at your discretion.
I have a scenario where I need to synchronize a database table with a list (XML) from an external system.
I am using EF but am not sure which would be the best way to achieve this in terms of performance.
There are 2 ways to do this as I see, but neither seem to be efficient to me.
Call Db each time
-Read each entry from the XML
-Try and retrieve the entry from the list
-If no entry found, add the entry
-If found , update timestamp
-At end of loop, delete all entries with older timestamp.
Load All Objects and work in memory
Read all EF objects into a list.
Delete all EF objects
Add item for each item in the XML
Save Changes to Db.
The lists are not that long, estimating around 70k rows. I don't want to clear the db table before inserting the new rows, as this table is a source for data from a webservice, and I don't want to lock the table while its possible to query it.
If I was doing this in T-SQL i would most likely insert the rows into a temp table, and join to find missing and deleted entries, but I have no idea how the best way to handle this in Entity Framework would be.
Any suggestions / ideas ?
The general problem with Entity Framework is that, when changing data, it will fire a query for each changed record anyway, regardless of lazy or eager loading. So by nature, it will be extremely slow (think of factor 1000+).
My suggestion is to use a stored procedure with a table valued parameter and ignore Entity Framework all together. You could use a merge statement.
70k rows is not much, but 70k insert/update/delete statements is always going to be very slow.
You could test it and see if the performance is managable, but intuition says entity framework is not the way to go.
I would iterate over the elements in the XML and update the corresponding row in the DB one at a time. I guess that's what you meant with your first option? As long as you have a good query plan to select each row, that should be pretty efficient. Like you said, 70k rows isn't that much so you are better off keeping the code straightforward rather than doing something less readable for a little more speed.
It depends. It's ok to use EF if there'll be not many changes (say less than hundreds). Otherwise, need bulk insert into DB and merge rows inside database.
In the past, I always use Class SqlBulkCopy to complete bulk insert. But I don't know how to implement it using Linq to SQL. If inserted one by one, Efficiency will be very low.
Any good ideas?
Thanks in advance and sorry for my poor English.
Simple: you don't. You just use SqlBulkCopy. LINQ-to-SQL is simply a tool. SqlBulkCopy is a tool. Use the right tool for each job. Sometimes that means using something that isn't LINQ-to-SQL. This might mean creating a DataTable (or a spoof IDataReader if you are feeling ambitious) to represent the data; look perhaps at Convert generic List/Enumerable to DataTable? to get from your typed objects to a DataTable you can feed to SqlBulkCopy.
See this article on SubmitChanges().
Your changes are not transmitted to the server until you explicitly call SubmitChanges on the DataContext.
I am inserting record in the database (100,1.000, 10.000 and 100.000) using 2 methods
(is a table with no primary key and no index)
using a for and inserting one by one
using a stored procedure
The times are, of course better using stored procedure.
My questions are: 1)if i use a index will the operation go faster and 2)Is there any other way to make the insertion
PS:I am using ibatis as ORM if that makes any difference
Check out SqlBulkCopy.
It's designed for fast insertion of bulk data. I've found it to be fastest when using the TableLock option and setting a BatchSize of around 10,000, but it's best to test the different scenarios with your own data.
You may also find the following useful.
SQLBulkCopy Performance Analysis
No, I suspect that, if you use an index, it will actually go slower. That's because it has to update the index as well as inserting the data.
If you're reasonably certain that the data won't have duplicate keys, add the index after you've inserted all the rows. That way, it built once rather than being added to and re-balanced on every insert.
That's a function of the DBMS. I know it's true for the one I use frequently (which is not SQLServer).
I know this is slightly off-topic, but it's a shame you're not using SQL Server 2008, as there's been a massive improvement in this area with the advent of the MERGE statement and user-defined table types (which allow you to pass-in a 'table' of data to the stored procedure or statement so you can insert/update many records in one go).
For some more information, have a look at http://www.sql-server-helper.com/sql-server-2008/merge-statement-with-table-valued-parameters.aspx
It was already discussed : Insert data into SQL server with best performance.