Azure SQL handling table locks for huge inserts - c#

In my application, I have a windows service (hosted as quartz on azure web) which runs after a particular time and reads a file and inserts data. The data can be of any length so the type in DB is "text". All records are inserted in one table.
Problem is that the service might run in parallel and try insert the records in table at the same time. Since the data might be huge and I want to have performance also, I want to make the service run in parallel. I am using EF 6.0 and LINQ. Is there a possible way to not lock the table and insert huge data.
Note: Bulk insert might not work as the data type to insert have 'text' as well.

I am assuming that there is one column of type "text" in your table and nothing else. Based on the above assumptions, I would think that there would not be fundamental semantic problems if you want to run this in parallel.
There are a couple of ways of doing this:
use SqlBulkCopy class with row batches. This will copy batches of rows and load them into the table. This can be done in parallel.
use a temporary staging table for each service, and then run INSERT INTO SELECT for each staging table. The INSERT portion will be in serial because it will take an X lock on the table, but this may be faster than loading one row by another.
Neither of these methods uses EF 6.0 and LINQ, but they will get the job done.
Hope this helps!

Related

How to update huge sql data in asp .net application using c#

I am doing web application using c# .net and sql server 2008 as back end. Where application read data from excel and insert into sql table. For this mechanism I have used SQLBulkCopy function which work very well. Sql table has 50 fields from which system_error and mannual_error are two fields. After inserting records in 48 columns I need to re-ckeck all this records and update above mentioned two columns by specific errors e.g. Name filed have number, qty Not specified etc. For this I have to check each column by fetching in datatable and using for loop.
Its work very well when record numbers are 1000 to 5000. But it took huge time say 50 minutes when records are around 100,000 or more than this.
Initially I have used simple SQL Update Query then I had used stored procedure but both requires same time.
How to increase the performance of application? What are other ways when dealing with huge data to update? Do suggestions.
I hope this is why people use mongodb and no SQL systems. You can update huge data setsby optimizing your query. Read more here:
http://www.sqlservergeeks.com/blogs/AhmadOsama/personal/450/sql-server-optimizing-update-queries-for-large-data-volumes
Also check:Best practices for inserting/updating large amount of data in SQL Server 2008
One thing to consider is that iterating over a database table row by row, rather than performing set based update operations would incur a significant performance hit.
If you are in fact performing set based updates on your data and still have significant performance problems you should look at the execution plan of your queries so that you can workout where and why they are performing so badly.

SQL Server - insert multiple values - what is the right way?

I have a .NET application that works against a SQL Server. This app gets data from a remote third party API, and I need to insert that data to my database in a transaction.
First I delete all existing data from the tables, then I insert each row of data that I get from the API.
I wrote a stored procedure that accepts parameters and does the insert. then I call that stored procedure in a loop with a transaction from .NET.
I'm guessing there's a smarter way to do this?
Thanks
If you're doing thousands or maybe even tens of thousands you can probably do best with table valued parameters.
If you're doing more than that then you should probably look at doing the dedicated SQL server bulk insert feature. That might not work great transactionally if I remember correctly.
Either way truncate is way faster than delete.
What I've done in the past to avoid needing transactions is create two tables, and use another for deciding which is the active one. That way you always have a table with valid data and no write locks.

Efficient Update of Table from One SQL Server to Another, Same Table Structure

I have one database server, acting as the main SQL Server, containing a Table to hold all data. Other database servers come in and out (different instances of SQL Server). When they come online, they need to download data from main Table (for a given time period), they then generate their own additional data to the same local SQL Server database table, and then want to update the main server with only new data, using a C# program, through a scheduled service, every so often. Multiple additional servers could be generating data at the same time, although it's not going to be that many.
Main table will always be online. The additional non-main database table is not always online, and should not be an identical copy of main, first it will contain a subset of the main data, then it generates its own additional data to the local table and updates main table every so often with its updates. There could be a decent amount of number of rows generated and/or downloaded. so an efficient algorithm is needed to copy from the extra database to the main table.
What is the most efficient way to transfer this in C#? SqlBulkCopy doesn't look like it will work because I can't have duplicate entries in main server, and it would fail if checking constraints since some entries already exist.
You could do it in DB or in C#. In all cases you must do something like Using FULL JOINs to Compare Datasets. You know that already.
Most important thing is to do it in transaction. If you have 100k rows split it to 1000 rows per transaction. Or try to determine what combination of rows per transaction is best for you.
Use Dapper. It's really fast.
If you have all your data in C#, use TVP to pass it to DB stored procedure. In stored procedure use MERGE to UPDATE/DELETE/INSERT data.
And last. In C# use Dictionary<Tkey, TValue> or something different with O(1) access time.
SQLBulkCopy is the fastest way for inserting data into a table from a C# program. I have used it to copy data between databases and so far nothing beats it speed wise. Here is a nice generic example: Generic bulk copy.
I would use a IsProcessed flag in the table of the main server and keep track of the main table's primary keys when you download data to the local db server. Then you should be able to do a delete and update to the main server again.
Here's how i would do it:
Create a stored procedure on the main table database which receives a user defined table variable with the same structure as the main table.
it should do something like -
INSERT INTO yourtable (SELECT * FROM tablevar)
OR you could use the MERGE statement for the Insert-or-Update functionality.
In code, (a windows service) load all (or a part of) the data from the secondery table and send it to the stored procedure as a table variable.
You could do it in bulks of 1000's and each time a bulk is updated you should mark it in the source table / source updater code.
Can you use linked servers for this? If yes it will make copying of data from and to main server much easier.
When copying data back to the main server I’d use IF EXISTS before each INSERT statement to additionally make sure there are no duplicates and encapsulate all insert statements into transaction so that if an error occurs transaction is rolled back.
I also agree with others on doing this in batches on 1000 or so records so that if something goes wrong you can limit the damage.

Batch Updates using DataAdapter

I have a situation where I have a bunch of SQL Update commands that all need to be executed. I know that DataSets can do batch updates, but the only way I've been able to accomplish it is to load the whole table into a dataset first. What if I want to only update a subset of the records in a table?
The easiest way out here is load table with the required rows (rows you wan to update). Double check that the RowState is "Inserted".
Assign the InsertCommand property of the adapter with your stored procedure (wrapped in an SqlCommand) that does the "update", this tweak will ensure that all the rows present in the table gets updated.
The fundamental here being: The DataAdapter runs the UpdateCommand on rows for which state is Updated, runs InsertCommand for rows which state is Inserted and finally DeleteCommand for the rows for which state is Deleted.
EDITED: Based on your comment, I'd recommend using the Bulk Copy method to go to a staging table first. Then you can do a single update on your real table based on the staging table.
=========
One way is to build the SQL Command yourself; however, I'd recommend reading about the SQL Injection possibilities to protect yourself. Depending on your situation and your platform there are other options.
For example if you are dealing with a lot of data, then you could do a bulk import into a holding table, which you would then issue a single update command off of. I've also had good success passing records in as XML (I found in my environment that I needed at least 50 rows to offset the cost of loading the DOM, and I knew that the scalability issues were not a factor in my case).
Some other things I've seen was people package the updates into a binary field, for example one binary field for each column, then they have a function which unpacks the binary field. Again you will want to test with your environment.
With that said, have you verified that simply calling a single update command or stored procedure for each update you need is not sufficent? You might not even need to batch the commands up.
Josh
Watch out for Premature Optimization it can be killer!

Maintain a local copy of a table from an external database table, ADO.NET

We have built an application which needs a local copy of a table from another database. I would like to write an ado.net routine which will keep the local table in sync with the master. Using .net 2.0, C# and ADO.NET.
Please note I really have no control over the master table which is in a third party, mission critical app I don't wish to mess with.
For example Here is the master data table:
ProjectCodeId Varchar(20) [PK]
ProjectCode Varchar(20)
ProjectDescrip Varchar(50)
OtherUneededField int
OtherUneededField2 int
The local table we need to keep in sync...
ProjectCodeId Varchar(20) [PK]
ProjectCode Varchar(20)
ProjectDescrip Varchar(50)
Perhaps a better approach to this question is what have you done in the past to this type of problem? What has worked best for you or should be avoided at all costs?
My goal with this question is to determine a good way to handle this. So often I am combining data from two or more disjointed data sources. I haven't included database platforms for this reason, it really shouldn't matter. In this current situation both databases are MSSQL, but I prefer the solution not use linked databases or DTS, etc.
Sure, truncating the local table and refilling it each time from the master is an option, but with thousands of rows I don't think this is very efficient. Do you?
EDIT: First, recognize that what you are doing is hand-rolled replication and replication is never simple.
You need to track and apply all of the CRUD state changes. That said, ADO.NET can do this.
To track changes to the source you can use Query Notification with your source database. This requires special permission against the database so the owner of the source database will need to take action to enable this solution. I haven't used this technique myself, but here is a description of it.
See "Query Notifications in SQL Server (ADO.NET)"
Query notifications were introduced in
Microsoft SQL Server 2005 and the
System.Data.SqlClient namespace in
ADO.NET 2.0. Built upon the Service
Broker infrastructure, query
notifications allow applications to be
notified when data has changed. This
feature is particularly useful for
applications that provide a cache of
information from a database, such as a
Web application, and need to be
notified when the source data is
changed.
To apply changes from the source db table you need to retrieve the data from the target db table, apply the changes to the target rows and post the changes back to the target db.
To apply the changes you can either
1) Delete and reinsert all of the rows (simple), or
2) Merge row-by-row changes (hard).
Delete and reinsert is self explanatory, so I won't go into detail on that.
For row-by-row change tracking here is an approach. (I am assuming here that Query Notification doesn't give you row-by-row change information, so you have to calculate it.)
You need to determine which rows were modified and identify inserted and deleted rows. Create a DataView with a sort for each table to get a Find method you can use to lookup matching rows by ID.
Identify modified rows by using a datetime/timestamp column, or by comparing all field values. Copy modified values to the target row.
Identify added and deleted rows by looping over the respective table DataViews and using the Find method of the other DataView to identify rows that do not appear in the first table. Insert or delete rows from the target table as required. (The Delete method doesn't remove the row but marks it for deletion by the TableAdapter Update.)
Good luck!
+tom
I would push in the direction where the application that is inserting the data would insert into one db/table then the other in the same function. Make the application do the work, the db will be pushed already.
Some questions - what db platform? how are you using the data?
I'm going to assume you're just using this data as a lookup... and as you have no timestamp and no ability modify the existing table, i'd just blow away the local copy periodically and pull it down from the master table again.
Unless you've got a hell of a lot of data the overhead for this should be pretty small.
If you need to synch back to the master table, you'll need to do something a bit more exotic.
Can you use SQL replication? This would be preferable to writing code to do it no?

Categories

Resources