C# ANSI Database Bulk Insert - c#

I am looking for a way to do multi row inserts with C# which can be universal enough so that the same procedures can be applied to more than just SQL Server.
Right now, we have an application that can use SQLite, PostgreSQL, and SQL as a back end to store data, with a possibility of adding Oracle into the mix as well. For that reason, it would be great to have a single procedure that can insert multiple rows in one round trip, and for that same procedure to work regardless of which Data Client is being used. (For this reason, I am not using a SQL Server Stored Procedure).

Related

How to insert bulk record in SQLite table from result of a SQL Server query through C# or directly from database

I have thousands of records from a SQL Server 2014 stored procedure result set, and I insert them one by one into a SQLite DB table through C# code, which takes around 4-5 minutes. I need to reduce this time.
I am looking for something like:
insert into 'sqlite_table'
select *
from 'sql_server_table'
Any answer with C# code or anything direct from SQL Server script can be helpful
On your C# Code. Use SqlTransaction to fasten the insertion of records from one table to another.
I Have done same thing in Sql Server to MySql. For that first I stopped the AutoCommit (Set AutoCommit=False). After the query run the commit command. It gave me good performance. You can try the same if it works for you.

Entity Framework and SQL Server temp tables

We have a legacy application which we are moving to C#, and I would like to use Entity Framework (EF) for data access. SQL Server 2012 is the database, and currently I am using EF 5.0. The original application did a lot of data intensive processing. As part of that processing, the original application made extensive use of “temp” tables. Most of these “temp” tables were not actually SQL Server temp tables (e.g. #sometable in tempdb), but real tables created and destroyed in their own “temp” database in SQL Server.
Now, that I am working with C# EF I want to be able to use temp tables (of either type) as part of my data processing. I have done some googling on using temp tables with EF - and so far I have found that you can create the tempdb temp tables using the SQLBulkCopy, but there is no way to query against them using EF Linq, as they are not part of the normal data structure. The other option I have read about - is using a stored procedure to do the processing and passing it a table valued parameter. This would force us to have thousands of stored procedures - and put a great deal of the business logic in sprocs.
Is there any other way to create and use temporary SQL Server tables? Does EF 6 have any added capabilities in this area?
I've never seen temp tables used in EF.
If the data process in question is that intense, it's probably better to leave it as a stored procedure (I'm assuming this is how the legacy code worked). EF can run stored procedures with no problem; you can even get the results back as a model entity.
If you really don't want to use a stored procedure, you could also simulate a temp table in EF by using a regular table and just filtering it to the current session (through the use of a GUID or some other throw-away synthetic key). This technique works fairly well, but has the disadvantage of needing to clean the garbage data out of the table when your procedure is done.

SQL Server - insert multiple values - what is the right way?

I have a .NET application that works against a SQL Server. This app gets data from a remote third party API, and I need to insert that data to my database in a transaction.
First I delete all existing data from the tables, then I insert each row of data that I get from the API.
I wrote a stored procedure that accepts parameters and does the insert. then I call that stored procedure in a loop with a transaction from .NET.
I'm guessing there's a smarter way to do this?
Thanks
If you're doing thousands or maybe even tens of thousands you can probably do best with table valued parameters.
If you're doing more than that then you should probably look at doing the dedicated SQL server bulk insert feature. That might not work great transactionally if I remember correctly.
Either way truncate is way faster than delete.
What I've done in the past to avoid needing transactions is create two tables, and use another for deciding which is the active one. That way you always have a table with valid data and no write locks.

insert directly or via a stored procedure

I am using sql server and winforms for my application. data would be inserted every minute into the database tables by pressing a button on a Form.
for this, I am using the INSERT query.
But if I create a procedure and include the same insert query in it, then would it be more efficient, what would be the difference then?
Using stored procedures is more secure
A stored procedure would generally be quicker as the query plan is stored and does not need to be created for each call. If this is a simple insert the difference would be minimal.
A stored procedure can be run with execute permissions which is more secure than giving insert permissions to the user.
It depends on what you mean by 'efficient'.
Execution time - if you're only saving to the database only every couple of seconds then any speed difference between SPs and INSERT is most likely insignificant. If the volume is especially high you would probably set up something like a command queue on the server before fine-tuning at this level.
Development time
using INSERT means you can write your SQL directly in your codebase (in a repository or similar). I've seen that described as poor design, but I think that as long as you have integration tests around the query there's no real problem
Stored Procedures can be more difficult to maintain - you need to have a plan to deploy the new SP to the database. Benefits are that you can implement finer-grained security on the database itself (as #b-rain and #mark_s have said) and it is easy to decide between INSERT and UPDATE within the SP, whereas to do the same in code means making certain assumptions.
Personally (at the moment) I use inline SQL for querying and deleting, and stored procedures for inserting. I have a script and a set of migration files that I can run against the production database to deploy table and SP changes, which seems to work pretty well. I also have integration tests around both the inline SQL and the SP calls. If you go for inline SQL you definitely should use parameterised queries, it helps against SQL injection attacks and it is also easier to read and program.
If your DBA is even allowing you to do this without a stored procedure I'd be very suspicious...

Passing a dataset to a SQL Server stored procedure

Here I am facing a problem that I want to pass a dataset to a SQL Server stored procedure and I don't have any idea about it and there is no alternate solution (I think so ) to do that, let me tell what I want ...
I have an Excel file to be read , I read it successfully and all data form this excel work book import to a dataset. Now this data needs to be inserted into two different tables and there is too many rows in Excel workbook so it is not good if I run it from code behind that's why I want to pass this dataset to stored procedure and than ........
please suggest me some solution .
Not knowing what database version you're working with, here are a few hints:
if you need to read the Excel file regularly, and split it up into two or more tables, maybe you need to use something like SQL Server Integration Services for this. With SSIS, you should be able to achieve this quite easily
you could load the Excel file into a temporary staging table, and then read the data from that staging table inside your stored procedure. This works, but it gets a bit messy when there's a chance that multiple concurrent calls need to be handled
if you're using SQL Server 2008 and up, you should look at table-valued parameters - you basically load the Excel file into a .NET DataSet and pass that to the stored proc as a special parameter. Works great, but wasn't available in SQL Server before the 2008 release
since you're using SQL Server 2005 and table-valued parameters aren't available, you might want to look at Erland Sommarskog's excellent article Arrays and Lists in SQL SErver 2005 - depending on how big your data set is, one of his approaches might work for you (e.g. passing as XML which you parse/shred inside the stored proc)

Categories

Resources