Passing a dataset to a SQL Server stored procedure - c#

Here I am facing a problem that I want to pass a dataset to a SQL Server stored procedure and I don't have any idea about it and there is no alternate solution (I think so ) to do that, let me tell what I want ...
I have an Excel file to be read , I read it successfully and all data form this excel work book import to a dataset. Now this data needs to be inserted into two different tables and there is too many rows in Excel workbook so it is not good if I run it from code behind that's why I want to pass this dataset to stored procedure and than ........
please suggest me some solution .

Not knowing what database version you're working with, here are a few hints:
if you need to read the Excel file regularly, and split it up into two or more tables, maybe you need to use something like SQL Server Integration Services for this. With SSIS, you should be able to achieve this quite easily
you could load the Excel file into a temporary staging table, and then read the data from that staging table inside your stored procedure. This works, but it gets a bit messy when there's a chance that multiple concurrent calls need to be handled
if you're using SQL Server 2008 and up, you should look at table-valued parameters - you basically load the Excel file into a .NET DataSet and pass that to the stored proc as a special parameter. Works great, but wasn't available in SQL Server before the 2008 release
since you're using SQL Server 2005 and table-valued parameters aren't available, you might want to look at Erland Sommarskog's excellent article Arrays and Lists in SQL SErver 2005 - depending on how big your data set is, one of his approaches might work for you (e.g. passing as XML which you parse/shred inside the stored proc)

Related

Load .sql file to a DataSet or DataTable with C#

Is it possible to load a .sql file into a DataSet or DataTable without executing it on a SQL Server database and then read the values from the database?
I know the way to load the .sql file into SQL Server and then read out the values, but is there a way to skip this because I don't have the option to create a SQL Server on the PC at the moment.
The .sql file has a CREATE TABLE statement and a INSERT INTO query with some data
Greetz.
In-Memory SQSQLite or SQL Server CE maybe? Depends on whats in your SQL file, and if there are any SQL Server specific statements.
In that case it pretty much boils down to
In-memory database (would be my preferred option)
Parsing the file directly using a custom adapter. Maybe its a good idea for you to start an open source project. And once you come across hiccups to ask for help ;-)

Better way to perform bulk CRUD operations in asp.net

I am currently working on an asp.net application which allows a user to upload a CSV file and read and update values in SQL table. Typically there could be 500 records and 8 parameters.
Using sqlbulkcopy, CSV file can be directly uploaded into table. But this option requires to write all logic in another stored procedure and call it after bulk copy.
If there are any other approaches I could follow to acheieve this, please let me know.
Note: Table parameter is not an option, since its SQL Server 2005.

How to input data into SQL Server for first time start of C# application

I created a C# program using SQL Server 2008 Express.
There is some data that must exist in the database table initially, for the C# application to run correctly.
I would like to transfer this data into the table the first time the C# application executes.
But I don't want this SQL data (record data) read from my C# code (I don't want to load this data from hard-coded C# code).
How does SQL Server store its record data, and how can I transfer the initial data into the database?
I'd try to not rely on a database being existent for your application to work. You're coupling yourself to the database which probably isn't a good idea. Do you think maybe you could execute a stored procedure from your application which would fill out the tables you're relying on? That way your DB creation code wouldn't exist in your application. This would however still mean your application is dependant on the database to function correctly.
I would recommend one of two plans, the latter being the cleaner.
Plan 1
Create a SQL script which will create and insert all the required data for the application to work and place this in a stored procedure
When your application starts first time it will check a configuration file to see if the program has ran before, if not, execute the stored procedure to create the required data
Alter an external configuration file (which could be password protected) which will indicate whether the stored procedure has already been run, could just be a simple bool
Each subsequent time the application runs it will check to see whether it's run before and won't execute the stored proc if it has
Plan 2
Create a SQL script which will create and insert all the required data for the application to work and place this in a stored procedure
Use an installer to deploy your application with a custom action to create the database, explained here
You definitely need to have the SQL insert/create script at least partially hardcoded somewhere. Either in an SQL script sitting in your app's directory (not recommended because the user might tamper with it) or in your application's ressources (a bit more secure as the user would need to hex-edit the EXE or DLL).
Tried this ? http://social.msdn.microsoft.com/Forums/en/adodotnetdataproviders/thread/43e8bc3a-1132-453b-b950-09427e970f31
I think your question is so simple and you want to fill the tables before you run the program.
simplely you can open sql server managment studio and and connect with your instance name,then expand Database list and find your database that you have created.expand that and in Tables tree find the table you want to add data to it.right click on it and do Edit
also Redgate has a product to fill tables called data generator,that can use to fill your tables with so many data

Bulk Insert Sql Server millions of record

I have a Windows Service application that receives a stream of data with the following format
IDX|20120512|075659|00000002|3|AALI |Astra Agro Lestari Tbk. |0|ORDI_PREOPEN|12 |00000001550.00|00000001291.67|00001574745000|00001574745000|00500|XDS1BXO1| |00001574745000|ݤ
IDX|20120512|075659|00000022|3|ALMI |Alumindo Light Metal Industry Tbk. |0|ORDI |33 |00000001300.00|00000001300.00|00000308000000|00000308000000|00500|--U3---2| |00000308000000|õÄ
This data comes in millions of rows and in sequence 00000002....00198562 and I have to parse and insert them according to the sequence into a database table.
My question is, what is the best way (the most effective) to insert these data into my database? I have tried to use a simple method as to open a SqlConnection object then generate a string of SQL insert script and then execute the script using SqlCommand object, however this method is taking too long.
I read that I can use Sql BULK INSERT but it has to read from a textfile, is it possible for this scenario to use BULK INSERT? (I have never used it before).
Thank you
update: I'm aware of SqlBulkCopy but it requires me to have DataTable first, is this good for performance? If possible I want to insert directly from my data source to SQL Server without having to use in memory DataTable.
If you are writing this in C# you might want to look at the SqlBulkCopy class.
Lets you efficiently bulk load a SQL Server table with data from another source.
First, download free LumenWorks.Framework.IO.Csv library.
Second, use the code like this
StreamReader sr = new TextReader(yourStream);
var sbc = new SqlBulkCopy(connectionString);
sbc.WriteToServer(new LumenWorks.Framework.IO.Csv.CsvReader(sr));
Yeah, it is really that easy.
You can use SSIS "Sql Server Integration Service" for converting data from source data flow to destination data flow.
The source can be a text file and destination can be a SQL Server table. Your conversion executes in bulk insert mode.

Copy from one database table to another C#

Using C# (vs2005) I need to copy a table from one database to another. Both database engines are SQL Server 2005. For the remote database, the source, I only have execute access to a stored procedure to get the data I need to bring locally.
The local database I have more control over as it's used by the [asp.net] application which needs a local copy of this remote table. We would like it local for easier lookup and joins with other tables, etc.
Could you please explain to me an efficient method of copying this data to our local database.
The local table can be created with the same schema as the remote one, if it makes things simpler. The remote table has 9 columns, none of which are identity columns. There are approximately 5400 rows in the remote table, and this number grows by about 200 a year. So not a quickly changing table.
Perhaps SqlBulkCopy; use SqlCommand.ExecuteReader to get the reader that you use in the call to SqlBulkCopy.WriteToServer. This is the same as bulk-insert, so very quick. It should look something like (untested);
using (SqlConnection connSource = new SqlConnection(csSource))
using (SqlCommand cmd = connSource.CreateCommand())
using (SqlBulkCopy bcp = new SqlBulkCopy(csDest))
{
bcp.DestinationTableName = "SomeTable";
cmd.CommandText = "myproc";
cmd.CommandType = CommandType.StoredProcedure;
connSource.Open();
using(SqlDataReader reader = cmd.ExecuteReader())
{
bcp.WriteToServer(reader);
}
}
Bulk Copy feature of ADO.NET might help you take a look at that :
MSDN - Multiple Bulk Copy Operations (ADO.NET)
An example article
I would first look at using SQL Server Intergration Services (SSIS, née Data Transfer Services (DTS)).
It is designed for moving/comparing/processing/transforming data between databases, and IIRC allows an arbitrary expression for the source. You would need it installed on your database (shouldn't be a problem, it is part of a default install).
Otherwise a code solution, given the data size (small), pull all the data from the remove system into an internal structure, and then look for rows which don't exist locally to insert.
You probably can't do this, but if you can't, DON'T do it with a program. If you have any way of talking to someone who controls the source server, see if they will set up some sort of export of the data. If the data is as small as you say, then xml or csv output would be 100x better than writing something in c# (or any language).
So let's assume they can't export, still, avoid writing a program. You say you have more control over the destination. Can you set up an SSIS package, or setup a linked server? If so, you'll have a much easier time migrating the data.
If you set up at bare minimum the source as a linked server you could write a small t-sql batch to
TRUNCATE DestTable
INSERT INTO DestTable
SELECT SourceTable.Star FROM [SourceServer].[Schema].[Table]
wouldn't be as nice as SSIS (you have more visual of what's happening, but the t-sql above is pretty clear).
Since I would not take the programming route, the best solution I could give you would be, if you absolutely had to:
Use SqlClient namespace.
So, create 2 SqlConnections, 2 SqlCommands, and get the instance of the 1 SqlReader.
Iterate through the source reader, and execute the destination SqlCommand insert for each iteration with the.
It'll be ugly, but it'll work.
Doesn’t seem to be huge quantity of data you have to synchronize. Under conditions you described (only SP to access the remote DB and no way to get anything else), you can go for Marc Gravell’s solution.
In the case the data can only grow and existing data can not be changed you can compare the record count on remote and internal DB in order to optimize operation; if no change in remote DB no need to copy.

Categories

Resources