SQL Server 2000 - Bulk insert from dataset/or C# collection - c#

I know SQL Server 2000 has a bulk insert. Does it support bulk insert from a C# collection, such as a dataset?
I need to insert 30 rows at a time, fairly regularly. I don't want to create 30 DB connections for this if I don't have to.

Have a look at SqlBulkCopy (has according to forums SQL 2000 support). It's easy to use. Basically just provide it with a data table (or data reader) and it will copy the rows from that source to your destination table.

You can insert using a DataSet in SQL 2000, I've never tried because I never use DataSets.
http://www.dotnet247.com/247reference/msgs/3/16570.aspx has a good post on it:
(From the article)
Steps involved
1.Create SqlDataAdapter with proper select statement
2.Create dataset and fill dataset with SqlDataAdapter
3.Add rows to the table in the dataset (for all the actions you said above, like radio
button selected, check box enabled)
4.Use SqlCommandBuilder helper object to generate the
UpdateStatements. Its very easy to use command builder. Just a one
call to the SqlCommandBuilder constructor.
5.Once your are done adding rows to the datatable int the dataset call
SqlDataAdapter.update and pass the modified dataset as a parameter.
This should automatically add the rows from dataset to the
database.(if no database error occurs)
Have you considered XML?
Working with XML in SQL 2000 isn't as nice as in 2008, but it is still doable:
http://www.codeproject.com/KB/database/insxmldatasqlsvr.aspx
http://www.codeproject.com/KB/database/generic_OpenXml.aspx
http://support.microsoft.com/default.aspx?scid=kb;en-us;315968
Another option you could look at would be to:
Open Connection.
Iterate through the inserts
Close Connection.

Related

SQL/C# - Multi-stage query, massive dataset

I am using a huge dataset (>10M records * ~16k) on a locally-stored MySQL database.
The user will filter by, say, fields A/B/C, returning between 1-200k records. THIS query takes up to a minute
With THIS set of data, I want to do further analysis with SQL; i.e. dynamically change a further set of fields, say D/E/F, depending on settings in the UI, running these further queries only on the smaller dataset.
My question is - conceptually - how is it best in C#/MySQL to approach this;
Can I keep the original query 'open' on the MySQL server, and dynamically adjust that to suit?
Do I need to take the whole dataset from the original query into memory and then filter it further in C#?
Should I copy the relevant data into a temporary table, and perform queries on that table?

Copy C# datatable into mysql database table

I have created one DataTable in C# dynamically. I want to insert this whole DataTable in MySql database table.
I am using MySql hence SqlBulkCopy is not an option for me.
I have tried MySqlBulkLoader but it seems it only insert data from file.
I can do it via iterating through every datarow of datatable and insert it to database.
Is there any another way I can achieve multiple inserts into database?
Thanks for any help!
If the number of rows are not much you can simply create an insert statement
INSERT INTO table VALUES (1,2,3), (4,5,6), (7,8,9);
Another article at http://www.codeproject.com/Articles/19727/ADO-NET-Generic-Copy-Table-Data-Function may help.
Though, if the number of rows are high, you can probably create a temporary text file and then use BulkLoader !

sync data between datagrid and SQL Compact table

Hello I have a project that is linked to a SQL Compact database (xxx.sdf file) and need to properly transfer data from my datagridview and a table within it. This may not be how I should do it but I've spent hours trying to parametrize a whole bunch of queries to do this individually to the table (without success). I've been reading article after article and maybe I'm searching for the wrong thing.
My grid is bound to tableNameBindingSource. I cannot programmatically add rows to the grid because it is bound. I'm looking for the easiest way to add hundreds of rows to my SQL table at once and wanted to try to use the datagridview to accomplish that.
Can I (unbind the SQL table and) add the rows as needed to the datagridview table then force its contents to become the contents of the database table? Is there some kind of translator (I've used the SQLDataAdaptor to go from SQL table to datagridview)?
As long as your grid view table (source data) has the same structure (columns, data types) as your SQL table you should be able to use SQLBulkCopy to quickly copy the data.
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
I've used this to copy 100,000s of rows in a relatively small amount of time.
Check out this CodeProject and the link above:
http://www.codeproject.com/Articles/18418/Transferring-Data-Using-SqlBulkCopy
(edit) I just noticed you said SQLCompact. I'm not sure that SqlBulkCopy will work with the compact edition. There is a CodePlex project though called SQLCEBulkCopy.
You can just use a loop and then use a Insert query if that helps ...
SqlConnection cn = new SqlConnection(your connection parameters)
foreach (GridViewRow gvr in GridView1.Rows)
{
Label lblname = (Label)gvr.FindControl("lblname");
Label lbllogin = (Label)gvr.FindControl("lbllogin");
Label lblemail = (Label)gvr.FindControl("lblemail");
cn.open();
SqlCommand command = new SqlCommand("INSERT INTO Employee(Name,Login_Id,Email_Id) VALUES(#lblname,#lbllogin,#lblemail)", cn);
command.Parameters.AddWithValue("#lblname", lblname.Text.ToString());
command.Parameters.AddWithValue("#lbllogin", lbllogin.Text.ToString());
command.Parameters.AddWithValue("#lblemail", lblemail.Text.ToString());
command.ExecuteNonQuery();
cn.Close()
}
and put this code in a button click event(save button or something)

Insert data from DataTable to database table

What is the easiest and most efficient way to insert data from a DataTable into a SQL Server database table? For what it's worth, the DataTable and the database table have matching columns.
What I am doing is transferring data from one database to another (separate instances, so I will be using my application as the intermediate "receiver/sender" of data). I can easily populate a DataTable with a SqlDataAdapater.Fill() call from the source database table. But I'm trying to find the most proficient way to send that DataTable's data to the final destination database table.
Any suggestions/advice/opinions are much appreciated.
EDIT: the destination database table already exists.
You should take a look at the SqlBulkCopy class, particularly the overload of the WriteToServer method that takes a DataTable as a parameter.
If you want to be even more efficient, and you don't have the requirement to materialize the entire table into a DataSet (or, you can process the contents as you move them in a forward-only manner), then use the overload of WriteToServer that takes an IDataReader run your query using the ExecuteReader method on the SqlCommand class instead of using a SqlDataAdapter to load the entire table into memory.

Import DataSet into SQL Server 2008 Express

I have a very large DataSet containing about 160.000 records. If I loop trough the dataset and import every record, it can take about 20 minutes before the complete dataset is imported into the SQL Server.
Isn't there a faster way for importing the dataset at once in the database?
The dataset is created from a file I process which the user provides, then I have 1 table called lets say "ImportTable" containing about 14 columns. The columns correspond with the columns in the DataSet.
I use Visual Studio 2010 professional with c#.
Thanks in advance!
You should take a close look at the SqlBulkCopy class.
It's a C# component (no external app), it takes a DataSet or DataTable as input, and copies that into SQL Server in a bulk fashion. Should be significantly faster than doing a row-by-agonizing-row (RBAR) insert operation...
Better yet: you don't even need to import your entire data set into memory - you can define a SqlDataReader on your base data, and pass that to SqlBulkCopy to read from the SqlDataReader and bulk insert into SQL Server.
You may want to take a look at the bcp command line utility. It lets you load data directly from a file into a table in the database. Depending on how the user generated file looks, you may need to re-format it, but if it has a simple delimited format you can probably use it as-is with bcp.
You can use make dataset xml using DataSet.getXml function. and pass input paremeter for SP.
for example
Create PROCEDURE dbo.MyInsertSP
(
#strXML varchar(1000)
)
AS
Begin
Insert into publishers
Select * from OpenXml(#intPointer,'/root/publisher',2)
With (pub_id char(4), pub_name varchar(40), city varchar(20),
state char(2),9) country varchar(20))
exec sp_xml_removedocument #intPointer
RETURN
End
Hope this make sense.

Categories

Resources