Bulk Update in SQL Server from C# - c#

I have to update multiple records in SQL Server table from C#. Below are the steps I have to follow and below is the code.
The code is working but the process is taking much longer than expected.
I need a quick way to update 10000 records, not sure whether Bulk Copy would work for Update.
I have seen the other answers which has Bulk insert to temp and then update..But that update has a single statement and here I need to update the records in DB based on Excel data and for this I have to loop each excel record.So how can I achieve faster update.
1) Read the Excel Data and copied the data into a data table
string strDirectory = string. Empty;
strDirectory = System.IO.Directory.GetCurrentDirectory() + "\\" + "Filename.xlsx";
string Connection String = "Provider=Microsoft.ACE.OLEDB.12.0; Data Source = " + strDirectory + "; Extended Properties = \"Excel 12.0;HDR=YES;IMEX=1\"";
using (OleDbConnection conn = new OleDbConnection(Connection String))
{
conn.Open();
DataTable schemaTable = conn.GetOleDbSchemaTableOleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
DataRow schemaRow = schemaTable. Rows[0];
string sheet = schemaRow["TABLE_NAME"].ToString();
string query = "SELECT * FROM [" + sheet + "]";
OleDbDataAdapter daexcel = new OleDbDataAdapter(query, conn);
daexcel.Fill(dt);
conn.Close();
}
2) Doing some manipulations on datatable data before updating into table.
string strsqlst = string. Empty;
using (SqlConnection sqlConn = new SqlConnection(Connectionstring))
{
sqlConn.Open();
SqlCommand cmd;
StringBuilder sb = new StringBuilder();
sb.AppendLine("DataTable content:");
foreach (DataRow row in dt.Rows)
{
if (row.ItemArray[0].ToString() == "")
break;
strsqlst = "Update table Set col1= " + row.ItemArray[4].ToString() + " ,col2= " + row.ItemArray[5].ToString() + " where <Condition>'";
cmd = new SqlCommand(strsqlst, sqlConn);
cmd.CommandType = CommandType.Text;
cmd.ExecuteNonQuery();
}
sqlConn.Close();
}

The SqlCommand can be a whole SQL batch and is not limited to a single statement. So you can create a single large batch with 10,000 UPDATE statements, or divide it into for example 20 batches of 500 each.
In other words, you can create a single command with CommandText like this:
UPDATE [T] SET Col1='Value1', Col2='Value2' WHERE [Id] = 1;
...
UPDATE [T] SET Col1='Value999', Col2='Value1000' WHERE [Id] = 500;
That said, you should use parameters for all data values (to ensure SQL injection is not possible).
If you want to handle any errors (updates failing due to invalid data) you will need something a bit more sophisticated.

Related

How to insert data from excel to datatable which has headers?

I am reading excel having like million of records first i query my table (no records) and get Datable. i query my table to get columns name as define in my excel sheet using alias.
var dal = new clsConn();
var sqlQuery = "SELECT FETAPE_THEIR_TRANDATE \"Date\" ,ISSUER Issuer, ISSU_BRAN Branch , STAN_NUMB STAN, TERMID TermID, ACQUIRER Acquirer,DEBIT_AMOUNT Debit,CREDIT_AMOUNT Credit,CARD_NUMB \"Card Number\" , DESCRIPTION Description FROM ALLTRANSACTIONS";
var returntable = dal.ReadData(sqlQuery);
DataRow ds = returntable.NewRow();
var dtExcelData = returntable;
So my datatable looks like this,
Then i read records from excel sheet
OleDbConnection con = null;
if (ext == ".xls")
{
con = new OleDbConnection(#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + filepath + ";Extended Properties=Excel 8.0;");
}
else if (ext == ".xlsx")
{
con = new OleDbConnection(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filepath + ";Extended Properties=\"Excel 12.0;IMEX=2;HDR=NO;TypeGuessRows=0;ImportMixedTypes=Text\"");
}
con.Open();
dt = con.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
string getExcelSheetName = dt.Rows[0]["Table_Name"].ToString();
//OleDbCommand ExcelCommand = new OleDbCommand(#"SELECT * FROM [" + getExcelSheetName + #"]", con);
OleDbCommand ExcelCommand = new OleDbCommand("SELECT F1, F2, F3, F4, F5,F6,F7,F8,F9,F10 FROM [Sheet1$]", con);
OleDbDataAdapter ExcelAdapter = new OleDbDataAdapter(ExcelCommand);
try
{
ExcelAdapter.Fill(dtExcelData); //Here I give the datatable which i made previously
}
catch (Exception ex)
{
//lblAlert2.CssClass = "message-error";
//lblAlert2.Text = ex.Message;
}
It reads successfully and fill data in datatable but creating its own column in data table like F1 to F10 how can i move this data to exactly match with my defined columns in datatable
How Will i manage this to not create other columns (f1,f2..f10)
any workaround will be appreciable or Please explain what i am doing wrong and how can i achieve this.
UPDATE :
My Excel file looks like this
The Microsoft.ACE.OLEDB.12.0 driver will handle both types of excel spreadsheets and using the same Extended Properties. i.e. "Excel 12.0" will open both .xls and .xlsx.
Leave the HDR=NO as OLEDB expects them in the first row and they are actually in row 11.
Sadly "TypeGuessRows=0;ImportMixedTypes=Text" is completely ignored by Microsoft.ACE.OLEDB.12.0, you've got to play around with the registry (yuk). Change your IMEX=2 to IMEX=1 to ensure that mixed data types as handled as text.
Change back to using "Select * From [Sheet1$]" and then I'm afriad that you are going to have to handle the source data manually.
OleDbConnection con = new OleDbConnection(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filepath + ";Extended Properties=\"Excel 12.0;IMEX=1;HDR=NO\"");
con.Open();
DataTable dt = con.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
string getExcelSheetName = (string)dt.Rows[0]["Table_Name"];
DataTable xlWorksheet = new DataTable();
xlWorksheet.Load(new OleDbCommand("Select * From [" + getExcelSheetName + "]", con).ExecuteReader());
//More than 11 rows implies 11 header rows and at least 1 data row
if (xlWorksheet.Rows.Count > 11 & xlWorksheet.Columns.Count >= 10)
{
for (int nRow = 11; nRow < xlWorksheet.Rows.Count; nRow++)
{
DataRow returnRow = returntable.NewRow();
for (int nColumn = 0; nColumn < 10; nColumn++)
{
//Note you will probably get conversion problems here that you will have to handle
returnRow[nColumn] = xlWorksheet.Rows[nRow].ItemArray[nColumn];
}
returntable.Rows.Add(returnRow);
}
}
I'm guessing you simply want to add the excel data into your ALLTRANSACTION table? You don't specify but it seems the likely outcome of this. If so this is a terrible way to do it. You don't need to read the whole table into memory append data and then update the database. All you need to do is read the excel file and insert the data to the Oracle table.
Some thoughts, your returntable will contain data so if you just want the structure of the table then add a "Where RowNum=0" to the Select statement. To add the data to your Oracle Database you could 1) Convert to using the Oracle Data Provider (ODP) and then use using OracleBulkCopy Class or 2) simply modify the above to insert row by row as you read the data. As long as you don't have a LOT of data in your Excel spreadsheet it will work just fine. Having said that a Million rows is a LOT so perhaps not the best option. You will need to validate the input as Excel is not the best data source really.

C# sql IF NOT EXIST statement not working?

Not sure if this is written correctly but it looks correct. I am wanting to update a record if the id already exists and insert if not.
DataSet ds = new DataSet();
ds.ReadXml(XDocument.Load(Application.StartupPath + #"\xml1.xml").CreateReader());
using (var conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" + Application.StartupPath + "\\Database3.mdb"))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand("Insert INTO Table1 (description, active) VALUES (#iq_question,#active);", conn);
var commUpdate = new OleDbCommand("UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#question_id;", conn);
// here add your parameters with no value
//string question_id = row[0].ToString();
//string iq_question = row[1].ToString();
//string active = row[4].ToString();
commInsert.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.VarChar));
commInsert.Parameters.Add(new OleDbParameter("#active", OleDbType.VarChar));
commUpdate.Parameters.Add(new OleDbParameter("#question_id", OleDbType.AutoNumber));
commUpdate.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.Text));
commUpdate.Parameters.Add(new OleDbParameter("#active", OleDbType.Text));
foreach (DataTable table in ds.Tables)
{
foreach (DataRow row in table.Rows)
{
// here only reset the values
commUpdate.Parameters["#question_id"].Value = row[0].ToString();
commUpdate.Parameters["#iq_question"].Value = row[1].ToString();
commUpdate.Parameters["#active"].Value = row[4].ToString();
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters["#iq_question"].Value = row[1].ToString();
commInsert.Parameters["#active"].Value = row[4].ToString();
commInsert.ExecuteNonQuery();
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
conn.Close();
}
System.Windows.Forms.MessageBox.Show("Updated Latest Data Was Succesfull");
I either get an error on the insert saying it will create duplicate content, or it creates more rows with different data. So say I should be getting 10 rows from the xml file, the first time I run it I get the 10 rows with the correct data. If I run it again, I end up with 10 more so being 20 but the last 10 rows show different data. I don't think I am identifying the rows in the xml file correctly and I need to do some research on that part.
There is no Exists for MS Access. The engine is much more primitive than Sql Server. See here: Microsoft Access SQL. I think, what you can do is:
myCommand.CommandText = "UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#currentRow";
......
int recs = myCommand.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
myCommand.Parameters.Clear();
myCommand.CommandText = "Insert INTO Table1 VALUES(#iq_question,#active)";
.....
}
This is still 2 statements but you can save some coding by not doing Select first. Because ExecuteNonQuery will tell you if you updated anything
Another thing is that your code is a bit inefficient. You have nested loop where you can reuse same command and connection. Yuu can do this
using (var conn = new OleDbConnection(.......))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand(.....);
var commUpdate = new OleDbCommand(.....);
// here add your parameters with no value
commInsert.Parameters.Add(new OleDbParameter(....));
.......
Foreach (....)
{
Foreach (....)
{
// here only reset the values
commUpdate.Parameters[0].Value = ...
...
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters[0].Value = iq_question;
.....
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
}
You can also use nested using for commands. Only setting values will be more efficient to what you do right now.

C# and SQLCE Database Not Updating

Ok, I have been having a problem the last few days with my database not updating. I can read the data fine and I'm not getting any exceptions either. I'm trying to update the database then I try to read values again after the update (during same run), and they still hold the original values, so it doesn't seem to be an issue with the database being copied to another folder (I'm using Copy if newer yet neither database is being updated).
Here is the code I'm using. As you can see I tried a few different approaches, none of which worked yet.
public void UpdateDatabaseInStock(string itemName, string tableName)
{
DataSet data = new DataSet("Items");
int val;
//get the file path to the database as a string
string dbfile =
new System.IO.FileInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).DirectoryName +
"\\Database\\GameData.sdf";
//connect to the database
using (SqlCeConnection cntn = new SqlCeConnection("datasource=" + dbfile))
{
//create an adapter to pull all data from the table
using (SqlCeDataAdapter adpt = new SqlCeDataAdapter
("SELECT * FROM " + tableName + " WHERE Name LIKE '%" + itemName + "%'", cntn))
{
//put the data into a DataSet
adpt.Fill(data);
cntn.Close();
}
//fill the data from the Items table into a DataTable to return.
DataTable itemTable = data.Tables[0];
DataRow a = itemTable.Rows[0];
val = (short)a.ItemArray[3] - 1;
dbfile = "";
data.Dispose();
itemTable.Dispose();
SqlCeCommand cmd = new SqlCeCommand();
cmd.Connection = cntn;
cntn.Open();
cmd.CommandType = CommandType.Text;
cmd.CommandText = "UPDATE " + tableName + " SET [In Stock] = #Value WHERE [Name] = '#ItemName'";
//cmd.Parameters.Add("#Value", SqlDbType.SmallInt);
//cmd.Parameters["#Value"].Value = val;
//cmd.Parameters.Add("#ItemName", SqlDbType.NChar, 75);
//cmd.Parameters["#ItemName"].Value = itemName;
cmd.Parameters.AddWithValue("#Value", val);
cmd.Parameters.AddWithValue("#ItemName", itemName);
cmd.ExecuteNonQuery();
//close the conenction
cntn.Close();
cmd.Dispose();
}
}
Any ideas to get it to actually update?
Just a hunch (can't corroborate this on msdn): could it be that using nchar(75) adds spaces to the parameter, thereby causing the WHERE clause to fail?

how to reference columns in sql server connection in C#

I'm trying to translate some perl code into C# and I'm having some trouble with the following.
After establishing a sql server connection and executing the select statement, how do I reference the different elements in the table columns. For example, in Perl it looks like:
my $dbh = DBI -> connect( NAME, USR, PWD )
or die "Failed to connect to database: " . DBI->message;
my $dbname = DB_NAME;
my $dbschema = DB_SCHEMA;
my $sql = qq{select a,b,c,d,e,f,g,h,i,...
from $dbname.$dbschema.package p
join $dbname.$dbschema.package_download pd on p.package_id = pd.package_id
join $dbname.$dbschema.download d on pd.download_id = d.download_id
where p.package_name = '$package'
--and ds.server_address like 'tcp/ip'
order by a,b,c,d,..};
my $sth = $dbh -> prepare( $sql )
or die "Failed to prepare statement: " . $dbh->message;
$sth -> execute()
or die "Failed to execute statement: " . $sth->message;
#now to go through each row in result table
while ( #data = $sth->fetchrow_array() )
{
print "$data[0]";
# If source server FTP is not already open, make new FTP
if ( $data[0] != $src_id )
{
if ( $src_ftp )
{ $src_ftp -> quit; }
$src_ftp = make_ftp( $data[1], $data[2], $data[3], $data[18], $data[19], $data[20] );
$src_id = $data[0];
}
}
so far I've got it down to
string db = NAME;
string myConnectionString = "Data Source=ServerName;" + "Initial Catalog=" + db + "User id=" + ODBC_USR + "Password=" + PWD
SqlConnection myConnection = new SqlConnection(myConnectionString);
string myInsertQuery = "select a,b,c,d,e,f,g,h,i,...
from $dbname.$dbschema.package p
join $dbname.$dbschema.package_download pd on p.package_id = pd.package_id
join $dbname.$dbschema.download d on pd.download_id = d.download_id
where p.package_name = '$package'
--and ds.server_address like 'tcp/ip'
order by a,b,c,d,..";
SqlCommand myCommand = new SqlCommand(myInsertQuery);
myCommand.Connection = myConnection;
myConnection.Open();
myCommand.ExecuteNonQuery();
myCommand.Connection.Close();
but how do I reference the columns like data[0] and data[1] in C#. Sorry I'm new to both languages so my background is severely lacking. Thanks!
You could reference your column directly by its column name or by numeric order (it starts with 0 as the first column) either through a DataTable, DataSet, DataReader or a specific DataRow.
For the sake of example i'll use a DataTable here and I will name it as dt and let's say we want to reference the first row then you could reference it with the following Syntax/Format:
dt[RowNumber]["ColumnName or Column Number"].ToString();
For example:
dt[0]["a"].ToString();
Or by number the first column with be 0 like:
dt[0][0].ToString();
And use Parameters by the way because without which it would be susceptible to SQL Injection. Here's a more complete code below:
string db = NAME;
string myConnectionString = "Data Source=ServerName;" + "Initial Catalog=" + db + "User id=" + ODBC_USR + "Password=" + PWD
using (SqlConnection connection = new SqlConnection(myConnectionString))
{
string mySelectQuery = #"SELECT a,b,c,d,e,f,g,h,i,...
FROM package p
JOIN package_download pd on p.package_id = pd.package_id
join download d on pd.download_id = d.download_id
WHERE p.package_name = #PackageName
AND ds.server_address LIKE 'tcp/ip%'
ORDER by a,b,c,d";
try
{
connection.Open();
using (SqlDataAdapter da = new SqlDataAdapter(mySelectQuery, connection))
{
using (SqlCommand cmd = new SqlCommand())
{
da.SelectCommand.Parameters.AddWithValue("#PackageName", txtPackage.Text);
DataTable dt = new DataTable();
da.Fill(dt);
if (dt.Rows.Count>0) // Make sure there is something in your DataTable
{
String aVal = dt[0]["a"].ToString();
String bVal = dt[0]["b"].ToString();
// You'll be the one to fill up
}
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
I change your LIKE 'tcp/ip' to LIKE 'tcp/ip%' by the way which is the more appropriate one of using LIKE.
you can use ado.net entity data table to reference the tables in your sql server. I don't know if you're asking exactly this but it may help. because direct referencing to sql server is not possible as far as i know.

SQLite, Copy DataSet / DataTable to DataBase file

I have filled a DataSet with a Table that was created from another database file. The table is NOT in the database file which I want to be able to copy the Table to.
Now I want to save all those records (DataTable) to a newly created SQLite database file...
How can i do that?
Also I really want to avoid loops if this is possible.
The best answer is by me :) so i'll share it.This is loop but writes 100k entries in 2-3secs.
using (DbTransaction dbTrans = kaupykliuduomConn.BeginTransaction())
{
downloadas.Visible = true; //my progressbar
downloadas.Maximum = dataSet1.Tables["duomenys"].Rows.Count;
using (DbCommand cmd = kaupykliuduomConn.CreateCommand())
{
cmd.CommandText = "INSERT INTO duomenys(Barkodas, Preke, kiekis) VALUES(?,?,?)";
DbParameter Field1 = cmd.CreateParameter();
DbParameter Field2 = cmd.CreateParameter();
DbParameter Field3 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
cmd.Parameters.Add(Field2);
cmd.Parameters.Add(Field3);
while (n != dataSet1.Tables["duomenys"].Rows.Count)
{
Field1.Value = dataSet1.Tables["duomenys"].Rows[n]["Barkodas"].ToString();
Field2.Value = dataSet1.Tables["duomenys"].Rows[n]["Preke"].ToString();
Field3.Value = dataSet1.Tables["duomenys"].Rows[n]["kiekis"].ToString();
downloadas.Value = n;
n++;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
In this case dataSet1.Tables["duomenys"] is already filled with all the data i need to transfer to another database. I used loop to fill dataset too.
When you load the DataTable from the source database, set the AcceptChangesDuringFill property of the data adapter to false, so that loaded records are kept in the Added state (assuming that the source database is SQL Server)
var sqlAdapter = new SqlDataAdapter("SELECT * FROM the_table", sqlConnection);
DataTable table = new DataTable();
sqlAdapter.AcceptChangesDuringFill = false;
sqlAdapter.Fill(table);
Create the table in the SQLite database, by executing the CREATE TABLE statement directly with SQLiteCommand.ExecuteNonQuery
Create a new DataAdapter for the SQLite database connection, and use it to Update the db:
var sqliteAdapter = new SQLiteDataAdapter("SELECT * FROM the_table", sqliteConnection);
var cmdBuilder = new SQLiteCommandBuilder(sqliteAdapter);
sqliteAdapter.Update(table);
If the source and target tables have the same column names and compatible types, it should work fine...
The way to import SQL data to SQLite will take long time. When you want to import data in millions, It will take lot of time. So the shortest and easiest way to do that is just fill fetch the data from SQL database in a DataTable and insert all its rows to SQLite database.
public bool ImportDataToSQLiteDatabase(string Proc, string SQLiteDatabase, params object[] obj)
{
DataTable result = null;
SqlConnection conn = null;
SqlCommand cmd = null;
try
{
result = new DataTable();
using (conn = new SqlConnection(ConStr))
{
using (cmd = CreateCommand(Proc, CommandType.StoredProcedure, obj))
{
cmd.Connection = conn;
conn.Open();
result.Load(cmd.ExecuteReader());
}
}
using (SQLiteConnection con = new SQLiteConnection(string.Format("Data Source={0};Version=3;New=False;Compress=True;Max Pool Size=100;", SQLiteDatabase)))
{
con.Open();
using (SQLiteTransaction transaction = con.BeginTransaction())
{
foreach (DataRow row in result.Rows)
{
using (SQLiteCommand sqlitecommand = new SQLiteCommand("insert into table(fh,ch,mt,pn) values ('" + Convert.ToString(row[0]) + "','" + Convert.ToString(row[1]) + "','"
+ Convert.ToString(row[2]) + "','" + Convert.ToString(row[3]) + "')", con))
{
sqlitecommand.ExecuteNonQuery();
}
}
transaction.Commit();
new General().WriteApplicationLog("Data successfully imported.");
return true;
}
}
}
catch (Exception ex)
{
result = null;
return false;
}
finally
{
if (conn.State == ConnectionState.Open)
conn.Close();
}
}
It will take a very few time as compare to upper given answers.

Categories

Resources