SqlDataAdapter fails to get row after insert - c#

I'm trying to use SqlDataAdapter to insert a row, and then immediately get that same row. I followed the advice on this post and use SCOPE_IDENTITY, but it doesn't work. Here is my code...
using (var conn = _db.OpenConnection())
{
var sqlQuery = "SELECT * FROM RotationItem WHERE [RotationItemId] = SCOPE_IDENTITY()";
var adapter = new SqlDataAdapter(sqlQuery, conn);
var builder = new SqlCommandBuilder(adapter);
var dataSet = new DataSet();
adapter.Fill(dataSet);
Debug.WriteLine("First fill, rows " + dataSet.Tables[0].Rows.Count);
var table = dataSet.Tables[0];
table.Rows.InsertAt(table.NewRow(), 0);
CopyJsonToRow(table, 0, item);
if (adapter.Update(dataSet) != 1)
{
throw new InvalidOperationException("Insert failed");
}
// After insert, fetch the new record
dataSet.Clear();
adapter.Fill(dataSet);
Debug.WriteLine("Second fill, rows " + dataSet.Tables[0].Rows.Count);
}
My output is:
First fill, rows 0
Second fill, rows 0 <== this is NOT what I expect
Why does the second fill fail? Shouldn't it get the row that I just inserted?!
I am not using any Transactions. The definition of the table is below...
CREATE TABLE [dbo].[RotationItem] (
[RotationItemId] INT NOT NULL IDENTITY(1,1),
[RotationScheduleId] INT NOT NULL,
[QuestionnaireId] INT NOT NULL,
[Order] INT NOT NULL,
PRIMARY KEY CLUSTERED ([RotationItemId] ASC)
);

Here's what I found after several hours of messing around.
You should not use IDENT_CURRENT because it is very unreliable
You cannot use SCOPE_IDENTITY after SqlDataAdapter.Update (as in OP) because it closes the scope.
when you call new SqlCommandBuilder(adapter) it does secret voodoo to your adapter that make it impossible to customize. In particular, any changes you make to the InsertCommand are ignored.
Here is my clever work around (some might say horrible hack). This works perfectly, even it I hammer it with many concurrent clients.
using (var conn = _db.OpenConnection())
{
var sqlQuery = "SELECT * FROM RotationItem WHERE [RotationItemId] = SCOPE_IDENTITY()";
// Create a dummy adapter, so we can generate an appropriate INSERT statement
var dummy = new SqlDataAdapter(sqlQuery, conn);
var insert = new SqlCommandBuilder(dummy).GetInsertCommand();
// Append the SELECT to the end of the INSERT command,
// and set a flag to copy the result back into the dataSet table row
insert.UpdatedRowSource = UpdateRowSource.FirstReturnedRecord;
insert.CommandText += ";" + sqlQuery;
// Now proceed as usual...
var adapter = new SqlDataAdapter(sqlQuery, conn);
adapter.InsertCommand = insert;
var dataSet = new DataSet();
adapter.Fill(dataSet);
Debug.WriteLine("First fill, rows " + dataSet.Tables[0].Rows.Count);
var table = dataSet.Tables[0];
table.Rows.InsertAt(table.NewRow(), 0);
CopyJsonToRow(table, 0, item);
if (adapter.Update(dataSet) != 1)
{
throw new InvalidOperationException("Insert failed");
}
// After insert, the table AUTOMATICALLY has the new row ID (and any other computed columns)
Debug.WriteLine("The new id is " + table.Rows[0].ItemArray[0]);
}
The negative points of this hack is I need to make two SqlDataAdapters, one is a dummy just feed to the SqlCommandBuilder. Also using the string concatenation to glue two SQL queries together is very dodgy. I'm not sure if the internal security audit will allow me to do that, because of the injection issue. Anyone have a better idea?

Related

How to import only non existing row from csv and update existing rows to SQL Database C#

With the help from the sylvan.data.csv package i'm now able to apply the scheme from the sql table to the csv. Yet the following problem arises where i want to check if a row from the csv exists in the sql database. If it does, it needs to be updated and if it does not only the non existing rows need to be imported. But with a bulkcopy this is not possible.
I've got the following code:
static void LoadTableCsv(SqlConnection conn, string tableName, string csvFile)
{
// read the column schema of the target table
var cmd = conn.CreateCommand();
conn.Open();
cmd.CommandText = $"select top 0 * from {tableName}"; // beware of sql injection
var reader = cmd.ExecuteReader();
var colSchema = reader.GetColumnSchema();
reader.Close();
// apply the column schema to the csv reader.
var csvSchema = new CsvSchema(colSchema);
var csvOpts = new CsvDataReaderOptions { Schema = csvSchema };
using var csv = CsvDataReader.Create(csvFile, csvOpts);
// Initialize SqlCommand for checking if the record already exists.
using var checkCommand = new SqlCommand("SELECT COUNT(*) FROM {tablename} WHERE TicketID = #value", conn);
checkCommand.Parameters.Add("#value", SqlDbType.Int, 10, "TicketID");
using var bulkCopy = new SqlBulkCopy(conn);
bulkCopy.DestinationTableName = tableName;
bulkCopy.EnableStreaming = true;
// Iterate through the records in the CSV file.
while (csv.Read())
{
// Set the value of the "#value" parameter
checkCommand.Parameters["#value"].Value = csv["TicketID"].ToString();
// Execute the check command to see if the record already exists.
var checkResult = (int)checkCommand.ExecuteScalar();
if (checkResult == 0)
{
// The record does not exist, write it to the SQL database using SqlBulkCopy.
bulkCopy.WriteToServer(new[] { csv });
}
else
{
// The record already exists, update it using an UPDATE statement.
using var updateCommand = new SqlCommand("UPDATE {tablename} SET Column1 = #col1, Column2 = #col2, Column3 = #col3, Column4 = #col4, Column5 = #col5, Column6 = #col6 WHERE TicketID = #value", conn);
// Add parameters for each column you want to update, using the names and types of the columns in the target table.
updateCommand.Parameters.Add("#col1", SqlDbType.Int, 10, "TicketID");
updateCommand.Parameters.Add("#col2", SqlDbType.NVarChar, 50, "TicketTitle");
updateCommand.Parameters.Add("#col3", SqlDbType.NVarChar, 50, "TicketStatus");
updateCommand.Parameters.Add("#col4", SqlDbType.NVarChar, 50, "CustomerName");
updateCommand.Parameters.Add("#col5", SqlDbType.NVarChar, 50, "TechnicianFullName");
updateCommand.Parameters.Add("#col6", SqlDbType.DateTime, 50, "TicketResolvedDate");
updateCommand.Parameters.Add("#value", SqlDbType.Int, 10, "TicketID");
// Set the values of the parameters to the values in the current row of the CSV file.
updateCommand.Parameters["#col1"].Value = int.Parse(csv["TicketID"].ToString());
updateCommand.Parameters["#col2"].Value = csv["TicketTitle"].ToString();
updateCommand.Parameters["#col3"].Value = csv["TicketStatus"].ToString();
updateCommand.Parameters["#col4"].Value = csv["CustomerName"].ToString();
updateCommand.Parameters["#col5"].Value = csv["TechnicianFullName"].ToString();
updateCommand.Parameters["#col6"].Value = DateTime.Parse(csv["TicketResolvedDate"].ToString());
updateCommand.Parameters["#value"].Value = int.Parse(csv["TicketID"].ToString());
// Execute the update command.
updateCommand.ExecuteNonQuery();
}
}
conn.Close();
}
But this gives me an error cause the bulkcopy can't read only one datarow.
Siggermannen's comment was the correct suggestion. You should bulk load all of the data into a temp table, then use SQL command(s) to merge the data from the temp table into the destination table. Ideally, you'd do this with a T-Sql Merge Statement. You could also use separate update and insert statements. This requires knowledge of the table columns to create the commands to merge the data. You can do this dynamically, by reading querying the INFORMATION_SCHEMA for the tables to determine the columns and keys, and using that to dynamically construct the merge statements. Or, if you know the schema at compile time, you can hard-code the statements which will be significantly easier to develop and test.
using Sylvan.Data.Csv;
using System.Data.SqlClient;
static void LoadTableCsv(SqlConnection conn, string tableName, string csvFile)
{
// read the column schema of the target table
var cmd = conn.CreateCommand();
cmd.CommandText = $"select top 0 * from {tableName}"; // beware of sql injection
var reader = cmd.ExecuteReader();
var colSchema = reader.GetColumnSchema();
reader.Close();
// create a temp table to load the data into
// using the destination table as a template
cmd.CommandText = $"select top 0 * into #load from {tableName}";
cmd.ExecuteNonQuery();
// apply the column schema to the csv reader.
var csvSchema = new CsvSchema(colSchema);
var csvOpts = new CsvDataReaderOptions { Schema = csvSchema };
using var csv = CsvDataReader.Create(csvFile, csvOpts);
// push *all* data into the temp table
using var bulkCopy = new SqlBulkCopy(conn);
bulkCopy.DestinationTableName = "#load";
bulkCopy.EnableStreaming = true;
bulkCopy.WriteToServer(csv);
// use sql commands to "MERGE" the data into the destination table
cmd.CommandText = $"""
insert into {tableName}
select * from #load l
where not exists (select * from {tableName} d where d.Id = l.Id)
""";
cmd.ExecuteNonQuery();
}
Doing insert/update statements in a loop is what you're trying to avoid. This produces "chatty" communication with the database where performance becomes dominated by overhead of each operation. Instead, you want to keep your operations as "chunky" as possble, which SqlBulkCopy and MERGE will provide.

Bulk Update in SQL Server from C#

I have to update multiple records in SQL Server table from C#. Below are the steps I have to follow and below is the code.
The code is working but the process is taking much longer than expected.
I need a quick way to update 10000 records, not sure whether Bulk Copy would work for Update.
I have seen the other answers which has Bulk insert to temp and then update..But that update has a single statement and here I need to update the records in DB based on Excel data and for this I have to loop each excel record.So how can I achieve faster update.
1) Read the Excel Data and copied the data into a data table
string strDirectory = string. Empty;
strDirectory = System.IO.Directory.GetCurrentDirectory() + "\\" + "Filename.xlsx";
string Connection String = "Provider=Microsoft.ACE.OLEDB.12.0; Data Source = " + strDirectory + "; Extended Properties = \"Excel 12.0;HDR=YES;IMEX=1\"";
using (OleDbConnection conn = new OleDbConnection(Connection String))
{
conn.Open();
DataTable schemaTable = conn.GetOleDbSchemaTableOleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
DataRow schemaRow = schemaTable. Rows[0];
string sheet = schemaRow["TABLE_NAME"].ToString();
string query = "SELECT * FROM [" + sheet + "]";
OleDbDataAdapter daexcel = new OleDbDataAdapter(query, conn);
daexcel.Fill(dt);
conn.Close();
}
2) Doing some manipulations on datatable data before updating into table.
string strsqlst = string. Empty;
using (SqlConnection sqlConn = new SqlConnection(Connectionstring))
{
sqlConn.Open();
SqlCommand cmd;
StringBuilder sb = new StringBuilder();
sb.AppendLine("DataTable content:");
foreach (DataRow row in dt.Rows)
{
if (row.ItemArray[0].ToString() == "")
break;
strsqlst = "Update table Set col1= " + row.ItemArray[4].ToString() + " ,col2= " + row.ItemArray[5].ToString() + " where <Condition>'";
cmd = new SqlCommand(strsqlst, sqlConn);
cmd.CommandType = CommandType.Text;
cmd.ExecuteNonQuery();
}
sqlConn.Close();
}
The SqlCommand can be a whole SQL batch and is not limited to a single statement. So you can create a single large batch with 10,000 UPDATE statements, or divide it into for example 20 batches of 500 each.
In other words, you can create a single command with CommandText like this:
UPDATE [T] SET Col1='Value1', Col2='Value2' WHERE [Id] = 1;
...
UPDATE [T] SET Col1='Value999', Col2='Value1000' WHERE [Id] = 500;
That said, you should use parameters for all data values (to ensure SQL injection is not possible).
If you want to handle any errors (updates failing due to invalid data) you will need something a bit more sophisticated.

Getting concurrency error on updating record with data adapter

This is my table:
Student:StudentId int PK autoincrement,Name varchar(20)
When i am trying to update last added records then i am geting error:
Error: Concurrency violation: the UpdateCommand affected 0 of the expected 1 records.
This is my code:
using (var connection = new SqlConnection("MyConnectionstring"))
{
connection.Open();
SqlDataAdapter adapter = new SqlDataAdapter();
SqlCommandBuilder builder = new SqlCommandBuilder(adapter);
adapter.SelectCommand = new SqlCommand("select * from Student", connection);
DataTable dt = new DataTable();
adapter.Fill(dt);
DataRow row = dt.NewRow();
row["Name"] = "Abc";
dt.Rows.Add(row);
var addedRecords = dt.GetChanges(DataRowState.Added);
adapter.Update(dt);
dt.AcceptChanges();
DataRow lastRow = dt.Rows[dt.Rows.Count - 1];
row["Name"] = "Pqr";
adapter.Update(dt); //Error Here
connection.Close();
}
Can anybody please tell me why this is happening and what can be the workaround for this problem??
As described in the Generating Commands with CommandBuilders MSDN topic, the automatic commands generated by the command builders do not retrieve the identity fields for the inserted records:
You might want to map output parameters back to the updated row of a DataSet. One common task would be retrieving the value of an automatically generated identity field or time stamp from the data source. The DbCommandBuilder will not map output parameters to columns in an updated row by default. In this instance you must specify your command explicitly.
Looking at Retrieving Identity or Autonumber Values topic, it turns out that basically you need to generate the insert command manually.
Here is how you can do that for your table (see the comments inside the code):
using (var connection = new SqlConnection("MyConnectionstring"))
{
connection.Open();
// Create data adapter with the specified SelectCommand
var adapter = new SqlDataAdapter("select * from Student", connection);
// Build InsertCommand
var insertCommand = new SqlCommand(
"insert into Student (Name) values (#Name) SET #Id = SCOPE_IDENTITY()",
connection);
insertCommand.Parameters.Add("#Name", SqlDbType.VarChar, 20, "Name");
var parameter = insertCommand.Parameters.Add("#Id", SqlDbType.Int, 0, "StudentId");
parameter.Direction = ParameterDirection.Output;
insertCommand.UpdatedRowSource = UpdateRowSource.OutputParameters;
adapter.InsertCommand = insertCommand;
// Auto build outher commands
var builder = new SqlCommandBuilder(adapter);
// Read the data
var dt = new DataTable();
adapter.Fill(dt);
// Insert a new record
var row = dt.NewRow();
row["Name"] = "Abc";
dt.Rows.Add(row);
adapter.Update(dt);
// Update the just inserted record
row["Name"] = "Pqr";
adapter.Update(dt);
connection.Close();
}
When you do the first update, the row is written to the database and it gets a primary key per the AUTOINCREMENT. However, the row inside the DataTabledoes not reflect the changed ID. Therefore, when you try to do the second update, it can't find the row you're intending to update (the ID in the data table doesn't match the ID in the database). Consequently, you get a concurrency error.
That's why, in order to get the ID into the DataTable, you will need to refresh the contents of the DataTable before doing the second update.
To refresh the DataTable, call:
adapter.Fill()
For more information, read Merging DataSet Contents.

How do I commit the changes I have made to a DataTable to the Table from which I grabbed it?

I'm working on a minimalist database interface that grabs all relevant information from all relevant tables and stores that information in data tables locally so that the information can be modified, changed, updated, and what not, as needed without maintaining a constant connection to the database.
The method works perfectly, all changes are reflected and stored in the table, and that's great.
Now I need to know how to go about reflecting the changes made locally in the data tables to the database that I grabbed the tables from.
Everything I've read seems to me to say "You need to maintain a connection to do that" but because it may take some amount of time to enact the changes to the datatables, and I've heard it's best practice to just establish a connection, do what you need to, and then get out, I'm seeing a kind of conflict here.
The articles I'm referencing are this one, and this one.
This is the code for getting the table(s) :
public static DataTable GetTable( string Table ) {
string Query = "SELECT * FROM " + Table;
return SQLLib.GetDataTable( Query, null );
}
private static DataTable GetDataTable( string CMD, object[] Params ) {
DataSet DS = new DataSet( );
using ( MySqlConnection MSQCon = new MySqlConnection( SQLLib.sqlconstr ) ) {
try { MSQCon.Open( ); } catch ( MySqlException ) {
Console.WriteLine( "Failed to open SQL Connection" );
}
MySqlDataAdapter MSQDA = new MySqlDataAdapter( );
MySqlCommand MSQCom = new MySqlCommand( CMD, MSQCon );
if (Params != null) for ( int c = 0; c < Params.Length; c++ )
MSQCom.Parameters.AddWithValue( "#param_val_" + ( c + 1 ), Params[c] );
MSQCom.CommandType = CommandType.Text;
MSQDA.SelectCommand = MSQCom;
MSQDA.Fill( DS );
MSQCon.Close( );
} try { return DS.Tables[0];
} catch ( IndexOutOfRangeException ) { return SQLLib.GetDataTable( CMD, Params ); }
}
So, now, is there a method through which I can update the source table with the DataTable I keep locally until it's ready to be committed?
It is not about keeping the connection open, you just have to use the Command Builder, it's the same with MySql I believe.
private MySqlDataAdapter adapt;
private DataSet someDataSet;
someDataSet = new DataSet();
public DataSet GetCustomerData(int customerId)
{
using(MySqlConnection connect = new MySqlConnection(ConnString))
{
connect.Open();
MySqlCommand comm = new MySqlCommand("SELECT * FROM customers WHERE Id = #0", connect);
someDataSet.Tables.Add("CustomersTable");
comm.Parameters.AddWithValue("#0", customerId);
adapt.SelectCommand = comm;
adapt.Fill(someDataSet.Tables["CustomersTable"]);
}
return someDataSet;
}
Now for the updating:
you could use a new adapter as well, but then you have to give it a select command, based on that the commandbuilder will make the Insert,Update and Delete commands.
public void UpdateTable(DataTable table, int customerId)
{
using (MySqlConnection connect = new MySqlConnection(ConnString))
{
connect.Open();
MySqlCommandBuilder commbuilder = new MySqlCommandBuilder(adapt);
adapt.SelectCommand = new MySqlCommand("SELECT * FROM customers WHERE Id = "+customerId, connect); //or use parameters.addwithvalue
adapt.Update(table);
}
}
You should create a MySqlCommand for updating your database and set it as your MySqlDataAdapter's UpdateCommand property value. Exactly as you do this for the SelectCommand.
When you are ready to commit, you just call the Update() method of your MySqlDataAdapter.

C# sql IF NOT EXIST statement not working?

Not sure if this is written correctly but it looks correct. I am wanting to update a record if the id already exists and insert if not.
DataSet ds = new DataSet();
ds.ReadXml(XDocument.Load(Application.StartupPath + #"\xml1.xml").CreateReader());
using (var conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" + Application.StartupPath + "\\Database3.mdb"))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand("Insert INTO Table1 (description, active) VALUES (#iq_question,#active);", conn);
var commUpdate = new OleDbCommand("UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#question_id;", conn);
// here add your parameters with no value
//string question_id = row[0].ToString();
//string iq_question = row[1].ToString();
//string active = row[4].ToString();
commInsert.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.VarChar));
commInsert.Parameters.Add(new OleDbParameter("#active", OleDbType.VarChar));
commUpdate.Parameters.Add(new OleDbParameter("#question_id", OleDbType.AutoNumber));
commUpdate.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.Text));
commUpdate.Parameters.Add(new OleDbParameter("#active", OleDbType.Text));
foreach (DataTable table in ds.Tables)
{
foreach (DataRow row in table.Rows)
{
// here only reset the values
commUpdate.Parameters["#question_id"].Value = row[0].ToString();
commUpdate.Parameters["#iq_question"].Value = row[1].ToString();
commUpdate.Parameters["#active"].Value = row[4].ToString();
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters["#iq_question"].Value = row[1].ToString();
commInsert.Parameters["#active"].Value = row[4].ToString();
commInsert.ExecuteNonQuery();
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
conn.Close();
}
System.Windows.Forms.MessageBox.Show("Updated Latest Data Was Succesfull");
I either get an error on the insert saying it will create duplicate content, or it creates more rows with different data. So say I should be getting 10 rows from the xml file, the first time I run it I get the 10 rows with the correct data. If I run it again, I end up with 10 more so being 20 but the last 10 rows show different data. I don't think I am identifying the rows in the xml file correctly and I need to do some research on that part.
There is no Exists for MS Access. The engine is much more primitive than Sql Server. See here: Microsoft Access SQL. I think, what you can do is:
myCommand.CommandText = "UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#currentRow";
......
int recs = myCommand.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
myCommand.Parameters.Clear();
myCommand.CommandText = "Insert INTO Table1 VALUES(#iq_question,#active)";
.....
}
This is still 2 statements but you can save some coding by not doing Select first. Because ExecuteNonQuery will tell you if you updated anything
Another thing is that your code is a bit inefficient. You have nested loop where you can reuse same command and connection. Yuu can do this
using (var conn = new OleDbConnection(.......))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand(.....);
var commUpdate = new OleDbCommand(.....);
// here add your parameters with no value
commInsert.Parameters.Add(new OleDbParameter(....));
.......
Foreach (....)
{
Foreach (....)
{
// here only reset the values
commUpdate.Parameters[0].Value = ...
...
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters[0].Value = iq_question;
.....
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
}
You can also use nested using for commands. Only setting values will be more efficient to what you do right now.

Categories

Resources