Not sure if this is written correctly but it looks correct. I am wanting to update a record if the id already exists and insert if not.
DataSet ds = new DataSet();
ds.ReadXml(XDocument.Load(Application.StartupPath + #"\xml1.xml").CreateReader());
using (var conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" + Application.StartupPath + "\\Database3.mdb"))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand("Insert INTO Table1 (description, active) VALUES (#iq_question,#active);", conn);
var commUpdate = new OleDbCommand("UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#question_id;", conn);
// here add your parameters with no value
//string question_id = row[0].ToString();
//string iq_question = row[1].ToString();
//string active = row[4].ToString();
commInsert.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.VarChar));
commInsert.Parameters.Add(new OleDbParameter("#active", OleDbType.VarChar));
commUpdate.Parameters.Add(new OleDbParameter("#question_id", OleDbType.AutoNumber));
commUpdate.Parameters.Add(new OleDbParameter("#iq_question", OleDbType.Text));
commUpdate.Parameters.Add(new OleDbParameter("#active", OleDbType.Text));
foreach (DataTable table in ds.Tables)
{
foreach (DataRow row in table.Rows)
{
// here only reset the values
commUpdate.Parameters["#question_id"].Value = row[0].ToString();
commUpdate.Parameters["#iq_question"].Value = row[1].ToString();
commUpdate.Parameters["#active"].Value = row[4].ToString();
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters["#iq_question"].Value = row[1].ToString();
commInsert.Parameters["#active"].Value = row[4].ToString();
commInsert.ExecuteNonQuery();
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
conn.Close();
}
System.Windows.Forms.MessageBox.Show("Updated Latest Data Was Succesfull");
I either get an error on the insert saying it will create duplicate content, or it creates more rows with different data. So say I should be getting 10 rows from the xml file, the first time I run it I get the 10 rows with the correct data. If I run it again, I end up with 10 more so being 20 but the last 10 rows show different data. I don't think I am identifying the rows in the xml file correctly and I need to do some research on that part.
There is no Exists for MS Access. The engine is much more primitive than Sql Server. See here: Microsoft Access SQL. I think, what you can do is:
myCommand.CommandText = "UPDATE Table1 SET description=#iq_question,active=#active WHERE ID=#currentRow";
......
int recs = myCommand.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
myCommand.Parameters.Clear();
myCommand.CommandText = "Insert INTO Table1 VALUES(#iq_question,#active)";
.....
}
This is still 2 statements but you can save some coding by not doing Select first. Because ExecuteNonQuery will tell you if you updated anything
Another thing is that your code is a bit inefficient. You have nested loop where you can reuse same command and connection. Yuu can do this
using (var conn = new OleDbConnection(.......))
{
conn.Open();
// make two commands here
var commInsert = new OleDbCommand(.....);
var commUpdate = new OleDbCommand(.....);
// here add your parameters with no value
commInsert.Parameters.Add(new OleDbParameter(....));
.......
Foreach (....)
{
Foreach (....)
{
// here only reset the values
commUpdate.Parameters[0].Value = ...
...
int recs = commUpdate.ExecuteNonQuery();
if (recs < 1) // when no records updated do insert
{
commInsert.Parameters[0].Value = iq_question;
.....
}
}
}
commInsert.Dispose();
commUpdate.Dispose();
}
You can also use nested using for commands. Only setting values will be more efficient to what you do right now.
Related
I'm trying to use SqlDataAdapter to insert a row, and then immediately get that same row. I followed the advice on this post and use SCOPE_IDENTITY, but it doesn't work. Here is my code...
using (var conn = _db.OpenConnection())
{
var sqlQuery = "SELECT * FROM RotationItem WHERE [RotationItemId] = SCOPE_IDENTITY()";
var adapter = new SqlDataAdapter(sqlQuery, conn);
var builder = new SqlCommandBuilder(adapter);
var dataSet = new DataSet();
adapter.Fill(dataSet);
Debug.WriteLine("First fill, rows " + dataSet.Tables[0].Rows.Count);
var table = dataSet.Tables[0];
table.Rows.InsertAt(table.NewRow(), 0);
CopyJsonToRow(table, 0, item);
if (adapter.Update(dataSet) != 1)
{
throw new InvalidOperationException("Insert failed");
}
// After insert, fetch the new record
dataSet.Clear();
adapter.Fill(dataSet);
Debug.WriteLine("Second fill, rows " + dataSet.Tables[0].Rows.Count);
}
My output is:
First fill, rows 0
Second fill, rows 0 <== this is NOT what I expect
Why does the second fill fail? Shouldn't it get the row that I just inserted?!
I am not using any Transactions. The definition of the table is below...
CREATE TABLE [dbo].[RotationItem] (
[RotationItemId] INT NOT NULL IDENTITY(1,1),
[RotationScheduleId] INT NOT NULL,
[QuestionnaireId] INT NOT NULL,
[Order] INT NOT NULL,
PRIMARY KEY CLUSTERED ([RotationItemId] ASC)
);
Here's what I found after several hours of messing around.
You should not use IDENT_CURRENT because it is very unreliable
You cannot use SCOPE_IDENTITY after SqlDataAdapter.Update (as in OP) because it closes the scope.
when you call new SqlCommandBuilder(adapter) it does secret voodoo to your adapter that make it impossible to customize. In particular, any changes you make to the InsertCommand are ignored.
Here is my clever work around (some might say horrible hack). This works perfectly, even it I hammer it with many concurrent clients.
using (var conn = _db.OpenConnection())
{
var sqlQuery = "SELECT * FROM RotationItem WHERE [RotationItemId] = SCOPE_IDENTITY()";
// Create a dummy adapter, so we can generate an appropriate INSERT statement
var dummy = new SqlDataAdapter(sqlQuery, conn);
var insert = new SqlCommandBuilder(dummy).GetInsertCommand();
// Append the SELECT to the end of the INSERT command,
// and set a flag to copy the result back into the dataSet table row
insert.UpdatedRowSource = UpdateRowSource.FirstReturnedRecord;
insert.CommandText += ";" + sqlQuery;
// Now proceed as usual...
var adapter = new SqlDataAdapter(sqlQuery, conn);
adapter.InsertCommand = insert;
var dataSet = new DataSet();
adapter.Fill(dataSet);
Debug.WriteLine("First fill, rows " + dataSet.Tables[0].Rows.Count);
var table = dataSet.Tables[0];
table.Rows.InsertAt(table.NewRow(), 0);
CopyJsonToRow(table, 0, item);
if (adapter.Update(dataSet) != 1)
{
throw new InvalidOperationException("Insert failed");
}
// After insert, the table AUTOMATICALLY has the new row ID (and any other computed columns)
Debug.WriteLine("The new id is " + table.Rows[0].ItemArray[0]);
}
The negative points of this hack is I need to make two SqlDataAdapters, one is a dummy just feed to the SqlCommandBuilder. Also using the string concatenation to glue two SQL queries together is very dodgy. I'm not sure if the internal security audit will allow me to do that, because of the injection issue. Anyone have a better idea?
I have a SQL Server database table EmpDetails and it contains two row values. I want to save these values into another table. but as per my code I can get only first row value. How to get all rows values from my EmpDetails table.
My sample code is:
SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["ApplicationServices"].ConnectionString);
SqlCommand cmd = new SqlCommand ("select e.MachID,e.Name,e.EmpCode,T.TypeName from EmpDetails e,EmpType t where e.EtypeID=t.EtypeID and e.Deptid='" + dddep.SelectedItem.Value + "'",conn);
SqlDataAdapter sda = new SqlDataAdapter(cmd);
DataSet ds = new DataSet();
sda.Fill(ds);
string name = Convert.ToString(ds.Tables[0].Rows[0]["Name"]);
string code = Convert.ToString(ds.Tables[0].Rows[0]["EmpCode"]);
string type = Convert.ToString(ds.Tables[0].Rows[0]["TypeName"]);
You could try to use a loop because currently you are getting values from the first row when you say index [0].
Example:
//let n be the total rows
for(int i = 0; i < n; i++)
{
string name = Convert.ToString(ds.Tables[0].Rows[i]["Name"]);
string code = Convert.ToString(ds.Tables[0].Rows[i]["EmpCode"]);
string type = Convert.ToString(ds.Tables[0].Rows[i]["TypeName"]);
}
Hope it helps
What you can do is iterate the Rows in each Table using foreach. Since Tables[0].Rows return a DataRowCollection, you can directly put the rows in a loop:
var rows = ds.Tables[0].Rows;
foreach (DataRow row in rows)
{
var name = row["Name"];
var code= row["EmpCode"];
var type= row["TypeName"];
}
And you can use a collection to store the values for each column for each row, maybe like a list that uses a simple data model.
Hope that works!
if (ds !=null && ds.Tables.count > 0 ) {
foreach (row in ds.Tables["EmpDetails"].Rows)
{
var name = row["Name"];
var EmpCode= row["EmpCode"];
var TypeName= row["TypeName"];
}
}
I have to insert 90Mb of data into MySql Tables and I'm using INSERT IGNORE command to avoid the exception of duplicate key. The performance are 8 records a second but it seems very slow. Could I fast it up?
p.s. I'm inserting record per record since i read data from an sql compact database
using (SqlCeConnection sqlConnection = new SqlCeConnection(connectionstrCe))
{
sqlConnection.Open();
SqlCeCommand cmdCe = sqlConnection.CreateCommand();
{
{
mySQLConnection.Open();
foreach (KeyValuePair<string, List<string>> t in tablesCeNames) //reading the tables property from the dictionary - column names and datatypes
{
string tableData = t.Key;
List<string> columnData = t.Value;
//get the values from the table I want to transfer the data
cmdText = "SELECT * FROM " + tableData;
//compose the mysql command
cmdCe.CommandText = cmdText;
SqlCeDataReader dataReader = cmdCe.ExecuteReader(); //read
//InsertTable is a method that get the datareader and convert all the data from this table in a list array with the values to insert
inputValues = InsertTables(dataReader);
MySql.Data.MySqlClient.MySqlTransaction transakcija;
transakcija = mySQLConnection.BeginTransaction();
worker.ReportProgress(4, inputValues.Count);
foreach (string val in inputValues)//foreach row of values of the data table
{
cmdSqlText = "INSERT IGNORE INTO " + tableData + "("; //compose the command for sql
foreach (string cName in columnData) //forach column in a table
{
string[] data = cName.Split(' ');
if (!data[0].ToString().Equals("Id"))
{
cmdSqlText += data[0].ToString() + ","; //write the column names of the values that will be inserted
}
}
cmdSqlText = cmdSqlText.TrimEnd(',');
cmdSqlText += ") VALUES (";
//val contains the values of this current record that i want to insert
cmdSqlText += val; //final command with insert ignore and the values of one record
if (!val.Equals(""))
{
try
{
new MySql.Data.MySqlClient.MySqlCommand(cmdSqlText, mySQLConnection, transakcija).ExecuteNonQuery(); //execute insert on sql database
WriteToTxt("uspješno upisano u mysql" + t.Key);
}
catch (MySql.Data.MySqlClient.MySqlException sqlEx)
{
}
}
}
if (TablicaSveOK)
{
transakcija.Commit();
}
else
{
transakcija.Rollback();
}
}
}
if (mySQLConnection.State != System.Data.ConnectionState.Closed)
{
mySQLConnection.Close();
}
}
What about getting the data from Sql to a file and use LOAD DATA?
http://dev.mysql.com/doc/refman/5.0/es/load-data.html
Rather then sending multiple calls you can send one call to insert all records. Insert it like this
insert into your table(col1, col2)
SELECT 'Name1', 'Location1'
UNION ALL
SELECT 'Name2', 'Location2'
UNION ALL
SELECT 'Name3', 'Location3'
A part from this it is possible that your code is the bottle neck rather than the insert statement. So i would recommend you to first check where the problem lies and then go for the solution.
The latest MySql.Connector has a class called MySqlBulkLoader that could be used to call the LOAD DATA syntax of MySql in a more NET oriented way. The class requires a comma separated value file to load from and is extremely fast.
So your job would be to read all of your Sql Compact records in a datatable and then, using a streamwriter or a specialized CSV Writer write down everything in a file.
Then the code to load your data in a MySql table is simple like this
string connStr = "server=localhost;user=root;database=........";
using(MySqlConnection conn = new MySqlConnection(connStr))
{
MySqlBulkLoader bl = new MySqlBulkLoader(conn);
bl.TableName = "yourdestinationtable";
bl.FieldTerminator = "\t";
bl.LineTerminator = "\n";
bl.FileName = "path_to_your_comma_separated_value_file";
try
{
conn.Open();
int count = bl.Load();
}
}
I am using System.Data.SQLite for my database, and my select statements are very slow. It takes around 3-5 minutes to query around 5000 rows of data. Here is the code I am using:
string connectionString;
connectionString = string.Format(#"Data Source={0}", documentsFolder + ";Version=3;New=False;Compress=True;");
//Open a new SQLite Connection
SQLiteConnection conn = new SQLiteConnection(connectionString);
conn.Open();
SQLiteCommand cmd = new SQLiteCommand();
cmd.Connection = conn;
cmd.CommandText = "Select * From urls";
//Assign the data from urls to dr
SQLiteDataReader dr = cmd.ExecuteReader();
SQLiteCommand com = new SQLiteCommand();
com.CommandText = "Select * From visits";
SQLiteDataReader visit = com.ExecuteReader();
List<int> dbID2 = new List<int>();
while (visit.Read())
{
dbID2.Add(int.Parse(visit[1].ToString()));
}
//Read from dr
while (dr.Read())
{
string url = dr[1].ToString();
string title = dr[2].ToString();
long visitlong = Int64.Parse(dr[5].ToString());
string browser = "Chrome";
int dbID = int.Parse(dr[0].ToString());
bool exists = dbID2.Any(item => item == dbID);
int frequency = int.Parse(dr["visit_count"].ToString());
bool containsBoth = url.Contains("file:///");
if (exists)
{
if (containsBoth == false)
{
var form = Form.ActiveForm as TestURLGUI2.Form1;
URLs.Add(new URL(url, title, browser, visited, frequency));
Console.WriteLine(String.Format("{0} {1}", title, browser));
}
}
}
//Close the connection
conn.Close();
And here is another example that takes long:
IEnumerable<URL> ExtractUserHistory(string folder, bool display)
{
// Get User history info
DataTable historyDT = ExtractFromTable("moz_places", folder);
// Get visit Time/Data info
DataTable visitsDT = ExtractFromTable("moz_historyvisits",
folder);
// Loop each history entry
foreach (DataRow row in historyDT.Rows)
{
// Select entry Date from visits
var entryDate = (from dates in visitsDT.AsEnumerable()
where dates["place_id"].ToString() == row["id"].ToString()
select dates).LastOrDefault();
// If history entry has date
if (entryDate != null)
{
// Obtain URL and Title strings
string url = row["Url"].ToString();
string title = row["title"].ToString();
int frequency = int.Parse(row["visit_count"].ToString());
string visit_type;
//Add a URL to list URLs
URLs.Add(new URL(url, title, browser, visited, frequency));
// Add entry to list
// URLs.Add(u);
if (title != "")
{
Console.WriteLine(String.Format("{0} {1}", title, browser));
}
}
}
return URLs;
}
DataTable ExtractFromTable(string table, string folder)
{
SQLiteConnection sql_con;
SQLiteCommand sql_cmd;
SQLiteDataAdapter DB;
DataTable DT = new DataTable();
// FireFox database file
string dbPath = folder + "\\places.sqlite";
// If file exists
if (File.Exists(dbPath))
{
// Data connection
sql_con = new SQLiteConnection("Data Source=" + dbPath +
";Version=3;New=False;Compress=True;");
// Open the Connection
sql_con.Open();
sql_cmd = sql_con.CreateCommand();
// Select Query
string CommandText = "select * from " + table;
// Populate Data Table
DB = new SQLiteDataAdapter(CommandText, sql_con);
DB.Fill(DT);
// Clean up
sql_con.Close();
}
return DT;
}
Now, how can I optimize these so that they are faster?
In addition to moving more of the data aggregation to SQL as joins, you might also consider getting your SQLiteDataReader to provide the data types instead of always parsing the values.
For example, you have the line:
long visitlong = Int64.Parse(dr[5].ToString());
dr[5] is a Sqlite value which you are first converting to a string, then parsing it to a long. These parse operations take time. Why not instead do:
long visitlong = dr.GetInt64(5);
Or:
long visitlong = dr.GetInt64(dr.GetOrdinal("columnName"));
Check out the various methods that SqliteDataReader offers and utilize them whenever possible instead of parsing values.
Edit:
Note that this requires the data be stored as the correct type. If everything in the database is stored as a string, some parsing will be unavoidable.
Make sure that you've recently run the SQL command "ANALYZE {db|table|index};".
I recently ran into a situation where queries were running fast (<1 sec) in my ER software (Navicat), ie: not debugging, but they were very slow (>1 min) debugging in Visual Studio. It turned out that because I did my database design in Navicat (SQLite v3.7), the statistics were not the same as those used by System.Data.SQLite in Visual Studio (v3.8). Running "ANALYZE;" on the entire database file from Visual Studio updated the [sqlite_statX] tables used by v3.8. Both places were the same speed after that.
I have a DataTable with a few records. I want to insert all those records into a remote database. What would be the easiest way to do it? I read that most people iterate over the rows of the DataTable and insert record by record. I would like to make just 1 connection to the remote server and do a bulk insert. Is it possible? I am using C# and MySQL.
Although Kemal Taskin's answer is an elegant solution it is horrible on performance with a large DataTable.
I tried it with a 37500 record insert and it took over 15 minutes.
It seems to be inserting one record at a time.
I found that if I generate a MySQL insert statement string with 1000 records in it and loop over the data until its complete I have reduced my insert time down to 6 seconds. It's not BULK LOADING, its CHUNK LOADING. If anyone can come up with a better solution, please let me know.
public void writeToDBTable(DataTable dt)
{
MySqlConnection conn = new MySqlConnection(globalClass.connString);
conn.Open();
String sql = null;
String sqlStart = "insert into MyTable (run_id, model_id, start_frame,water_year, state_id, obligateCover, DTWoodyCover, perennialGrowth, clonalCover) values ";
Console.WriteLine("Write to DB - Start. Records to insert = {0}", dt.Rows.Count);
int x = 0;
foreach (DataRow row in dt.Rows)
{
x += 1;
if (x == 1)
{
sql = String.Format(#"({0},{1},{2},{3},{4},{5},{6},{7},{8})",
row["runId"],
row["modelId"],
row["startFrame"],
row["waterYear"],
row["currentFrame"],
row["obligateCover"],
row["DTWoodyCover"],
row["perennialGrowth"],
row["clonalCover"]
);
}
else
{
sql = String.Format(sql + #",({0},{1},{2},{3},{4},{5},{6},{7},{8})",
row["runId"],
row["modelId"],
row["startFrame"],
row["waterYear"],
row["currentFrame"],
row["obligateCover"],
row["DTWoodyCover"],
row["perennialGrowth"],
row["clonalCover"]
);
}
if (x == 1000)
{
try
{
sql = sqlStart + sql;
MySqlCommand cmd = new MySqlCommand(sql, conn);
cmd.ExecuteNonQuery();
Console.WriteLine("Write {0}", x);
x = 0;
}
catch (Exception ex)
{
Console.WriteLine(sql);
Console.WriteLine(ex.ToString());
}
}
}
// get any straglers
if (x > 0)
{
try
{
sql = sqlStart + sql;
MySqlCommand cmd = new MySqlCommand(sql, conn);
cmd.ExecuteNonQuery();
Console.WriteLine("Write {0}", x);
x = 0;
}
catch (Exception ex)
{
Console.WriteLine(sql);
Console.WriteLine(ex.ToString());
}
}
conn.Close();
Console.WriteLine("Write to DB - End.");
}
I don't know whether this answer is too late or not :)
You can do something like this:
// assume you have a table with one column;
string commandText = "insert into t_test1 (myid) values (#tempid)";
using (MySqlConnection cn = new MySqlConnection(myConnectionString))
{
cn.Open();
using (MySqlCommand cmd = new MySqlCommand(commandText, cn))
{
cmd.UpdatedRowSource = UpdateRowSource.None;
cmd.Parameters.Add("?tempid", MySqlDbType.UInt32).SourceColumn = "tempid";
MySqlDataAdapter da = new MySqlDataAdapter();
da.InsertCommand = cmd;
// assume DataTable dt contains one column with name "tempid"
int records = da.Update(dt);
}
cn.Close();
}
For Kemal Taşkın solution, RowState set must be equal to DataRowState.Added.
If it is not the case do this :
foreach (DataRow row in dt.Rows)
row.SetAdded();
For Mr.Black, it is recommended to use sql parameter and not use data value directly.
When importing data into InnoDB, turn off autocommit mode, because it performs a log flush to disk for every insert. To disable autocommit during your import operation, surround it with SET autocommit and COMMIT statements:
SET autocommit=0;
... SQL import statements ...
COMMIT;
Performance test : insertion of 5400 rows in 2 tables
insertion from CSV file : 3 seconds
LOAD DATA INFILE 'data.csv' INTO TABLE myTable TERMINATED BY '\t';
insertion by using Kemal Taşkın solution: 32 seconds
MySqlDataAdapter.Update (DataTable)
insertion row by row (): 41 seconds
INSERT INTO table (columns) VALUES (values);
INSERT INTO table (columns) VALUES (values);
...
insertion all rows in one query: 143 seconds
INSERT INTO table (columns) VALUES (values), (values), ...;
=> LOAD DATA is the most performant by far !
You can check also this article :
https://dev.mysql.com/doc/refman/8.0/en/insert-optimization.html