I have Handheld device that connect to Sql Server database, read the Sql server data and get it on the SQL Compact database that is located on device. This is my code:
public void InsertData() // Function insert data into SQL commapct database
{
dt = new DataTable();
dt = SqlServer_GetData_dt("Select id_price, price, id_item from prices", SqlCeConnection); // get data form sql server
if (dt.Rows.Count > 0)
{
for (int i = 0; i < dt.Rows.Count; i++)
{
string sql = "";
sql = "insert into prices" +
" ( id_prices, price,id_item) values('"
+ dt.Rows[i]["id_price"].ToString().Trim() + "', '"
+ dt.Rows[i]["price"].ToString().Trim() + "', '"
+ dt.Rows[i]["id_item"].ToString().Trim() + "')";
obj.SqlCE_WriteData_bit(sql, connection.ConnectionString);//insert into sql compact
}
}
}
public DataTable SqlServer_GetData_dt(string query, string conn)
{
try
{
DataTable dt = new DataTable();
string SqlCeConnection = conn;
SqlConnection sqlConnection = new SqlConnection(SqlCeConnection);
sqlConnection.Open();
{
SqlDataReader darSQLServer;
SqlCommand cmdCESQLServer = new SqlCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
darSQLServer = cmdCESQLServer.ExecuteReader();
dt.Load(darSQLServer);
sqlConnection.Close();
}
return dt;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
DataTable dt = new DataTable();
return dt;
}
}
public object SqlCE_WriteData_bit(string query, string conn)
{
try
{
string SqlCeConnection = conn;
SqlCeConnection sqlConnection = new SqlCeConnection(SqlCeConnection);
if (sqlConnection.State == ConnectionState.Closed)
{
sqlConnection.Open();
}
SqlCeCommand cmdCESQLServer = new SqlCeCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
object i = cmdCESQLServer.ExecuteScalar();
sqlConnection.Close();
return i;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
return 0;
}
}
This is all work fine but the problem is that all this work very slow. I have 20 000 row that's need to be inserted into SQL compact database.
Is there any way for faster insert?
Thanks.
Aside from the obvious poor usage of the Connection for every call, you can greatly improve things by also eliminating the query processor altogether. That means don't use SQL. Instead open the destination table with TableDirect and a SqlCeResultset. The iterate through the source data (a DataTable is a bad idea, but that's a completely different thing) and use a series of CreateRecord, SetValues and Insert.
A pretty good example can be found here (though again, I'd use SetValues to set the entire row, not each individual field).
Reuse your connection and don't create a new connection for every INSERT statement.
Instead of passing a connection string to your SqlCE_WriteData_bit method, create the connection once in the InsertData method, and pass the connection object to SqlCE_WriteData_bit.
Put all the data into a DataTable and then use a SqlCeDataAdapter to save all the data with one call to Update. You may have to fiddle with the UpdateBatchSize to get the best performance.
Looking more closely, I see that you already have a DataTable. Looping through it yourself is therefore ludicrous. Just note that, as you have it, the RowState of every DataRow will be Unchanged, so they will not be inserted. I think that you can call DataTable.Load such that all RowState values are left as Added but, if not, then use a SqlDataAdapter instead, set AcceptChangesDuringFill to false and call Fill.
Is there any way for faster insert?
Yes, but it probably won't be "acceptably fast enough" when we're talking about inserting 20k rows.
The problem I can see is that you are opening a connection for every single row you are retrieving from SqlServer_GetData_dt, that is, you open a connection to insert data 20k times...opening a connection is an expensive transaction. You should build the whole query using a StringBuilder object and then execute all the insert statements in one batch.
This will bring some performance gains but don't expect it to solve you're problem, inserting 20k rows will still take some time, specially if indexes need to be re-built. My suggestion is that you should thoroughly analyse your requirements and be a bit smarter about how you approach it. Options are:
bundle a pre-populated database if possible so your app doesn't have to suffer the population process performance penalties
if not, run the insert process in the background and access the data only when the pre-population is finished
Related
Currently I'm working with my local MSSQL database and when I make a connection all works good. However that is not the question I'm having right now; I want to make my code cleaner and I don't want to have duplicated code or almost duplicated code.
For now I'm working with one large class that holds all the methods to selecting, creating, updating and/or deleting an user. But I think it can be writen down better with an override string that rides over the sql string inside the code.
Only thing is that I'm (for now) a complete noob and have no idea how to accomplish this... please help? As an example I've set the two regions, might change them to classes, below.
#region Select Data from Database
public DataTable Select()
{
// static method to connect to database
SqlConnection conn = new SqlConnection(myconnstring);
// to hold the data from database
DataTable dt = new DataTable();
try
{
// sql query to get date from database
String sql = "SELECT * FROM tbl_users";
// for executing command
SqlCommand cmd = new SqlCommand(sql, conn);
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(cmd);
// database connection open
conn.Open();
// fill data in datatable
adapter.Fill(dt);
}
catch(Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
finally
{
// closing connection
conn.Close();
}
// return value in datatable
return dt;
}
#endregion
#region Search User on Database using KeyWords
public DataTable Search(string keywords)
{
// static method to connect to database
SqlConnection conn = new SqlConnection(myconnstring);
// to hold the data from database
DataTable dt = new DataTable();
try
{
// sql query to get date from database
String sql = "SELECT * FROM tbl_users WHERE id LIKE '%"+keywords+"%' OR first_name LIKE '%"+keywords+"%' OR last_name LIKE '%"+keywords+"%' OR username LIKE '%"+keywords+"%'";
// for executing command
SqlCommand cmd = new SqlCommand(sql, conn);
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(cmd);
// database connection open
conn.Open();
// fill data in datatable
adapter.Fill(dt);
}
catch (Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
finally
{
// closing connection
conn.Close();
}
// return value in datatable
return dt;
}
#endregion
Step 1. Read https://xkcd.com/327/ and whatever solution you go with fix id LIKE '%"+keywords+"%'
I encourage you to research an object mapper, like Dapper which will make your methods return types (e.g. User) and not raw DataTables. An ORM can help pushing you into the pit of success.
As for reuse you can notice that your methods that do SELECT look very similar so you could make a helper method DataTable ExecuteSelect(string sql) which you could reuse from your Search and Select methods.
You really must fix this '%"+keywords+"%' issue. SQL injection is no joke.
Think about changing your entire concept and work with entity framework.
That way you create a connection to the DB and class for each main entity you have there.
Afterwards all your selects, updates, deletes etc will be automatic with all the functionality you have in MSSQL.
(This would be a mess as a comment)
First of all you must write code that is trustable and doesn't have unnecessary additions. ie:
#region Select Data from Database
public DataTable Select(string sql)
{
DataTable dt = new DataTable();
try
{
// getting data from database
SqlDataAdapter adapter = new SqlDataAdapter(sql, myconnstring);
// fill data in datatable
adapter.Fill(dt);
}
catch(Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
// return value in datatable
return dt;
}
#endregion
#region Search User on Database using KeyWords
public DataTable Search(string keywords)
{
// to hold the data from database
DataTable dt = new DataTable();
String sql = #"SELECT *
FROM tbl_users
WHERE id LIKE #pattern OR
first_name LIKE #pattern OR
last_name LIKE #pattern OR
username LIKE #pattern";
try
{
using (SqlConnection conn = new SqlConnection(myconnstring))
using (SqlCommand cmd = new SqlCommand(sql, conn))
{
cmd.Parameters.Add("#pattern", SqlDbType.Varchar).Value = '%"+keywords+"%';
// database connection open
conn.Open();
// fill data in datatable
tbl.Load(cmd.ExecuteReader());
}
catch (Exception ex)
{
// throw message if any error accures
MessageBox.Show(ex.Message);
}
// return value in datatable
return dt;
}
#endregion
First spare time to understand why you should use parameters, how you would use then etc.
Next, you would quickly see that this pattern is not flexible and not worth to make into a class as is.
You should think about getting away from DataTable and DataSet as well. Look into Linq and Entity Framework (Linq To EF). Others already found the wheel for you. Read about different patterns like Repository pattern. And also read about different backends and their advantagaes\disadvantages, weighing it against your use cases, before coding specifically for one of them.
I am using a SqlDataAdapter to pull a result from a stored procedure that takes up to 5 minutes to execute and return the result.
I am using a
da.SelectCommand.CommandTimeout = 1800;
setting, but the timeout does not work. The code is not honoring the timeout in real. It fails earlier.
Any idea how to fix this timeout?
This is my code:
var cpdbconn = new SqlConnection(ConfigurationManager.ConnectionStrings["SQL"].ConnectionString);
using (SqlCommand cmd = new SqlCommand())
{
cmd.Connection = cpdbconnection;
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = readStoredProcedureName;
using (SqlDataAdapter da = new SqlDataAdapter(cmd))
{
try
{
da.SelectCommand.CommandTimeout = 1800;
da.Fill(dt);
// Check datatable is null or not
if (dt != null && dt.Rows.Count > 0)
{
foreach (DataRow dataRow in dt.Rows)
{
lstring.Add(Convert.ToString(dataRow["ServerName"]));
}
}
// Add "','" in each row to convert the result to support nested query format
InnerQryResultStr = string.Join("','", lstring.ToArray());
if (multinestedQry != null)
{
combinedQry = qryName;
qryName = multinestedQry + "('" + InnerQryResultStr + "')";
}
else
{
qryName = qryName + "('" + InnerQryResultStr + "')";
}
}
catch (SqlException e)
{
Logger.Log(LOGTYPE.Error, String.Format("Inserting Data Failed for server {0} with Exception {1}", "DiscreteServerData", e.Message));
if(e.Number == -2)
{
Logger.Log(LOGTYPE.Error, String.Format("TimeOut occurred while executing SQL query / stored procedure ", "DiscreteServerData", e.Message));
}
strMsg = e.Message.ToString();
file.WriteLine(strMsg.ToString());
}
}
}
did you try to set the timeout on your SqlCommand? cmd.CommandTimeout = 300;
With Database Operations, there is a whole bunch of timeouts to consider:
The Command has a timeout.
The connection has a timeout.
Every layer of the Networking part has a timeout.
The server on the other end has a timeout.
The server on the other end might have slow loris protection.
The locks and transaction might time out (because at some point, somebody else might want to work with that table too).
According to your comments, your stored procedure returns "millions of records". Wich is propably where the issue lies. That is not a useable query. That is so much data it exceeds being a mere Network or DB Problem - and possibly becomes a memory problem on the client.
It is a common mistake to retreive too much, even if I can not remember anything quite on this scale (the bigest was in the 100k's). There is no way for a user to process this kind of information, so there has to be more filtering, pagination and the like. Never do those steps in the Client. At best you transfer useless amounts of data over the network. At worst you run into timeouts and concurrency issues. Always do as much filtering, pagination, etc in the query.
You want to retreive as little data as possible overall. Bulk operations like merging, backup, etc. should generally be done in the DBMS. If you move it to the client, you only add another layer of failure and two networking ways for the data.
For a better answer, I will need more precise information of the problem.
I have a sqlite database consist of 50 columns and more than 1.2 million rows. I'm using System.Data.Sqlite to work with Visual Studio 2013 C# language.
I used a very simple code to retrieve my data from database but it is taking too much time.
private SQLiteConnection sqlite;
public MySqlite(string path)
{
sqlite = new SQLiteConnection("Data Source="+path+"\\DBName.sqlite");
}
public DataTable selectQuery(string query)
{
SQLiteDataAdapter ad;
DataTable dt = new DataTable();
try
{
SQLiteCommand cmd;
sqlite.Open();
cmd = sqlite.CreateCommand();
cmd.CommandText = query; //set the passed query
ad = new SQLiteDataAdapter(cmd);
ad.Fill(dt); //fill the datasource
}
catch (SQLiteException ex)
{
//exception code here.
}
sqlite.Close();
return dt;
}
And, the select statement is:
select * from table
As I told you, it is a very simple code.
I want to know how to boost select operation performance to get the appropriate result. for this code the process takes up to 1 minute which I want to get to less than 1 second.
and another thing is that there seems to be some tags for configuring sqlite database but I don't know where to apply them. could some one tell me how to configure sqlite database with System.Data.Sqlite;
Consider narrowing your result set by getting necessary columns or paging.
I have a question regarding performance. This is my scenario.
I have a MYSQL database and a application that from time to time moves records, according to the criteria from a query, from one table to another. The way this is done is:
foreach(object obj in list)
{
string id = obj.ToString().Split(',')[0].Trim();
string query = " insert into old_records select * from testes where id='" +
id + "';" + " delete from testes where id='" + id +"'";
DB _db = new DB();
_db.DBConnect(query);
this is the way I connect to the database:
DataTable _dt = new DataTable();
MySqlConnection _conn = new MySqlConnection(connectionString);
MySqlCommand _cmd = new MySqlCommand
{
Connection = _conn,
CommandText = query
};
MySqlDataAdapter _da = new MySqlDataAdapter(_cmd);
MySqlCommandBuilder _cb = new MySqlCommandBuilder(_da);
_dt.Clear();
try
{
_conn.Open();
_cmd.ExecuteNonQuery();
_da.Fill(_dt);
}
catch (MySqlException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (_conn != null) _conn.Close();
}
return _dt;
So my question is, I have like 4000 rows in the table, and it takes a lot of time to move all the records from one table to another, specially across a network. Is there a way to make this faster?
I have been doing some reading and there are several option to handle data from DB like data adapters, reader, set, and tables. Which one is faster for this case? Should I be using a different method?
Two things I see is that first you're opening and closing your connection for each insert, that's usually your most expensive operation so you won't want to do that. You can also try batching them rather than doing them at once. When you do that you have to be careful because things could break in the middle of a large update so you would want to do things in a transaction. Without knowing too much about what your data structure looks like I refactored your method to do batching 100 at a time. First create a little helper method called move items that takes a connection and a list of ids. Don't do a try catch in this, you'll see why later.
Note: This method doesn't use parameters, I highly recommend you change it to do that.
private static void MoveItems(MySqlConnection conn, List<string> moveList)
{
string query = string.Format("insert into old_records select * from testes where id IN({0});" + " delete from testes where id IN({0})", string.Join(",", moveList.ToArray()));
var cmd = new MySqlCommand
{
Connection = conn,
CommandText = query
};
cmd.ExecuteNonQuery();
}
Next you'll change your main method to open the database connection once and then call this method 100 id's at a time. This method will have a try catch therefore if the call to MoveItems throws an exception it will be caught in this main method.
// the using statement will call your dispose method
using (var conn = new MySqlConnection(connectionString))
{
// open the connection and start the transaction
conn.Open();
var transaction = conn.BeginTransaction();
// createa list to temporarily store the ids
List<string> moves = new List<string>();
try
{
// clean the list, do the trim and get everything that's not null or empty
var cleanList = list.Select(obj => obj.ToString().Split(',')[0].Trim()).Where(s => !string.IsNullOrEmpty(s));
// loop over the clean list
foreach (string id in cleanList)
{
// add the id to the move list
moves.Add("'" + id + "'");
// batch 100 at a time
if (moves.Count % 100 == 0)
{
// when I reach 100 execute them and clear the list out
MoveItems(conn, moves);
moves.Clear();
}
}
// The list count might not be n (mod 100) therefore see if there's anything left
if (moves.Count > 0)
{
MoveItems(conn, moves);
moves.Clear();
}
// wohoo! commit the transaction
transaction.Commit();
}
catch (MySqlException ex)
{
// oops! something happened roll back everything
transaction.Rollback();
Console.WriteLine(ex.Message);
}
finally
{
conn.Close();
}
}
You may have to play with that 100 number. I remember when I worked with MySQL a lot I saw some performance differences between doing an IN and giving it a list of Or statements (Id = 'ID1' OR id = 'ID2' ...). But executing 40 statements or 80 statements will certainly have better performance, and opening the database connection once instead of 4000 times should also give you much better performance.
I might be wrong, but there is not much you can do in order to make it faster. After all you want to get the entire table data and insert its information into another table. The process will take some time if your table isn't small. However, you can try using the code below. It should do the trick and save some time.
INSERT INTO TABLE2 (FIELDNAME_IN_TABLE2, FIELDNAME2_IN_TABLE2)
SELECT FIELDNAME_IN_TABLE1, FIELDNAME2_IN_TABLE1
FROM TABLE1
I am guessing that not all SQL is created equally. I am diving into the world of DSNs and ODBC drivers in c# and having a bit of a go at it. I am trying to get all the tables in a database that is defined by a DSN that all I know about it is using a Transoft ODBC driver. I can connect to it and get back tables using the code:
public void ConnectToData(String dsn)
{
System.Data.Odbc.OdbcConnection conn =
new System.Data.Odbc.OdbcConnection();
//conn.ConnectionString = "FIL=MS Access;DSN=" + dsn;
conn.ConnectionString = "DSN=" + dsn; //dsn equals "Company_Shared"
try
{
conn.Open();
MessageBox.Show("Connected!");
lstBoxLogs.Items.Add("Connected");
DataTable tableschema = conn.GetSchema("TABLES");
DataSet set = tableschema.DataSet;
// first column name
for (int i = 0; i < tableschema.Columns.Count; i++)
{
lstBoxLogs.Items.Add(tableschema.Columns[i].ColumnName);
}
lstBoxLogs.Refresh();
MessageBox.Show(tableschema.Columns.Count + " tables found");
}
catch (Exception ex)
{
MessageBox.Show("Failed to connect to data source: " + ex.GetBaseException().Message);
}
finally
{
conn.Close();
}
}
It connects fine, and reports tables but not the tables I know I am looking for in the database what comes back is the following:
TABLE_QUALIFIER
TABLE_OWNER
TABLE_NAME
TABLE_TYPE
REMARKS
I am not sure how to get the actual table names from this information so I can just dump all the data in every table (This is all I want to do). Is this cause I have to read up on what kind of SQL a transoft database uses and does it render the conn.GetSchema("TABLES"); call useless?
Would this work?
using(DataTable tableschema = conn.GetSchema("TABLES"))
{
// first column name
foreach(DataRow row in tableschema.Rows)
{
lstBoxLogs.Items.Add(row["TABLE_NAME"].ToString());
}
}
EDIT: Fixed the code to not use the DataSet.
EDIT: Updated the code for future readers to implement the best practice of disposing of the DataTable.