I am using a SqlDataAdapter to pull a result from a stored procedure that takes up to 5 minutes to execute and return the result.
I am using a
da.SelectCommand.CommandTimeout = 1800;
setting, but the timeout does not work. The code is not honoring the timeout in real. It fails earlier.
Any idea how to fix this timeout?
This is my code:
var cpdbconn = new SqlConnection(ConfigurationManager.ConnectionStrings["SQL"].ConnectionString);
using (SqlCommand cmd = new SqlCommand())
{
cmd.Connection = cpdbconnection;
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = readStoredProcedureName;
using (SqlDataAdapter da = new SqlDataAdapter(cmd))
{
try
{
da.SelectCommand.CommandTimeout = 1800;
da.Fill(dt);
// Check datatable is null or not
if (dt != null && dt.Rows.Count > 0)
{
foreach (DataRow dataRow in dt.Rows)
{
lstring.Add(Convert.ToString(dataRow["ServerName"]));
}
}
// Add "','" in each row to convert the result to support nested query format
InnerQryResultStr = string.Join("','", lstring.ToArray());
if (multinestedQry != null)
{
combinedQry = qryName;
qryName = multinestedQry + "('" + InnerQryResultStr + "')";
}
else
{
qryName = qryName + "('" + InnerQryResultStr + "')";
}
}
catch (SqlException e)
{
Logger.Log(LOGTYPE.Error, String.Format("Inserting Data Failed for server {0} with Exception {1}", "DiscreteServerData", e.Message));
if(e.Number == -2)
{
Logger.Log(LOGTYPE.Error, String.Format("TimeOut occurred while executing SQL query / stored procedure ", "DiscreteServerData", e.Message));
}
strMsg = e.Message.ToString();
file.WriteLine(strMsg.ToString());
}
}
}
did you try to set the timeout on your SqlCommand? cmd.CommandTimeout = 300;
With Database Operations, there is a whole bunch of timeouts to consider:
The Command has a timeout.
The connection has a timeout.
Every layer of the Networking part has a timeout.
The server on the other end has a timeout.
The server on the other end might have slow loris protection.
The locks and transaction might time out (because at some point, somebody else might want to work with that table too).
According to your comments, your stored procedure returns "millions of records". Wich is propably where the issue lies. That is not a useable query. That is so much data it exceeds being a mere Network or DB Problem - and possibly becomes a memory problem on the client.
It is a common mistake to retreive too much, even if I can not remember anything quite on this scale (the bigest was in the 100k's). There is no way for a user to process this kind of information, so there has to be more filtering, pagination and the like. Never do those steps in the Client. At best you transfer useless amounts of data over the network. At worst you run into timeouts and concurrency issues. Always do as much filtering, pagination, etc in the query.
You want to retreive as little data as possible overall. Bulk operations like merging, backup, etc. should generally be done in the DBMS. If you move it to the client, you only add another layer of failure and two networking ways for the data.
For a better answer, I will need more precise information of the problem.
Related
I have Handheld device that connect to Sql Server database, read the Sql server data and get it on the SQL Compact database that is located on device. This is my code:
public void InsertData() // Function insert data into SQL commapct database
{
dt = new DataTable();
dt = SqlServer_GetData_dt("Select id_price, price, id_item from prices", SqlCeConnection); // get data form sql server
if (dt.Rows.Count > 0)
{
for (int i = 0; i < dt.Rows.Count; i++)
{
string sql = "";
sql = "insert into prices" +
" ( id_prices, price,id_item) values('"
+ dt.Rows[i]["id_price"].ToString().Trim() + "', '"
+ dt.Rows[i]["price"].ToString().Trim() + "', '"
+ dt.Rows[i]["id_item"].ToString().Trim() + "')";
obj.SqlCE_WriteData_bit(sql, connection.ConnectionString);//insert into sql compact
}
}
}
public DataTable SqlServer_GetData_dt(string query, string conn)
{
try
{
DataTable dt = new DataTable();
string SqlCeConnection = conn;
SqlConnection sqlConnection = new SqlConnection(SqlCeConnection);
sqlConnection.Open();
{
SqlDataReader darSQLServer;
SqlCommand cmdCESQLServer = new SqlCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
darSQLServer = cmdCESQLServer.ExecuteReader();
dt.Load(darSQLServer);
sqlConnection.Close();
}
return dt;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
DataTable dt = new DataTable();
return dt;
}
}
public object SqlCE_WriteData_bit(string query, string conn)
{
try
{
string SqlCeConnection = conn;
SqlCeConnection sqlConnection = new SqlCeConnection(SqlCeConnection);
if (sqlConnection.State == ConnectionState.Closed)
{
sqlConnection.Open();
}
SqlCeCommand cmdCESQLServer = new SqlCeCommand();
cmdCESQLServer.Connection = sqlConnection;
cmdCESQLServer.CommandType = CommandType.Text;
cmdCESQLServer.CommandText = query;
object i = cmdCESQLServer.ExecuteScalar();
sqlConnection.Close();
return i;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
return 0;
}
}
This is all work fine but the problem is that all this work very slow. I have 20 000 row that's need to be inserted into SQL compact database.
Is there any way for faster insert?
Thanks.
Aside from the obvious poor usage of the Connection for every call, you can greatly improve things by also eliminating the query processor altogether. That means don't use SQL. Instead open the destination table with TableDirect and a SqlCeResultset. The iterate through the source data (a DataTable is a bad idea, but that's a completely different thing) and use a series of CreateRecord, SetValues and Insert.
A pretty good example can be found here (though again, I'd use SetValues to set the entire row, not each individual field).
Reuse your connection and don't create a new connection for every INSERT statement.
Instead of passing a connection string to your SqlCE_WriteData_bit method, create the connection once in the InsertData method, and pass the connection object to SqlCE_WriteData_bit.
Put all the data into a DataTable and then use a SqlCeDataAdapter to save all the data with one call to Update. You may have to fiddle with the UpdateBatchSize to get the best performance.
Looking more closely, I see that you already have a DataTable. Looping through it yourself is therefore ludicrous. Just note that, as you have it, the RowState of every DataRow will be Unchanged, so they will not be inserted. I think that you can call DataTable.Load such that all RowState values are left as Added but, if not, then use a SqlDataAdapter instead, set AcceptChangesDuringFill to false and call Fill.
Is there any way for faster insert?
Yes, but it probably won't be "acceptably fast enough" when we're talking about inserting 20k rows.
The problem I can see is that you are opening a connection for every single row you are retrieving from SqlServer_GetData_dt, that is, you open a connection to insert data 20k times...opening a connection is an expensive transaction. You should build the whole query using a StringBuilder object and then execute all the insert statements in one batch.
This will bring some performance gains but don't expect it to solve you're problem, inserting 20k rows will still take some time, specially if indexes need to be re-built. My suggestion is that you should thoroughly analyse your requirements and be a bit smarter about how you approach it. Options are:
bundle a pre-populated database if possible so your app doesn't have to suffer the population process performance penalties
if not, run the insert process in the background and access the data only when the pre-population is finished
I have a data reader object, which reads through say 4 rows, and I will be looping through the rows. While reading third row, I insert a row inside the same table, will my data reader be able to read the newly inserted row. If not how to achieve this functionality
Here is the code i tried.
AseCommand sessionCmd = null;
//AseCommand selectCmd = null;
AseCommand insertCmd = null;
AseDataReader reader = null;
string retCode = "Nothing returned from the Server";
string insertStatement;
AseConnection conn = null;
conn = new AseConnection("Data Source='" + host + "';Port='" + port + "';UID='" + user + "';PWD='" + password + "';Database=" + db + ";");
conn.Open();
sessionCmd = new AseCommand("select * from dbo.DummyTable", conn);
try
{
reader = sessionCmd.ExecuteReader();
int count = 0;
while (reader.Read())
{
Console.WriteLine(reader.GetString(0));
count++;
if (count == 3)
{
//insert into table
insertCmd = new AseCommand("insert into DummyTable values (5)", conn);
insertCmd.ExecuteReader();
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
I found your question interesting, because I thought the context would along the lines if a row being added from another process, would the current reader see it.
If I had written your test and it did not produce the result I expected, I might come to Stackoverflow and ask: is it possible to read the new row and I just did not do it right? or maybe even get an answer like: it might work sometimes but not always because of xyz...
However, you got answer three, which was 'don't bother us, we didn't sleep well last night.'
In general terms, a DataReader is a read-only forward-reading stream of tabular data sets. It is streaming the results of a query, not monitoring a table...so the query results have already been executed by the database and are being served when called,but not being updated.
Incidentally, what you did was insert a new row using an ExecuteReader command, which returned a new reader...it doesn't modify the outer reader you are looping through. You could read the results of the insert from the new reader...although its not quite clear why you would want to.
Although probably beyond the scope of your test, you might be interested in the concept of Multiple Active Result Sets which does allow you to go back and forth between two results sets on the same connection.
I am trying to retrieve list of records from one table , and write to another table. I've used a simple query to retrieve the values to SqlDataReader,then load them to a DataTable. Using the DataTableReader , I am going through the entire data set which is Saved in DataTable. The problem is, while reading each and every record I am trying to insert those values to another table using a Stored Procedure.But it only insert the first row of values,and for the second row onward giving some Exception saying."procedure or function has too many arguments specified".
string ConStr = ConfigurationManager.ConnectionStrings["ConString"].ConnectionString;
SqlConnection NewCon = new SqlConnection(ConStr);
NewCon.Open();
SqlCommand NewCmd3 = NewCon.CreateCommand();
NewCmd3.CommandType = CommandType.Text;
NewCmd3.CommandText ="select * from dbo.Request_List where group_no ='" +group_no+ "'";
NewCon.Close();
NewCon.Open();
SqlDataReader dr = (SqlDataReader)NewCmd3.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dr);
DataTableReader reader = new DataTableReader(dt);
NewCmd.Dispose();
NewCon.Close();
NewCon.Open();
SqlCommand NewCmdGrpReqSer = NewCon.CreateCommand();
NewCmdGrpReqSer.CommandType = CommandType.StoredProcedure;
NewCmdGrpReqSer.CommandText = "Voucher_Request_Connection";
if (reader.HasRows)
{
int request_no = 0;
while (reader.Read())
{
request_no = (int)reader["request_no"];
NewCmdGrpReqSer.Parameters.Add("#serial_no", serial_no);
NewCmdGrpReqSer.Parameters.Add("#request_no", request_no);
try
{
NewCmdGrpReqSer.ExecuteNonQuery();
MessageBox.Show("Connection Updated");//just to check the status.tempory
}
catch (Exception xcep)
{
MessageBox.Show(xcep.Message);
}
MessageBox.Show(request_no.ToString());//
}
NewCmdGrpReqSer.Dispose();
NewCon.Close();
}
Any Solutions ?
As #Sparky suggests, the problem is that you continue to add parameters to the insertion command. There are several other ways in which the code could be improved, however. These improvements would remove the need to clear the parameters and would help to make sure you don't leave disposable resources undisposed.
First - use the using statement for your disposable objects. This removes the need for the explicit Close (btw, only one of Close/Dispose is needed for the connection as I believe Dispose calls Close). Second, simply create a new command for each insertion. This will prevent complex logic around resetting the parameters and, possibly, handling error states for the command. Third, check the results of the insertion to make sure it succeeds. Fourth, explicitly catch a SqlException - you don't want to accidentally hide unexpected errors in your code. If it's necessary to make sure all exceptions don't bubble up, consider using multiple exception handlers and "doing the right thing" for each case - say logging with different error levels or categories, aborting the entire operation rather than just this insert, etc. Lastly, I would use better variable names. In particular, avoid appending numeric identifiers to generic variable names. This makes the code harder to understand, both for others and for yourself after you've let the code sit for awhile.
Here's my version. Note there are several other things that I might do such as make the string literals into appropriately named constants. Introduce a strongly-typed wrapper around the ConfigurationManager object to make testing easier. Remove the underscores from the variable names and use camelCase instead. Though those are more stylistic in nature, you might want to consider them as well.
var connectionString = ConfigurationManager.ConnectionStrings["ConString"].ConnectionString;
using (var newConnection = new SqlConnection(connectionString))
{
newConnection.Open();
using (var selectCommand = newConnection.CreateCommand())
{
selectCommand.CommandType = CommandType.Text;
select.CommandText ="select request_no from dbo.Request_List where group_no = #groupNumber";
selectCommand.Parameters.AddWithValue("groupNumber", group_no);
using (dataReader = (SqlDataReader)newCommand.ExecuteReader())
{
while (reader.HasRows && reader.Read())
{
using (var insertCommand = newConnection.CreateCommand())
{
insertCommand.CommandType = CommandType.StoredProcedure;
insertCommand.CommandText = "Voucher_Request_Connection";
var request_no = (int)reader["request_no"];
insertCommand.Parameters.Add("#serial_no", serial_no);
insertCommand.Parameters.Add("#request_no", request_no);
try
{
if (insertCommand.ExecuteNonQuery() == 1)
{
MessageBox.Show("Connection Updated");//just to check the status.tempory
}
else
{
MessageBox.Show("Connection was not updated " + request_no);
}
}
catch (SqlException xcep)
{
MessageBox.Show(xcep.Message);
}
MessageBox.Show(request_no.ToString());//
}
}
}
}
}
Try clearing your parameters each time...
while (reader.Read())
{
request_no = (int)reader["request_no"];
// Add this line
NewCmdGrpReqSer.Parameters.Clear();
NewCmdGrpReqSer.Parameters.Add("#serial_no", serial_no);
NewCmdGrpReqSer.Parameters.Add("#request_no", request_no);
try
{
I have a question regarding performance. This is my scenario.
I have a MYSQL database and a application that from time to time moves records, according to the criteria from a query, from one table to another. The way this is done is:
foreach(object obj in list)
{
string id = obj.ToString().Split(',')[0].Trim();
string query = " insert into old_records select * from testes where id='" +
id + "';" + " delete from testes where id='" + id +"'";
DB _db = new DB();
_db.DBConnect(query);
this is the way I connect to the database:
DataTable _dt = new DataTable();
MySqlConnection _conn = new MySqlConnection(connectionString);
MySqlCommand _cmd = new MySqlCommand
{
Connection = _conn,
CommandText = query
};
MySqlDataAdapter _da = new MySqlDataAdapter(_cmd);
MySqlCommandBuilder _cb = new MySqlCommandBuilder(_da);
_dt.Clear();
try
{
_conn.Open();
_cmd.ExecuteNonQuery();
_da.Fill(_dt);
}
catch (MySqlException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (_conn != null) _conn.Close();
}
return _dt;
So my question is, I have like 4000 rows in the table, and it takes a lot of time to move all the records from one table to another, specially across a network. Is there a way to make this faster?
I have been doing some reading and there are several option to handle data from DB like data adapters, reader, set, and tables. Which one is faster for this case? Should I be using a different method?
Two things I see is that first you're opening and closing your connection for each insert, that's usually your most expensive operation so you won't want to do that. You can also try batching them rather than doing them at once. When you do that you have to be careful because things could break in the middle of a large update so you would want to do things in a transaction. Without knowing too much about what your data structure looks like I refactored your method to do batching 100 at a time. First create a little helper method called move items that takes a connection and a list of ids. Don't do a try catch in this, you'll see why later.
Note: This method doesn't use parameters, I highly recommend you change it to do that.
private static void MoveItems(MySqlConnection conn, List<string> moveList)
{
string query = string.Format("insert into old_records select * from testes where id IN({0});" + " delete from testes where id IN({0})", string.Join(",", moveList.ToArray()));
var cmd = new MySqlCommand
{
Connection = conn,
CommandText = query
};
cmd.ExecuteNonQuery();
}
Next you'll change your main method to open the database connection once and then call this method 100 id's at a time. This method will have a try catch therefore if the call to MoveItems throws an exception it will be caught in this main method.
// the using statement will call your dispose method
using (var conn = new MySqlConnection(connectionString))
{
// open the connection and start the transaction
conn.Open();
var transaction = conn.BeginTransaction();
// createa list to temporarily store the ids
List<string> moves = new List<string>();
try
{
// clean the list, do the trim and get everything that's not null or empty
var cleanList = list.Select(obj => obj.ToString().Split(',')[0].Trim()).Where(s => !string.IsNullOrEmpty(s));
// loop over the clean list
foreach (string id in cleanList)
{
// add the id to the move list
moves.Add("'" + id + "'");
// batch 100 at a time
if (moves.Count % 100 == 0)
{
// when I reach 100 execute them and clear the list out
MoveItems(conn, moves);
moves.Clear();
}
}
// The list count might not be n (mod 100) therefore see if there's anything left
if (moves.Count > 0)
{
MoveItems(conn, moves);
moves.Clear();
}
// wohoo! commit the transaction
transaction.Commit();
}
catch (MySqlException ex)
{
// oops! something happened roll back everything
transaction.Rollback();
Console.WriteLine(ex.Message);
}
finally
{
conn.Close();
}
}
You may have to play with that 100 number. I remember when I worked with MySQL a lot I saw some performance differences between doing an IN and giving it a list of Or statements (Id = 'ID1' OR id = 'ID2' ...). But executing 40 statements or 80 statements will certainly have better performance, and opening the database connection once instead of 4000 times should also give you much better performance.
I might be wrong, but there is not much you can do in order to make it faster. After all you want to get the entire table data and insert its information into another table. The process will take some time if your table isn't small. However, you can try using the code below. It should do the trick and save some time.
INSERT INTO TABLE2 (FIELDNAME_IN_TABLE2, FIELDNAME2_IN_TABLE2)
SELECT FIELDNAME_IN_TABLE1, FIELDNAME2_IN_TABLE1
FROM TABLE1
My C# code below checks a SQL database to see if a record matches a ClientID and a User Name. If more than 15 or more matching records are found that match, the CPU on my Windows 2008 server peaks at about 78% while the 15 records are found while the below C# code executes. The SQL Server 2008 database and software is located on another server so the problem is not with SQL Server spiking the CPU. The problem is with my C# software that is executing the code below. I can see my software executable that contains the C# code below spike to 78% while the database query is executed and the records are found.
Can someone please tell me if there is something wrong with my code that is causing the CPU to spike when 15 or more matching records are found? Can you also please tell/show me how to optimize my code?
Update: If it finds 10 records, the CPU only spikes at 2-3 percent. It is only when it finds 15 or more records does the CPU spike at 78% for two to three seconds.
//ClientID[0] will contain a ClientID of 10 characters
//output[0] will contain a User Name
char[] trimChars = { ' ' };
using (var connection = new SqlConnection(string.Format(GlobalClass.SQLConnectionString, "History")))
{
connection.Open();
using (var command = new SqlCommand())
{
command.CommandText = string.Format(#"SELECT Count(*) FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'");
command.Connection = connection;
var rows = (int) command.ExecuteScalar();
if (rows >= 0)
{
command.CommandText = string.Format(#"SELECT * FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'");
using (SqlDataReader reader = command.ExecuteReader())
{
if (reader.HasRows)
{
while (reader.Read())
{
//Make sure ClientID does NOT exist in the ClientID field
if (reader["ClientID"].ToString().TrimEnd(trimChars).IndexOf(ClientID[0]) !=
-1)
{
//If we are here, then do something
}
}
}
reader.Close();
reader.Dispose();
}
}
// Close the connection
if (connection != null)
{
connection.Close();
}
}
}
You can decrease the number of database access from 2 to 1 if will remove first query, it is not necessary.
using (SqlConnection connection = new SqlConnection(connectionString))
using (SqlCommand command = connection.CreateCommand())
{
command.CommandText = "SELECT ClientID FROM dbo.Filelist WHERE ToAccountName = #param"; // note single column in select clause
command.Parameters.AddWithValue("#param", output[0]); // note parameterized query
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read()) // reader.HasRow is doubtfully necessary
{
// logic goes here
// but it's better to perform it on data layer too
// or return all clients first, then perform client-side logic
yield return reader.GetString(0);
}
} // note that using block calls Dispose()/Close() automatically
}
Change this:
SELECT * FROM Filelist
To this:
SELECT ClientID FROM Filelist
And check for performance.
I suspect there is a blob field on your select.
Also select * is not recommended, write your exact interested fields in your query.
Nothing looks obviously CPU intensive, but one problem does stand out.
You are running a query to count how many records there are
"SELECT Count(*) FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'"
Then, if more than 0 is returned, you are running another query to get the data.
"SELECT * FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'"
This is redundant. Get rid of the first query, and just use the second one, checking to see if the reader has data. You can also get rid of the HasRows call and just do
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
}
}
Please consider what already said about parametrized queries.
Beside that, I think that the only big issue could arise in the following block:
while (reader.Read())
{
//Make sure ClientID does NOT exist in the ClientID field
if (reader["ClientID"].ToString().TrimEnd(trimChars).IndexOf(ClientID[0]) != -1)
{
//If we are here, then do something
}
}
So try to just cache your reader.Read() data in some local variable, releasing the SQL resources asap, then you can work on the data you just retrieved. Eg:
List<string> myRows = new List<string>();
while (reader.Read())
{
myRows.Add(reader["ClientID"].ToString();
}
/// quit the using clause
/// now elaborate what you got in myRows
There is nothing in the code to indicate a performance problem.
What does SQL Profiler show?
(Both in terms of query plan, and server resources used.)
Edit: To make this clearer: you have one measurement that might indicate an issue. You now need to measure more deeply to understand if it really is a problem, only you can do this (no one else has access to the hardware).
I strongly recommend that you get a copy of dotTrace from JetBrains.
At the very least, profiling the client code will help you identify/eliminate the source of the CPU spike.
I recommend using parameters as suggested, however, I have seen performance problems where the type of the string column does not match the C# string. In these cases, I suggest specifying the type explicitly.
Like this:
command.CommandText = "SELECT ClientID FROM dbo.Filelist WHERE ToAccountName = #accountName";
command.Parameters.Add("#accountName", SqlDbType.NVarChar, 16, output[0]);
Or this:
SqlParameter param = command.Parameters.Add(
"#accountName", SqlDbType.NVarChar);
param.Size = 16; //optional
param.Value = output[0];