Why code is maxing out the CPU while querying the database? - c#

My C# code below checks a SQL database to see if a record matches a ClientID and a User Name. If more than 15 or more matching records are found that match, the CPU on my Windows 2008 server peaks at about 78% while the 15 records are found while the below C# code executes. The SQL Server 2008 database and software is located on another server so the problem is not with SQL Server spiking the CPU. The problem is with my C# software that is executing the code below. I can see my software executable that contains the C# code below spike to 78% while the database query is executed and the records are found.
Can someone please tell me if there is something wrong with my code that is causing the CPU to spike when 15 or more matching records are found? Can you also please tell/show me how to optimize my code?
Update: If it finds 10 records, the CPU only spikes at 2-3 percent. It is only when it finds 15 or more records does the CPU spike at 78% for two to three seconds.
//ClientID[0] will contain a ClientID of 10 characters
//output[0] will contain a User Name
char[] trimChars = { ' ' };
using (var connection = new SqlConnection(string.Format(GlobalClass.SQLConnectionString, "History")))
{
connection.Open();
using (var command = new SqlCommand())
{
command.CommandText = string.Format(#"SELECT Count(*) FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'");
command.Connection = connection;
var rows = (int) command.ExecuteScalar();
if (rows >= 0)
{
command.CommandText = string.Format(#"SELECT * FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'");
using (SqlDataReader reader = command.ExecuteReader())
{
if (reader.HasRows)
{
while (reader.Read())
{
//Make sure ClientID does NOT exist in the ClientID field
if (reader["ClientID"].ToString().TrimEnd(trimChars).IndexOf(ClientID[0]) !=
-1)
{
//If we are here, then do something
}
}
}
reader.Close();
reader.Dispose();
}
}
// Close the connection
if (connection != null)
{
connection.Close();
}
}
}

You can decrease the number of database access from 2 to 1 if will remove first query, it is not necessary.
using (SqlConnection connection = new SqlConnection(connectionString))
using (SqlCommand command = connection.CreateCommand())
{
command.CommandText = "SELECT ClientID FROM dbo.Filelist WHERE ToAccountName = #param"; // note single column in select clause
command.Parameters.AddWithValue("#param", output[0]); // note parameterized query
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read()) // reader.HasRow is doubtfully necessary
{
// logic goes here
// but it's better to perform it on data layer too
// or return all clients first, then perform client-side logic
yield return reader.GetString(0);
}
} // note that using block calls Dispose()/Close() automatically
}

Change this:
SELECT * FROM Filelist
To this:
SELECT ClientID FROM Filelist
And check for performance.
I suspect there is a blob field on your select.
Also select * is not recommended, write your exact interested fields in your query.

Nothing looks obviously CPU intensive, but one problem does stand out.
You are running a query to count how many records there are
"SELECT Count(*) FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'"
Then, if more than 0 is returned, you are running another query to get the data.
"SELECT * FROM Filelist WHERE [ToAccountName] = '" + output[0] + #"'"
This is redundant. Get rid of the first query, and just use the second one, checking to see if the reader has data. You can also get rid of the HasRows call and just do
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
}
}

Please consider what already said about parametrized queries.
Beside that, I think that the only big issue could arise in the following block:
while (reader.Read())
{
//Make sure ClientID does NOT exist in the ClientID field
if (reader["ClientID"].ToString().TrimEnd(trimChars).IndexOf(ClientID[0]) != -1)
{
//If we are here, then do something
}
}
So try to just cache your reader.Read() data in some local variable, releasing the SQL resources asap, then you can work on the data you just retrieved. Eg:
List<string> myRows = new List<string>();
while (reader.Read())
{
myRows.Add(reader["ClientID"].ToString();
}
/// quit the using clause
/// now elaborate what you got in myRows

There is nothing in the code to indicate a performance problem.
What does SQL Profiler show?
(Both in terms of query plan, and server resources used.)
Edit: To make this clearer: you have one measurement that might indicate an issue. You now need to measure more deeply to understand if it really is a problem, only you can do this (no one else has access to the hardware).

I strongly recommend that you get a copy of dotTrace from JetBrains.
At the very least, profiling the client code will help you identify/eliminate the source of the CPU spike.

I recommend using parameters as suggested, however, I have seen performance problems where the type of the string column does not match the C# string. In these cases, I suggest specifying the type explicitly.
Like this:
command.CommandText = "SELECT ClientID FROM dbo.Filelist WHERE ToAccountName = #accountName";
command.Parameters.Add("#accountName", SqlDbType.NVarChar, 16, output[0]);
Or this:
SqlParameter param = command.Parameters.Add(
"#accountName", SqlDbType.NVarChar);
param.Size = 16; //optional
param.Value = output[0];

Related

No value given for one or more required parameters. C#

I have a C# program and I made a code for subtracting the amount of sold products from the amount of these products in the stock (access datatable) so I used this code :
foreach (DataGridViewRow r in dataGridView1.Rows)
{
OleDbCommand command = new OleDbCommand("Update BookInserting set Amount = Amount - '" + Convert.ToInt32(r.Cells[1].Value) + "' where BookName = " + r.Cells[0].Value.ToString() + "", connection);
connection.Open();
command.ExecuteNonQuery();
connection.Close();
}
but when I run it gives me this error :
(No value given for one or more required parameters) .
I tried solving it several times but failed, I hope you can help me solve it.
Thanks in advance.
Your problem is probably caused by Access unable to recognize some part of your query as an object of the underlying table (or the table itself).
This problem and a more serious one called Sql Injection could be avoided using parameters. (And a side benefit your code becomes a lot clearer without all those strings concatenations)
So let's try to change your code in this way:
// String sql with parameters placeholders
string cmdText = #"Update BookInserting
set Amount = Amount - #amt
where BookName = #book";
connection.Open();
// Just build the command just one time outside the loop and
// add the two parameters required (without a value and in the exact order
// expected by the placeholders
OleDbCommand command = new OleDbCommand(cmdText, connection);
command.Parameters.Add("#amt", OleDbType.Integer);
command.Parameters.Add("#book", OleDbType.VarWChar);
// Inside the loop just change the parameters values and execute
foreach (DataGridViewRow r in dataGridView1.Rows)
{
// If the cell with the parameter for the WHERE
// clause is invalid skip the update
if(!r.IsNewRow && r.Cells[0].Value != null
&& r.Cells[0].Value.ToString() != "")
{
cmd.Parameters["#amt"].Value = Convert.ToInt32(r.Cells[1].Value);
cmd.Parameters["#book"].Value = r.Cells[0].Value.ToString();
command.ExecuteNonQuery();
}
}
connection.Close();
Final note. A connection object should be created each time you require it. From your code it is not clear if this is the case. Use the following pattern. (Using Statement)
using(OleDbConnection connection = new OleDbConnection(....))
{
... code that uses the connection ....
} // <- here the connection is closed and disposed

Getting Database values into variable form c#

I am developing a cricket simulation and i need to retrieve certain statistics from a players data. I've got the following code.
public List<float> BattingData()
{
con.ConnectionString = ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString.ToString();
string query = "SELECT [INNS], [NOT OUTS], [AVG] FROM [" + batTeam + "] WHERE [Player] = '" + name + "';";
SqlCommand com = new SqlCommand(query, con);
con.Open();
using (SqlDataReader reader = com.ExecuteReader())
{
if(reader.HasRows)
{
while (reader.NextResult())
{
Innings = Convert.ToInt32(reader["INNS"]);
NotOuts = Convert.ToInt32(reader["NOT OUTS"]);
Avg = Convert.ToSingle(reader["AVG"]);
}
}
}
con.Close();
OutRatePG = (Innings = NotOuts) / Innings;
OutRatePB = OutRatePG / 240;
RunsPB = Avg / 240;
battingData.Add(OutRatePB);
battingData.Add(RunsPB);
return battingData;
}
The error that I am getting is that when I try to divie by 'Innings' it is saying cannot divide by zero, so I think the variables are being returned as zero and no data is being assigned to them.
This line is the issue:
while (reader.NextResult())
What this does is move the reader to the next resultset, ignoring the rest of the rows unread. To advance a reader to the next row, you need to call reader.Read() instead.
You have some other issues with your code:
You appear to have a separate table for each team. This is incorrect database design. You should create a Team table, with each team in it, and then foreign key your TeamResults table to it. Query it using INNER JOIN.
You are concatenating user-entered values to your query. This leaves you open to SQL injection attacks. Use parameters instead. (You cannot parameterize a table name, another reason you should do as above 1.)
You do not need to check for HasRows. If there are no rows, Read() will return false.
It looks like you only want one row. If that is the case you don't want a while(reader.Read()) loop, instead if(reader.Read()). (If you only need a single value, you can refactor the code to use command.ExecuteScalar().)
In database records check if value for Innings has 0
also you can try the below code before performing any operation.
> if(Innings>0) { OutRatePG = (Innings - NotOuts) / Innings; }

SqlDataAdapter object CommandTimeOut does not work in C#

I am using a SqlDataAdapter to pull a result from a stored procedure that takes up to 5 minutes to execute and return the result.
I am using a
da.SelectCommand.CommandTimeout = 1800;
setting, but the timeout does not work. The code is not honoring the timeout in real. It fails earlier.
Any idea how to fix this timeout?
This is my code:
var cpdbconn = new SqlConnection(ConfigurationManager.ConnectionStrings["SQL"].ConnectionString);
using (SqlCommand cmd = new SqlCommand())
{
cmd.Connection = cpdbconnection;
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = readStoredProcedureName;
using (SqlDataAdapter da = new SqlDataAdapter(cmd))
{
try
{
da.SelectCommand.CommandTimeout = 1800;
da.Fill(dt);
// Check datatable is null or not
if (dt != null && dt.Rows.Count > 0)
{
foreach (DataRow dataRow in dt.Rows)
{
lstring.Add(Convert.ToString(dataRow["ServerName"]));
}
}
// Add "','" in each row to convert the result to support nested query format
InnerQryResultStr = string.Join("','", lstring.ToArray());
if (multinestedQry != null)
{
combinedQry = qryName;
qryName = multinestedQry + "('" + InnerQryResultStr + "')";
}
else
{
qryName = qryName + "('" + InnerQryResultStr + "')";
}
}
catch (SqlException e)
{
Logger.Log(LOGTYPE.Error, String.Format("Inserting Data Failed for server {0} with Exception {1}", "DiscreteServerData", e.Message));
if(e.Number == -2)
{
Logger.Log(LOGTYPE.Error, String.Format("TimeOut occurred while executing SQL query / stored procedure ", "DiscreteServerData", e.Message));
}
strMsg = e.Message.ToString();
file.WriteLine(strMsg.ToString());
}
}
}
did you try to set the timeout on your SqlCommand? cmd.CommandTimeout = 300;
With Database Operations, there is a whole bunch of timeouts to consider:
The Command has a timeout.
The connection has a timeout.
Every layer of the Networking part has a timeout.
The server on the other end has a timeout.
The server on the other end might have slow loris protection.
The locks and transaction might time out (because at some point, somebody else might want to work with that table too).
According to your comments, your stored procedure returns "millions of records". Wich is propably where the issue lies. That is not a useable query. That is so much data it exceeds being a mere Network or DB Problem - and possibly becomes a memory problem on the client.
It is a common mistake to retreive too much, even if I can not remember anything quite on this scale (the bigest was in the 100k's). There is no way for a user to process this kind of information, so there has to be more filtering, pagination and the like. Never do those steps in the Client. At best you transfer useless amounts of data over the network. At worst you run into timeouts and concurrency issues. Always do as much filtering, pagination, etc in the query.
You want to retreive as little data as possible overall. Bulk operations like merging, backup, etc. should generally be done in the DBMS. If you move it to the client, you only add another layer of failure and two networking ways for the data.
For a better answer, I will need more precise information of the problem.

Best way to move data between tables in MYSQL

I have a question regarding performance. This is my scenario.
I have a MYSQL database and a application that from time to time moves records, according to the criteria from a query, from one table to another. The way this is done is:
foreach(object obj in list)
{
string id = obj.ToString().Split(',')[0].Trim();
string query = " insert into old_records select * from testes where id='" +
id + "';" + " delete from testes where id='" + id +"'";
DB _db = new DB();
_db.DBConnect(query);
this is the way I connect to the database:
DataTable _dt = new DataTable();
MySqlConnection _conn = new MySqlConnection(connectionString);
MySqlCommand _cmd = new MySqlCommand
{
Connection = _conn,
CommandText = query
};
MySqlDataAdapter _da = new MySqlDataAdapter(_cmd);
MySqlCommandBuilder _cb = new MySqlCommandBuilder(_da);
_dt.Clear();
try
{
_conn.Open();
_cmd.ExecuteNonQuery();
_da.Fill(_dt);
}
catch (MySqlException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (_conn != null) _conn.Close();
}
return _dt;
So my question is, I have like 4000 rows in the table, and it takes a lot of time to move all the records from one table to another, specially across a network. Is there a way to make this faster?
I have been doing some reading and there are several option to handle data from DB like data adapters, reader, set, and tables. Which one is faster for this case? Should I be using a different method?
Two things I see is that first you're opening and closing your connection for each insert, that's usually your most expensive operation so you won't want to do that. You can also try batching them rather than doing them at once. When you do that you have to be careful because things could break in the middle of a large update so you would want to do things in a transaction. Without knowing too much about what your data structure looks like I refactored your method to do batching 100 at a time. First create a little helper method called move items that takes a connection and a list of ids. Don't do a try catch in this, you'll see why later.
Note: This method doesn't use parameters, I highly recommend you change it to do that.
private static void MoveItems(MySqlConnection conn, List<string> moveList)
{
string query = string.Format("insert into old_records select * from testes where id IN({0});" + " delete from testes where id IN({0})", string.Join(",", moveList.ToArray()));
var cmd = new MySqlCommand
{
Connection = conn,
CommandText = query
};
cmd.ExecuteNonQuery();
}
Next you'll change your main method to open the database connection once and then call this method 100 id's at a time. This method will have a try catch therefore if the call to MoveItems throws an exception it will be caught in this main method.
// the using statement will call your dispose method
using (var conn = new MySqlConnection(connectionString))
{
// open the connection and start the transaction
conn.Open();
var transaction = conn.BeginTransaction();
// createa list to temporarily store the ids
List<string> moves = new List<string>();
try
{
// clean the list, do the trim and get everything that's not null or empty
var cleanList = list.Select(obj => obj.ToString().Split(',')[0].Trim()).Where(s => !string.IsNullOrEmpty(s));
// loop over the clean list
foreach (string id in cleanList)
{
// add the id to the move list
moves.Add("'" + id + "'");
// batch 100 at a time
if (moves.Count % 100 == 0)
{
// when I reach 100 execute them and clear the list out
MoveItems(conn, moves);
moves.Clear();
}
}
// The list count might not be n (mod 100) therefore see if there's anything left
if (moves.Count > 0)
{
MoveItems(conn, moves);
moves.Clear();
}
// wohoo! commit the transaction
transaction.Commit();
}
catch (MySqlException ex)
{
// oops! something happened roll back everything
transaction.Rollback();
Console.WriteLine(ex.Message);
}
finally
{
conn.Close();
}
}
You may have to play with that 100 number. I remember when I worked with MySQL a lot I saw some performance differences between doing an IN and giving it a list of Or statements (Id = 'ID1' OR id = 'ID2' ...). But executing 40 statements or 80 statements will certainly have better performance, and opening the database connection once instead of 4000 times should also give you much better performance.
I might be wrong, but there is not much you can do in order to make it faster. After all you want to get the entire table data and insert its information into another table. The process will take some time if your table isn't small. However, you can try using the code below. It should do the trick and save some time.
INSERT INTO TABLE2 (FIELDNAME_IN_TABLE2, FIELDNAME2_IN_TABLE2)
SELECT FIELDNAME_IN_TABLE1, FIELDNAME2_IN_TABLE1
FROM TABLE1

SQL Server CE WHERE statement behaving wrong? very confused

I am developing a windows mobile app. Right now I am just testing that it can correctly query the local SQL Server CE database. It works fine until I put a WHERE statement in.
Here is my code:
private void buttonStart_Click(object sender, EventArgs e)
{
System.Data.SqlServerCe.SqlCeConnection conn = new System.Data.SqlServerCe.SqlCeConnection(
("Data Source=" + (System.IO.Path.Combine(System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase), "ElectricReading.sdf") + ";Max Database Size=2047")));
try
{
// Connect to the local database
conn.Open();
System.Data.SqlServerCe.SqlCeCommand cmd = conn.CreateCommand();
SqlCeParameter param = new SqlCeParameter();
param.ParameterName = "#Barcode";
param.Value = "%" + textBarcode.Text.Trim() + "%";
// Insert a row
cmd.CommandText = "SELECT * FROM Main2 WHERE Reading LIKE #Barcode";
cmd.Parameters.Add(param);
cmd.ExecuteNonQuery();
DataTable data = new DataTable();
using (SqlCeDataReader reader = cmd.ExecuteReader())
{
if (reader.Read())
{
data.Load(reader);
}
}
if (data != null)
{
this.dataGrid1.DataSource = data;
}
}
finally
{
conn.Close();
}
The database contains this data:
Okay so you can see I changed the WHERE statement to use the Reading column just for testing purposes. When I enter "111" into the textbox and run --> it returns only the row where reading ="1111" and not the row that contains "111".
If I enter "1111" it does not return any data.
If I enter "1" it will return both the "1111" row and the "111" row which is the correct behavior.
However if I enter "11" it once again only returns the "1111" row.
Any other data entry of 2's or 9's attempting to return those rows does not work.
I'm not sure what is going on? This does not make any sense. It is not behaving like I would expect in any way shape or form. I know this must be a little confusing to read. I hope it makes enough sense to get some answers. Please help!
NOTE: I added the "%" before and after the text in an attempt to get better results. This is not desired.
EDIT <<<-----------------------I did have Reading = #Barcode, I just accidently typed Location for this question, that is not the problem.
Firstly, some things to note:
1) As other commentators have noted, use the Reading column, not the Location column. I know you have mentioned you are testing, but swapping around column names and then changing code isn't the easiest way to troubleshoot these things. Try to only change one thing at a time.
2) If Reading is numeric, you are going to have to convert the column value first.
So your query becomes:
"SELECT * FROM Main2 WHERE CONVERT(varchar, Reading) LIKE #Barcode";
Also see How to use parameter with LIKE in Sql Server Compact Edition for more help with working with parameters in SqlServerCE.
3) Set a parameter type on your SqlCEParameter. I've linked to the appropriate page in the code example below.
4) You are using ExecuteNonQuery for no reason. Just get rid of it in this context. It's for when you want to make a change to the database (like an insert, update, delete) or execute something (like a stored proc that can also insert, update, delete etc) that returns no rows. You've probably cut and paste this code from another place in your app :-)
5) Use using on disposable objects (see example below). This will make managing your connection lifecycle much simpler. It's also more readable (IMO) and will take care of issues when exceptions occur.
6) Use the using statement to import the BCL (Base Class Libraries) into your current namespace:
Add the following using statements to the top of your class (.cs). This will make using all of the .Net classes a lot simpler (and is much easier to read and less wear on your keyboard ;-)
using System.Data.SqlServerCe;
using System.IO;
using System.Reflection;
A more complete example would look like the following
private void buttonStart_Click(object sender, EventArgs e)
{
using(SqlCeConnection conn = new SqlCeConnection(
("Data Source=" + (Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().GetName().CodeBase), "ElectricReading.sdf") + ";Max Database Size=2047"))))
{
// Connect to the local database
conn.Open();
using(SqlCeCommand cmd = conn.CreateCommand())
{
SqlCeParameter param = new SqlCeParameter();
param.ParameterName = "#Barcode";
param.DBType = DBType.String; //Intellisense is your friend here but See http://msdn.microsoft.com/en-US/library/system.data.sqlserverce.sqlceparameter.dbtype(v=VS.80).aspx for supported types
param.Value = "%" + textBarcode.Text.Trim() + "%";
// SELECT rows
cmd.CommandText = "SELECT * FROM Main2 WHERE CONVERT(varchar, Reading) LIKE #Barcode";
cmd.Parameters.Add(param);
//cmd.ExecuteNonQuery(); //You don't need this line
DataTable data = new DataTable();
using (SqlCeDataReader reader = cmd.ExecuteReader())
{
data.Load(reader); //SqlCeDataReader does not support the HasRows property.
if(data.Rows.Count > 0)
{
this.dataGrid1.DataSource = data;
}
}
}
}
}
Intellisense should be able to clean up any errors with the above but feel free to ask for more help.
Finally, you also might be able to set the data source of the grid directly to a datareader, try it!
using (SqlCeDataReader reader = cmd.ExecuteReader())
{
dataGrid1.DataSource = reader;
}
You can then get rid of the DataTable.
Change the following line:
cmd.CommandText = "SELECT * FROM Main2 WHERE Location LIKE #Barcode";
to
cmd.CommandText = "SELECT * FROM Main2 WHERE Reading LIKE #Barcode";
You are comparing the wrong columns.

Categories

Resources