HasRows is empty despite valid query and connection - c#

For some reason, HasRows is false despite the connection and query seeming valid. There is a record in the table.
There are no errors when opening the connection, so I assume it's valid.
I can't see why HasRows would be false.
string connectionString = #"Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=C:\Projects\app\app.mdf;Integrated Security=True;Connect Timeout=30";
string sql = #"SELECT * from SavedSettings";
using (var con = new SqlConnection(connectionString))
using (var cmd = new SqlCommand(sql, con))
{
con.Open();
using (var reader = cmd.ExecuteReader())
{
if (reader.HasRows)
{
while (reader.Read())
{
if (reader["ignoreCase"].ToString() == "1")
{
ignoreCase.Checked = true;
}
else
{
ignoreCase.Checked = false;
}
}
}
}
}

Don't always trust the HasRows. There are times when it has data but returns false. It's been a little wonky since it was introduced. Try this instead:
DataTable dt=new DataTable();
dt.Load(cmd.ExecuteReader());
if (dt.Rows.Count > 0)
{
// Has rows.
}
else {
// No row.
}
OR you don't even need to check if (reader.HasRows) anymore. If your table has no row, then the program will bypass the while (reader.Read()) with no problem. Just use:
while (reader.Read())
{
if (reader["ignoreCase"].ToString() == "1")
{
ignoreCase.Checked = true;
}
else
{
ignoreCase.Checked = false;
}
}
Finally, restart Visual Studio and reconnected to the database then selected Show Table Data.

Disconnected then reconnected to the database and opened the data viewer and it shows my changes. I'm using Visual Studio 2017.
Refreshing the database and refreshing the data viewer did nothing.
Incidentally, it's still not syncing (see below), and I have to keep connecting/reconnecting to see the changes. I'm going to use a different editor.
Before reconnecting (includeSubFolders field shows on left, not right):
After reconnecting:

Related

How to stay connected to database until screen close?

I developed a C# program for Windows-CE platform. The program open and close the connection to the database for every single interaction. See the code below.
Button click:
private void btnStkIn_Click(object sender, EventArgs e)
{
formStockIn = new frmStkIn();
formStockIn.Show();
}
Select Data:
try
{
using (SqlConnection sqlConn = new SqlConnection(<connection-string>))
{
sqlConn.Open();
//Execute command
SqlCommand sqlCmd = new SqlCommand(<Select Query>, sqlConn);
SqlDataReader sqlReader = sqlCmd.ExecuteReader();
while (sqlReader.Read())
{
<Statement here>
}
}
}
catch
{
//SQL command error
itemDT.ErrorMessage = "Select process is failed.Please contact system admin.";
itemDT.Result = 12;
}
Update Data:
try
{
using (SqlConnection sqlConn = new SqlConnection(<connection-string>))
{
sqlConn.Open();
//Execute command
SqlCommand sqlCmd = new SqlCommand(<Update Query>, sqlConn);
if (sqlCmd.ExecuteNonQuery() <= 0)
{
//No row affect
return -99;
}
else
{
//Completed
return 0;
}
}
}
catch
{
//Sql command error
return 99;
}
I would like to connect to database once (when the form in shown) and then do select, Insert, Update the data using the same connection and close the connection when I close the screen. At run-time, some screen can select-update more than once.
What should I do?
What you are doing is fine. It is good practice to keep the connection open for the shortest time possible and then disposing it. This is what you are doing and that's good.
If you keep it open and the user goes off on lunch or vacation and clicks nothing else, you are keeping a connection for no good reason.
If you need to do multiple things at the same time, then open one connection and execute the queries and then close the connection right away.

Getting Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool

I have been facing this timed out Problem more frequently. I have been using MYSQL Database and working on windows application. I have even tried on using having clause but faced the same situation
public bool VerifyStock(string serialnumber)
{
con = new MySqlConnection(connstring);
string readData = "select * from Fn_Inventory where ModelNumber = '" + serialnumber + "'";
cmd = new MySqlCommand(readData, con);
cmd.Parameters.AddWithValue("#ModelNumber", serialnumber);
con.Open();
dr = cmd.ExecuteReader();
if (dr.HasRows)
{
//while (dr.Read())
if (dr.Read())
{
decimal invquntity = Convert.ToDecimal(dr["AvailableQuantity"].ToString());
decimal quantity = Convert.ToDecimal(txtQuantity.Text);
decimal sinvquntity = invquntity - quantity;
if (sinvquntity >= 0)
{
return false;
}
else
{
return true;
}
}
else
{
return false;
}
}
else
{
return false;
}
con.Close();
}
Increasing connection pool size and time out will be a quick fix. You can refer here for more information
And properly close opened connections and the end outside conditions. Use a try catch finally block and add this code in finally block so that it is always executed.
if (con.State == ConnectionState.Open)
{
con.Close();
}
You are opening connection but not closing it based on your conditions and returning from code using return statement either close connections where ever you are using return statement or use using() clause.When connections remain open there will be no connection free to use so pool will be full.
Always close connections when you are done reading values from datareader.
using (SqlConnection con = new SqlConnection()) { //your code here}

SqlDataReader and database access concurrency

The easiest way to illustrate my question is with this C# code:
using (SqlCommand cmd = new SqlCommand("SELECT * FROM [tbl]", connectionString))
{
using (SqlDataReader rdr = cmd.ExecuteReader())
{
//Somewhere at this point a concurrent thread,
//or another process changes the [tbl] table data
//Begin reading
while (rdr.Read())
{
//Process the data
}
}
}
So what would happen with the data in rdr in such situation?
I actually tested this. Test code:
using (SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["test"].ConnectionString))
{
conn.Open();
using (SqlCommand comm = new SqlCommand("select * from test", conn))
{
using (var reader = comm.ExecuteReader())
{
int i = 0;
while (reader.Read())
{
if ((string)reader[1] == "stop")
{
throw new Exception("Stop was found");
}
}
}
}
}
To test, I initialized the table with some dummy data (making sure that no row with the value 'stop' was included). Then I put a break point on the line int i = 0;. While the execution was halted on the break point, I inserted a line in the table with the 'stop' value.
The result was that depending on the amount of initial rows in the table, the Exception was thrown/not thrown. I did not try to pin down where exactly the row limit was. For ten rows, the Exception was not thrown, meaning the reader did not notice the row added from another process. With ten thousand rows, the exception was thrown.
So the answer is: It depends. Without wrapping the command/reader inside a Transaction, you cannot rely on either behavior.
Obligatory disclaimer: This is how it worked in my environment...
EDIT:
I tested using a local Sql server on my dev machine. It reports itself as:
Microsoft SQL Server 2008 R2 (SP1) - 10.50.2550.0 (X64)
Regarding transactions:
Here's code where I use a transaction:
using (SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["test"].ConnectionString))
{
conn.Open();
using (var trans = conn.BeginTransaction())
using (SqlCommand comm = new SqlCommand("select * from test", conn, trans))
{
using (var reader = comm.ExecuteReader())
{
int i = 0;
while (reader.Read())
{
i++;
if ((string)reader[1] == "stop")
{
throw new Exception("Stop was found");
}
}
}
trans.Commit();
}
}
In this code, I create the transaction without explicitly specifying an isolation level. That usually means that System.Data.IsolationLevel.ReadCommitted will be used (I think the default isolation level can be set in the Sql Server settings somewhere). In that case the reader behaves the same as before. If I change it to use:
...
using (var trans = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
...
the insert of the "stop" record is blocked until the transaction is comitted. This means that while the reader is active, no changes to underlying the data is allowed by Sql Server.

Multiples Table in DataReader

I normally use DataSet because It is very flexible. Recently I am assigned code optimization task , To reduce hits to the database I am changing two queries in a procedure. one Query returns the count and the other returns the actual data. That is , My stored procedure returns two tables. Now, I know how to read both tables using DataSets, But I need to read both tables using DataReader. In search of that I found This.
I follow the article and wrote my code like this:
dr = cmd.ExecuteReader();
while (dr.Read())
{
}
if (dr.NextResult()) // this line throws exception
{
while (dr.Read())
{
But I am getting an exception at dt.NextResult. Exception is :
Invalid attempt to call NextResult when reader is closed.
I also googled above error , but still not able to solve the issue.
Any help will be much appreciated. I need to read multiple tables using datareader, is this possible?
Try this because this will close connection ,data reader and command once task get over , so that this will not give datareader close exception
Also do check like this if(reader.NextResult()) to check there is next result,
using (SqlConnection connection = new SqlConnection("connection string here"))
{
using (SqlCommand command = new SqlCommand
("SELECT Column1 FROM Table1; SELECT Column2 FROM Table2", connection))
{
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
MessageBox.Show(reader.GetString(0), "Table1.Column1");
}
if(reader.NextResult())
{
while (reader.Read())
{
MessageBox.Show(reader.GetString(0), "Table2.Column2");
}
}
}
}
}
I have tried to reproduce this issue (also because i haven't used multiple tables in a reader before). But it works as expected, hence i assume that you've omitted the related code.
Here's my test code:
using (var con = new SqlConnection(Properties.Settings.Default.ConnectionString))
{
using (var cmd = new SqlCommand("SELECT TOP 10 * FROM tabData; SELECT TOP 10 * FROM tabDataDetail;", con))
{
int rowCount = 0;
con.Open();
using (IDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
String object1 = String.Format("Object 1 in Row {0}: '{1}'", ++rowCount, rdr[0]);
}
if (rdr.NextResult())
{
rowCount = 0;
while (rdr.Read())
{
String object1 = String.Format("Object 1 in Row {0}: '{1}'", ++rowCount, rdr[0]);
}
}
}
}
}
I built on Pranay Rana's answer because I like keeping it as small as possible.
string rslt = "";
using (SqlDataReader dr = cmd.ExecuteReader())
{
do
{
while (dr.Read())
{
rslt += $"ReqID: {dr["REQ_NR"]}, Shpr: {dr["SHPR_NR"]}, MultiLoc: {dr["MULTI_LOC"]}\r\n";
}
} while (dr.NextResult());
}
The question is old but I find the answers are not correct.
Here's how I do it:
List<DataTable> dataTables = new();
using IDataReader dataReader = command.ExecuteReader();
do
{
DataTable dataTable = new();
dataTable.Load(dataReader);
dataTables.Add(dataTable);
}
while (!dataReader.IsClosed);

Very slow performance when updating records

I am writing my C# application that connects to database, does couple selects and then inserts record back to the server in the network.
But I have around 40k records and my program processes one records like for 1 second.
I don't know how to improve the performance. Here are my sql getter and inserter. Any suggestions?
public bool insert_and_ConfirmSQL(String Query, String comments)
{
bool success = false;
NpgsqlCommand cmd = new NpgsqlCommand();
NpgsqlConnection mycon = new NpgsqlConnection();
string connstring = String.Format("Server={0};Port={1}; User Id={2};Password={3};Database={4};", tbHost, tbPort, tbUser, tbPass, tbDataBaseName);
mycon.ConnectionString = connstring;
cmd = mycon.CreateCommand();
cmd.CommandText = Query;
mycon.Open();
int temp = 0;
try
{
temp = cmd.ExecuteNonQuery();
success = true;
}
catch
{
success = false;
}
finally
{
if (mycon.State.ToString() == "Open")
{
mycon.Close();
}
}
return success;
}
public String getString(String sql, NpgsqlConnection conn)
{
using (DataSet ds = new DataSet())
{
using (NpgsqlDataAdapter da = new NpgsqlDataAdapter(sql, conn))
{
da.Fill(ds);
if (ds.Tables.Count > 0)
{
DataTable dt = ds.Tables[0];
// check count of Rows
if (dt.Rows.Count > 0)
{
object o = dt.Rows[0][0];
if (o != DBNull.Value && o != null)
{
return o.ToString();
}
}
}
}
}
// Return default value
return "";
}
There is no chance to help you with your sql statements because we don't know them.
The only (guessing) visible problem is that you are trying to loop for 40K records (you know how) with a new connection each time. Both routines provided as your code do exactly this. So do you need 40K calls to insert_and_ConfirmSQL and/or another 40K to getString routines?
If you really loop then update your code to use only one connection without closing it; the same can be done with a dataset (you have to clear it before use: ds.Clear()).
Needless to say that if your queries are huge (in terms of data), and/or indexes do not cover the queries, then delays are expected.
Try with this approach and let us know.
Sometimes you can improve performance by using a sql Datareader instead of DataSets.
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqldatareader(v=vs.100).aspx#Y0
You then can Inspect the data while you are reading it from the database.
So I'd implement the above code with a DataReader and retime it.
Edit: Especially that getString method. if that ds.Fill was taking 40k rows that could be the cause of your performance problem.

Categories

Resources