SqlDataReader hangs on Dispose() - c#

I use the following approach to execute queries over database and read data:
using(SqlConnection connection = new SqlConnection("Connection string"))
{
connection.Open();
using(SqlCommand command = new SqlCommand("SELECT * FROM TableName", connection))
{
using (SqlDataReader reader = command.ExecuteReader())
{
// read and process data somehow (possible source of exceptions)
} // <- reader hangs here if exception occurs
}
}
While reading and processing data some exceptions can occur. The problem is when exception is thrown DataReader hangs on Close() call. Do you have any ideas why??? And how to solve this issue in a proper way? The problem has gone when I wrote try..catch..finally block instead of using and called command.Cancel() before disposing the reader in finally.
Working version:
using(SqlConnection connection = new SqlConnection("Connection string"))
{
connection.Open();
using(SqlCommand command = new SqlCommand("SELECT * FROM TableName", connection))
{
SqlDataReader reader = command.ExecuteReader();
try
{
// read and process data somehow (possible source of exceptions)
}
catch(Exception ex)
{
// handle exception somehow
}
finally
{
command.Cancel(); // !!!
reader.Dispose();
}
}
}

When an exception occurs you stop processing data before all data is received. You can reproduce this issue even without exceptions if you abort processing after a few rows.
When the command or reader is disposed, the query is still running on the server. ADO.NET just reads all remaining rows and result sets like mad and throws them away. It does that because the server is sending them and the protocol requires receiving them.
Calling SqlCommand.Cancel sends an "attention" to SQL Server causing the query to truly abort. It is the same thing as pressing the cancel button in SSMS.
To summarize, this issue occurs whenever you stop processing rows although many more rows are inbound. Your workaround (calling SqlCommand.Cancel) is the correct solution.

About the Dispose method of the SqlDataReader, MSDN (link) has this to say:
Releases the resources used by the DbDataReader and calls Close.
Emphasis added by me. And if you then go look at the Close method (link), it states this:
The Close method fills in the values for output parameters, return
values and RecordsAffected, increasing the time that it takes to close
a SqlDataReader that was used to process a large or complex query.
When the return values and the number of records affected by a query
are not significant, the time that it takes to close the SqlDataReader
can be reduced by calling the Cancel method of the associated
SqlCommand object before calling the Close method.
So if you need to stop iterating through the reader, it's best to cancel the command first just like your working version is doing.

I would not format it that way.
Open(); is not in a try block and it can throw an exception
ExecuteReader(); is not in a try block and it can throw an exception
I like reader.Close - cause that is what I see in MSDN samples
And I catch SQLexception as they have numbers (like for timeout)
SqlConnection connection = new SqlConnection();
SqlDataReader reader = null;
try
{
connection.Open(); // you are missing this as a possible source of exceptions
SqlCommand command = new SqlCommand("SELECT * FROM TableName", connection);
reader = command.ExecuteReader(); // you are missing this as a possible source of exceptions
// read and process data somehow (possible source of exceptions)
}
catch (SqlException ex)
{
}
catch (Exception ex)
{
// handle exception somehow
}
finally
{
if (reader != null) reader.Close();
connection.Close();
}

Related

There is already an open DataReader associated with this Command, without nested datareaders

I'm getting the following error intermittently.
There is already an open DataReader associated with this Command which must be closed first.
I read that this can happens when there are nested DataReaders in the same connection, but in my case, I'm using the following code to execute all queries.
private SqlTransaction Transaction { get; set; }
private SqlConnection Connection { get; set; }
private DbRow Row {get; set;}
public Row Exec(string sql){
try{
//Begin connection/transaction
Connection = new SqlConnection(connectionString);
Connection.Open();
Transaction = Connection.BeginTransaction("SampleTransaction");
//create command
SqlCommand command = new SqlCommand(sql, Connection);
command.Transaction = Transaction;
//execute reader and close it
//HERE IS THE PROBLEM, THE READER ALWAYS READ UNTIL THE END
//BEFORE ANOTHER CAN BE OPENED
reader = command.ExecuteReader();
while (reader.Read())
{
object[] value = new object[reader.FieldCount];
reader.GetValues(value);
List<object> values = new List<object>(value);
Rows.Add(values);
}
reader.Close();
Transaction.Commit();
Connection.Dispose();
Connection = null;
}
catch
{
Transaction.Rollback();
Connection.Dispose();
Connection = null;
}
finally
{
if (reader != null && !reader.IsClosed) reader.Close();
}
}
This way, The result is stored in an object and there isn't nested readers.
I also read that adding 'MultipleActiveResultSets=True' to connection string can solve the problem when using nested readers.
Does this solution also solve my problem?
As the error is intermitent and only happens in production environment, I can't test it many times.
There is already an open DataReader associated with this Command which must be closed first. at
System.Data.SqlClient.SqlInternalConnectionTds.ValidateConnectionForExecute(SqlCommand command) at
System.Data.SqlClient.SqlInternalTransaction.Rollback() at
System.Data.SqlClient.SqlTransaction.Rollback() at
Application.Lib.DB.DBSQLServer.Rollback()
at Application.Lib.DB.DBSQLServer.Execute(String sql, Dictionary`2 parameters,
Nullable`1 timeout, Boolean useTransaction) at
Application.UtilDB.Execute(String sql, Dictionary`2 parameters, Nullable`1
timeout, Boolean useTransaction) in c:\Application\DBUtil.cs:line 37 at
Application.A.CollectionFromDataBase(Int32 cenId,
IDB db, Int32 includeId, Boolean allStatus) in c:\Application\Activities.cs:line 64 at
Application.ActivitiesController.CheckForConflictsBeforeSave(String aulId, String insId) in c:\Application\AlocController.cs:line 212
The problem was that, when a query fails, the transaction can't be rolled back because the data reader is already open to process the query.
A second exception is thrown and the first one is lost.
I just placed the rollback inside a try catch block and used the AggregateException class to throw both exceptions.
try
{
Transaction.Rollback();
Connection.Dispose();
Connection = null;
}
catch (Exception ex2)
{
throw new AggregateException(new List<Exception>() { e, ex2 });
}
Although the transaction will be rolled back anyway, I think you can also try to close the data reader before the rollback, so it will probably work.
if (reader != null && !reader.IsClosed)
reader.Close();
Transaction.Rollback();
Since this happens only on production it's more likely that bug is outside the code you attached.
Most common way to prevent this is to always code in following fashion:
reader = command.ExecuteReader();
try
{
for (int i = 0; i < reader.FieldCount; i++)
{
dbResult.Columns.Add(reader.GetName(i));
dbResult.Types.Add(reader.GetDataTypeName(i));
}
while (reader.Read())
{
object[] value = new object[reader.FieldCount];
reader.GetValues(value);
List<object> values = new List<object>(value);
Rows.Add(values);
}
}
finally
{
reader.Close();
}
Notice the finally block, it makes sure reader is closed no matter what. I am under impression that something happens in your code that leaves the reader open but the bug isn't visible in the code you've posted.
I recommend you enclose it in the above try/finally block and your bug is quite likely to be resolved.
Edit, to clarify: This may not resolve whatever bug exists outside the scope of the originally shown code but it will prevent data readers being left open. The finally block I suggested won't block any exceptions, they will be propagated to whatever handler you employ outside of it.

ThreadPool.QueueUserWorkItem versus BeginExecuteNonQuery

I have a problem at work with a simple insert method occasionally timing out due to a scheduled clean-up task on a database table. This task runs every ten minutes and during its execution my code often records an error in the event log due to 'the wait operation timed out'.
One of the solutions I'm considering is to make the code calling the stored procedure asynchronous, and in order to do this I first started looking at the BeginExecuteNonQuery method.
I've tried using the BeginExecuteNonQuery method but have found that it quite often does not insert the row at all. The code I've used is as follows:
SqlConnection conn = daService.CreateSqlConnection(dataSupport.DBConnString);
SqlCommand command = daService.CreateSqlCommand("StoredProc");
try {
command.Connection = conn;
command.Parameters.AddWithValue("page", page);
command.Parameters.AddWithValue("Customer", customerId);
conn.Open();
command.BeginExecuteNonQuery(delegate(IAsyncResult ar) {
SqlCommand c = (SqlCommand)ar.AsyncState;
c.EndExecuteNonQuery(ar);
c.Connection.Close();
}, command);
} catch (Exception ex) {
LogService.WriteExceptionEntry(ex, EventLogEntryType.Error);
} finally {
command.Connection.Close();
command.Dispose();
conn.Dispose();
}
Obviously, I'm not expecting an instant insert but I am expecting it to be inserted after five minutes on a low usage development database.
I've now switched to the following code, which does do the insert:
System.Threading.ThreadPool.QueueUserWorkItem(delegate {
using (SqlConnection conn = daService.CreateSqlConnection( dataSupport.DBConnString)) {
using (SqlCommand command = daService.CreateSqlCommand("StoredProcedure")) {
command.Connection = conn;
command.Parameters.AddWithValue("page", page);
command.Parameters.AddWithValue("customer", customerId);
conn.Open();
command.ExecuteNonQuery();
}
}
});
I've got a few questions, some of them are assumptions:
As my insert method's signature is void, I'm presuming code that calls it doesn't wait for a response. Is this correct?
Is there a reason why BeginExecuteNonQuery doesn't run the stored procedure? Is my code wrong?
Most importantly, if I use the QueueUserWorkItem (or a well-behaved BeginExecuteNonQuery) am I right in thinking this will have the desired result? Which is, that an attempt to run the stored procedure whilst the scheduled task is running will see the code executing after the task completes, rather than its current timing out?
Edit
This is the version I'm using now in response to the comments and answers I've received.
SqlConnection conn = daService.CreateSqlConnection(
string.Concat("Asynchronous Processing=True;",
dataSupport.DBConnString));
SqlCommand command = daService.CreateSqlCommand("StoredProc");
command.Connection = conn;
command.Parameters.AddWithValue("page", page);
command.Parameters.AddWithValue("customer", customerId);
conn.Open();
command.BeginExecuteNonQuery(delegate(IAsyncResult ar) {
SqlCommand c = (SqlCommand)ar.AsyncState;
try {
c.EndExecuteNonQuery(ar);
} catch (Exception ex) {
LogService.WriteExceptionEntry(ex, EventLogEntryType.Error);
} finally {
c.Connection.Close();
c.Dispose();
conn.Dispose();
}
}, command);
Is there a reason why BeginExecuteNonQuery doesn't run the stored
procedure? Is my code wrong?
Probably you didn't add the Asynchronous Processing=True in the connection string.
Also - there could be a situation that when the reponse from sql is ready - the asp.net response has already sent.
that's why you need to use : Page.RegisterASyncTask (+AsyncTimeout)
(if you use webform asynchronous pages , you should add in the page directive : Async="True")
p.s. this line in :
System.Threading.ThreadPool.QueueUserWorkItem is dangerouse in asp.net apps. you should take care that the response is not already sent.

ADO.NET Funny Connection Pool Behaviour when bury SQL Exception

I am catching a sql exception and not rethrowing it. This seems to mean that the connection is not returned to the pool as I would expect. Is this possible?
using (IDbCommand paymentCommand = this.Connection.CreateCommand())
{
try
{
//database stuff
}
catch (SqlException ex)
{
//LOG CALL
}
}
why don't you put using(...){} inside try{} block? This way even if exception is thrown, using block will dispose off IDBcmd obj.
It's not clear in your question how you are creating the connection, but you do need to make sure you Open it, then Close it, regardless of errors or not.
Typically I'll do something like this:
SqlConnection connection = null;
try {
connection.Open();
// Do stuff like run a query, setup your IDbCommand, etc.
} catch (Exception ex) {
// Log error
} finally {
if (connection != null) {
connection.Close();
}
}
This way, no matter what happens, your connection will be closed and returned to the pool. If you fail to Close(), you'll "leak" that connection and eventually run out of pooled connections to draw from. The lifetime of the connection should generally only be as long as it takes to issue your sql command, at which point you should be closing it.
It's not clear what you are experiencing with the connection pool. However, I would definitely wrap your connection in a using statement.
This is what I usually use (note that dac.GetConnection() is simply a class that centralizes the code to get a connection object):
using (SqlConnection connection = dac.GetConnection())
{
using (SqlCommand command = new SqlCommand("myProc", connection))
{
command.CommandType = CommandType.StoredProcedure;
try
{
connection.Open();
//add params, run query
}
catch (Exception ex)
{
//handle/log errror
}
finally
{
if (connection.State == ConnectionState.Open)
connection.Close();
}
}
}

SQL server and .NET memory constraints, allocations, and garbage collection

I am running .NET 3.5 (C#) and SQL Server 2005 (for our clients). The code that we run does some regression math and is a little complicated. I get the following error when I run multiple pages on our site:
.NET Framework execution was aborted by escalation policy because of out of memory.
System.InvalidOperationException: There is already an open DataReader associated with this Command which must be closed first.
System.InvalidOperationException:
I'm trying to figure out what is the root cause of this: is it a database issue or my C## code? or is it concurrency with locks when running queries? or somethin else?
The code is erroring here:
erver.ScriptTimeout = 300;
string returnCode = string.Empty;
using (SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["MainDll"].ToString())) {
connection.Open();
using (SqlCommand command = new SqlCommand(sql.ToString(), connection)) {
command.CommandType = CommandType.Text;
command.CommandTimeout = 300;
returnCode = (string)command.ExecuteScalar();
//Dispose();
}
//Dispose();
}
Our contractor wrote a bunch of code to help with SQL connections in an App_Code/sqlHelper.s file. Some of them are like this:
public static SqlDataReader GetDataReader(string sql, string connectionString, int connectionTime) {
lock (_lock) {
SqlConnection connection = null;
try {
connection = GetConnection(connectionString);
//connection.Open();
using (SqlCommand cmd = new SqlCommand(sql, connection)) {
cmd.CommandTimeout = connectionTime;
WriteDebugInfo("GetDataReader", sql);
return cmd.ExecuteReader(CommandBehavior.CloseConnection);
}
}
catch (Exception e) {
if (connection != null)
connection.Dispose();
throw new DataException(sql, connectionString, e);
}
}
}
Should there be some deallocation of memory somewhere?
The problem is that, for some reason, your DataReader isn't being closed. An exception? The method user didn't remember to close the DataReader?
A function that returns a DataReader to be used outside its body leaves the responsibility of closing it to outer code, so there's no guarantee that the Reader will be closed. If you don't close the reader, you cannot reuse the connection in which it was opened.
So returning a DataReader from a function is a very bad idea!
You can see a whole discussion on this subject here.
Look for the usages of this function (GetDataReader), and check if there's guarantee that the reader is getting closed. And, most importantly, that there is no possibility that this code re-enters and uses the same collection to open a new DataReader before the first is closed. (Don't be mislead by the CommandBehavior.CloseConnection. This only takes care of closing the connection when the DataReader is closed... only if you don't fail to close it)
This is because your data reader is already filled in. Its always a better way to release the data reader, command , data set , data table and close the connection in finally block.
Make use of Dispose() and Close() methods .

Retrying method to call database

Im making a system which should be running 24/7, with timers to control it. There are many calls to the database, and at some point, two methods are trying to open a connection, and one of them will fail. I've tried to make a retry method, so my methods would succeed. With the help from Michael S. Scherotter and Steven Sudit's methods in Better way to write retry logic without goto, does my method look like this:
int MaxRetries = 3;
Product pro = new Product();
SqlConnection myCon = DBcon.getInstance().conn();
string barcod = barcode;
string query = string.Format("SELECT * FROM Product WHERE Barcode = #barcode");
for (int tries = MaxRetries; tries >= 0; tries--) //<-- 'tries' at the end, are unreachable?.
{
try
{
myCon.Open();
SqlCommand com = new SqlCommand(query, myCon);
com.Parameters.AddWithValue("#barcode", barcode);
SqlDataReader dr = com.ExecuteReader();
if (dr.Read())
{
pro.Barcode = dr.GetString(0);
pro.Name = dr.GetString(1);
}
break;
}
catch (Exception ex)
{
if (tries == 0)
Console.WriteLine("Exception: "+ex);
throw;
}
}
myCon.Close();
return pro;
When running the code, the program stops at the "for(.....)", and the exception: The connection was not closed. The connection's current state is open... This problem was the reason why I'm trying to make this method! If anyone knows how to resovle this problem, please write. Thanks
You do
myCon.Open();
inside the for loop, but
myCon = DBcon.getInstance().conn();
outside of it. This way you try to open the same connection multiple times. If you want to protect against loss of DB connection you need to put both inside teh loop
You should move the call to myCon.Open outside the for statement or wrap myCon.Open() checking the connection state before re-opening the connection:
if (myCon.State != ConnectionState.Open)
{
myCon.Open();
}
Edited for new information
How about using Transactions to preserve data integrity, getting on-the-fly connections for multiple access and wrapping them in Using statements to ensure connections are closed? eg
Using (SqlConnection myCon = new SqlConnection('ConnectionString'))
{
myCon.Open();
var transaction = myCon.BeginTransaction();
try
{
// ... do some DB stuff - build your command with SqlCommand but use your transaction and your connection
var sqlCommand = new SqlCommand(CommandString, myCon, transaction);
sqlCommand.Parameters.Add(new Parameter()); // Build up your params
sqlCommand.ExecuteNonReader(); // Or whatever type of execution is best
transaction.Commit(); // Yayy!
}
catch (Exception ex)
{
transaction.RollBack(); // D'oh!
// ... Some logging
}
myCon.Close();
}
This way even if you forget to Close the connection, it will still be done implicitly when the connection gets to the end of its Using statement.
Have you tried adding
myCon.Close();
Into a Finally block. It looks like it is never being hit if you have an exception. I would highly recommend that you wrap the connection, command object etc in Using statements. This will ensure they are disposed of properly and the connection is closed.

Categories

Resources