I'm running a large number of extracts from a SQL database via a C# Script Task within a SSIS package. The connection to the source is acquired from a connection manager within the package:
object rawConnection = Dts.Connections[sqlSpecItems["ConnectionManager"]].AcquireConnection(Dts.Transaction);
SqlConnection connectionFromCM = (SqlConnection)rawConnection;
(splSpecItems is a Dictionary object that supplies the name of the connection manager to use)
The ConnectionTimeout property of the connection manager is set to 0. The connection string generated for the CM is:
Data Source=MyDatabase;User ID=MyUserName;Initial Catalog=MyDatabaseName;Persist Security Info=True;Asynchronous Processing=True;Connect Timeout=0;Application Name=MyPackageApplicationName;
The connection is used to return a SqlDataReader object as follows:
private SqlDataReader GetDataReaderFromQuery(string sqlQueryToExecute)
{
// connect to server
SqlConnection sqlReaderSource = GetSourceSQLConnection();
// create command
SqlCommand sqlReaderCmd = new SqlCommand(sqlQueryToExecute, sqlReaderSource)
{
CommandType = CommandType.Text,
CommandTimeout = 0
};
// execute query to return data to reader
SqlDataReader sqlReader = sqlReaderCmd.ExecuteReader();
return sqlReader;
}
The operation fails with the error message:
Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Logged start and end times for the operation are typically around 50s apart, but can be up to 120s.
The source database is a Business Central cloud hosted SQL db. The failures occur on a small number of specific extracts, the larger ones (although by no means large in absolute terms, c20k rows). When attempting to query these via SSMS, there is typically a delay as the data has to be fetched from the source into memory, which I suspect to be the reason for the timeout (notwithstanding the setting of timeout = 0 for the connection and command). Note that once the failure has happened once, the data is in memory and so it doesn't repeat if I restart the job.
I've looked at various answers on this site, as well as across other sites. Nothing I have seen so far has given me any idea of what I might try to resolve this problem. Any help would be appreciated.
So, I found the problem. Just in case it helps anyone else, the timeout wasn't throwing because of the reader. Rather, it was the SqlBulkCopy operation downstream of it that I pass the reader into. Adding:
bulkCopy.BulkCopyTimeout = 0;
Has cleared the problem...
Related
I have set the command timeout to 0 as per the documentation in SQL Server. I'm indexing a large table, and still get an exception "Execution timeout expired". The timeout period elapsed prior to completion of the operation or the server is not responding. The server is responding as I watch it though the SQL Server Monitor.
Here is the pertinent code:
private void ExecuteQuery(string qStr)
{
using (SqlConnection cnx = new SqlConnection(_ConnectionString))
{
cnx.Open();
using (SqlCommand cmd = new SqlCommand(qStr, cnx))
{
cmd.CommandTimeout = 0;
cmd.ExecuteNonQuery();
}
}
}
This is the connection string
Data Source='tcp:aplace.database.windows.net,1433';Initial Catalog='SQL-Dev';User Id='user#place';Password='password';Connection Timeout=360
Why am I getting a execution timeout? I have the connection timeout set to 7200 seconds, also. I am indexing a 31 million rows table on one column.
First, connection timeout and command timeout are not the same thing. Make sure you understand the difference and are using them correctly.
Second, if this is from a web page, you also need to consider timeout values relating to the web server, etc.
Third, to verify that it is in fact a timeout issue, execute your index statement in SSMS and find out how long it takes. Since the actual indexing takes place on the SQL server no matter where it is called from, the indexing time should be roughly equal whether running from SSMS or your application.
I'm new to C# and SQL Server; I wrote a simple stored procedure in T-SQL, and in C# with under code call that:
da = new SqlDataAdapter("select_equals_Cycle", con);
da.SelectCommand.CommandType = CommandType.StoredProcedure;
ds = new DataSet();
da.Fill(ds, "select_equals_Cycle");
but on this line:
da.Fill(ds, "select_equals_Cycle");
I get this error:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
My connection string this:
string conn = "Data Source=.;Initial Catalog=ClubEatc;Integrated Security=True;Connect Timeout=30;";
How can I solve that? thanks.
Use CommandTimeout or optimize the StoredProcedure
da.SelectCommand.CommandTimeout = 180; // default is 30 seconds
Don't set the timeout value, if you don't know how to configure it.
Microsoft make the optional has well defined value.
the Timeout question has two items you have to check out.
Database:
e.g. MSSQL
--the code below ran in Query of Sql Server
--this code snippet will show you all the advance options
Exec sp_configure 'show advanced options',1
recogfigure
-- this code snippet configure the seconds of the query wait
-- this configuration means that if the memory couldn't used to query -- the big query, sql server will wait for some seconds
-- this option usually use the default value made by Microsoft
-- the default value about how many seconds of formula is the estimated query -- -- time multiply by 25, and if the memory still couldn't used for querying.
-- then throw TimeOut Exception
Exec sp_configure 'query wait', 200;
ReCONFIGURE
-- the end
// the timeout configuration about C#
SqlConnection.ConnectionTimeout = 200
// here is the Document:
// get the time to wait while trying to establish a connection before
// terminating the attempt and generating and error
// the Unit of the property is second
// 0 is no limit
after you scan the code snippet, you will find two reason can cause the exception.
your code couldn't establish the connection.
your memory in the machine install sql server couldn't used to run the big query
I know this question has been asked many times however none of the answers fit my issue.
I have a thread timer firing every 30 seconds that queries a MSSQL db that is under heavy load. If i need to update the data in the console app that i'm using i use Linq To Sql to update the data stored in memory.
My problem is sometimes I get the error ExecuteReader requires an open and available Connection.
The code from the thread timer fires a Thread.Run(reload());
The connection string is
//example code
void reload(...
string connstring = string.Format("Data Source={0},{1};Initial Catalog={2};User ID={3};Password={4};Application Name={5};Connect Timeout=120;MultipleActiveResultSets=True;Max Pool Size=1524;Pooling=true;"
settings = new ConnectionStringSettings("sqlServer", connstring, "System.Data.SqlClient");
using (var tx = new TransactionScope(TransactionScopeOption.Required,
new TransactionOptions() { IsolationLevel = IsolationLevel.ReadUncommitted }))
{
using (SwitchDataDataContext data = new SwitchDataDataContext(settings.ConnectionString))
{
data.CommandTimeout = 560;
then i do many linqtosql searches. The exceptions happen from time to time but not always on the same query's. it's like the connections is opened and is forced closed.
Sometimes the exceptions says the current status is Open, Closed, Connecting. I add a larger ThreadPool to the SQL db but nothing seems to help.
i also have ADO in other parts of the program without any issues.
I believe that your problem is that the transaction scope also has a timeout. The default timeout is 1 minute according to this answer. So the transaction times out long before your command does (560 seconds = 9.3 minutes or so) . You will need to set the timeout property in the instance of the TransactionOptions object you are creating
new TransactionOptions()
{
IsolationLevel = IsolationLevel.ReadUncommitted,
Timeout = new TimeSpan(0,10,0) /* 10 Minutes */
}
You can verify that is indeed the issue by setting the TransactionScope timeout to a small value to force it to timeout.
I changed Linq To Sql to Entity Framework and received the same type of message. I believe the issues is lazy Loading. I was using the collections before it was ready on a different thread. I just added .Include("Lab") to my collection to load the entire collection and it seems to of fixed the issue.
Some first things that people learned in their early use of MySQL that closing connection right after its usage is important, but why is this so important? Well, if we do it on a website it can save some server resource (as described here) But why we should do that on a .NET desktop application? Does it share the same issues with web application? Or are there others?
If you use connection pooling you won't close the physical connection by calling con.Close, you just tell the pool that this connection can be used. If you call database stuff in a loop you'll get exceptions like "too many open connections" quickly if you don't close them.
Check this:
for (int i = 0; i < 1000; i++)
{
var con = new SqlConnection(Properties.Settings.Default.ConnectionString);
con.Open();
var cmd = new SqlCommand("Select 1", con);
var rd = cmd.ExecuteReader();
while (rd.Read())
Console.WriteLine("{0}) {1}", i, rd.GetInt32(0));
}
One of the possible exceptions:
Timeout expired. The timeout period elapsed prior to obtaining a
connection from the pool. This may have occurred because all pooled
connections were in use and max pool size was reached.
By the way, the same is true for a MySqlConnection.
This is the correct way, use the using statement on all types implementing IDsiposable:
using (var con = new SqlConnection(Properties.Settings.Default.ConnectionString))
{
con.Open();
for (int i = 0; i < 1000; i++)
{
using(var cmd = new SqlCommand("Select 1", con))
using (var rd = cmd.ExecuteReader())
while (rd.Read())
Console.WriteLine("{0}) {1}", i, rd.GetInt32(0));
}
}// no need to close it with the using statement, will be done in connection.Dispose
Yes I think it is important to close out your connection rather than leaving it open or allowing the garbage collector to eventually handle it. There are a couple of reason why you should do this and below that I'll describe the best method for how
WHY:
So you've opened a connection to the database and sent some data back and forth along this pipeline and now have the results you were looking for. Ideally at this point you do something else with the data and the end results of your application is achieved.
Once you have the data from the database you don't need it anymore, its part in this is done so leaving the connection open does nothing but hold up memory and increase the number of connections the database and your application has to keep track of and possibly pushing you closer to your maximum number of connections limit.
"But wait! I have to make a lot of database calls in rapid
succession!"
Okay no problem, open the connection run your calls and then close it out again. Opening a connection to a database in a "modern" application isn't going to cost you a significant amount of computing power/time, while explicitly closing out a connection does nothing but help (frees up memory, lowers your number of current connections).
So that is the why, here is the how
HOW:
So depending on how you are connecting to your MySQL database you a probably using an IDisposible object to help manage the connection. Here is what MSDN has to say on using an IDisposable:
As a rule, when you use an IDisposable object, you should declare and
instantiate it in a using statement. The using statement calls the
Dispose method on the object in the correct way, and (when you use it
as shown earlier) it also causes the object itself to go out of scope
as soon as Dispose is called. Within the using block, the object is
read-only and cannot be modified or reassigned.
Here is my personal take on the subject:
Using a using block helps to keep your code cleaner (readability)
Using a usingblock helps to keep your code clear (memory wise), it will "automagically" clean up unused items
With a usingblock it helps to prevent using a previous connection from being used accidentally as it will automatically close out the connection when you are done with it.
In short, I think it is important to close connections properly, preferably with a con.close() type statement method in combination with a using block
As pointed out in the comments this is also a very good question/answer similar to yours: Why always close Database connection?
I have a ASP.net based application.
The CPU on the SQL Server box is constantly ~90 - 100%
There are a lot of inneficient queries, which I am currently working on, however, looking at the code from a previous coder, he never seemed to close (or dispose) the SqlConnection
When I run the folloing query, I get around 450 connections that are "Awaiting Command"
SELECT Count(*) FROM
MASTER.DBO.SYSPROCESSES WHERE
DB_NAME(DBID) = 'CroCMS' AND DBID != 0
AND cmd = 'AWAITING COMMAND'
Is this likely to be causing a problem?
I read this and it seems to relate:
http://www.pythian.com/news/1270/sql-server-understanding-and-controlling-connection-pooling-fragmentation/
We are also getting a lot of timeouts, specifically when replication is enabled..
I'm not sure if this is related.. Have disabled replication (transactional) for now and it seems ok..
(This server is a subscriber to our in office Database server)
Would disposing of the SQL connection object help?
Yes, dispose them. Otherwise ignore them for now. Possibly the pool is as large because the statements are slow. I would more suggest:
Fixing the statements.
Check the applicaion that it only uses one connection PER REQUEST (i.e. not open multiple at the same time).
If the problem does not get better after optiomizing SQL - you can revisit the pool.
You should always dispose the command object when your done with it. that way the connection pooling can be used better.
easist is to use the using statment.
using (
var sqlCommand = new SqlCommand(
"storedprocname",
new SqlConnection("connectionstring"))
{ CommandType = CommandType.StoredProcedure })
{
// do what you should.. setting params executing etc etc.
}