I have application server that connects to Sql Server database and returns query results to client applications via WCF. Most of code there is like that:
using(SqlConnection sqlConnection = GetConnectionProvider().NewConnection())
{
SqlCommand sqlCommand = new SqlCommand(data.SelectStoredProcedureName, sqlConnection);
sqlCommand.CommandType = CommandType.StoredProcedure;
sqlCommand.CommandTimeout = 600;
... add params
using (SqlDataReader sqlReader = sqlCommand.ExecuteReader())
{
... fill data
}
return serializedData;
}
in 99.99% percent of cases it works well but in some rare cases application begins to work very slowly. Average response time is 13 ms. After that freeze response time grows to 300-400 ms. Application begins to use 1 GB of memory and 100% cpu usage.
I'm not sure, but it can be related with running long sql queries. For example: during ordinary work some user runs long time running report. It can cause server freeze.
What can be a reason of that behavior? Any thounghts how I can diagnose that?
Related
We see that creating new connections takes about 250 milliseconds, which is much slower than expected.
From our application server to the SQL Server, we have about 1 millisecond ping-time, and we are operating inside pretty fast LAN.
Our connection string is:
Data Source=SQLServer;Initial Catalog=Database;Integrated Security=True
I have measured elapsed time around this simple statement
var conn = new SqlConnection(_connectionString);
if (conn.State == ConnectionState.Closed)
{
var sw = Stopwatch.StartNew();
await conn.OpenAsync().ConfigureAwait(false);
sw.StopAndLogIfDelay(20, _log, "Open SQL Server connection");
}
I would not expect 250 milliseconds to connect in a fast LAN. Does anyone have ideas or experience in where the problem could be?
Setting minimum connections to 50 in th sql-server connection pool solved the immidate problem.
What remains is to investigate why open db-connections takes such a long time in our organisation. I will ally with our network-team to see if we can locate the problem.
I have set the command timeout to 0 as per the documentation in SQL Server. I'm indexing a large table, and still get an exception "Execution timeout expired". The timeout period elapsed prior to completion of the operation or the server is not responding. The server is responding as I watch it though the SQL Server Monitor.
Here is the pertinent code:
private void ExecuteQuery(string qStr)
{
using (SqlConnection cnx = new SqlConnection(_ConnectionString))
{
cnx.Open();
using (SqlCommand cmd = new SqlCommand(qStr, cnx))
{
cmd.CommandTimeout = 0;
cmd.ExecuteNonQuery();
}
}
}
This is the connection string
Data Source='tcp:aplace.database.windows.net,1433';Initial Catalog='SQL-Dev';User Id='user#place';Password='password';Connection Timeout=360
Why am I getting a execution timeout? I have the connection timeout set to 7200 seconds, also. I am indexing a 31 million rows table on one column.
First, connection timeout and command timeout are not the same thing. Make sure you understand the difference and are using them correctly.
Second, if this is from a web page, you also need to consider timeout values relating to the web server, etc.
Third, to verify that it is in fact a timeout issue, execute your index statement in SSMS and find out how long it takes. Since the actual indexing takes place on the SQL server no matter where it is called from, the indexing time should be roughly equal whether running from SSMS or your application.
First of all, I realize that my question MAY be broad, please bear with me as I've been thinking on how to form it for a month and I still am not 100% sure how to express my issues.
I'm currently developing a website, that will be used by many thousands of users daily. The bottle neck is the communication with the Data Base.
Each and every conversation with the tables is done through stored procedures, whose calls look like this:
public void storedProcedure(int id, out DataSet ds)
{
ds = new DataSet("resource");
SqlDataReader objReader = null;
SqlCommand cmd = new SqlCommand("storedProcedure", DbConn.objConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#id", id));
openConnection(cmd);
SqlDataAdapter objDataAdapter = new SqlDataAdapter();
objDataAdapter.SelectCommand = cmd;
objDataAdapter.Fill(ds);
cmd.Connection.Close();
}
Or
public void anotherStoredProcedure(int var1, int var2, int var3, int var4, string var5, out DataSet ds)
{
ds = new DataSet("ai");
SqlCommand cmd = new SqlCommand("anotherStoredProcedure", DbConn.objConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#var1", var1));
cmd.Parameters.Add(new SqlParameter("#var2", var2));
cmd.Parameters.Add(new SqlParameter("#var3", var3));
cmd.Parameters.Add(new SqlParameter("#var4", var4));
cmd.Parameters.Add(new SqlParameter("#var5", var5));
openConnection(cmd);
SqlDataAdapter objDataAdapter = new SqlDataAdapter();
objDataAdapter.SelectCommand = cmd;
objDataAdapter.Fill(ds);
cmd.Connection.Close();
}
My objConn is defined as following:
public static string DatabaseConnectionString = System.Configuration.ConfigurationManager.ConnectionStrings["objConnLocal"].ConnectionString;
public static SqlConnection objConn = new SqlConnection(DatabaseConnectionString);
And ofcourse in web.config I have
<add name="objConnLocal" connectionString="Initial Catalog=t1;Data Source=1.2.3.4;Uid=id;pwd=pwd;Min Pool Size=20;Max Pool Size=200;" providerName="SQLOLEDB.1"/>
Now the issue is: On every page_load there a few sp calls (above), and when the user starts navigating through the page, more calls are made.
At the moment only the developing and testing team are on the site and at times the speed is really slow. Frequently it would keep loading till it times out (err 504).
Another problem (only ever now and then, but certainly frequent enough to be noticeable) on first user login is it would keep trying to run a call but the connection would claim to be opened, even though it shouldn't be. A fairly not-working work-around is
private void openConnection(SqlCommand cmd){
if (cmd.Connection.State != ConnectionState.Closed)
{
cmd.Connection.Close();
}
if (cmd.Connection.State == ConnectionState.Open)
{
cmd.Connection.Close();
}
try
{
cmd.Connection.Open();
}
catch (Exception ex)
{
System.Threading.Thread.Sleep(1000);
HttpContext.Current.Response.Redirect("/");
}
}
Which makes connecting slow but at least doesn't show the YSOD.
So, what am I doing wrong on my SQL calls so that it is so slow for only 5-10 users? What I have so far:
I've read on Stack Overflow that using "using" is quite nice, but am not entirely sure why and how come as it was a single line comment under an answer. Another idea for improvement was to use several connection strings and not only one.
Resolved:
Changing the wait the connection is established in the connection string from username/pwd to Integrated Security resolved the issue. IF anyone's having similar issue refer to http://www.codeproject.com/Articles/17768/ADO-NET-Connection-Pooling-at-a-Glance
You're right - it's a broad question!
For context - many "thousands of users daily" isn't huge from a performance point of view. A well-built ASP.Net application can typically support hundreds of concurrent users on a decently specified developer laptop; assuming 10K users per day, you probably only have a few dozen concurrent users at peak times (of course this depends entirely on the application domain).
The first thing to do is to use a profiler on your running code to see where the performance bottleneck is. This is available in VS, and there are several 3rd party solutions (I like RedGate and JetBrains).
The profiler will tell you where your code is slow - it should be pretty obvious if it's taking seconds for pages to render.
At first glance, it looks like you have a problem with the database. So you can also use the SQLServer activity monitor to look at long-running queries.
Now the issue is: On every page_load there a few sp calls (above), and
when the user starts navigating trough the page, more calls are made.
This sounds like you've written webpages which won't display anything until the Stored Procedure calls have completed. This is never a good idea.
Shift these SP calls into a background thread, so the user at least sees something when they go onto the webpage (like a "Please wait" message). This can also help prevent timeout messages.
One other thing: you don't say why your SPs take so long to run.
If you're dealing with lots of records, its worth running a SQL script (described on the link below) to check for missing SQL Server indexes.
Finding missing indexes
This script shows the missing indexes which have made the most impact to your users, also tells you the syntax of the CREATE INDEX command you'd need to run to add those indexes.
Apologies if the subject does not accurately reflect my exact issues, I'm struggling to explain the issue I'm having, although it seems quite straight-forward.
I've built a simple "db helper" class which executes sql statements for me, given some parameters, etc. Here's the code block:
public DataSet selectSprocData(string sprocName, SqlParameter[] parameterArray, out int returnValue)
{
//processes the specified Select stored procedure based on parameter array provided;
//this is the only place anywhere in the application we will do a simple SELECT using a sproc.
DataSet dataset = new DataSet();
using (SqlConnection cn = new SqlConnection(ConfigurationManager.ConnectionStrings["MyServer"].ToString()))
{
cn.Open();
SqlDataAdapter adapter = new SqlDataAdapter(sprocName, cn);
adapter.SelectCommand.CommandType = CommandType.StoredProcedure;
adapter.SelectCommand.Parameters.AddRange(parameterArray);
SqlParameter retValParam = adapter.SelectCommand.Parameters.Add("#RETURN_VALUE", SqlDbType.Int);
retValParam.Direction = ParameterDirection.ReturnValue;
adapter.SelectCommand.CommandTimeout = 600;
adapter.Fill(dataset);
returnValue = (int)retValParam.Value;
adapter.Dispose();
cn.Close();
}
return dataset;
}
When I take a long-running sproc and execute it within SSMS it will run and eventually time out. Meanwhile I can open another query window in SSMS and execute any other select or queries against my db.
Now, when I call this sproc through my web-app using the code-block above it, the page will spin and load and load until eventually (a few minutes later) the process will time out.
However, during this web-based call I can NOT open any other window and execute any other UI functions that use the same db-code to call other sprocs.
Essentially, one user executing a sproc/function from the UI which takes long seems to be blocking everyone else from doing anything on my app.
I understand that first and foremost I need to have better queries that don't time out, but is there something I'm missing or not doing right in .net/c# that would be causing all other connections or command attempts to be blocked until the other one has finished or timed out?
My web.config connectionstring has no special parameters, simply:
Persist Security Info=False;User ID=sa;Password=xxxx;Initial Catalog=db_live;Data Source=my.host.com"
Any help would be greatly appreciated
You can't really compare a Windows Application (SQL Management Studio) to an ASP.NET web app (running in IIS)
ALl of your ASP.NET code is running on a single thread, so that thread has to wait for the database code to complete, which blocks all everything else, until it's done.
Use an ASP.NET Update panel to perform your long running stored procedure and UI binding.
ASP.NET Update Panel
I've noticed some peculiar behaviour getting SQL data into a c# application. I used a dataset xsd but this began to time out. I then changed my approach as I already had a class which would generate and return a datatable for other operations I tried that. This too timed out.
I had opened the Activity Monitor in SSMS to get an idea of what was happening in SQL when the code ran and noticed that which ever way I run it the fill command casuses the SQL Sever to peak at 100% CPU and stay there until the command is cancelled. This is a good server with plenty of oomph 240Ghz processors and 30GB RAM . The query is not exactly zippy but returns 100k rows in under 3 seconds in SSMS.
Here is my code for the dataset:
public DataTable UKDataRefresh ()
{
UKREFRESH.UKNewContactsDataTable dt =
new UKREFRESH.UKNewContactsDataTable();
UKREFRESHTableAdapters.UKNewContactsTableAdapter ta =
new UKREFRESHTableAdapters.UKNewContactsTableAdapter();
ta.Connection.ConnectionString += ";Password = xxxxxx;";
try
{
ta.Fill(dt, global::SilverPopTransfer.Properties.Settings.Default.UKDATELASTACTION);
}
catch (SqlException )
{
throw;
}
return dt;
}
Here is my code for building it on the fly:
public DataTableOperations (string strconnection, string strSelect,string tablename)
{
SqlConnection c = new SqlConnection(strconnection);
connection = c;
SqlDataAdapter da = new SqlDataAdapter(strSelect, connection);
DataSet ds = new DataSet();
//added this to see what would happen.
da.SelectCommand.CommandTimeout = 0;
connection.Open();
da.Fill(ds, tablename);
connection.Close();
Datatable = ds.Tables[tablename];
_disposed = false;
}
Im looking for clues as to what might cause the problem, not a full solution.
Incidentally I ran a similar function in a pre-existing console application before posting and it connected and ran without error: the query was almost identical.
The sole difference is that for the pre-existing application I use Integrated security but in this case I specify the user and password. I have confirmed the login credentials.
Check index and set nocount on before execution. SSMS performance is not a good performance indicator. It get only partial data and executes asynchronous. Try calling a small subset of data or running the query without to fill the table. It can be the bottleneck.
If your are running the select with parameters, sql server will execute it using sp_execute then server compile it creating CPE peaks.