I've noticed some peculiar behaviour getting SQL data into a c# application. I used a dataset xsd but this began to time out. I then changed my approach as I already had a class which would generate and return a datatable for other operations I tried that. This too timed out.
I had opened the Activity Monitor in SSMS to get an idea of what was happening in SQL when the code ran and noticed that which ever way I run it the fill command casuses the SQL Sever to peak at 100% CPU and stay there until the command is cancelled. This is a good server with plenty of oomph 240Ghz processors and 30GB RAM . The query is not exactly zippy but returns 100k rows in under 3 seconds in SSMS.
Here is my code for the dataset:
public DataTable UKDataRefresh ()
{
UKREFRESH.UKNewContactsDataTable dt =
new UKREFRESH.UKNewContactsDataTable();
UKREFRESHTableAdapters.UKNewContactsTableAdapter ta =
new UKREFRESHTableAdapters.UKNewContactsTableAdapter();
ta.Connection.ConnectionString += ";Password = xxxxxx;";
try
{
ta.Fill(dt, global::SilverPopTransfer.Properties.Settings.Default.UKDATELASTACTION);
}
catch (SqlException )
{
throw;
}
return dt;
}
Here is my code for building it on the fly:
public DataTableOperations (string strconnection, string strSelect,string tablename)
{
SqlConnection c = new SqlConnection(strconnection);
connection = c;
SqlDataAdapter da = new SqlDataAdapter(strSelect, connection);
DataSet ds = new DataSet();
//added this to see what would happen.
da.SelectCommand.CommandTimeout = 0;
connection.Open();
da.Fill(ds, tablename);
connection.Close();
Datatable = ds.Tables[tablename];
_disposed = false;
}
Im looking for clues as to what might cause the problem, not a full solution.
Incidentally I ran a similar function in a pre-existing console application before posting and it connected and ran without error: the query was almost identical.
The sole difference is that for the pre-existing application I use Integrated security but in this case I specify the user and password. I have confirmed the login credentials.
Check index and set nocount on before execution. SSMS performance is not a good performance indicator. It get only partial data and executes asynchronous. Try calling a small subset of data or running the query without to fill the table. It can be the bottleneck.
If your are running the select with parameters, sql server will execute it using sp_execute then server compile it creating CPE peaks.
Related
I have a C# program that connects to a remote server to query data. The data is quite big so the query takes about 2 mins to finish. During this 2 min window, the internet went down. This resulted in the job being unable to finish with the program stuck in the getting data routine.
It established connection but during the select query it was cut off. Setting the command timeout to 30 seconds did not work. I need the query to fail when encountering this error because the program can handle failure but it cannot handle being stuck. Thanks!
UPDATE: included code
OracleConnection connection = new OracleConnection(connectionstring);
OracleDataAdapter oracleDataAdapter = new OracleDataAdapter(new OracleCommand(query, connection));
oracleDataAdapter.SelectCommand.CommandTimeout = 30;
DataSet dataSet = new DataSet();
try
{
oracleDataAdapter.Fill(dataSet, table); //Hangs on this line when connection is lost
return dataSet;
}
catch
{
throw;
}
finally
{
dataSet.Dispose();
oracleDataAdapter.Dispose();
}
UPDATE AGAIN:
What I need to do is handle this situation because I don't want a dangling process.
Simplest would be once the connection is lost is that the program will throw an error. That is what I don't know how to do. I assumed that the commandtimeout will fix it but it did not.
There are a few duplicates reporting this problem, eg: System being hang when the connection is lost while adapter is filling the datatables
I found a good thread on MSDN where the OP answers:
I have solved this problem a while back, sorry i forgot to come and let you all know. I worked out that the code stopped executing at that line because (for some reason) there was already an open connection to the database.
Since DA.Fill would open a connection itself if there wasnt one previously opened, it was having a hissy fit and bombing out.
I solved this by putting Connection.Close(); before and after any connection to the database is needed.
Based on this we can see you are not explicitly opening a Connection to the Database. Suggest you do a:
connection.Open();
Also follow Steve Py's answer with the using to confirm you are closing the connection and disposing unmanaged resources.
I see a couple issues with your statement, assuming it's using ODP.Net. Try the following:
DataSet dataSet = new DataSet();
using (OracleConnection connection = new OracleConnection(connectionstring))
{
using (OracleDataAdapter oracleDataAdapter = new OracleDataAdapter(new OracleCommand(query, connection)))
{
oracleDataAdapter.Fill(dataSet, table);
}
}
return dataSet;
The using blocks will handle disposing of the connection and data adapter. In your example the connection did not get disposed which may have been part of your issue. Additionally I don't think you want to dispose the dataset if you intend to return it.
Since you were bubbling up the exception with a Throw I removed the exception handling. Keep in mind that this will bubble the exception so somewhere in your calling code chain you will need to catch the exception and handle it. If the app is just sitting there then be wary of any empty "catch" blocks eating exceptions.
Updated answer:
DataSet dataset = new DataSet();
using (OracleConnection connection = new OracleConnection(connection))
{
using (OracleDataAdapter oracleDataAdapter = new OracleDataAdapter(new OracleCommand(query, connection)))
{
oracleDataAdapter.SelectCommand.CommandTimeout = 30;
connection.Open();
oracleDataAdapter.Fill(dataset, table);
}
}
return dataset;
First of all, I realize that my question MAY be broad, please bear with me as I've been thinking on how to form it for a month and I still am not 100% sure how to express my issues.
I'm currently developing a website, that will be used by many thousands of users daily. The bottle neck is the communication with the Data Base.
Each and every conversation with the tables is done through stored procedures, whose calls look like this:
public void storedProcedure(int id, out DataSet ds)
{
ds = new DataSet("resource");
SqlDataReader objReader = null;
SqlCommand cmd = new SqlCommand("storedProcedure", DbConn.objConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#id", id));
openConnection(cmd);
SqlDataAdapter objDataAdapter = new SqlDataAdapter();
objDataAdapter.SelectCommand = cmd;
objDataAdapter.Fill(ds);
cmd.Connection.Close();
}
Or
public void anotherStoredProcedure(int var1, int var2, int var3, int var4, string var5, out DataSet ds)
{
ds = new DataSet("ai");
SqlCommand cmd = new SqlCommand("anotherStoredProcedure", DbConn.objConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#var1", var1));
cmd.Parameters.Add(new SqlParameter("#var2", var2));
cmd.Parameters.Add(new SqlParameter("#var3", var3));
cmd.Parameters.Add(new SqlParameter("#var4", var4));
cmd.Parameters.Add(new SqlParameter("#var5", var5));
openConnection(cmd);
SqlDataAdapter objDataAdapter = new SqlDataAdapter();
objDataAdapter.SelectCommand = cmd;
objDataAdapter.Fill(ds);
cmd.Connection.Close();
}
My objConn is defined as following:
public static string DatabaseConnectionString = System.Configuration.ConfigurationManager.ConnectionStrings["objConnLocal"].ConnectionString;
public static SqlConnection objConn = new SqlConnection(DatabaseConnectionString);
And ofcourse in web.config I have
<add name="objConnLocal" connectionString="Initial Catalog=t1;Data Source=1.2.3.4;Uid=id;pwd=pwd;Min Pool Size=20;Max Pool Size=200;" providerName="SQLOLEDB.1"/>
Now the issue is: On every page_load there a few sp calls (above), and when the user starts navigating through the page, more calls are made.
At the moment only the developing and testing team are on the site and at times the speed is really slow. Frequently it would keep loading till it times out (err 504).
Another problem (only ever now and then, but certainly frequent enough to be noticeable) on first user login is it would keep trying to run a call but the connection would claim to be opened, even though it shouldn't be. A fairly not-working work-around is
private void openConnection(SqlCommand cmd){
if (cmd.Connection.State != ConnectionState.Closed)
{
cmd.Connection.Close();
}
if (cmd.Connection.State == ConnectionState.Open)
{
cmd.Connection.Close();
}
try
{
cmd.Connection.Open();
}
catch (Exception ex)
{
System.Threading.Thread.Sleep(1000);
HttpContext.Current.Response.Redirect("/");
}
}
Which makes connecting slow but at least doesn't show the YSOD.
So, what am I doing wrong on my SQL calls so that it is so slow for only 5-10 users? What I have so far:
I've read on Stack Overflow that using "using" is quite nice, but am not entirely sure why and how come as it was a single line comment under an answer. Another idea for improvement was to use several connection strings and not only one.
Resolved:
Changing the wait the connection is established in the connection string from username/pwd to Integrated Security resolved the issue. IF anyone's having similar issue refer to http://www.codeproject.com/Articles/17768/ADO-NET-Connection-Pooling-at-a-Glance
You're right - it's a broad question!
For context - many "thousands of users daily" isn't huge from a performance point of view. A well-built ASP.Net application can typically support hundreds of concurrent users on a decently specified developer laptop; assuming 10K users per day, you probably only have a few dozen concurrent users at peak times (of course this depends entirely on the application domain).
The first thing to do is to use a profiler on your running code to see where the performance bottleneck is. This is available in VS, and there are several 3rd party solutions (I like RedGate and JetBrains).
The profiler will tell you where your code is slow - it should be pretty obvious if it's taking seconds for pages to render.
At first glance, it looks like you have a problem with the database. So you can also use the SQLServer activity monitor to look at long-running queries.
Now the issue is: On every page_load there a few sp calls (above), and
when the user starts navigating trough the page, more calls are made.
This sounds like you've written webpages which won't display anything until the Stored Procedure calls have completed. This is never a good idea.
Shift these SP calls into a background thread, so the user at least sees something when they go onto the webpage (like a "Please wait" message). This can also help prevent timeout messages.
One other thing: you don't say why your SPs take so long to run.
If you're dealing with lots of records, its worth running a SQL script (described on the link below) to check for missing SQL Server indexes.
Finding missing indexes
This script shows the missing indexes which have made the most impact to your users, also tells you the syntax of the CREATE INDEX command you'd need to run to add those indexes.
I have application server that connects to Sql Server database and returns query results to client applications via WCF. Most of code there is like that:
using(SqlConnection sqlConnection = GetConnectionProvider().NewConnection())
{
SqlCommand sqlCommand = new SqlCommand(data.SelectStoredProcedureName, sqlConnection);
sqlCommand.CommandType = CommandType.StoredProcedure;
sqlCommand.CommandTimeout = 600;
... add params
using (SqlDataReader sqlReader = sqlCommand.ExecuteReader())
{
... fill data
}
return serializedData;
}
in 99.99% percent of cases it works well but in some rare cases application begins to work very slowly. Average response time is 13 ms. After that freeze response time grows to 300-400 ms. Application begins to use 1 GB of memory and 100% cpu usage.
I'm not sure, but it can be related with running long sql queries. For example: during ordinary work some user runs long time running report. It can cause server freeze.
What can be a reason of that behavior? Any thounghts how I can diagnose that?
Apologies if the subject does not accurately reflect my exact issues, I'm struggling to explain the issue I'm having, although it seems quite straight-forward.
I've built a simple "db helper" class which executes sql statements for me, given some parameters, etc. Here's the code block:
public DataSet selectSprocData(string sprocName, SqlParameter[] parameterArray, out int returnValue)
{
//processes the specified Select stored procedure based on parameter array provided;
//this is the only place anywhere in the application we will do a simple SELECT using a sproc.
DataSet dataset = new DataSet();
using (SqlConnection cn = new SqlConnection(ConfigurationManager.ConnectionStrings["MyServer"].ToString()))
{
cn.Open();
SqlDataAdapter adapter = new SqlDataAdapter(sprocName, cn);
adapter.SelectCommand.CommandType = CommandType.StoredProcedure;
adapter.SelectCommand.Parameters.AddRange(parameterArray);
SqlParameter retValParam = adapter.SelectCommand.Parameters.Add("#RETURN_VALUE", SqlDbType.Int);
retValParam.Direction = ParameterDirection.ReturnValue;
adapter.SelectCommand.CommandTimeout = 600;
adapter.Fill(dataset);
returnValue = (int)retValParam.Value;
adapter.Dispose();
cn.Close();
}
return dataset;
}
When I take a long-running sproc and execute it within SSMS it will run and eventually time out. Meanwhile I can open another query window in SSMS and execute any other select or queries against my db.
Now, when I call this sproc through my web-app using the code-block above it, the page will spin and load and load until eventually (a few minutes later) the process will time out.
However, during this web-based call I can NOT open any other window and execute any other UI functions that use the same db-code to call other sprocs.
Essentially, one user executing a sproc/function from the UI which takes long seems to be blocking everyone else from doing anything on my app.
I understand that first and foremost I need to have better queries that don't time out, but is there something I'm missing or not doing right in .net/c# that would be causing all other connections or command attempts to be blocked until the other one has finished or timed out?
My web.config connectionstring has no special parameters, simply:
Persist Security Info=False;User ID=sa;Password=xxxx;Initial Catalog=db_live;Data Source=my.host.com"
Any help would be greatly appreciated
You can't really compare a Windows Application (SQL Management Studio) to an ASP.NET web app (running in IIS)
ALl of your ASP.NET code is running on a single thread, so that thread has to wait for the database code to complete, which blocks all everything else, until it's done.
Use an ASP.NET Update panel to perform your long running stored procedure and UI binding.
ASP.NET Update Panel
I have a C# ASP.net project that sends a dataset to my connection class where I update the database accordingly.
The dataset I send through is populated with data from 2 tables from my database, thus I used a Join to get the data. (thus the individual update)
Now I have made the changes I want to the dataset and want to Update the database. Both section 1 and 2 of my code works IF only one is run at a time (thus either section 1 or 2 should be commented out).
But when I try and run both, it only updates the database with the first part (no error is thrown, and the code does execute)
Why does this happen? I've also closed and re-opened my connection after the first update to see if that made any difference.
public void udpateCourse(DataSet dataSetEmp)
{
try
{
conn.Open();
//SECTION 1 -- THE FIRST UPDATE
da = new SqlDataAdapter("select * from EthicsManagement", conn);
var builderForTable1 = new SqlCommandBuilder(da);
da.Update(dataSetEmp, "Table");
//SECTION 2 -- THE SECOND UPDATE
da = new SqlDataAdapter("select employeeId, name as [Employee Name] from EmployeeTable", conn);
builderForTable1 = new SqlCommandBuilder(da);
da.Update(dataSetEmp, "Table");
conn.Close();
}
catch (Exception ex)
{
throw new Exception("Problem with the SQL connection " + ex);
}
}
--Update--
What I've Tried
Closing and opening the connection again
New Instances of the adapter and builders
even putting them in a separate method
Having one query (a join on two tables)
I know it's an old question, but perhaps it helps someone.
MSDN says the following happens when calling Update:
When using Update, the order of execution is as follows:
The values in the DataRow are moved to the parameter values.
The OnRowUpdating event is raised.
The command executes.
If the command is set to FirstReturnedRecord, then the first returned result is placed in the DataRow.
If there are output parameters, they are
placed in the DataRow.
The OnRowUpdated event is raised.
AcceptChanges is called.
The second Update() doesn't have any effect, because the changes to it were committed by the AcceptChanges() command.