I have a stored procedure that is executing an INSERT statement that we are seeing significant delays when executing. When running it from our C# .NET application to insert 30 records in a row, it's taking roughly 4 seconds total to complete (only counting the tame it takes to run the SqlCommand.ExecuteNonQuery() method). However, calling the same exact stored procedure from within SQL Server Management Studio the same number of times only takes about 0.4 seconds. I can't figure out what's different between the 2 setups that would make such a difference of 10x speed.
I have tried all of the following with no noticeable change in speed:
Creating the stored procedure "WITH RECOMPILE"
Checking all of the "SET" values that are configured within SSMS and C#. The only difference one was SET ARITHABORT, which was ON in SSMS and OFF when called from the .NET application. Adding "SET ARITHABORT ON" to the start of the stored procedure made no difference, though.
Removed all default values from the sproc parameters
The code used to call the stored procedure from the .NET application is:
using (SqlConnection newConn = new SqlConnection(connectionString))
{
using (SqlCommand uCmd = new SqlCommand("sproc_name", newConn))
{
uCmd.CommandType = CommandType.StoredProcedure;
uCmd.Connection.Open();
//About 15 parameters added using:
uCmd.Parameters.AddWithValue("#ParamName", value);
...
//One output parameter
SqlParameter paramOUT = new SqlParameter("#OutPutKey", SqlDbType.UniqueIdentifier);
paramOUT.Direction = ParameterDirection.Output;
uCmd.Parameters.Add(paramOUT);
uCmd.ExecuteNonQuery();
uCmd.Connection.Close();
}
}
The stored procedure itself is just a list of set commands (SET ANSI_NULLS ON, SET QUOTED_IDENTIFIER ON, SET ARITHABORT ON), a list of non-defaulted parameters, and the setting of the output variable that will be the new uniqueidentifier that will be inserted as the primary key in the table, followed by the INSERT statement itself.
The application is build on .NET 4 and the SQL server is MS SQL Server 2005.
Here is an example of the insert stored procedure it's calling:
alter procedure InsertStuff
#Field1 uniqueidentifier,
#Field2 datetime,
...
#CreateDate datetime,
#PrimaryKEY uniqueidentifier OUTPUT
AS
declare #newCreateDate datetime
set #newCreateDate=getDate()
set #PrimaryKEY = NEWID()
INSERT INTO [dbo].[Table]
(
Field1,
Field2,
...
CreateDate,
PrimaryKEY
)
VALUES
(
#Field1,
#Field2,
...
#newCreateDate,
#PrimaryKEY
)
Likely the issue is that every execute command call does a network hop, where as ssms will send all 30 commands to the server at once in a batch. I believe by default SSMS will send all 30 statements as a single batch, but if you've changed other settings that may impact things as well.
Also, make sure youre not opening and closing the connection each time. While connection pools may make that a non issue, I wouldn't leave it to chance.
Related
I have a stored procedure with couple parameters. My data table type has 2 columns (int, nvarchar).
When I run this stored procedure and pass IEnumerable<SqlDataRecords> with defined type then query results on my machine is 9 times slower than the same stored procedure without passing this parameter.
The stored procedure doesn't touch this param. Only passing.
It looks like something (sql server?) do with data passed as structured (table-valued) value.
Maybe I am missing something. Maybe there is special switch to:
off any kind of validation
anything else?
Type:
CREATE TYPE dbo.MyData AS TABLE
(
[Ver] INT NOT NULL,
[Name] NVARCHAR(225) NOT NULL
)
Stored procedure:
CREATE PROCEDURE [dbo].[SaveData]
(#Id UNIQUEIDENTIFIER, #Data MyData READONLY)
AS
BEGIN
SET NOCOUNT ON;
END
UPDATE 1:
I've changed query. This stored procedure does nothing. Difference is in passing value or null.
UPDATE 2:
Added stored procedure and type definition.
UPDATE 3:
I'm using SQL Server 2014 Express.
UPDATE 4:
5000 iterations with parameter takes 11281ms (443/sec), without table-valued param - 1029ms (4856/sec).
I would advise running SQL Profiler to examine your queries on the database. You can also look into using SSMS Activity Monitor by right clicking on the database, by clicking on 'activity monitor', and may get lucky and see a 'recent expensive query'.
hth
I am struggling to understand why a certain stored procedure has stopped working in a few of my databases, but not in others. I am hoping that someone can help me resolve this issue.
Introduction
I inherited an existing C# application that connects to a choice of SQL Server databases depending on the culture parameter supplied to the program. Example: Passing "en-CA" causes the program to connect to the database with English-Canada content. Passing "fr-CA" causes the program to connect to the database with French-Canada content. The databases are derived from a common root database. The databases are essentially identical except for the contents of many of the NVARCHAR fields. (This variety of databases is used solely during development for testing various cultures.)
Both databases use the following collation: SQL_Latin1_General_CP1_CI_AS
Issue
I am not sure when this issue started, but the current situation is that if I call a certain stored procedure from the fr-CA database, then it is not executed at all. (I will explain this in more detail.) No error code is returned to the program. the program acts as if no record was found.
However, if I call the same stored procedure from the en-CA database, then it functions as expected and a record is returned to the program.
Attempted Steps
If I run the stored procedure from SSMS, then it executes properly.
I have attempted copying the definition of the stored procedure from the database where it is executing properly to the database where it is not executing properly. This did not resolve the issue.
I did try debugging with the SQL Profiler. When I ran the stored procedure against both databases, I see an entry in the trace. I do not see any errors listed. I will admit that I am a newbie when it comes to using the Profiler.
When I say that the stored procedure is not being executed, I base this on the following test. I created a debug table with a couple of fields:
create table DEBUG
(
Id INTEGER,
Line NVARCHAR(100)
);
At the top of the stored procedure, in both databases, I inserted as the very first line the following statement:
INSERT INTO dbo.DEBUG VALUES (1, 'Top of Atms_Get_Tray_Infos');
When my code calls the stored procedure, I expect to see a line in the DEBUG table.
If I run the program against the en-CA database, I do see the expected line:
If I empty the DEBUG table and then run the program against the fr-CA database, the DEBUG table remains empty. This fact leads me to believe that the stored procedure is not being executed.
Database details
Here is the definition of the stored procedure with the debug line:
SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[Atms_Get_Tray_Infos]
#TrayNo AS NVARCHAR(10)
AS
BEGIN
-- DEBUG
INSERT INTO dbo.DEBUG VALUES (1, 'Top of Atms_Get_Tray_Infos');
-- SET NOCOUNT ON added to prevent extra result sets from interfering with SELECT statements.
SET NOCOUNT ON;
BEGIN TRY
SELECT HTRAY.SEQ_HIST_PLATEAU AS TRAYNO,
HTRAY.DATE_EXPIRATION_DATE AS EXPIRY,
HTRAY.DATE_UTILISATION_DATE AS DATEUSED,
HTRAY.LADATE_LAVAGE AS WASHDATE,
HSTE.SEQ_CODE_QUAL_STERIL AS QLTYCODE,
HSTE.NO_CHARGE AS CHGNO,
HSTE.TEST_BIO_BON AS BIOTEST,
FRML.CODE AS FORMULACODE,
TRAY.NO_TYPE_PLATEAU AS TRAYCODE,
TRAY.DESCRIPTION_S,
TRAY.EstUrgent AS URGENT
FROM dbo.HIST_PLAT HTRAY
LEFT JOIN dbo.HIST_CHARG_STE HSTE ON HTRAY.LAST_SEQ_HIST_CHARGE_STERIL = HSTE.SEQ_HIST_CHARGE_STERIL
INNER JOIN dbo.PLATEAUX TRAY ON TRAY.SEQ_PLATEAU = HTRAY.NO_SEQ_PLATEAU
INNER JOIN dbo.FORMULE FRML ON HSTE.SEQ_FORMULE = FRML.SEQ_FORMULE
WHERE HTRAY.SEQ_HIST_PLATEAU = #TrayNo
END TRY
BEGIN CATCH
DECLARE #ErrorMessage NVARCHAR(4000);
DECLARE #ErrorSeverity INT;
DECLARE #ErrorState INT;
SELECT #ErrorMessage = ERROR_MESSAGE(),
#ErrorSeverity = ERROR_SEVERITY(),
#ErrorState = ERROR_STATE();
RAISERROR (#ErrorMessage, #ErrorSeverity, #ErrorState);
END CATCH
END
I appreciate any bit of assistance that will lead me to a resolution of this issue. Thanks!
Paolo's comment, above, caused me to investigate the actual C# code that calls the stored procedure.
The code is convoluted for the sake of being convoluted, in my opinion.
There is a method is some class that handles all calls to stored procedures. I replaced that code with this basic code:
DataSet dataSet = new DataSet("ReturnDs");
using (var connection = new System.Data.SqlClient.SqlConnection(theConnectStg))
{
using (var command = new System.Data.SqlClient.SqlCommand(theStoreProcName, connection))
{
using (var dataAdapter = new System.Data.SqlClient.SqlDataAdapter(command))
{
command.CommandType = CommandType.StoredProcedure;
if (theParameterList != null)
{
foreach (String str1 in theParameterList.ToArray())
{
String parameterName = str1.Substring(0, str1.IndexOf(":"));
String str2 = str1.Substring(str1.IndexOf(":") + 1);
dataAdapter.SelectCommand.Parameters.Add(new SqlParameter(parameterName, SqlDbType.VarChar, 128));
dataAdapter.SelectCommand.Parameters[parameterName].Value = (object)str2;
}
}
dataAdapter.Fill(dataSet);
}
}
}
return dataSet;
To satisfy your curiosity, the theParameterList parameter is an array of parameters, each in the form "#variable:value". I'm not a fan, but I am stuck with the existing code for now.
So, why did the previous code fail for certain databases? I still do not know. I am curious, but I do not wish to spend any more time on this issue. My brain is tired.
Thanks for the clue, Paolo!
I am currently trying to stabilize an asp.net 2.0 website.
I am about 95% sure that the main problem in the stability of the system is that the C# code is leaking SQL connections.
The accepted answer on this post describes exactly my problem:
Why is my SqlCommand returning a string when it should be an int?
That beign said, I am currently running this sql statement to pinpoint the possible problem:
SELECT S.spid, login_time, last_batch, status, hostname, program_name, cmd,
(
select text from sys.dm_exec_sql_text(S.sql_handle)
) as last_sql
FROM sys.sysprocesses S
where dbid > 0
and DB_NAME(dbid) = 'db'
and loginame = 'login'
order by last_batch asc
What I find weird is that the login used to connect to the DB from the website keeps returning last_sql as:
CREATE PROCEDURE name
-- Add the parameters for the stored procedure here
...
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
... Procedure code
The question is, why would a create procedure statement be run over and over?
Also, is it a bad practice to have 3-4 (not so active) website connecting to DB using the same connection string?
I am trying to insert > 8000 characters (submit from a web page) via ExecuteNonQuery (and DatabaseFactory.CreateDatabase() from MS Practices Enterprise Library). The stored procedure defines the parameter as VARCHAR(MAX). The column is VARCHAR(MAX). In theory, 2GB of data should be able to be passed.
What can I do to pass data > 8000? I set a breakpoint and the string.Length is indeed > 8K.
public static void UpdateTerms(string terms)
{
Database db = DatabaseFactory.CreateDatabase();
db.ExecuteNonQuery("uspUpdateTerms", terms);
}
Stored procedure:
ALTER PROCEDURE [dbo].[uspUpdateTerms]
#Terms VARCHAR(MAX)
AS
SET NOCOUNT ON
INSERT INTO tblTerms(Terms)
VALUES(#Terms)
Table (just to show that everything is varchar(max)):
CREATE TABLE [dbo].[tblTerms](
[ID] [int] IDENTITY(1,1) NOT NULL,
[Terms] [varchar](max) NULL,
[DateUpdated] [datetime] NULL,
.
Update:
I just changed the code, and this seems to work, though I am not sure what the difference is:
public static void UpdateTerms(string terms)
{
Database db = DatabaseFactory.CreateDatabase();
DbCommand cmd = db.GetStoredProcCommand("uspUpdateTerms");
db.AddInParameter(cmd, "Terms", DbType.String, terms);
db.ExecuteNonQuery(cmd);
}
The issue may not be the storage of the data, it may be the retrieval.
If you are trying to determine whether or not more than 8000 chars were stored in the DB through enterprise manager, then you are out of luck if you just select the contents of the columns and look at the text length: enterprise manager limits the column output.
To determine how much data is actually stored in the column, execute the following query:
SELECT DATALENGTH(Terms) FROM tblTerms
This will tell you how much text was stored.
EDIT:
Another thought just occurred: the enterprise library caches stored procedure parameters in order to improve performance. If you changed the stored procedure after testing with the parameter set to nvarchar(8000), then switch the parameter to nvarchar(max) without resetting the application (if IIS-hosted, then iisreset or dirty web.config), then you will still be using the old stored proc parameter.
REPLICATE returns the input type irrespective of later assignment. It's annoying, but to avoid silent truncation, try:
SET #x = REPLICATE(CONVERT(VARCHAR(MAX), 'a'), 10000);
This is because SQL Server performs the REPLICATE operation before it considers what you're assigning it to or how many characters you're trying to expand it to. It only cares about the input expression to determine what it should return, and if the input is not a max type, it assumes it is meant to fit within 8,000 bytes. This is explained in Books Online:
If string_expression is not of type varchar(max) or nvarchar(max), REPLICATE truncates the return value at 8,000 bytes. To return values greater than 8,000 bytes, string_expression must be explicitly cast to the appropriate large-value data type.
Your sample code can be fixed by doing:
declare #x varchar(max)
set #x = replicate (cast('a' as varchar(max)), 10000)
select #x, len(#x)
You haven't shown the code where you are trying to use ExecutenonQuery. Note that you should use parameters.
using(var con = new SqlConnection(conString))
using(var cmd = new SqlCommand("storedProcedureName", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#text", SqlDbType.NVarChar, -1);
cmd.Parameters["#text"].Value = yourVeryLongText;
cmd.ExecuteNonQuery();
}
From yesterday, i'm facing a problem:when i call a stored proc from c#,it lasts >5 in, but when i execute it directly from SSMS (in the server machine) its lasts less than 30 seconds.
I have searched in forums and went trough this great article http://www.sommarskog.se/query-plan-mysteries.html but no result.
The script contained in my proc is retrieving 10 columns among them a column called "article" of type nvarchar(max).
When i remove the article column from my Select ,my proc executes quickly.
To further my logic, i have created a new stored proc retrieving just Primary Key Column and nvarchar(max) column.
I'm reproducing the same behaviour.Here is my new proc=MyNewProc(lasts >5 min when called from c# and 0 Secondes in the server from SSMS)
CREATE PROCEDURE Student.GetStudents
AS
BEGIN
SET NOCOUNT ON
-----------------
SELECT StudentId,Article
FROM Students
WHERE Degree=1
END
MyNewProc returns just 2500 rows.
Is that normal? How can i improve that.
SELECT SUM(DATALENGTH(Article)) FROM Students WHERE Degree=1
the result is 13885838
You're probably transferring a lot of data over the network. That takes time.
Instead of returning article try returning LEFT(article, 50) to see if its an issue with the volume of data or not.
One thing to note is that SSMS will begin populating the results immediately while a C# application probably will not.
In SSMS, go to the following: Tools -> Options
Then go to Query Execution -> SQL Server -> Advanced
From here, look at what check boxes are checked and if there is something that is checked, SSMS will use this automatically when you execute a sproc from inside of it but when you execute it from C# (or whatever client you're using) it won't be used.
I had this same issue and found out that I needed to include the following line at the top of my sproc and it worked perfectly:
SET ARITHABORT ON;