Strange C# to SQL Dynamic Query Execution - c#

All, I have a dynamic SQL Query that I am executing from a C# application. The problem query is an INSERT statement, which is run from within a C# loop, being executed sequentially on many databases to create a single data warehouse [database]. I have run this code one 100+ databases in a single batch without problem; however, I have just come across one specific database where the query
DECLARE #DbName NVARCHAR(128);
SET #DbName = (SELECT TOP 1 [DbName]
FROM [IPACostAdmin]..[TmpSpecialOptions]);
DECLARE #FilterSql NVARCHAR(MAX);
SET #FilterSql = (SELECT TOP 1 [AdditionalSQL]
FROM [IPACostAdmin]..[TmpSpecialOptions]);
DECLARE #SQL NVARCHAR(MAX);
DECLARE #SQL1 NVARCHAR(MAX);
DECLARE #SQL2 NVARCHAR(MAX);
SET #SQL1 =
'INSERT INTO [' + #DbName + ']..[Episode] WITH(TABLOCK)
([EstabID],..., [InclFlag]) ';
SET #SQL2 =
'SELECT
[EstabID],..., [InclFlag]
FROM [B1A] ' + #FilterSql + ';';
SET #SQL = #SQL1 + #SQL2;
EXEC sp_executesql #SQL;
Goes from taking roughly three seconds for an insert of 20,000-30,000 records to 40+ minutes! Now, after long deliberation and experiments, I have just worked out the fix for this; it is to use
EXEC sp_executesql #SQL WITH RECOMPILE;
This brings it back down to < 2s for the insert.
This SQL is executed from the application once for each database in the batch, the current execution of this statement should be totally separate from the preceding ones as far as the server is concerned (as I understand it), but it is not; it seems SQL is cashing the dynamic SQL in this case.
I would like to know what is happening here for this single site? Where will I need to ensure I use the RECOMPILE option in future to prevent such issues?
Thanks for your time.
_Note. I appreciate that this recompiles the query, but I am baffelled as to why the server is using the same execution plan in the first place. each time this query is run it is against a different database using a different Initial Catalog using a different SqlConnection.

when you do RECOMPILE, sql server will generate each time new execution plan and execute it. other wise it will try to use an existing execution plan stored in the procedure cache, which may be wrong for the current query as in dynamic query, conditions and parameters get changed each time it executes..

Related

Getting a timeout error from a stored procedure in C#

I have this stored procedure to retrieve data from a database (dynamic query). I am calling this stored procedure from C# code, passing two parameters to this stored procedure:
ALTER PROCEDURE [dbo].[GetCompleteCPTDetails]
#Practice_Short_Name varchar(50),
#Uploaded_Date nvarchar(max)
AS
BEGIN
DECLARE #CPTtablename nvarchar(300)
DECLARE #vQuery NVARCHAR(max)
DECLARE #upldate nvarchar(100)
SET #upldate = #Uploaded_Date
SET #CPTtablename = 'ACER_CLAIMS_MASTER_DETAIL_Hist_' + #Practice_Short_Name
SET #vQuery = 'SELECT Practice_Short_Name, Service_Date_From, Carrier_Name,
Location_Description, Patient_Number, Patient_First_Name,
Patient_Last_Name, Voucher_Number, Procedure_Code, Service_Fees,
Service_Payments, Service_Adjustments, Acer_Status, Acer_Allowed_Amount
FROM ' +#CPTtablename+'
WHERE Uploaded_Date =''' + #upldate + '''
ORDER BY acer_status ASC, Service_Date_From DESC, Patient_First_Name ASC'
EXEC (#vQuery)
END
But when I am running this query I get a timeout error. If I assign value to my parameters in the stored procedure and run it from query windows then it is showing correct data.
Can anyone please explain to me why I get a timeout error if I am calling it from C#?
That is a pretty simple where and order by.
Unless that is just a massive table with no indexes that should be fast.
Is there an index on Uploaded_Date and is it not fragmented.
Also an index on the sort would help.
Are you loading everything into a DataTable?
If so try loading into DataReader.
Try a top 1 and remove the order by.
If that does not return then you have connection issue as no way that query should time out.
The other thing to try is with (no lock) to see if it is a lock problem.
Why is #Uploaded_Date nvarchar(max)?
Is that a date or not?
There can be many solutions to this problem, as problem areas can be different in each case.
But most common:
Check & increase sqlcommand timeout in your application
Try calling this SP asynchronously
Also i would like to know, your application on the same machine where DB resides?

Dynamic SQL INSERT Takes 30x Longer than it Should

We have built a C# .NET system that can be used to create data warehouses. This system takes selected databases and run a script against these databases to create a combined database/warehouse.
Now, I have three databases to be compiled into a single database and I am copying two tables from each (table [XI] and table [XII] - which have a one to many relationship, but have no constraints set up at the time of the copy/INSERT INTO). The figures for the script to run and the relevant sizes for each table are below:
The executed script consists of 30 SQL queries.
DatabaseA:
Table [XI] 29,026 Rows (size 20,128Kb).
Table [XII] 531,958 Rows (size 50,168Kb).
Time taken for entire script: 1.51s.
DatabaseB:
Table [XI] 117,877 Rows (size 17,000Kb).
Table [XII] 4,000,443 Rows (size 512,824Kb).
Time taken for entire script: 2.04s.
These both run fine and fast. The next is almost exactly the same size as the first but takes 40x as long!
DatabaseC:
Table [XI] 29,543 Rows (size 20,880Kb).
Table [XII] 538,302 Rows (size 68,000Kb).
Time taken for entire script: 44.38s.
I cannot work out why this is taking so long. I have used SQL Server Profiler and the Performance Monitor, but I cannot nail-down the reason for this massive change in performance.
The query being used to do the update is dynamic and is shown at the bottom of this question - it is large due the explicit reference to the required columns. My question is; what could be causing this inordinate increase in execution time?
Any clues would be greatly appreciated.
SQL:
DECLARE #DbName NVARCHAR(128);
SET #DbName = (SELECT TOP 1 [DbName]
FROM [IPACostAdmin]..[TmpSpecialOptions]);
DECLARE #FilterSql NVARCHAR(MAX);
SET #FilterSql = (SELECT TOP 1 [AdditionalSQL]
FROM [IPACostAdmin]..[TmpSpecialOptions]);
DECLARE #SQL NVARCHAR(MAX);
DECLARE #SQL1 NVARCHAR(MAX);
DECLARE #SQL2 NVARCHAR(MAX);
SET #SQL1 =
'INSERT INTO [' + #DbName + ']..[Episode]
([Fields1], ..., [FieldN])';
SET #SQL2 =
'SELECT
[Fields1], ..., [FieldN]
FROM [B1A] ' + #FilterSql + ';';
SET #SQL = #SQL1 + #SQL2;
EXEC(#SQL);
GO
Note: I am splitting the dynamic SQL into #SQL1 and #SQL2 for clarity. Also note that I have not shown all columns due to space and the fact that it would largely be redundant.
Edit1.
1. The databases are on the same server.
2. The database files, including logs are in the same directory on the same drive.
3. There are no primary/foriegn keys or constraints set up on the source databases (DatabaseA/B/C) or the data warehouse database at the time of this INSERT INTO.
Edit2. I have ran the above query in management studio and it took 5s!?
Edit3. I have added a temporary CLUSTERED INDEX in a hope that this would assist this query, this has not helped either.
Some information would be great to know:
1: Databases are on the same server?
2: The db file and the logfile is on the same drive in case of A and C?
(Once I had a problem with two database where one of them was on an SSD drive and the other one in a HDD. That was a problem of reading the data)
3: DB statistics about fragmentation? (Tables has no constraints, but Indexes are defined?)
This was caused by a DELETE query being run before a preceeding CREATE CLUSTERED INDEX query had time to update the entire table. The solution was to use the BEGIN TRANSACTION and COMMIT keywords. This forces SQL Server to finish the indexing before attampting anyother operations.
Note, that this problem is only likely to arise when following a CREATE CLUSTERED INDEX query with a dynamic SQL statement that modifies the existing tabel.
I hope this helps someone else.

Creating a dynamic Stored Procedure on MSSQL SERVER 2008 R2

I have a System that fetches its data from different DBs on the same server.This DBs are newly attached to the server annually.e.g. at the at the beginning of 2013, a db called 2012 is attached.
So I want to create a stored procedure(SP) that fetches the user's input which can be anything from 2005(year). so based on the year the user enters, the SP should go to that db(whose name will be the year the user entered) and search for the data (with its parameter being the year the user entered) inside the db which will also has a table with the same name as the db(i.e the table will have the same name as the year name).
Hope this makes sense
It would be a good idea to paramatarize the query.
e.g.
CREATE PROC usp_bar
(
#ID INT
)
AS
BEGIN
DECLARE #SQL NVARCHAR(100)
DECLARE #Params NVARCHAR(100)
SET #SQL = N'SELECT * FROM [Table] WHERE ID = #ID'
SET #Params = N'#ID INT'
EXEC sp_executesql #SQL, #Params, #ID = 5
END
Check out this
I am no DBA so take this with a grain of salt and understand there may be a better way to do this but what you will probably have to do is something like this:
CREATE PROCEDURE usp_foo
#Year varchar(4)
AS
DECLARE #sql varchar(255)
#sql = 'SELECT * from [' + #Year + '].[owner].[table]'
sp_executesql #sql
Of course the user calling this sproc will have to A) have permissions to call system sprocs B) have permission to access the yearly database
Additionally, instead of going the SQL route, you could just make a dynamic connection string that you could populate with the correct catalog then issue your SQL queries directly to the database. Personally I would prefer that over using dynamic SQL.
One thing you could do, is take a look at synonyms, you could create one for each year:
http://sommarskog.se/dynamic_sql.html#Dyn_DB
CREATE SYNONYM otherdbtbl FOR otherdb.dbo.tbl
I'd recommend his site, it's full of good stuff, it's a great read :)
As to whether this works well for you, I'd expect it depends on how many tables you have in each DB. If it's a few this may work, if it's hundreds maybe another approach may work better - like abszero's suggestion of doing the switching in the application tier?

Handling paging and "lazy-loading" with DataSets?

My company uses raw, untyped DataSets filled via Stored Procedures exclusively. I have been tasked with finding a way to retrieve very large result sets (paging) and ways to get Lazy Loading functionality (at least I think this is lazy loading; I'm still learning that stuff to be honest) so we aren't pulling back tens of thousands of rows in one batch and hogging server resources.
I personally am not that familiar with DataSets as I avoid them whenever possible, and I would rather get rid of them entirely here, but saying "Change everything to use LINQ/EF" isn't going to be a valid answer since there's no business value to management (and it would take too long to redo things, so the idea would be shot down immediately).
Are there some resources I can look into to get this same kind of functionality but using standard untyped DataSets?
EDIT: Also, I need a solution that can work with dynamically created SQL that does not use a stored procedure.
All you need to do is to modify your stored procedure to page the result set. This of course will also mean that you'll have to pass as parameters certain criteria such as page number etc. Assuming you're using SQL Server 05 or newer, take a look at the following:
http://www.codeproject.com/KB/database/PagingResults.aspx
You'll need to implement paging inside your stored procedures. I assume you're using Sql Server, so here's a link:
http://www.davidhayden.com/blog/dave/archive/2005/12/30/2652.aspx
Note that this has nothing to do with DataSets per se. Presumably, your code generates a DataSet from a stored procedure call. If you rewrite your procs to do paging, your code will then generate a DataSet that contains only the requested page's records.
You could use the DataSet returned by your original proc to implement paging, by caching the DataSet and returning only selected rows to the client (or more accurately, using only selected rows of the DataSet to generate the client HTML), but this is a super-duper, really bad idea.
I had the same problem with asp.net 2.0 website, there is no "lazy-loading" solution to this. In order to paginate the data-sets I am using 2 sprocs that will help me wrap the paging functionality on every select I am doing.
CREATE PROCEDURE [dbo].[Generic_Counting]
#tables VARCHAR(MAX),
#filter VARCHAR(MAX) = '1=1'
AS
BEGIN
SET NOCOUNT ON;
DECLARE #strQuery VARCHAR(8000)
SET #strQuery = ' SELECT COUNT(*) FROM '+ #tables +'
WHERE '+ #filter
execute (#strQuery)
IF ##ERROR<>0
BEGIN
--error on generic count
SET NOCOUNT OFF
RETURN 10067
END
SET NOCOUNT OFF
RETURN 0
END
GO
CREATE PROCEDURE [dbo].[Generic_Paging]
#tables VARCHAR(1000),
#pk VARCHAR(100),
#pageNumber INT = 1,
#pageSize INT = 10,
#fields VARCHAR(MAX) = '*',
#filter VARCHAR(MAX) = '1=1',
#orderBy VARCHAR(MAX) = NULL
AS
BEGIN
SET NOCOUNT ON;
DECLARE #strQuery VARCHAR(8000)
DECLARE #strMinRecord VARCHAR(12);
DECLARE #strMaxRecord VARCHAR(12);
SET #strMinRecord = CONVERT(VARCHAR(12),((#pageNumber -1)*#pageSize + 1))
SET #strMaxRecord = CONVERT(VARCHAR(12), (#pageNumber * #pageSize))
-- Use ROW_NUMBER function
SET #strQuery ='
WITH Generic_CTE As
(
SELECT ''RowNumber'' = ROW_NUMBER() OVER(ORDER BY ' +
ISNULL(#orderBy,#pk) +'),' +
#fields +
' FROM ' + #tables +
' WHERE ('+ #filter +')
)
SELECT ' + #fields + '
FROM Generic_CTE
WHERE RowNumber BETWEEN ' + #strMinRecord +' AND '+ #strMaxRecord
--print #strQuery
execute (#strQuery)
IF ##ERROR<>0
BEGIN
--error on generic paging
SET NOCOUNT OFF
RETURN 10066
END
SET NOCOUNT OFF
RETURN 0
END
GO
You could take a look at the Value List Handler pattern, designed to be used where "the client requires a list of items ... for presentation. The number of items in the list is unknown and can be quite large in many instances."
The examples (in the link above and here) are for Java but should translate to asp.net fairly readily.

How do I get Linq to SQL to recognize the result set of a dynamic Stored Procedure?

I'm using Linq-to-SQL with a SQL Server backend (of course) as an ORM for a project. I need to get the result set from a stored procedure that returns from a dynamically-created table. Here's what the proc looks like:
CREATE procedure [RetailAdmin].[TitleSearch] (
#isbn varchar(50), #author varchar(50),
#title varchar(50))
as
declare #L_isbn varchar(50)
declare #l_author varchar(50)
declare #l_title varchar(50)
declare #sql nvarchar(4000)
set #L_isbn = rtrim(ltrim(#isbn))
set #l_author = rtrim(ltrim(#author))
set #l_title = rtrim(ltrim(#title))
CREATE TABLE #mytemp(
[storeid] int not NULL,
[Author] [varchar](100) NULL,
[Title] [varchar](400) NULL,
[ISBN] [varchar](50) NULL,
[Imprint] [varchar](255) NULL,
[Edition] [varchar](255) NULL,
[Copyright] [varchar](100) NULL,
[stockonhand] [int] NULL
)
set #sql = 'select a.storeid, Author,Title, thirteendigitisbn ISBN,
Imprint,Edition,Copyright ,b.stockonhand from ods.items a join ods.inventory b on
a.itemkey = b.itemkey where b.stockonhand <> 0 '
if len(#l_author) > 0
set #sql = #sql + ' and author like ''%'+#L_author+'%'''
if len(#l_title) > 0
set #sql = #sql + ' and title like ''%'+#l_title+'%'''
if len(#L_isbn) > 0
set #sql = #sql + ' and thirteendigitisbn like ''%'+#L_isbn+'%'''
print #sql
if len(#l_author) <> 0 or len(#l_title) <> 0 or len(#L_isbn) <> 0
begin
insert into #mytemp
EXECUTE sp_executesql #sql
end
select * from #mytemp
drop table #mytemp
I didn't write this procedure, but may be able to influence a change if there's a really serious problem.
My present problem is that when I add this procedure to my model, the designer generates this function:
[Function(Name="RetailAdmin.TitleSearch")]
public int TitleSearch([Parameter(DbType="VarChar(50)")] string isbn,
[Parameter(DbType="VarChar(50)")] string author,
[Parameter(DbType="VarChar(50)")] string title)
{
IExecuteResult result = this.ExecuteMethodCall(this,
((MethodInfo)(MethodInfo.GetCurrentMethod())), isbn, author, title);
return ((int)(result.ReturnValue));
}
which doesn't look anything like the result set I get when I run the proc manually:
Can anybody tell me what's going wrong here?
This is basically the same problem as this question but due to the poor phrasing from the OP it was never really answered.
Thanks Marc for your reply. I will see about making the changes you suggested.
The problem was the temp table. Linq to Sql just doesn't know what to do with them. This was particularly difficult to diagnose, because Visual Studio caches information about stored procs, so when it initially failed to find a result set it set the return as a default integer type and didn't update when I made changes to the stored proc. Getting VS to recognize a change requires you to:
Delete proc from the dbml
delete the server connection from Server Explorer
save the dbml to force a recompile
close the project and restart VS
recreate the server connection and import the proc
You might not have to do every one of those steps, but that's what worked for me. What you need to do, if you must use a temp table, is to create a barebones proc that simply returns the correct schema, and then alter it to do what you want after you've imported it into the OR Designer.
First - IMPORTANT - your SQL is vulnerable to injection; the inner command should be parameterized:
if len(#l_author) > 0
set #sql = #sql + ' and author like ''%''+#author+''%'''
EXECUTE sp_executesql #sql, N'#author varchar(100)', #L_author
This passes the value of #L_author in as the #author parameter in the dynamic command - preventing injection attacks.
Second - you don't really need the temp table. It isn't doing anything for you... you just INSERT and SELECT. Perhaps just EXEC and let the results flow to the caller naturally?
In other circumstances a table-variable would be more appropriate, but this doesn't work with INSERT/EXEC.
Are the columns the same for every call? If so, either write the dbml manually, or use a temp SP (just with "WHERE 1=0" or something) so that the SET FMT_ONLY ON can work.
If not (different columns per usage), then there isn't an easy answer. Perhaps use regular ADO.NET in this case (ExecuteReader/IDataReader - and perhaps even DataTable.Fill).
Of course, you could let LINQ take the strain... (C#):
...
if(!string.IsNullOrEmpty(author)) {
query = query.Where(row => row.Author.Contains(author));
}
...
etc
There's no real easy way to do this. I've had the same problem in the past. I think the issue is that Linq to Sql has no way of "figuring out" which type will be returned since you're building up the SELECT statement at execution time. What I did to get around this, was in the stored proc, I did just a select and selected all the columns that I possibly needed. Then, I had Linq to Sql generate the function based on that. Then, I went back to SQL and changed the stored proc back to the way it's supposed to be. The trick here is not to regenerate your DBML.

Categories

Resources