I am currently passing an SQL Parameter with one value.
So right now I have :
SqlParameter sqlParameter = new SqlParameter("#Parameter", SqlDbType.VarChar);
sqlParameter.Value = ParameterValue
and this is working fine in my WHERE clause in my SQL query.
However now I want to leave the option to have multiple values passed in my WHERE.
Instead of passing a regular string, I was thinking of passing a string with commas to separate the values.
So SqlParameter.Value = "value1, value2, value3";
and I want it to act like
WHERE Parameter = value1 OR Parameter = value2 OR Parameter = value3
Is there an easy way to do this where I dont actually have to modify my SQL query?
Bottom line: you're going to have to change either the SQL Statement or Stored Procedure to support what you are trying to do.
There are many different approaches to do what you are trying to accomplish but none are ideal, in my opinion. Erland Sommarskog wrote a great article explaining the many ways to pass in arrays and lists to SQL Server (http://www.sommarskog.se/arrays-in-sql-2005.html) which I recommend reading. A clean approach, if you are using SQL Server 2008 or greater, is using Table Valued Parameters (http://www.sommarskog.se/arrays-in-sql-2008.html). With this approach, you are basically passing an array of values into SQL Server.
If you go the Table Valued Parameters approach, your parameter will behave like a table where you can SELECT values from. So, for instance, you might modify your Stored Procedure (or SQL Statement) like so:
CREATE PROCEDURE get_products #Parameter myCustomParameterType READONLY AS
SELECT p.ProductID, p.ProductName
FROM Products p
WHERE p.ProductID IN (SELECT n FROM #Parameter)
There is another SO question/answer which provides more detail on this approach here: How to pass table value parameters to stored procedure from .net code
More info on Table Valued Parameters can be found here: http://msdn.microsoft.com/en-us/library/bb675163.aspx
Not if your query is "where parameter = #paramter".
Either change your query to "where parameter in..."
Or get your values into another table/table variable and join them.
If you want to pass in a comma separated list to check values against you can take the list, split it down and insert it into a temporary table or a table variable and then you can do
all the normal table statements such as JOIN, IN, EXISTS.
Here is a good article on how to take a Comma separated string and turn it into a table.
http://blogs.msdn.com/b/amitjet/archive/2009/12/11/sql-server-comma-separated-string-to-table.aspx
Related
I have a stored procedure like this
create proc usp_ProjectName_DBQuery
#strDBQuery varchar(8000)
as
begin
exec (#strDBQuery)
end
So this will accept any DBQuery and it will execute it in the server.
Now my problem is this will return n number of columns based on the query.
For example if I pass in select x,y from db it will return two columns, but if i pass in select * from db it will return n columns, it sometimes may not return anything
So how can I define a complex type for this stored procedure!? Please help
#marc_s is right this is not something you can do in EF (without third party library).
Entity framework doesn't really like that.
You have 2 options:
Use ADO.NET instead as suggested in the comments
Change your stored procedure to return the same number of column even of they are empty fields.
I am building a website in ASP.NET 2.0, some description of the page I am working about:
ListView displaying a table (of posts) from my access db, and a ListBox with Multiple select mode used to filter rows (by forum name, value=forumId).
I am converting the ListBox selected values into a List, then running the following query.
Parameter:
OleDbParameter("#Q",list.ToString());
Procedure:
SELECT * FROM sp_feedbacks WHERE forumId IN ([#Q])
The problem is, well, it doesn't work. Even when I run it from MSACCESS 2007 with the string 1,4, "1","4" or "1,4" I get zero results. The query works when only one forum is selected. (In (1) for instance).
SOLUTION?
So I guess I could use WHERE with many OR's but I would really like to avoid this option.
Another solution is to convert the DataTable into list then filter it using LINQ, which seems very messy option.
Thanks in advance,
BBLN.
I see 2 problems here:
1) list.ToString() doesn't do what you expect. Try this:
List<int> foo = new List<int>();
foo.Add(1);
foo.Add(4);
string x = foo.ToString();
The value of "x" will be "System.Collections.Generic.List`1[System.Int32]" not "1,4"
To create a comma separated list, use string.Join().
2) OleDbParameter does not understand arrays or lists. You have to do something else. Let me explain:
Suppose that you successfully use string.Join() to create the parameter. The resulting SQL will be:
SELECT * FROM sp_feedbacks WHERE forumId IN ('1,4')
The OLEDB provider knows that strings must have quotation marks around them. This is to protect you from SQL injection attacks. But you didn't want to pass a string: you wanted to pass either an array, or a literal unchanged value to go into the SQL.
You aren't the first to ask this question, but I'm afraid OLEDB doesn't have a great solution. If it were me, I would discard OLEDB entirely and use dynamic SQL. However, a Google search for "parameterized SQL array" resulted in some very good solutions here on Stack Overflow:
WHERE IN (array of IDs)
Passing an array of parameters to a stored procedure
Good Luck! Post which approach you go with!
When you have:
col in ('1,4')
This tests that col is equal to the string '1,4'. It is not testing for the values individually.
One way to solve this is using like:
where ','&#Q&',' like '*,'&col&',*'
The idea is to add delimiters to each string. So, a value of "1" becomes ",1,"in the column. A value of "1,4" for #Q becomes ",1,4,". Now when you do the comparison, there is no danger that "1" will match "10".
Note (for those who do not know). The wildcard for like is * rather than the SQL standard %. However, this might differ depending on how you are connecting, so use the appropriate wildcard.
Passing such a condition to a query has always been a problem. To a stored procedure it is worse because you can't even adjust the query to suit. 2 options currently:
use a table valued parameter and pass in multiple values that way (a bit of a nuisance to be honest)
write a "split" multi-value function as either a UDF or via SQL/CLR and call that from the query
For the record, "dapper" makes this easy for raw commands (not sprocs) via:
int[] ids = ...
var list = conn.Query<Foo>(
"select * from Foo where Id in #ids",
new { ids } ).ToList();
It figures out how to turn that into parameters etc for you.
Just in case anyone is looking for an SQL Server Solution:
CREATE FUNCTION [dbo].[SplitString]
(
#Input NVARCHAR(MAX),
#Character CHAR(1)
)
RETURNS #Output TABLE (
Item NVARCHAR(1000)
)
AS BEGIN
DECLARE #StartIndex INT, #EndIndex INT
SET #StartIndex = 1
IF SUBSTRING(#Input, LEN(#Input) - 1, LEN(#Input)) <> #Character
BEGIN
SET #Input = #Input + #Character
END
WHILE CHARINDEX(#Character, #Input) > 0
BEGIN
SET #EndIndex = CHARINDEX(#Character, #Input)
INSERT INTO #Output(Item)
SELECT SUBSTRING(#Input, #StartIndex, #EndIndex - 1)
SET #Input = SUBSTRING(#Input, #EndIndex + 1, LEN(#Input))
END
RETURN
END
Giving an array of strings, I will convert it to a comma separated List of strings using the following code
var result = string.Join(",", arr);
Then I could pass the parameter as follows
Command.Parameters.AddWithValue("#Parameter", result);
The In Stored Procedure Definition, I would use the parameter from above as follows
select * from [dbo].[WhateverTable] where [WhateverColumn] in (dbo.splitString(#Parameter, ','))
I am working at implementing a search feature into my MVC3 application. I'm looking to pass two parameters into and execute a stored procedure that will look basically something like this:
create procedure MyProc
(
#FirstParam nvarchar(50),
#SecondParam nvarchar(20)
)
as select * from MyTable where #FirstParam like #SecondParam
MyTable has about 30 fields that will be returned for each object and I need to create a procedure like this for several tables, so I am trying to avoid using a SqlDataReader and converting the returned Sql data to C#.
I would like to use something like this method but I am not sure if this can be done with multiple parameters.
Ideally I would like to use EF4, but I have not found any good information on executing stored procedures while using EF4.
Any insight on the most painless way and/or best practice for executing this task will be greatly appreciated.
My sugestion is use dynamic linq (and here, and here). You can pass valid linq expressions as regular strings:
var column = "Name";
var value = "Marvin";
var query = DbCtx.MyEntity.Where("{0} == #1", columnName, value);
The benefits (IMO) is that you can keep the search logic in the application and, if you need to do this for many tables, you can create a T4 template to generate the bootstrap code for you.
What you are suggesting can indeed be done through parameters, and you should be using an ORM like EF4 for you data access. Like most ORM that have support for stored procedure, you can indeed pass multiple parameters to the stored procedure.
The issue you will find, however, is that you can't have dynamic column names in SQL Server (or any other SQL database that I am aware of) - you can't give a column name in a variable.
You will need to use dynamic SQL to achieve this, either within the stored procedure or otherwise.
I have an arraylist that holds a subset of names found in my database. I need to write a query to get a count of the people in the arraylist for certain sections i.e.
There is a field "City" in my database from the people in the arraylist of names I want to know how many of them live in Chicago, how many live in New York etc.
Can someone help me how I might set up an sql statement to handle this. I think somehow I have to pass the subset of names to sql somehow.
Here is a sample how I am writing my sql in my code
Public Shared Function GetCAData(ByVal employeeName As String) As DataTable
Dim strQuery As String = "SELECT EMPLID, EMPLNME, DISP_TYPE, BEGIN_DTE FROM Corr WHERE (EMPLNME = #name)"
Dim cmd As New SqlCommand(strQuery)
cmd.Parameters.Add("#name", SqlDbType.VarChar)
cmd.Parameters("#name").Value = employeeName
Dim dt As DataTable = GenericDataAccess.GetData(cmd)
Return dt
End Function
I need a way to create a function using your sql statement to return a datatable object of names and the parameters would be city, and the List of names.
The above example isnt the sql I am looking for its just a skeleton of what the function would look like that I want to create.
So then you would use the function by iterating through all the cities passing in the same set of names each time in the fron end.
Your Sql statement will need to look like:
select city, count(*)
from table
where city in ('Chicago', 'New York')
group by city
Where the list of cities is your arraylist. You could pas this into a stored procedure as a variable or you could build the Sql string dynamically within your code.
If you are using SQL Server 2008, you can create a stored proc with a table variable as the input parameter. Then inside the proc you can just join to the input parameter table to get what you want.
You could use table valued parameters, but that's SQL Server 2008 and up only.
If the number of names is limited and you need to support older versions of SQL Server, you can use multiple parameters. That way you're still safe from SQL injection. I'm not entirely sure on what query you want to do, so I'll give an example based on your code:
Public Shared Function GetCAData(ByVal employeeName() As String) As DataTable
Dim sql As StringBuilder = New StringBuilder()
sql.Append("SELECT EMPLID, EMPLNME, DISP_TYPE, BEGIN_DTE FROM Corr WHERE EMPLNME IN (")
Dim cmd As New SqlCommand()
For I As Integer = 0 To employeeName.Length - 1
sql.Append("#name").Append(I).Append(",")
cmd.Parameters.AddWithValue("#name" & I, employeeName(I))
Next
sql.Remove(sql.Length - 1, 1).Append(")")
cmd.CommandText = sql.ToString()
Return GenericDataAccess.GetData(cmd)
End Function
(I'm sorry if my VB looks a little odd, I never use it anymore)
It actually builds a SQL statement dynamically, but the "dynamic" part is just a bunch of generated parameter names, which you then set. The maximum number of allowed parameters is 2000-something.
The best approach would depend on how big the subset of names is.
The very simplest, and probably worst approach would be to just create some dynamic sql like this..
"select city,count(*) from table where user in ("name1","name2","name3") GROUP BY city"
Where the in section is generated from the arraylist, perhaps like the following
private string CreateSQL(List<string> names)
{
var sb = new StringBuilder();
sb.Append("select city,count(*) from table where user in ('");
foreach ( var name in name)
{
sb.Append("'");
sb.Append(name);
sb.Append("',");
}
sb.Remove(sb.Length-1,1); //remove trailing ,
sb.Append(") GROUP BY city");
return sb.ToString();
}
The where in clause has a 100 item limit I believe. If your subset is anything like that length, then you really need a better approach, like putting the users in another table and doing a join. In fact if the subset of users is taken from the same database by another query, then post that and we'll right a single query.
edit: I actually don't know how to do where user in type queries as parameterised commands, be wary of sql injection !
One way of doing this is to pass it in as an XML parameter. The XML would be something like
#Cities = '<xml><city>Chicago</city><city>New York</city></xml>'
And this can be selected against as a table. This would have a slightly different behaviour than Macros's in that it will return rows for cities where the population is zero.
SELECT
tempTable.item.value('.', 'varchar(50)') AS City,
COUNT(DISTINCT people) AS [Population]
FROM #Cities.nodes('//city') tempTable(item)
LEFT OUTER JOIN peopleTable
ON tempTable.item.value('.', 'varchar(50)') = peopleTable.City
The above is the first time I have attempted that approach though so I'd be glad to have it critiqued!
My habitual approach is to pass it in as a comma delimited list and use a split function to get it into table format that can be joined against. An example split function is here
I have a table, schema is very simple, an ID column as unique primary key (uniqueidentifier type) and some other nvarchar columns. My current goal is, for 5000 inputs, I need to calculate what ones are already contained in the table and what are not. Tht inputs are string and I have a C# function which converts string into uniqueidentifier (GUID). My logic is, if there is an existing ID, then I treat the string as already contained in the table.
My question is, if I need to find out what ones from the 5000 input strings are already contained in DB, and what are not, what is the most efficient way?
BTW: My current implementation is, convert string to GUID using C# code, then invoke/implement a store procedure which query whether an ID exists in database and returns back to C# code.
My working environment: VSTS 2008 + SQL Server 2008 + C# 3.5.
My first instinct would be to pump your 5000 inputs into a single-column temporary table X, possibly index it, and then use:
SELECT X.thecol
FROM X
JOIN ExistingTable USING (thecol)
to get the ones that are present, and (if both sets are needed)
SELECT X.thecol
FROM X
LEFT JOIN ExistingTable USING (thecol)
WHERE ExistingTable.thecol IS NULL
to get the ones that are absent. Worth benchmarking, at least.
Edit: as requested, here are some good docs & tutorials on temp tables in SQL Server. Bill Graziano has a simple intro covering temp tables, table variables, and global temp tables. Randy Dyess and SQL Master discuss performance issue for and against them (but remember that if you're getting performance problems you do want to benchmark alternatives, not just go on theoretical considerations!-).
MSDN has articles on tempdb (where temp tables are kept) and optimizing its performance.
Step 1. Make sure you have a problem to solve. Five thousand inserts isn't a lot to insert one at a time in a lot of contexts.
Are you certain that the simplest way possible isn't sufficient? What performance issues have you measured so far?
What do you need to do with those entries that do or don't exist in your table??
Depending on what you need, maybe the new MERGE statement in SQL Server 2008 could fit your bill - update what's already there, insert new stuff, all wrapped neatly into a single SQL statement. Check it out!
http://blogs.conchango.com/davidportas/archive/2007/11/14/SQL-Server-2008-MERGE.aspx
http://www.sql-server-performance.com/articles/dba/SQL_Server_2008_MERGE_Statement_p1.aspx
http://blogs.msdn.com/brunoterkaly/archive/2008/11/12/sql-server-2008-merge-capability.aspx
Your statement would look something like this:
MERGE INTO
(your target table) AS t
USING
(your source table, e.g. a temporary table) AS s
ON t.ID = s.ID
WHEN NOT MATCHED THEN -- new rows does not exist in base table
....(do whatever you need to do)
WHEN MATCHED THEN -- row exists in base table
... (do whatever else you need to do)
;
To make this really fast, I would load the "new" records from e.g. a TXT or CSV file into a temporary table in SQL server using BULK INSERT:
BULK INSERT YourTemporaryTable
FROM 'c:\temp\yourimportfile.csv'
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR =' |\n'
)
BULK INSERT combined with MERGE should give you the best performance you can get on this planet :-)
Marc
PS: here's a note from TechNet on MERGE performance and why it's faster than individual statements:
In SQL Server 2008, you can perform multiple data manipulation language (DML) operations in a single statement by using the MERGE statement. For example, you may need to synchronize two tables by inserting, updating, or deleting rows in one table based on differences found in the other table. Typically, this is done by executing a stored procedure or batch that contains individual INSERT, UPDATE, and DELETE statements. However, this means that the data in both the source and target tables are evaluated and processed multiple times; at least once for each statement.
By using the MERGE statement, you can replace the individual DML statements with a single statement. This can improve query performance because the operations are performed within a single statement, therefore, minimizing the number of times the data in the source and target tables are processed. However, performance gains depend on having correct indexes, joins, and other considerations in place. This topic provides best practice recommendations to help you achieve optimal performance when using the MERGE statement.
Try to ensure you end up running only one query - i.e. if your solution consists of running 5000 queries against the database, that'll probably be the biggest consumer of resources for the operation.
If you can insert the 5000 IDs into a temporary table, you could then write a single query to find the ones that don't exist in the database.
If you want simplicity, since 5000 records is not very many, then from C# just use a loop to generate an insert statement for each of the strings you want to add to the table. Wrap the insert in a TRY CATCH block. Send em all up to the server in one shot like this:
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
BEGIN TRY
INSERT INTO table (theCol, field2, field3)
SELECT theGuid, value2, value3
END TRY BEGIN CATCH END CATCH
if you have a unique index or primary key defined on your string GUID, then the duplicate inserts will fail. Checking ahead of time to see if the record does not exist just duplicates work that SQL is going to do anyway.
If performance is really important, then consider downloading the 5000 GUIDS to your local station and doing all the analysis localy. Reading 5000 GUIDS should take much less than 1 second. This is simpler than bulk importing to a temp table (which is the only way you will get performance from a temp table) and doing an update using a join to the temp table.
Since you are using Sql server 2008, you could use Table-valued parameters. It's a way to provide a table as a parameter to a stored procedure.
Using ADO.NET you could easily pre-populate a DataTable and pass it as a SqlParameter.
Steps you need to perform:
Create a custom Sql Type
CREATE TYPE MyType AS TABLE
(
UniqueId INT NOT NULL,
Column NVARCHAR(255) NOT NULL
)
Create a stored procedure which accepts the Type
CREATE PROCEDURE spInsertMyType
#Data MyType READONLY
AS
xxxx
Call using C#
SqlCommand insertCommand = new SqlCommand(
"spInsertMyType", connection);
insertCommand.CommandType = CommandType.StoredProcedure;
SqlParameter tvpParam =
insertCommand.Parameters.AddWithValue(
"#Data", dataReader);
tvpParam.SqlDbType = SqlDbType.Structured;
Links: Table-valued Parameters in Sql 2008
Definitely do not do it one-by-one.
My preferred solution is to create a stored procedure with one parameter that can take and XML in the following format:
<ROOT>
<MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000000">
<MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000001">
....
</ROOT>
Then in the procedure with the argument of type NCHAR(MAX) you convert it to XML, after what you use it as a table with single column (lets call it #FilterTable). The store procedure looks like:
CREATE PROCEDURE dbo.sp_MultipleParams(#FilterXML NVARCHAR(MAX))
AS BEGIN
SET NOCOUNT ON
DECLARE #x XML
SELECT #x = CONVERT(XML, #FilterXML)
-- temporary table (must have it, because cannot join on XML statement)
DECLARE #FilterTable TABLE (
"ID" UNIQUEIDENTIFIER
)
-- insert into temporary table
-- #important: XML iS CaSe-SenSiTiv
INSERT #FilterTable
SELECT x.value('#ID', 'UNIQUEIDENTIFIER')
FROM #x.nodes('/ROOT/MyObject') AS R(x)
SELECT o.ID,
SIGN(SUM(CASE WHEN t.ID IS NULL THEN 0 ELSE 1 END)) AS FoundInDB
FROM #FilterTable o
LEFT JOIN dbo.MyTable t
ON o.ID = t.ID
GROUP BY o.ID
END
GO
You run it as:
EXEC sp_MultipleParams '<ROOT><MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000000"/><MyObject ID="60EAD98F-8A6C-4C22-AF75-000000000002"/></ROOT>'
And your results look like:
ID FoundInDB
------------------------------------ -----------
60EAD98F-8A6C-4C22-AF75-000000000000 1
60EAD98F-8A6C-4C22-AF75-000000000002 0