I'm trying to figure out how best to organise an inline c# page that makes a number of DB connections and passes values to strings and need some advice.
So basically a CMS in use with my place of work only allows for inline code so I'm doing a course search page that hooks up to some stored procedures and passes the values to strings.
What would be the best way to handle three different stored procedure calls that output different bits of information to strings? In the old VB version I was passing info to strings then outputting them as one large string which probably isn't the best way to handle this.
The code currently goes in this rough format
Stored Procedure 1
Pass x values to string
string = "<p> + xString +</p>"
Stored Procedure 2
Pass y values to string
string = "<p> + yString +</p>"
Is there a smarter way for me to close off sections as each procedure section usually has a table involved or appends to one larger table and I'm just trying to see what people would suggest would be best practice.
Please note I'm really not much of a programmer and just dipping my toes so apologies if this is a school boy mistake.
There are a number of ways of concatenating strings:
You could use a StringBuilder. This type allows you to Append and Insert text and then output all the text with the ToString() method.
StringBuilders are usually used when adding text to a string from within a loop. (I've read that using a StringBuilder is encouraged once you have over 10 concatenations)
Since you only have to append text to your string 3 times you can just use:
String.Concat or yourString += string.Format("<p>{0}</p>", stringToBeAdded);
In your case the texts may not be relevant to eachother, in which case there are other options available:
// Using an array (fixed size)
string[3] yourStrings;
yourStrings[0] = "first string";
yourStrings[2] = "last string";
// Using a list (dynamic sized)
var yourStrings = new List<string>();
yourStrings.Add(storedProcResult.ToString());
I would suggest that you try and make the question clearer, however if I understand correctly I think it would probably be best to create a stored procedure which runs the other stored procedures and stores the results in temp tables e.g.
INSERT INTO #MyTable1 EXEC procedure1 #param
INSERT INTO #MyTable2 EXEC procedure2 #param
INSERT INTO #MyTable3 EXEC procedure3 #param
Then in the same stored procedure run a simple select statement that concatenates the strings as desired and returns them to your inline code e.g.
SELECT CONCAT( MyTable1.emp_name, MyTable2.emp_middlename, MyTable3.emp_lastname ) AS Result
Related
I want to write sql command to text file with parameter values.
Following is my code for replacing parameter with appropriate values.
string commandText = commandInfo.CommandText;
if (commandInfo.Parameters.Count > 0)
{
foreach (SqlParameter parameter in commandInfo.Parameters)
{
commandText=commandText.Replace(parameter.ParameterName,parameter.Value==(object)DBNull.Value?(object)DBNull.Value:("'"+parameter.Value.ToString()+"'"));
}
}
the catch is although all other parameter values are replaced correctly.those having null values are taken up as blank i.e 'parameter1',,'param2'
between the two is the null valued parameter in final string.
What can be the alternative?
Frankly, replacing the parameters with values is (IMO) the wrong thing to do; what we do in mini-profiler is to spoof declare statements at the top of the output, so that you can copy and paste it into SSMS, without needing to worry about what is a parameter and what was hard-coded in the original TSQL. For example, glancing at the mini-profiler output for this page, I see
DECLARE #id int = 18121022,
#type tinyint = 10;
(and then lots of tsql that is very specific to us)
You can glance at the mini-profiler code to see how we output this, but basically it just involves walking over the parameters, writing a simple declare. If you are using something like ASP.NET, mini-profiler also avoids the need to write a file (instead making it available live on the site, to your developers).
This is a "best practice" question. We are having internal discussions on this topic and want to get input from a wider audience.
I need to store my data in a traditional MS SQL Server table with normal columns and rows. I sometimes need to return a DataTable to my web application, and other times I need to return a JSON string.
Currently, I return the table to the middle layer and parse it into a JSON string. This seems to work well for the most part, but does occasionally take a while on large datasets (parsing the data, not returning the table).
I am considering revising the stored procedures to selectively return a DataTable or a JSON string. I would simply add a #isJson bit parameter to the SP.
If the user wanted the string instead of the table the SP would execute a query like this:
DECLARE #result varchar(MAX)
SELECT #result = COALESCE(#results ',', '') + '{id:"' + colId + '",name:"' + colName + '"}'
FROM MyTable
SELECT #result
This produces something like the following:
{id:"1342",name:"row1"},{id:"3424",name:"row2"}
Of course, the user can also get the table by passing false to the #isJson parameter.
I want to be clear that the data storage isn't affected, nor are any of the existing views and other processes. This is a change to ONLY the results of some stored procedures.
My questions are:
Has anyone tried this in a large application? If so, what was the result?
What issues have you seen/would you expect with this approach?
Is there a better faster way to go from table to JSON in SQL Server other than modifying the stored procedure in this way or parsing the string in the middle tier?
I personally think the best place for this kind of string manipulation is in program code in a fully expressive language that has functions and can be compiled. Doing this in T-SQL is not good. Program code can have fast functions that do proper escaping.
Let's think about things a bit:
When you deploy new versions of the parts and pieces of your application, where is the best place for this functionality to be?
If you have to restore your database (and all its stored procedures) will that negatively affect anything? If you are deploying a new version of your web front end, will the JSON conversion being tied into the database cause problems?
How will you escape characters properly? Are you sending any dates through? What format will date strings be in and how will they get converted to actual Date objects on the other end (if that is needed)?
How will you unit test it (and with automated tests!) to prove it is working correctly? How will you regression test it?
SQL Server UDFs can be very slow. Are you content to use a slow function, or for speed hack into your SQL code things like Replace(Replace(Replace(Replace(Value, '\', '\\'), '"', '\"'), '''', '\'''), Char(13), '\n')? What about Unicode, \u and \x escaping? How about splitting '</script>' into '<' + '/script>'? (Maybe that doesn't apply, but maybe it does, depending on how you use your JSON.) Is your T-SQL procedure going to do all this, and be reusable for different recordsets, or will you rewrite it each time into each SP that you need to return JSON?
You may only have one SP that needs to return JSON. For now. Some day, you might have more. Then if you find a bug, you have to fix it in two places. Or five. Or more.
It may seem like you are making things more complicated by having the middle layer do the translation, but I promise you it is going to be better in the long run. What if your product scales out and starts going massively parallel—you can always throw more web servers at it cheaply, but you can't so easily fix database server resource saturation! So don't make the DB do more work than it should. It is a data access layer, not a presentation layer. Make it do the minimum amount of work possible. Write code for everything else. You will be glad you did.
Speed Tips for String Handling in a Web Application
Make sure your web string concatenation code doesn't suffer from Schlemiel the Painter's Algorithm. Either directly write to the output buffer as JSON is generated (Response.Write), or use a proper StringBuilder object, or write the parts of the JSON to an array and Join() it later. Don't do plain vanilla concatenation to a longer and longer string over and over.
Dereference objects as little as possible. I don't know your server-side language, but if it happens to be ASP Classic, don't use field names--either get a reference to each field in a variable or at the very least use integer field indexes. Dereferencing a field based on its name inside a loop is (much) worse performance.
Use pre-built libraries. Don't roll your own when you can use a tried and true library. Performance should be equal or better to your own and (most importantly) it will be tested and correct.
If you're going to spend the time doing this, make it abstract enough to handle converting any recordset, not just the one you have now.
Use compiled code. You can always get the fastest code when it is compiled, not interpreted. If you identify that the JSON-conversion routines are truly the bottleneck (and you MUST prove this for real, do not guess) then get the code into something that is compiled.
Reduce string lengths. This is not a big one, but if at all possible use one-letter json names instead of many-letter. For a giant recordset this will add up to savings on both ends.
Ensure it is GZipped. This is not so much a server-side improvement, but I couldn't mention JSON performance without being complete.
Passing Dates in JSON
What I recommend is to use a separate JSON schema (itself in JSON, defining the structure of the virtual recordset to follow). This schema can be sent as a header to the "recordset" to follow, or it can be already loaded in the page (included in the base javascript files) so it doesn't have to be sent each time. Then, in your JSON parse callback (or post-callback on the final resultant object) look in the schema for the current column and do conversions as necessary. You might consider using ISO format since in ECMAScript 5 strict mode there is supposed to be better date support and your code can be simplified without having to change the data format (and a simple object detect can let you use this code for any browser that supports it):
Date
Dates are now capable of both parsing and outputting ISO-formatted dates.
The Date constructor now attempts to parse the date as if it was ISO-formatted, first, then moves on to the other inputs that it accepts.
Additionally, date objects now have a new .toISOString() method that outputs the date in an ISO format.
var date = new Date("2009-05-21T16:06:05.000Z");
print( date.toISOString() );
// 2009-05-21T16:06:05.000Z
I wouldn't do that way you are doing (contatenating)
You can try creating a CLR SQL function that uses JSON.net and returns a varchar.
See here how to create SQL CLR Functions:
http://msdn.microsoft.com/en-us/library/w2kae45k(v=vs.80).aspx
Something like this (untested code)
[Microsoft.SqlServer.Server.SqlFunction]
public static SqlString MyFunctionName(int id) {
// Put your code here (maybe find the object you want to serialize using the id passed?)
using (var cn = new SqlConnection("context connection=true") ) {
//get your data into an object
var myObject = new {Name = "My Name"};
return new SqlString(Newtonsoft.Json.JsonConvert.SerializeObject(myObject));
}
}
I am programming a project in c# where many records are generated and need to be stored into a database. At the moment what I do (which is VERY slow) is store all of these results as a list of structs. Then at the end iterate through this struct and add all of the records to an sql query string. The issue with this is it takes ages to iterate through a list when it contains 100000s of items. A similar size inserts needs to be performed several times in the simulation. I've considered just storing the string from the off and rather then storing the records in a list put it into the string directly. Also perhaps storing them in a temporary file and using sql copy. I don't really have much experience with dealing with this amount of data so your feedback will be appreciated.
Thanks in advance
What you should try is populating a file with your data then using the built in COPY command. This is the recommended method of populating a database.
http://www.postgresql.org/docs/8.3/interactive/sql-copy.html
When building the CSV temp file take care to follow the CSV spec. If your column data contains new lines (\n \r), commas (,) or quotes (") then escape the quotes (") with quotes
data=data.Replace("\"", "\"\"");
and surround the data with quotes
data="\""+data+"\"";
Something like
public String CSVEscape(String columnData)
{
if(columnData.Contains("\n") || columnData.Contains("\r") || columnData.Contains("\"") || columnData.Contains(","))
{
return "\"" + columnData.Replace("\"", "\"\"") + "\"";
}
return columnData;
}
If I'm reading your question correctly, you're sending the PostgreSQL server a string that looks something like this:
INSERT INTO mytable (x, y, z) VALUES (1,2,3), (4,5,6), ...
What you should do instead is
start a transaction
prepare the statement INSERT INTO mytable (x, y, z) VALUES ($1, $2, $3)
for each struct in your list, execute the prepared statement with the
appropriate fields
commit the transaction
(Sorry, no code, because I don't know C#'s DB APIs.)
I wouldn't bother figuring out COPY IN unless the approach I described above is still way too slow. I get nervous when inserting data into a database requires any sort of text munging on my part.
If you have a low performance by using OOP approach, so using structs/ classes, first thing to do, is to measure and optimize code as much as possible.
If the performance even after optimization not good in your specific context, I would leave OOP approach and pass to raw SQL.
One of solution can be, like you said in post, during generation of string for every single entity, immediately add it to big file, where at the end of generation you will find complete huge SQL string. The problem here is testability of solution.
But, you know, somewhere you need to "pay". You can not have a comfott and a performance contemporary on such scale.
My program generates a list at runtime of values. I need to send a sql query that looks in a table for entries that have a column that contain one of the values in the list. I can't use the usual chain of OR's because I don't know how big the list will be. Is there a nice way to use an array or some IEnumerable to build a SQL statement that makes a big chain of OR's for me? Using C# BTW
I'm using SQL Server but I'd prefer something that works across all databases if such a thing exists.
Thanks!
Read up Erland Sommarskog's excellent Arrays and Lists in SQL Server article - he explains in great detail and with great insights what can be done in which version of SQL Server.
I also believe this is going to be database-specific, I don't see any approaches that would be ANSI SQL compliant or work on other databases, too - string and XML handling is just too specific for each database, I guess.
Try this:
SELECT * FROM table WHERE column IN (#param)
And make sure you turn #param into the form "'#p1, #p2, #p3'" by using your chosen language's implode function.
The query should be standard for all SQLs I know.
Alternatively if you want to do a LIKE comparison:
SELECT * FROM table WHERE #param LIKE '%' (COALLESCE) column (COALLESCE) '%'
#param is still a quoted string, but you need to use a different COALLESCE operator depending on your database (SQL Server: '+', MySQL: '.', Oracle: '|').
Example PHP implode code:
$ps = array('1', 'test', '323');
$param = implode(',', $ps);
Example C# "implode" code:
string[] ps = new string[] { "test", "blah", "boo" };
string param = string.Join(",", ps));
One thing you can try is table-valued parameters in stored procedures in MS-SQL:
http://www.sqlteam.com/article/sql-server-2008-table-valued-parameters
Not only is that unmessy since all you're going to do is add rows to your table-valued parameter for each value in your list, but it also executes quickly compared to the alternatives since the SQL DBMS can reuse its execution plan every time you run the query given that the number of arguments never changes.
I have a list of msgs (it could be 24 or 100+).
I would need to remove many of these messages. I was wondering, what is the best way?
I dont like the idea of building names for 100+ elements and doing something like doRemove = Request["msgId" + i];
I prefer receive a array such as long[] and do something like
long[] removeThese = Request["remove"];
foreach ....
//DELETE msgId WHERE userId=#userId
However i dont know if creating an array and POSTing it is possible with HTML (and basic javascript?) or if its the best solution. How should i do this?
I would slap a <asp:CheckBoxList /> on there and when you need to submit changes, just take the .SelectedValue and pass it through into your SQL parameter (wherever that is done).
If you are on SQL 2008, you can take advantage of "Table-Valued Paramaters" (just google it) which you just pass any IEnumerable (pretty much any collection) in the variable and you can just JOIN to it in your UPDATE/DELETE/etc query.
Previous versions of SQL, the approach I use is very similar except just the string/VARCHAR is passed in to the query and you have to create a table variable or temp table to hold the values inserted from a split procedure. There are many ways to create the split procedure, but I've found the numbers-table approach works in all versions of SQL and has pretty good performance. See http://www.sommarskog.se/arrays-in-sql-2005.html for exhaustive reference of the possible performance implications of each approach.
Just say no to ' IN (' + #MyValuesCSV + ')'
:-)
You could create checkbox with same name
<input... name="mycheckbox" value="hello"></input>
<input... name="mycheckbox" value="world"></input>
On the server side, you could use Request("mycheckbox") & this should be an array of selected Items.
Alternatively, you could use asp.net checkboxlist.
link - http://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.checkboxlist(VS.71).aspx
If you're looking for a "lightweight" solution, you could always have a hidden HTML variable and assign a javascript event to each checkbox that adds its index/id to the variable, creating a comma delimited list of Ids to process.
Then when the submit/confirm occurs, you can have your code behind page grab this string, convert it into a List/Array/Whatever that works best for your processing and finish things up.