How should I post/process 100 checkboxes? HTML+C#/ASP.NET - c#

I have a list of msgs (it could be 24 or 100+).
I would need to remove many of these messages. I was wondering, what is the best way?
I dont like the idea of building names for 100+ elements and doing something like doRemove = Request["msgId" + i];
I prefer receive a array such as long[] and do something like
long[] removeThese = Request["remove"];
foreach ....
//DELETE msgId WHERE userId=#userId
However i dont know if creating an array and POSTing it is possible with HTML (and basic javascript?) or if its the best solution. How should i do this?

I would slap a <asp:CheckBoxList /> on there and when you need to submit changes, just take the .SelectedValue and pass it through into your SQL parameter (wherever that is done).
If you are on SQL 2008, you can take advantage of "Table-Valued Paramaters" (just google it) which you just pass any IEnumerable (pretty much any collection) in the variable and you can just JOIN to it in your UPDATE/DELETE/etc query.
Previous versions of SQL, the approach I use is very similar except just the string/VARCHAR is passed in to the query and you have to create a table variable or temp table to hold the values inserted from a split procedure. There are many ways to create the split procedure, but I've found the numbers-table approach works in all versions of SQL and has pretty good performance. See http://www.sommarskog.se/arrays-in-sql-2005.html for exhaustive reference of the possible performance implications of each approach.
Just say no to ' IN (' + #MyValuesCSV + ')'
:-)

You could create checkbox with same name
<input... name="mycheckbox" value="hello"></input>
<input... name="mycheckbox" value="world"></input>
On the server side, you could use Request("mycheckbox") & this should be an array of selected Items.
Alternatively, you could use asp.net checkboxlist.
link - http://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.checkboxlist(VS.71).aspx

If you're looking for a "lightweight" solution, you could always have a hidden HTML variable and assign a javascript event to each checkbox that adds its index/id to the variable, creating a comma delimited list of Ids to process.
Then when the submit/confirm occurs, you can have your code behind page grab this string, convert it into a List/Array/Whatever that works best for your processing and finish things up.

Related

Building an SQL in list using C# variable

I have an SQL where clause: -
where table1.resource in #ListOfRes
I'm trying to add to the SQL command like this:-
command.Parameters.AddWithValue("#ListOfRes", "'1','2'");
The result I'm looking for is: -
where table1.resource in ('1','2')
But I can't seem to build a string this way, is there a different data type I need to be using?
Nope, Not working. This is not how SQL works, regardless what you do.
IN (#variable) takes the content of the variable as ONE ELEMENT. There is no way to put multiple elements in it. Live with it. You need one variable for every element, or another approach (temp file, then using a join etc.)..
This simply is not a supported approach.

ParseObject queries with relative search parameters

I'm rather new to Parse and Cloud Code, and I'm having trouble writing a certain query script.
I have a table of Salespeople, who have two integers : dailySold and dailyQuota.
The dailySold is reset to 0 each day, and the dailyQuota is defined by upper management.
Now, I'd like to make queries that call out bulks of users. Say, all users which dailySold is below their dailyQuota. In MySQL it would just look like this :
select * from salespeople where dailySold < dailyQuota
But in Parse / CloudCode I have been unable to find something like this. Currently, I'm loading all the entries, and going through them one by one, populating a large array clientside. This feels like the absolutely wrong way of doing it.
And the query.WhereNotEqualTo() function (and their siblings) seem to only be able to compare with static queries.
Does anyone know how to put together a query to optimize this ? I need it to go through thousands of records, and its often only 10-20 results I'm interested in. If nothing else, I'll have to make a cloudcode function that iterates for me serverside, but I still feel like there is some function I should be able to use, to make a more lean query.
You can't compare two columns in a query. You can only compare a key with a provided object. If the dailyQuota is set by upper management, I'm assuming this is the same for all salespeople, or for groups of people. I'd suggest first making a query for the daily quota and then either use
whereKey:matchesKey:inQuery
or just fetch the dailyQuota first and then use that value in the second query.

Use SQL to return a JSON string

This is a "best practice" question. We are having internal discussions on this topic and want to get input from a wider audience.
I need to store my data in a traditional MS SQL Server table with normal columns and rows. I sometimes need to return a DataTable to my web application, and other times I need to return a JSON string.
Currently, I return the table to the middle layer and parse it into a JSON string. This seems to work well for the most part, but does occasionally take a while on large datasets (parsing the data, not returning the table).
I am considering revising the stored procedures to selectively return a DataTable or a JSON string. I would simply add a #isJson bit parameter to the SP.
If the user wanted the string instead of the table the SP would execute a query like this:
DECLARE #result varchar(MAX)
SELECT #result = COALESCE(#results ',', '') + '{id:"' + colId + '",name:"' + colName + '"}'
FROM MyTable
SELECT #result
This produces something like the following:
{id:"1342",name:"row1"},{id:"3424",name:"row2"}
Of course, the user can also get the table by passing false to the #isJson parameter.
I want to be clear that the data storage isn't affected, nor are any of the existing views and other processes. This is a change to ONLY the results of some stored procedures.
My questions are:
Has anyone tried this in a large application? If so, what was the result?
What issues have you seen/would you expect with this approach?
Is there a better faster way to go from table to JSON in SQL Server other than modifying the stored procedure in this way or parsing the string in the middle tier?
I personally think the best place for this kind of string manipulation is in program code in a fully expressive language that has functions and can be compiled. Doing this in T-SQL is not good. Program code can have fast functions that do proper escaping.
Let's think about things a bit:
When you deploy new versions of the parts and pieces of your application, where is the best place for this functionality to be?
If you have to restore your database (and all its stored procedures) will that negatively affect anything? If you are deploying a new version of your web front end, will the JSON conversion being tied into the database cause problems?
How will you escape characters properly? Are you sending any dates through? What format will date strings be in and how will they get converted to actual Date objects on the other end (if that is needed)?
How will you unit test it (and with automated tests!) to prove it is working correctly? How will you regression test it?
SQL Server UDFs can be very slow. Are you content to use a slow function, or for speed hack into your SQL code things like Replace(Replace(Replace(Replace(Value, '\', '\\'), '"', '\"'), '''', '\'''), Char(13), '\n')? What about Unicode, \u and \x escaping? How about splitting '</script>' into '<' + '/script>'? (Maybe that doesn't apply, but maybe it does, depending on how you use your JSON.) Is your T-SQL procedure going to do all this, and be reusable for different recordsets, or will you rewrite it each time into each SP that you need to return JSON?
You may only have one SP that needs to return JSON. For now. Some day, you might have more. Then if you find a bug, you have to fix it in two places. Or five. Or more.
It may seem like you are making things more complicated by having the middle layer do the translation, but I promise you it is going to be better in the long run. What if your product scales out and starts going massively parallel—you can always throw more web servers at it cheaply, but you can't so easily fix database server resource saturation! So don't make the DB do more work than it should. It is a data access layer, not a presentation layer. Make it do the minimum amount of work possible. Write code for everything else. You will be glad you did.
Speed Tips for String Handling in a Web Application
Make sure your web string concatenation code doesn't suffer from Schlemiel the Painter's Algorithm. Either directly write to the output buffer as JSON is generated (Response.Write), or use a proper StringBuilder object, or write the parts of the JSON to an array and Join() it later. Don't do plain vanilla concatenation to a longer and longer string over and over.
Dereference objects as little as possible. I don't know your server-side language, but if it happens to be ASP Classic, don't use field names--either get a reference to each field in a variable or at the very least use integer field indexes. Dereferencing a field based on its name inside a loop is (much) worse performance.
Use pre-built libraries. Don't roll your own when you can use a tried and true library. Performance should be equal or better to your own and (most importantly) it will be tested and correct.
If you're going to spend the time doing this, make it abstract enough to handle converting any recordset, not just the one you have now.
Use compiled code. You can always get the fastest code when it is compiled, not interpreted. If you identify that the JSON-conversion routines are truly the bottleneck (and you MUST prove this for real, do not guess) then get the code into something that is compiled.
Reduce string lengths. This is not a big one, but if at all possible use one-letter json names instead of many-letter. For a giant recordset this will add up to savings on both ends.
Ensure it is GZipped. This is not so much a server-side improvement, but I couldn't mention JSON performance without being complete.
Passing Dates in JSON
What I recommend is to use a separate JSON schema (itself in JSON, defining the structure of the virtual recordset to follow). This schema can be sent as a header to the "recordset" to follow, or it can be already loaded in the page (included in the base javascript files) so it doesn't have to be sent each time. Then, in your JSON parse callback (or post-callback on the final resultant object) look in the schema for the current column and do conversions as necessary. You might consider using ISO format since in ECMAScript 5 strict mode there is supposed to be better date support and your code can be simplified without having to change the data format (and a simple object detect can let you use this code for any browser that supports it):
Date
Dates are now capable of both parsing and outputting ISO-formatted dates.
The Date constructor now attempts to parse the date as if it was ISO-formatted, first, then moves on to the other inputs that it accepts.
Additionally, date objects now have a new .toISOString() method that outputs the date in an ISO format.
var date = new Date("2009-05-21T16:06:05.000Z");
print( date.toISOString() );
// 2009-05-21T16:06:05.000Z
I wouldn't do that way you are doing (contatenating)
You can try creating a CLR SQL function that uses JSON.net and returns a varchar.
See here how to create SQL CLR Functions:
http://msdn.microsoft.com/en-us/library/w2kae45k(v=vs.80).aspx
Something like this (untested code)
[Microsoft.SqlServer.Server.SqlFunction]
public static SqlString MyFunctionName(int id) {
// Put your code here (maybe find the object you want to serialize using the id passed?)
using (var cn = new SqlConnection("context connection=true") ) {
//get your data into an object
var myObject = new {Name = "My Name"};
return new SqlString(Newtonsoft.Json.JsonConvert.SerializeObject(myObject));
}
}

Using c# to generate and 100000s of records into a postgres database

I am programming a project in c# where many records are generated and need to be stored into a database. At the moment what I do (which is VERY slow) is store all of these results as a list of structs. Then at the end iterate through this struct and add all of the records to an sql query string. The issue with this is it takes ages to iterate through a list when it contains 100000s of items. A similar size inserts needs to be performed several times in the simulation. I've considered just storing the string from the off and rather then storing the records in a list put it into the string directly. Also perhaps storing them in a temporary file and using sql copy. I don't really have much experience with dealing with this amount of data so your feedback will be appreciated.
Thanks in advance
What you should try is populating a file with your data then using the built in COPY command. This is the recommended method of populating a database.
http://www.postgresql.org/docs/8.3/interactive/sql-copy.html
When building the CSV temp file take care to follow the CSV spec. If your column data contains new lines (\n \r), commas (,) or quotes (") then escape the quotes (") with quotes
data=data.Replace("\"", "\"\"");
and surround the data with quotes
data="\""+data+"\"";
Something like
public String CSVEscape(String columnData)
{
if(columnData.Contains("\n") || columnData.Contains("\r") || columnData.Contains("\"") || columnData.Contains(","))
{
return "\"" + columnData.Replace("\"", "\"\"") + "\"";
}
return columnData;
}
If I'm reading your question correctly, you're sending the PostgreSQL server a string that looks something like this:
INSERT INTO mytable (x, y, z) VALUES (1,2,3), (4,5,6), ...
What you should do instead is
start a transaction
prepare the statement INSERT INTO mytable (x, y, z) VALUES ($1, $2, $3)
for each struct in your list, execute the prepared statement with the
appropriate fields
commit the transaction
(Sorry, no code, because I don't know C#'s DB APIs.)
I wouldn't bother figuring out COPY IN unless the approach I described above is still way too slow. I get nervous when inserting data into a database requires any sort of text munging on my part.
If you have a low performance by using OOP approach, so using structs/ classes, first thing to do, is to measure and optimize code as much as possible.
If the performance even after optimization not good in your specific context, I would leave OOP approach and pass to raw SQL.
One of solution can be, like you said in post, during generation of string for every single entity, immediately add it to big file, where at the end of generation you will find complete huge SQL string. The problem here is testability of solution.
But, you know, somewhere you need to "pay". You can not have a comfott and a performance contemporary on such scale.

Fixing SQL injection forms in a big asp.net C# web application

I have to fix a project that is vulnerable to SQL injection.
All the forms in every page on the project do not use parametrized query but simply string query.
For example I have the search page, and looking at the code behind I see that there is a method CreateQuery() that creates the query basing on the text fields as example:
string sQuery = "";
sQuery += "b.name like '%" + txtName.Text + "%'";
Then in the btnSearch_Click() I have the method that does the query:
query = CreateQuery();
var totalList = GetAllBlaBla(query);
My question is:
Since I have hundreds of forms and thousands of formText and values to FIX, is there a "quick" solution to implement like
A global function that parametrizes the query or handle the situation in some way?
Since in every class the query is executed in the SubmitButton_Click() code behind method, can I handle the situation here, of course in every class?
Should I modify every form and every entry in the form codebehind to parametrize the SQL string, that is gonna take one million of years?
(Edit) What about Encode/Decode input values? SO that the example above will be:
string sQuery = "";
var txt = var txt = HttpUtility.HtmlEncode(txtName.Text);
sQuery += "b.name like '%" + txt + "%'";
Is this a possible temporary patch?
5- (Edit) Is this a possible solution, or it simply does not change anything?
cmd.Parameters.Add("#txtNameParameter", SqlDbType.VarChar);
cmd.Parameters["#txtNameParameter"].Value = txtName.Text;
sQuery += "b.name like '%" + (string)cmd.Parameters["#txtNameParameter"].Value + "%'";
The problem is that I have to return a string because the logic that handles the query is defined in another business class that takes a string as a query, I cannot give it a CommandType or SqlDataAdapter...
Suggestion?
Thanks in advance.
You already know there is a problem; IMO, any "quick" fix here is likely to reduce the attack surface, but is not likely to prevent determined abuse; simply, blacklisting is truly hard and there are some really bizarre inputs readily available on black-hat (and, as samples, on white-hat) sites. These are not always easily recognizable as abusive. It isn't all ' drop table Customers -- ;p
WHATEVER you do, I would advise doing it properly; parameters. Tools like dapper might reduce the code you need, though:
sQuery += "b.name like '%'+#text+'%'"
...
conn.Execute(sQuery, new {text=txtName.Text});
(which is easier than handling all the parameters etc manually)
Modify every query + validate every input.
Takes time but think about the guy who will maintain and add features to that "big web application"
(it may be you).
<customErrors mode="On"/>
This will prevent users to see the errors so a potential hacker would have very few clues how to exploit this security door, based on the error messages he/shes sees
Add Elmah to log errors.
Rewrite every query to use Parameters or use an ORM.
Any javascript based solution is useless, since a "hacker" for sure knows how to disable it.
The Plain Way
Estimate the number of replacements by searching the project for string sQuery = ". Multiply it by the time you plan spending on fixing a single query (e.g. one fix in 5 minutes, plus a coffee break each 10 fixes).
Add time for testing the whole website.
Then tell management the fix is going to be huge, give them the estimate and just do it.
Surely you'll have headaches for a couple of days but at least you'll have the thing working.
The Creative Way
If you literally mean hundreds and thousands of such forms with nearly identical code (e.g. query always created in CreateQuery, executed in SubmitButton_Click), I would consider learning to use Visual Studio regular expression syntax and crafting a couple of very accurate search & replace patterns.
This saved me hours of work in one project, but you'll need to be really precise with regexps and make sure you understand what you're doing.
Another option, again, when you're sure it is worth it, is to write a tool that will rewrite C# sources.
If all you need is a simple transform like the one Marc mentioned, it could take a couple hours of work.
But you can fail miserably here so it's a risky route.
Reduce the permissions of the database account that your application uses to access data. Hopefully it doesn't have sysadmin. Remove permissions to drop tables. If the account is used only for data retrieval, remove all permissions to update data. You might even consider setting up views, locking them down, and using them instead of direct table access.
Turn on ASP.NET Request Validation, described here. This automatically checks all traffic for malicious character sequences.
If possible, consider adding an event handler to Global for OnBeginRequest that inspects the incoming data and performs white list checks on all inputs. Not sure how well this maps to your problem but the nice thing is you only have to do it in once place and it will affect the whole site.
Ensure that all of your pages call Page.Validate to ensure that the client-side validation is also enforced on the server side.
Begin the long hard work of adding field-specific white list validation to every control, and figure out a long term plan to move onto parameterized database calls.
find all textbox controls on each page during pageload and disable special keys press handling

Categories

Resources