What is the best way to evaluate a SQL statement in C# to determine if it does more than just select? - Ie- check if values would be changed (insert, update, delete, modify, or drop) if the statement was later executed.
Any ideas as far as out of the box C# dlls/functions i can use, or is this something I should code myself using string-parsing techniques?
Thanks.
I would use db permissions. Create a database user with read-access only and then, any queries that do anything other than SELECT will fail.
One method would be to use a transaction and then rollback, with obvious limitations (may not work on complex queries, like ones returning multiple result sets with other updates in between, and will not work on queries that use non-rollback commands like DBCC - may want to catch exceptions as well for situations like those):
using(SqlConnection sqlConn = new sqlConnection(CONNECTION_STRING))
{
sqlConn.Open();
using (SqlTransaction trans = sqlConn.BeginTransaction())
{
// execute code and see if rows are affected here
var query = " ... " ;
var cmd = new SqlCommand(query, sqlConn);
var rowsAffected = cmd.ExecuteNonQuery();
if (rowsAffected > 0) { ... }
// roll back the transaction
trans.RollBack();
}
sqlConn.Close();
}
Related
I'm using MySQL to try and add a new user to my database. User got an Id, a First Name, a Last Name and a Date of Birth. But when I run the code below (And run conn.close() after I'm done) the database tells me (using HeidiSQL) that in the Table Overview there is now a new row in the table but when I open the Data Tab to look at the rows, there is nothing. It's empty. Running a COUNT(*) also returns 0.
using (MySqlTransaction transaction = conn.BeginTransaction())
{
using (MySqlCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "INSERT INTO USERS(NAME_FIRST,NAME_LAST,DATE_OF_BIRTH) VALUES(#nameFirst,#nameLast,#dateOfBirth)";
cmd.Transaction = transaction;
cmd.Parameters.AddWithValue("#nameFirst", user.NameFirst);
cmd.Parameters.AddWithValue("#nameLast", user.NameLast);
cmd.Parameters.AddWithValue("#dateOfBirth", user.DateOfBirth);
cmd.Prepare();
cmd.ExecuteNonQuery();
lastInsertId = (uint)cmd.LastInsertedId;
}
}
I get no errors. Nothing shows up in any log and everyone sees the same as me.
What am I doing wrong?
It feels like it's the use of begintransaction which starts a transaction. This means autocommit=false for the entirety of the transaction.
After ExecuteNonQuery Do a transaction.Commit(); and see if they show up.
More Info Here
I am trying to retrieve list of records from one table , and write to another table. I've used a simple query to retrieve the values to SqlDataReader,then load them to a DataTable. Using the DataTableReader , I am going through the entire data set which is Saved in DataTable. The problem is, while reading each and every record I am trying to insert those values to another table using a Stored Procedure.But it only insert the first row of values,and for the second row onward giving some Exception saying."procedure or function has too many arguments specified".
string ConStr = ConfigurationManager.ConnectionStrings["ConString"].ConnectionString;
SqlConnection NewCon = new SqlConnection(ConStr);
NewCon.Open();
SqlCommand NewCmd3 = NewCon.CreateCommand();
NewCmd3.CommandType = CommandType.Text;
NewCmd3.CommandText ="select * from dbo.Request_List where group_no ='" +group_no+ "'";
NewCon.Close();
NewCon.Open();
SqlDataReader dr = (SqlDataReader)NewCmd3.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dr);
DataTableReader reader = new DataTableReader(dt);
NewCmd.Dispose();
NewCon.Close();
NewCon.Open();
SqlCommand NewCmdGrpReqSer = NewCon.CreateCommand();
NewCmdGrpReqSer.CommandType = CommandType.StoredProcedure;
NewCmdGrpReqSer.CommandText = "Voucher_Request_Connection";
if (reader.HasRows)
{
int request_no = 0;
while (reader.Read())
{
request_no = (int)reader["request_no"];
NewCmdGrpReqSer.Parameters.Add("#serial_no", serial_no);
NewCmdGrpReqSer.Parameters.Add("#request_no", request_no);
try
{
NewCmdGrpReqSer.ExecuteNonQuery();
MessageBox.Show("Connection Updated");//just to check the status.tempory
}
catch (Exception xcep)
{
MessageBox.Show(xcep.Message);
}
MessageBox.Show(request_no.ToString());//
}
NewCmdGrpReqSer.Dispose();
NewCon.Close();
}
Any Solutions ?
As #Sparky suggests, the problem is that you continue to add parameters to the insertion command. There are several other ways in which the code could be improved, however. These improvements would remove the need to clear the parameters and would help to make sure you don't leave disposable resources undisposed.
First - use the using statement for your disposable objects. This removes the need for the explicit Close (btw, only one of Close/Dispose is needed for the connection as I believe Dispose calls Close). Second, simply create a new command for each insertion. This will prevent complex logic around resetting the parameters and, possibly, handling error states for the command. Third, check the results of the insertion to make sure it succeeds. Fourth, explicitly catch a SqlException - you don't want to accidentally hide unexpected errors in your code. If it's necessary to make sure all exceptions don't bubble up, consider using multiple exception handlers and "doing the right thing" for each case - say logging with different error levels or categories, aborting the entire operation rather than just this insert, etc. Lastly, I would use better variable names. In particular, avoid appending numeric identifiers to generic variable names. This makes the code harder to understand, both for others and for yourself after you've let the code sit for awhile.
Here's my version. Note there are several other things that I might do such as make the string literals into appropriately named constants. Introduce a strongly-typed wrapper around the ConfigurationManager object to make testing easier. Remove the underscores from the variable names and use camelCase instead. Though those are more stylistic in nature, you might want to consider them as well.
var connectionString = ConfigurationManager.ConnectionStrings["ConString"].ConnectionString;
using (var newConnection = new SqlConnection(connectionString))
{
newConnection.Open();
using (var selectCommand = newConnection.CreateCommand())
{
selectCommand.CommandType = CommandType.Text;
select.CommandText ="select request_no from dbo.Request_List where group_no = #groupNumber";
selectCommand.Parameters.AddWithValue("groupNumber", group_no);
using (dataReader = (SqlDataReader)newCommand.ExecuteReader())
{
while (reader.HasRows && reader.Read())
{
using (var insertCommand = newConnection.CreateCommand())
{
insertCommand.CommandType = CommandType.StoredProcedure;
insertCommand.CommandText = "Voucher_Request_Connection";
var request_no = (int)reader["request_no"];
insertCommand.Parameters.Add("#serial_no", serial_no);
insertCommand.Parameters.Add("#request_no", request_no);
try
{
if (insertCommand.ExecuteNonQuery() == 1)
{
MessageBox.Show("Connection Updated");//just to check the status.tempory
}
else
{
MessageBox.Show("Connection was not updated " + request_no);
}
}
catch (SqlException xcep)
{
MessageBox.Show(xcep.Message);
}
MessageBox.Show(request_no.ToString());//
}
}
}
}
}
Try clearing your parameters each time...
while (reader.Read())
{
request_no = (int)reader["request_no"];
// Add this line
NewCmdGrpReqSer.Parameters.Clear();
NewCmdGrpReqSer.Parameters.Add("#serial_no", serial_no);
NewCmdGrpReqSer.Parameters.Add("#request_no", request_no);
try
{
I have a question regarding performance. This is my scenario.
I have a MYSQL database and a application that from time to time moves records, according to the criteria from a query, from one table to another. The way this is done is:
foreach(object obj in list)
{
string id = obj.ToString().Split(',')[0].Trim();
string query = " insert into old_records select * from testes where id='" +
id + "';" + " delete from testes where id='" + id +"'";
DB _db = new DB();
_db.DBConnect(query);
this is the way I connect to the database:
DataTable _dt = new DataTable();
MySqlConnection _conn = new MySqlConnection(connectionString);
MySqlCommand _cmd = new MySqlCommand
{
Connection = _conn,
CommandText = query
};
MySqlDataAdapter _da = new MySqlDataAdapter(_cmd);
MySqlCommandBuilder _cb = new MySqlCommandBuilder(_da);
_dt.Clear();
try
{
_conn.Open();
_cmd.ExecuteNonQuery();
_da.Fill(_dt);
}
catch (MySqlException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (_conn != null) _conn.Close();
}
return _dt;
So my question is, I have like 4000 rows in the table, and it takes a lot of time to move all the records from one table to another, specially across a network. Is there a way to make this faster?
I have been doing some reading and there are several option to handle data from DB like data adapters, reader, set, and tables. Which one is faster for this case? Should I be using a different method?
Two things I see is that first you're opening and closing your connection for each insert, that's usually your most expensive operation so you won't want to do that. You can also try batching them rather than doing them at once. When you do that you have to be careful because things could break in the middle of a large update so you would want to do things in a transaction. Without knowing too much about what your data structure looks like I refactored your method to do batching 100 at a time. First create a little helper method called move items that takes a connection and a list of ids. Don't do a try catch in this, you'll see why later.
Note: This method doesn't use parameters, I highly recommend you change it to do that.
private static void MoveItems(MySqlConnection conn, List<string> moveList)
{
string query = string.Format("insert into old_records select * from testes where id IN({0});" + " delete from testes where id IN({0})", string.Join(",", moveList.ToArray()));
var cmd = new MySqlCommand
{
Connection = conn,
CommandText = query
};
cmd.ExecuteNonQuery();
}
Next you'll change your main method to open the database connection once and then call this method 100 id's at a time. This method will have a try catch therefore if the call to MoveItems throws an exception it will be caught in this main method.
// the using statement will call your dispose method
using (var conn = new MySqlConnection(connectionString))
{
// open the connection and start the transaction
conn.Open();
var transaction = conn.BeginTransaction();
// createa list to temporarily store the ids
List<string> moves = new List<string>();
try
{
// clean the list, do the trim and get everything that's not null or empty
var cleanList = list.Select(obj => obj.ToString().Split(',')[0].Trim()).Where(s => !string.IsNullOrEmpty(s));
// loop over the clean list
foreach (string id in cleanList)
{
// add the id to the move list
moves.Add("'" + id + "'");
// batch 100 at a time
if (moves.Count % 100 == 0)
{
// when I reach 100 execute them and clear the list out
MoveItems(conn, moves);
moves.Clear();
}
}
// The list count might not be n (mod 100) therefore see if there's anything left
if (moves.Count > 0)
{
MoveItems(conn, moves);
moves.Clear();
}
// wohoo! commit the transaction
transaction.Commit();
}
catch (MySqlException ex)
{
// oops! something happened roll back everything
transaction.Rollback();
Console.WriteLine(ex.Message);
}
finally
{
conn.Close();
}
}
You may have to play with that 100 number. I remember when I worked with MySQL a lot I saw some performance differences between doing an IN and giving it a list of Or statements (Id = 'ID1' OR id = 'ID2' ...). But executing 40 statements or 80 statements will certainly have better performance, and opening the database connection once instead of 4000 times should also give you much better performance.
I might be wrong, but there is not much you can do in order to make it faster. After all you want to get the entire table data and insert its information into another table. The process will take some time if your table isn't small. However, you can try using the code below. It should do the trick and save some time.
INSERT INTO TABLE2 (FIELDNAME_IN_TABLE2, FIELDNAME2_IN_TABLE2)
SELECT FIELDNAME_IN_TABLE1, FIELDNAME2_IN_TABLE1
FROM TABLE1
Updated Question: Is there a way to force dataadapter accept only commands which do not include any update/drop/create/delete/insert commands other than verifying the command.text before sending to dataadapter (otherwise throw exception). is there any such built-in functionality provided by dot net in datareader dataadapter or any other?
Note: DataReader returns results it also accepts update query and returns result. (I might be omitting some mistake but I am showing my update command just before executing reader and then show message after its success which is all going fine
Could you search the string for some keywords? Like CREATE,UPDATE, INSERT, DROP or if the query does not start with SELECT? Or is that too flimsy?
You might also want to create a login for this that the application uses that only has read capability. I don't know if the object has that property but you can make the server refuse the transaction.
All you need to do is ensure there are no INSERT, UPDATE, or DELETE statements prepared for the DataAdapter. Your code could look something like this:
var dataAdapter = new SqlDataAdapter("SELECT * FROM table", "connection string");
OR
var dataAdapter = new SqlDataAdapter("SELECT * FROM table", sqlConnectionObject);
And bam, you have a read-only data adapter.
If you just wanted a DataTable then the following method is short and reduces complexity:
public DataTable GetDataForSql(string sql, string connectionString)
{
using(SqlConnection connection = new SqlConnection(connectionString))
{
using(SqlCommand command = new SqlCommand())
{
command.CommandType = CommandType.Text;
command.Connection = connection;
command.CommandText = sql;
connection.Open();
using(SqlDataReader reader = command.ExecuteReader())
{
DataTable data = new DataTable();
data.Load(reader);
return data;
}
}
}
}
usage:
try{
DataTable results = GetDataForSql("SELECT * FROM Table;", ApplicationSettings["ConnectionString"]);
}
catch(Exception e)
{
//Logging
//Alert to user that command failed.
}
There isn't really a need to use the DataAdapter here - it's not really for what you want. Why even go to the bother of catching exceptions etc if the Update, Delete or Insert commands are used? It's not a great fit for what you want to do.
It's important to note that the SelectCommand property doesn't do anything special - when the SelectCommand is executed, it will still run whatever command is passed to it - it just expects a resultset to be returned and if no results are returned then it returns an empty dataset.
This means that (and you should do this anyway) you should explicitly grant only SELECT permissions to the tables you want people to be able to query.
EDIT
To answer your other question, SqlDataReader's are ReadOnly because they work via a Read-Only firehose style cursor. What this effectively means is:
while(reader.Read()) //Reads a row at a time moving forward through the resultset (`cursor`)
{
//Allowed
string name = reader.GetString(reader.GetOrdinal("name"));
//Not Allowed - the read only bit means you can't update the results as you move through them
reader.GetString(reader.GetOrdina("name")) = name;
}
It's read only because it doesn't allow you to update the records as you move through them. There is no reason why the sql they execute to get the resultset can't update data though.
If you have a read-only requirement, have your TextBox use a connection string that uses an account with only db_datareader permissions on the SQL database.
Otherwise, what's stopping the developer who is consuming your control from just connecting to the database and wreaking havoc using SqlCommand all on their own?
I have this code and it always returns -1.I have three tables (a picture is more suggestive ):
I want to see if the row is already in the ReservationDetails table, and if it's not to insert it.
try
{
SqlConnection conn = new SqlConnection...
SqlCommand slct = new SqlCommand("SELECT * FROM ReservationDetails WHERE rID=#rID AND RNumber=#RNumber", conn);
slct.Parameters.AddWithValue("#rID", (int)comboBox1.SelectedValue);
slct.Parameters.AddWithValue("#RNumber", dataGridView1.SelectedRows[0].Cells[0].Value);
int noRows;//counts if we already have the entry in the table
conn.Open();
noRows = slct.ExecuteNonQuery();
conn.Close();
MessageBox.Show("The result of select="+noRows);
if (noRows ==0) //we can insert the new row
Have you read the documentation of SqlCommand.ExecuteNonQuery?
For UPDATE, INSERT, and DELETE statements, the return value is the number of rows affected by the command. When a trigger exists on a table being inserted or updated, the return value includes the number of rows affected by both the insert or update operation and the number of rows affected by the trigger or triggers. For all other types of statements, the return value is -1. If a rollback occurs, the return value is also -1.
And your query is SELECT.
You should
1) Change your TSQL to
SELECT COUNT(*) FROM ReservationDetails WHERE ...
(better still, use IF EXISTS ...)
2) and use ExecuteScalar():
noRows = (int) slct.ExecuteScalar();
Also: you will need to use a transaction (or some other atomic technique), or else someone could insert a row in-between you testing and trying to insert it...
All that said, it would be better to create a stored procedure that given your parameters, atomically tests and inserts into the table, returning 1 if successful, or 0 if the row already existed.
It is better to do it in a single query so that you do not need to request server two times.
Create a procedure like this and call it from the code.
IF EXISTS (SELECT 1 from ReservationDetails WHERE rID=#rID AND RNumber=#RNumber)
BEGIN
insert into ReservationDetails values(#rID,#RNumber)
END
As per Microsoft:
You can use the ExecuteNonQuery to perform catalog operations (for example, querying the structure of a database or creating database objects such as tables), or to change the data in a database without using a DataSet by executing UPDATE, INSERT, or DELETE statements.
What you may need, instead of ExecuteNonQuery is ExecuteScalar and put the COUNT in your select query.
i.e.
SqlCommand slct = new SqlCommand("SELECT COUNT(*) FROM ReservationDetails WHERE rID=#rID AND RNumber=#RNumber", conn);
Also, try to make use of the using statement in C#, so you don't need to worry about closing the connection manually, even if things fail.
i.e.
using (SqlConnection conn = new SqlConnection(connString))
{
SqlCommand cmd = new SqlCommand(sql, conn);
try
{
conn.Open();
newProdID = (Int32)cmd.ExecuteScalar();
}
catch (Exception ex)
{
//Do stuff
}
}
see:
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlcommand.executescalar.aspx
#nickNatra
When ever you use
Select command
It will return you values. Which can be either used by
DataSet or SqlDataReader
But
command.ExecuteNonQuery()
is used only when you are using
Insert , Update , Delete where the Rows are getting effected in your table
Yes, If you do want to know how much records are there in your query.
You can perform
a) Modify your query "select count(*) from table"
where you will only get one value ie. Number of Rows.
b) Using this query perform command. ExecuteScalar() which will return only First row and first column which is the Row Count
Hence this satisfy's your requirement.
Cheers!!