I'd like to sort on a column in the result of a stored procedure without having to add the Order By clause in the stored procedure. I don't want the data to be sorted after I have executed the query, sorting should be part of the query if possible. I have the following code:
public static DataTable RunReport(ReportQuery query)
{
OffertaDataContext db = new OffertaDataContext();
Report report = (from r in db.Reports where r.Id == (int)query.ReportId select r).Single();
//???: check security clearance.
DataSet dataSet = new DataSet();
/*
doesn't work, I guess the "Result" table hasn't been created yet;
if(!string.IsNullOrEmpty(query.SortField))
{
dataSet.DefaultViewManager.DataViewSettings["Result"].Sort = query.SortField + " " + (query.SortAscending ? "ASC" : "DESC");
}
*/
using (SqlConnection conn = new SqlConnection(Config.ConnectionString))
{
conn.Open();
using (SqlCommand exec = conn.CreateCommand())
{
using (SqlDataAdapter adapter = new SqlDataAdapter())
{
exec.Connection = conn;
exec.CommandType = CommandType.StoredProcedure;
exec.CommandText = report.ReportProc;
adapter.SelectCommand = exec;
try
{
adapter.Fill(dataSet, query.Skip, query.Take, "Result");
}
catch (Exception e)
{
throw e;
}
finally
{
conn.Close();
}
return dataSet.Tables["Result"];
}
}
}
}
How do I add sorting?
Get the DataTable you are populating in the dataSet ("Result").
Now - there's no way to sort the DataTable, except via the Query, View, or Stored
Procedure that populates it.
Since you don't wanna do it in the SP, you can sort the DefaultView of the
DataTable, or any DataView that is associated with the DataTable.
You can achieve it using the Sort property of the DataView. This is a string which specifies the column (or columns) to sort on, and the order (ASC or DESC).
Example:
myTable.DefaultView.Sort = "myColumn DESC";
You can now use the DefaultView to do whatever you want (bind it to something or whatever)
To be honest, since you are using DataTable, you might as well just sort at the client.
Dynamic sorting (at the server) via SPs etc is always a pain; to do it in pure TSQL, you either need some horribly inefficient CASE block at the end of the SELECT, or you need to use dynamic SQL (for example via sp_ExecuteSQL), manipulating the ORDER BY in the final query. The only other option (in raw TSQL) would be to EXEC/INTO to get the data into a table variable (or temp table), then SELECT from this with an ORDER BY.
If it is an option, LINQ-to-SQL actually does OK at this; it supports querying (and composing against) UDFs - so rather than an SP, code the query in a UDF (the SP can always just SELECT from the UDF if you need to support legacy callers). Then you can use "order by" etc in a LINQ query:
var qry = from row in ctx.SomeMethod(args)
order by row.Name, row.Key
select row;
(or there are various methods for adding a dynamic sort to a LINQ query - the above is just a simple example)
the final TSQL will be something like:
SELECT blah FROM theudf(args) ORDER BY blah
i.e. it will get it right, and do the "ORDER BY" at the server. This is particularly useful when used with Skip() and Take() to get paged data.
Related
I am showing some sql table results in data grid view in winforms app. I normally use DbEntities but i had to use join in my query to get results from multiple results, so instead i used this code.
And i want to add a query and a textbox to search results while typing. How can i do that from what i already started?
SqlConnection con = new SqlConnection("server=.; Initial
Catalog=winforms;Integrated Security=SSPI");
DataTable dt = new DataTable();
string sql = "SELECT Personel.ad, Personel.soyad, Personel.tc, Personel.dogum, Personel.isgiris, Birim.birimad AS [Birim], Sube.subead AS [Şube] FROM Personel JOIN Birim ON Birim.birimid = Personel.birimid JOIN Sube ON Sube.subeid = Personel.subeid";
con.Open();
SqlDataAdapter da = new SqlDataAdapter(sql, con);
da.Fill(dt);
dataGridView1.DataSource = dt;
Found this answer to search in DataTable.
So for your solution you would need to implement
public static DataTable SearchInAllColums(this DataTable table, string keyword, StringComparison comparison)
{
if(keyword.Equals(""))
{
return table;
}
DataRow[] filteredRows = table.Rows
.Cast<DataRow>()
.Where(r => r.ItemArray.Any(
c => c.ToString().IndexOf(keyword, comparison) >= 0))
.ToArray();
if (filteredRows.Length == 0)
{
DataTable dtProcessesTemp = table.Clone();
dtProcessesTemp.Clear();
return dtProcessesTemp;
}
else
{
return filteredRows.CopyToDataTable();
}
}
And then you could use it in your changeevent:
void textBox1_TextChanged(object sender, EventArgs e)
{
SqlConnection con = new SqlConnection("server=.; Initial
Catalog=winforms;Integrated Security=SSPI");
DataTable dt = new DataTable();
string sql = "SELECT Personel.ad, Personel.soyad, Personel.tc, Personel.dogum, Personel.isgiris, Birim.birimad AS [Birim], Sube.subead AS [Şube] FROM Personel JOIN Birim ON Birim.birimid = Personel.birimid JOIN Sube ON Sube.subeid = Personel.subeid";
con.Open();
SqlDataAdapter da = new SqlDataAdapter(sql, con);
da.Fill(dt);
dataTable.SearchInAllColums(textBox1.Text, StringComparison.OrdinalIgnoreCase);
dataGridView1.DataSource = dataTable;
}
HOWEVER: Doing it like this will cause alot of traffic to your sql server. I would strongly suggest you to also implement some form of cache here for getting all searchable data. If that's an option.
You can use TextChanged event for textbox and send the text to the function as parameter. Then add the text to the WHERE clause on SQL query.
Be careful about user inputs for querying SQL. This is just an example, not very safe.
void textBox1_TextChanged(object sender, EventArgs e)
{
CallSQL(textBox1.Text);
}
void CallSQL(string filterText)
{
...
...
string sql = string.Format("SELECT ... WHERE Personel.Ad = {0}", filterText);
...
...
}
just o add to the question and futures similiar problems:
Do not use SQL as "insta-search" on textchanged proprieties. Its bad for many reasons as people mentioned above. But first of all, you can have problems by locking up SQL tables for having people trying so search something and generating alot of traffic as consequence, and even for one user, its unsafe and unfriendly to resources.. Besides, you have a high traffic for SQL injection. There are some good programming practices to make it safer, i'll try to elucidate a little more.
Okay, you already know this piece of information. But how the Correct way to do it?
Use objects, DAL and cache info until its old enough and have to update it.
How:
First things first, you should create a DAL/DAO class that is used ONLY for sql querys and operations. Do some search on the subject and why is a bad idea to have your business rules and sql code scrambled togheter. Here some short text about it:About DAL/DAO
Then, create a class of objects. For example: PersonelInfo. Wich contains every attribute that you need, like: name,documents, etc.
Here some info about objects handling: Using Objects with C#
After having your object nice and done, use lists to store and pass it to datatable or directly to datagridview if you wich.
List<Objectname> listname = new List<Objectname>();
Then, use some loop to iterate through SQL data and fill your list.
Example:
while(dataReader.Read())
{
Object objectname = new Object();
objectname.attribute1 = dataReadet["columname"].ToString();
objectname.attribute2 = dataReadet["columname"].ToString();
objectname.attribute3 = dataReadet["columname"].ToString();
listname.Add(objectname);
}
at the end return your objectlist:
Return listname;
That way you have full organized object list that you can:
Use as base for cache. That list contains all information that user
needs to search for. Search the list with a foreach loop and return
desired values to the user. Just remember to update it when it gets
old.
Use it as source for Datagridview or Datatable, its cheap.
Use it with JSON, it can be sent by API or simple UDP/TCP connection
Hope this short comment bring you some light with data handling.
Note that not every data is safe to keep that way. Passwords and protected stuff should not be stored in memory or other ways that can be exploited.
I have been asked to look at finding the most efficient way to take a DataTable input and write it to a SQL Server table using C#. The snag is that the solution must use ODBC Connections throughout, this rules out sqlBulkCopy. The solution must also work on all SQL Server versions back to SQL Server 2008 R2.
I am thinking that the best approach would be to use batch inserts of 1000 rows at a time using the following SQL syntax:
INSERT INTO dbo.Table1(Field1, Field2)
SELECT Value1, Value2
UNION
SELECT Value1, Value2
I have already written the code the check if a table corresponding to the DataTable input already exists on the SQL Server and to create one if it doesn't.
I have also written the code to create the INSERT statement itself. What I am struggling with is how to dynamically build the SELECT statements from the rows in the data table. How can I access the values in the rows to build my SELECT statement? I think I will also need to check the data type of each column in order to determine whether the values need to be enclosed in single quotes (') or not.
Here is my current code:
public bool CopyDataTable(DataTable sourceTable, OdbcConnection targetConn, string targetTable)
{
OdbcTransaction tran = null;
string[] selectStatement = new string[sourceTable.Rows.Count];
// Check if targetTable exists, create it if it doesn't
if (!TableExists(targetConn, targetTable))
{
bool created = CreateTableFromDataTable(targetConn, sourceTable);
if (!created)
return false;
}
try
{
// Prepare insert statement based on sourceTable
string insertStatement = string.Format("INSERT INTO [dbo].[{0}] (", targetTable);
foreach (DataColumn dataColumn in sourceTable.Columns)
{
insertStatement += dataColumn + ",";
}
insertStatement += insertStatement.TrimEnd(',') + ") ";
// Open connection to target db
using (targetConn)
{
if (targetConn.State != ConnectionState.Open)
targetConn.Open();
tran = targetConn.BeginTransaction();
for (int i = 0; i < sourceTable.Rows.Count; i++)
{
DataRow row = sourceTable.Rows[i];
// Need to iterate through columns in row, getting values and data types and building a SELECT statement
selectStatement[i] = "SELECT ";
}
insertStatement += string.Join(" UNION ", selectStatement);
using (OdbcCommand cmd = new OdbcCommand(insertStatement, targetConn, tran))
{
cmd.ExecuteNonQuery();
}
tran.Commit();
return true;
}
}
catch
{
tran.Rollback();
return false;
}
}
Any advice would be much appreciated. Also if there is a simpler approach than the one I am suggesting then any details of that would be great.
Ok since we cannot use stored procedures or Bulk Copy ; when I modelled the various approaches a couple of years ago, the key determinant to performance was the number of calls to the server. So batching a set of MERGE or INSERT statements into a single call separated by semi-colons was found to be the fastest method. I ended up batching my SQL statements. I think the max size of a SQL statement was 32k so I chopped up my batch into units of that size.
(Note - use StringBuilder instead of concatenating strings manually - it has a beneficial effect on performance)
Psuedo-code
string sqlStatement = "INSERT INTO Tab1 VALUES {0},{1},{2}";
StringBuilder sqlBatch = new StringBuilder();
foreach(DataRow row in myDataTable)
{
sqlBatch.AppendLine(string.Format(sqlStatement, row["Field1"], row["Field2"], row["Field3"]));
sqlBatch.Append(";");
}
myOdbcConnection.ExecuteSql(sqlBatch.ToString());
You need to deal with batch size complications, and formatting of the correct field data types in the string-replace step, but otherwise this will be the best performance.
Marked solution of PhillipH is open for several mistakes and SQL injection.
Normally you should build a DbCommand with parameters and execute this instead of executing a self build SQL statement.
The CommandText must be "INSERT INTO Tab1 VALUES ?,?,?" for ODBC and OLEDB, SqlClient needs named parameters ("#<Name>").
Parameters should be added with the dimensions of underlaying column.
I dont know how to do this query in c#.
There are two databases and each one has a table required for this query. I need to take the data from one database table and update the other database table with the corresponding payrollID.
I have two tables in seperate databases, Employee which is in techData database and strStaff in QLS database. In the employee table I have StaffID but need to pull the PayrollID from strStaff.
Insert payrollID into Employee where staffID from strStaff = staffID from Employee
However I need to get the staffID and PayrollID from strStaff before I can do the insert query.
This is what I have got so far but it wont work.
cn.ConnectionString = ConfigurationManager.ConnectionStrings["PayrollPlusConnectionString"].ConnectionString;
cmd.Connection = cn;
cmd.CommandText = "Select StaffId, PayrollID From [strStaff] Where (StaffID = #StaffID)";
cmd.Parameters.AddWithValue("#StaffID", staffID);
//Open the connection to the database
cn.Open();
// Execute the sql.
dr = cmd.ExecuteReader();
// Read all of the rows generated by the command (in this case only one row).
For each (dr.Read()) {
cmd.CommandText = "Insert into Employee, where StaffID = #StaffID";
}
// Close your connection to the DB.
dr.Close();
cn.Close();
Assuminig, you want to add data to existing table, you have to use UPDATE + SELECT statement (as i mentioned in a comment to the question). It might look like:
UPDATE emp SET payrollID = sta.peyrollID
FROM Emplyoee AS emp INNER JOIN strStaff AS sta ON emp.staffID = sta.staffID
I have added some clarity to your question: the essential part is that you want to create a C# procedure to accomplish your task (not using SQL Server Management Studio, SSIS, bulk insert, etc). Pertinent to this, there will be 2 different connection objects, and 2 different SQL statements to execute on those connections.
The first task would be retrieving data from the first DB (for certainty let's call it source DB/Table) using SELECT SQL statement, and storing it in some temporary data structure, either per row (as in your code), or the entire table using .NET DataTable object, which will give substantial performance boost. For this purpose, you should use the first connection object to source DB/Table (btw, you can close that connection as soon as you get the data).
The second task would be inserting the data into second DB (target DB/Table), though from your business logic it's a bit unclear how to handle possible data conflicts if records with identical ID already exist in the target DB/Table (some clarity needed). To complete this operation you should use the second connection object and second SQL query.
The sample code snippet to perform the first task, which allows retrieving entire data into .NET/C# DataTable object in a single pass is shown below:
private static DataTable SqlReadDB(string ConnString, string SQL)
{
DataTable _dt;
try
{
using (SqlConnection _connSql = new SqlConnection(ConnString))
{
using (SqlCommand _commandl = new SqlCommand(SQL, _connSql))
{
_commandSql.CommandType = CommandType.Text;
_connSql.Open();
using (SqlCeDataReader _dataReaderSql = _commandSql.ExecuteReader(CommandBehavior.CloseConnection))
{
_dt = new DataTable();
_dt.Load(_dataReaderSqlCe);
_dataReaderSql.Close();
}
}
_connSqlCe.Close();
return _dt;
}
}
catch { return null; }
}
The second part (adding data to target DB/Table) you should code based on the clarified business logic (i.e. data conflicts resolution: do you want to update existing record or skip, etc). Just iterate through the data rows in DataTable object and perform either INSERT or UPDATE SQL operations.
Hope this may help. Kind regards,
I have a requirement where I need to read queries from Access DB in c# and check if the access db query has any keyword like "KEY" if it has keywords I need to enclose that in square brackets"[]".just like how it is done in SQL.
Could someone suggest me how to do that?
You can retrieve the query text like this:
string connString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\...\myDB.mdb";
using (var conn = new OleDbConnection(connString )) {
conn.Open();
string[] restrictions = new string[] { null, null, "myQuery" };
DataTable schema = conn.GetSchema("Views", restrictions);
if (schema.Rows.Count > 0) {
DataRow row = schema.Rows[0];
string queryText = (string)row["VIEW_DEFINITION"];
Console.WriteLine(queryText);
}
}
If you drop the restrictions argument with the query name, conn.GetSchema("Views") returns one row for each query. If you query conn.GetSchema("Procedures") other types of queries like insert, update and DDL statements that are not considered as queries are returned in row["PROCEDURE_DEFINITION"].
View (query) names are returned in row["TABLE_NAME"] and procedure names in row["PROCEDURE_NAME"].
And you can update the query like this:
using (var conn = new OleDbConnection(connString)) {
conn.Open();
var cmd = new OleDbCommand("DROP PROCEDURE myQuery", conn);
cmd.ExecuteNonQuery();
cmd = new OleDbCommand("CREATE PROCEDURE myQuery AS SELECT * FROM myTable", conn);
cmd.ExecuteNonQuery();
}
Strangely enough the OleDb CREATE DDL (Data Definition Language) designates the queries as 'procedures' but the schema table returns a 'VIEW_DEFINITION' and the query name is returned in the column 'TABLE_NAME'. SELECT queries must be retrieved as "Views", other types of queries as "Procedures"; however, both types are created as PROCEDUREs.
While I was testing the answer that #Olivier Jacot-Descombes provided, I was not able to retreive all the queries text representation. Therefore I applied some other method where you open the existing Ms Access database instance and read the queries that are stored in it.
Here is the class I used:
public class MsAccess
{
private Microsoft.Office.Interop.Access._Application _oAccess;
public MsAccess(string path)
{
_oAccess = (Microsoft.Office.Interop.Access._Application)System.Runtime.InteropServices.Marshal.BindToMoniker(path);
}
public string ReturnSqlQueryText(string queryName)
{
string queryDef = null;
var qdefs = _oAccess.CurrentDb().QueryDefs;
foreach (QueryDef qdef in qdefs)
{
if(qdef.Name.Equals(queryName))
queryDef = qdef.SQL;
}
return queryDef;
}
}
Using this code might require you adding using Microsoft.Office.Interop.Access.Dao and Microsoft.Office.Interop.Access both (15.0.0.0) where you can find them under Extension on the reference menu
I'm using the OPD.NET dll in a project that is accessing oracle.
Users can type in any SQL into a text box, that is then executed against the db. I've been trying to use the OracleDataAdapter to populate a datatable with the resultset, but I want to be able to return the resultset in stages (for large select queries).
An example of my problem is...
If a select query returns 13 rows of data, the code snippet below will execute without issue until the fourth time oda.Fill (start row is 15 which doesn't exist) is called, I presume because it is calling into a reader that has closed or something similar.
It then will throw a System.InvalidOperationException with the message - Operation is not valid due to the current state of the object.
How can I find out how many rows in total the command will eventually contain (so that I don't encounter the exception)?
OracleDataAdapter oda = new OracleDataAdapter(oracleCommand);
oda.Requery = false;
var dts = new DataTable[] { dt };
DataTable dt = new DataTable();
oda.Fill(0, 5, dts);
var a = dts[0].Rows.Count;
oda.Fill(a, 5, dts);
var b = dts[0].Rows.Count;
oda.Fill(b, 5, dts);
var c = dts[0].Rows.Count;
oda.Fill(c, 5, dts);
var d = dts[0].Rows.Count;
Note: I've omitted the connection and oracle command objects for brevity.
EDIT 1:
I've just thought I could just wrap the SQL entered by the user in another query and execute it...
SELECT COUNT(*) FROM (...intial query in here...)
but this isn't exactly a clean solution, and surely there is a method somewhere that I haven't seen?
Thanks in advance.
For paging in Oracle, see: http://www.oracle.com/technology/oramag/oracle/06-sep/o56asktom.html
There is no way to know the record set count without running a separate count(*) query. This is by design. The DataReader and DataAdapter are forward-only, read only.
If efficiency is a concern (i.e., large record sets), one should let the database do the paging and not ask the OracleDataAdapter to run the full query. Imagine if Google filled a DataTable with all 1M+ results for each user search!! The following article addresses this concern, although the examples are in sql:
http://www.asp.net/data-access/tutorials/efficiently-paging-through-large-amounts-of-data-cs
I've revised my example below to allow paging on any sql query. The calling procedure is responsible for keeping track of the user's current page and page size. If the result set is less than the requested page size, there are no more pages.
Of course, running custom sql from user input is a huge security risk. But that wasn't the question at hand.
Good luck! --Brett
DataTable GetReport(string sql, int pageIndex, int pageSize)
{
DataTable table = new DataTable();
int rowStart = pageIndex * pageSize + 1;
int rowEnd = (pageIndex + 1) * pageSize;
string qry = string.Format(
#"select *
from (select rownum ""ROWNUM"", a.*
from ({0}) a
where rownum <= :rowEnd)
where ""ROWNUM"" >= :rowStart
", sql);
try
{
using (OracleConnection conn = new OracleConnection(_connStr))
{
OracleCommand cmd = new OracleCommand(qry, conn);
cmd.Parameters.Add(":rowEnd", OracleDbType.Int32).Value = rowEnd;
cmd.Parameters.Add(":rowStart", OracleDbType.Int32).Value = rowStart;
cmd.CommandType = CommandType.Text;
conn.Open();
OracleDataAdapter oda = new OracleDataAdapter(cmd);
oda.Fill(table);
}
}
catch (Exception)
{
throw;
}
return table;
}
You could add an Analytic COUNT to your query:
SELECT foo, bar, COUNT(*) OVER () TheCount WHERE ...;
That way the count of the entire query is returned with each row in TheCount, and you could set your loop to terminate accordingly.
To gain control over Fill DataTable Loop you need own the loop.
Then build your own Function to Fill DataTable using OracleDataReader.
To get Columns information, you can use dataReader.GetSchemaTable
To Fill the Table:
MyTable.BeginLoadData
Dim Values(mySchema.rows.count-1)
Do while myReader.read
MyReader.GetValues(Values)
MyTable.Rows.add(Values)
'Include here your control over load Count
Loop
MyTable.EndLoadData