I come from this question here but I have a different case. I need my result in a DataTable and I have 2 potential methods:
public static DataTable SelectDataTable(string query, string ConnectionString)
{
using (SqlConnection myConnection = new SqlConnection(ConnectionString))
{
using (SqlDataAdapter myDataAdapter = new SqlDataAdapter(query, myConnection))
{
DataTable dt = new DataTable();
myDataAdapter.Fill(dt);
return dt;
}
}
}
and
public static DataTable SelectDataTable(string query, string ConnectionString)
{
using (SqlConnection myConnection = new SqlConnection(ConnectionString))
{
using (SqlCommand cmd = new SqlCommand(query, myConnection))
{
myConnection.Open();
DataTable dt = new DataTable();
dt.Load(cmd.ExecuteReader(CommandBehavior.CloseConnection));
return dt;
}
}
}
so my question: is there difference between
SqlDataAdapter + Fill()
and
SqlDataReader + DataTable + Load()
Which of there methods is to prefer?
Joel answer is pretty detailed, what makes this question not a duplicate
In fact I don't use all those mentioned advantages of the SqlDataReader I use it to fill a DataTable and that makes me expecting the answer be like: It's the same?! Unfortunately it's hard to guess what's happening under the hood.
Unless you are working with big data, I wouldn't expect huge performance gains from using a dataReader as opposed to a dataAdapter.
That being said, the link Pawel posted has a pretty decent write-up explaining the differences and advantages of both.
The main takeaway is readers are for reading data. They do nothing else really than that.
Because they don't do much else, they are relatively low overhead for performance.
DataAdapters are going to allow you to do more than the Readers, but in your case, it sounds like you don't need to do anything other than read in the records.
To reiterate, unless you are working with big data (like hundreds of thousands/millions of rows) I wouldn't expect the performance savings by using the dataReader to be very noticeable.
That is something only you will be able to determine when benchmarking with your own data.
Let us know if that clears up any confusion you may have had about the differences between DataAdapter and DataReader.
Related
I'm currently trying to use C# to read through an SQL DB. To do so, I use OleDB with a select statement. This goes into a dataset, which then populates a data adapter. I then iterate through each row and calculate stuff.
First of all, I feel like there's a better/more efficient way of doing this because I NEVER actually write back to the SQL DB. I just calculate based on what I'm selecting.
Anyways, past a certain point I get out of memory errors and/or an error from Ssms.exe saying "a new guard page for the stack cannot be created."
From the other questions I've seen, I need to use DataReader but I can't seem to get it to work the same way as the data adapter (which I suppose isn't that surprising).
The code I have now:
OleDbConnection myConn = new OleDbConnection(#"SQLDB connection string here");
OleDbCommand cmd = new OleDbCommand();
cmd.CommandText = <selectstatement here>
cmd.Connection = myConn;
cmd.CommandTimeout = 0;
OleDbDataAdapter da = new OleDbDataAdapter(cmd);
DataSet ds = new DataSet();
da.Fill(ds);
myConn.Close();
foreach (DataTable table in ds.Tables)
{
foreach (DataRow dr in table.Rows)
{
//do stuff
I guess my question is twofold, like I said above. One would DataReader solve my problem and allow me to iterate through the data, and two how do I adapt the first code snippet above to support that?
Also, since I've seen it elsewhere, I'm using x64 on the application.
I have been writing database software for a few years now and know that there are multiple ways to access data. Personally I do everything manually when pulling data by using a data reader object. This has just started to get on my nerves when working with tables that have a large amount of columns. It becomes very inefficient to have to write 30 lines of this code at a time
if(reader[count] != DBNull.Value)
someObject = reader.GetString(count++);
else
count++;
it is bad enough that the queries themselves take a long time to type out and get ready.
I was thinking on possibly using a datatable to retrieve my records since you can do that in a few lines and then write a few helper methods that iterate throught the rows and return me an observable collection of objects. Is there another more simple, or more concise way to go about doing this?
I just use the datatable
using (SqlConnection conn = new SqlConnection())
{
conn.Open();
DataTable dt = new DataTable();
using (SqlDataAdapter adapter = new SqlDataAdapter("select * from mytable", conn))
{
adapter.Fill(dt);
}
}
I'm using a datatable as the datasource of some dropdowns on a page, but have noticed that the page is very slow during the postbacks.
I've tracked it through to here:
DataTable dt = new DataTable();
dt.Load(sqlCmd.ExecuteReader()); // this takes ages
The sql command is a parametrised query, not a stored procedure (the return values and where are quite 'dynamic' so this wouldn't be practicable), but nevertheless a simple select union query.
Usually returns between 5 and 20 options per dropdown, depending on what's been selected on the other dropdowns.
When I run the query in the management studio, it's done in under a second. Here it can take up to 7 seconds per dropdown, with 6 dropdowns on the page it soon adds up.
I have also tried with a SqlDataAdapter:
SqlDataAdapter sqlDa = new SqlDataAdapter(sqlCmd);
sqlDa.Fill(dt); // this takes ages
but this was just as slow.
I have this on 2 different systems and on both have the same performance issues.
If anyone knows a better (faster) methord, or knows why this is so slow that would be great.
Not the best thread I've seen on the issue, but there's good links inside, & it's in my post history:
SQL Query that runs fine in SSMS runs very slow in ASP.NET
The SQL Optimizer sometimes likes to decide what's best & you'll have to break out your query through some tracing and logging of data execution plans. It may very well be something as buried as a bad index, or your query code might need optimization. Seeing as we don't have the query code, and having it may or may not be helpful. I'd recommend you follow the guides linked to in the above post and close your question.
here is an example on how you can load a DataTable very quickly notice how I show specific Columns that I want to return
private DataTable GetTableData()
{
string sql = "SELECT Id, FisrtName, LastName, Desc FROM MySqlTable";
using (SqlConnection myConnection = new SqlConnection(connectionString))
{
using (SqlCommand myCommand = new SqlCommand(sql, myConnection))
{
myConnection.Open();
using (SqlDataReader myReader = myCommand.ExecuteReader())
{
DataTable myTable = new DataTable();
myTable.Load(myReader);
myConnection.Close();
return myTable;
}
}
}
}
If you want to use DataAdapter to Fill the DataTable here is a simple example
private void FillAdapter()
{
using (SqlConnection conn = new SqlConnection(Your ConnectionString))
{
conn.Open();
using (SqlDataAdapter dataAdapt = new SqlDataAdapter("SELECT * FROM EmployeeIDs", conn))
{
DataTable dt = new DataTable();
dataAdapt.Fill(dt);
// dataGridView1.DataSource = dt;//if you want to display data in DataGridView
}
}
}
This question already has answers here:
Is datareader quicker than dataset when populating a datatable?
(9 answers)
Closed 10 years ago.
I am using the following code (Variant DataReader):
public DataTable dtFromDataReader(list<String> lstStrings)
{
OleDBConn_.Open();
using (OleDbCommand cmd = new OleDbCommand())
{
DataTable dt = new DataTable();
OleDbDataReader reader = null;
cmd.Connection = OleDBConn_;
cmd.CommandText = "SELECT * from TableX where SUID=?";
foreach (String aString in lstStrings)
{
cmd.Parameters.AddWithValue("?", aNode.SUID);
reader = cmd.ExecuteReader();
if (reader != null)
dt.Load(reader);
cmd.Parameters.Clear();
}
return dt;
}
}
and compare it to (Variant DataAdapter):
public DataTable dtFromDataAdapter(list<String> lstStrings)
{
dt = new DataTable();
foreach (string aString in lstStrings)
{
sOledb_statement = String.Concat("SELECT * FROM TableX where SUID='", aString, "'");
OleDbDataAdapter oleDbAdapter;
using (oleDbAdapter = new OleDbDataAdapter(sOledb_statement, OleDBConn_))
{
GetOleDbRows = oleDbAdapter.Fill(dt);
}
}
}
When i connect to an offline database (microsoft access) my reading time is (~1.5k retrieved items):
DataReader 420 ms
DataAdapter 5613 ms
When reading from oracle server (~30k retrieved items):
DataReader 323845 ms
DataAdapter 204153 ms
(several tests, times do not change much)
Even changing the order of the commands (dataadapter before datareader) didn't change much (i thought that there may have been some precaching..).
I thought DataTable.Load should be somewhat faster than DataAdapter.Fill?
And i still believe, even though i see the results, that it should be faster. Where am i losing my time? (There are no unhandled exceptions..)
Your comparison isn't really an Adapter vs DataReader with the way you have the code setup. You are really comparing the Adapter.Fill vs DataTable.Load methods.
The DataReader would normally be faster on a per-record basis because you would be traversing the records one at a time and can react accordingly when you read each record.
Since you are returning a DataTable in both instances, the Adapter.Fill method would probably be the optimal choice to use. It was designed to do just that.
Which one of these techniques is faster?
1)
DbDataAdapter dataAdapter = _factory.CreateDataAdapter();
dataAdapter.SelectCommand = _command;
dataSet = new DataSet();
dataAdapter.Fill(dataSet);
2)
DataTable dt = new DataTable();
IDataReader iDataReader= _command.ExecuteReader();
dt.Load(iDataReader);
iDataReader.Close();
Have a look at these links
DataReaders, DataSets, and performance
and
DataAdapter.Fill preferable to DataReader?
As mentioned in the comments to your question. It would be best to test for the given situation at hand, there is never a one rule applies to all.