I'm having some performance issues when I return a record set with more than 1,000 records.
Sometimes the records are in upwards of 2,100 but can be as low as 10.
I have some bulk actions that I take on all the records by selecting them.
However, when the number is low, the Gridview is fine. When the record count is greater than 500 I see performance issues on the page.
What I want to happen is: if there are more than 500 records, DO NOT DISPLAY THE GRID, instead show a download button that exports to CSV or do other control things on the page.
My issue:
Even if i tell it not to display the grid and instead display a message and a button, the performance is still slow.
Below is my C# code for populating the GridView. Some stuff has been removed that are unimportant and to help with readability.
How can I adjust my C# code for better performance?
SqlConnection conn = new SqlConnection(ConfigurationManager.AppSettings["ConnectString"].ToString());
SqlCommand cmd = conn.CreateCommand();
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "SomeProcedure";
cmd.Parameters.Add(SearchParam);
try {
DataTable GridData = new DataTable();
conn.Open();
using(SqlDataAdapter Sqlda = new SqlDataAdapter(cmd)) {
Sqlda.Fill(GridData);
}
if (GridData.Rows.Count == 0) {
lblSearchMsg.Text = "No Fee Records are in the Queue at this time.";
} else {
if (GridData.Rows.Count > 500) {
lblSearchMsg.Text = "More than " + gridLimit.ToString() + " records returned.";
//Show the download button
} else {
//Persist the table in the Session object. (for sorting)
Session["GridData"] = GridData;
lblRowCount.Text = "Count: " + GridData.Rows.Count.ToString();
myGridView.DataSource = GridData;
myGridView.DataBind();
myGridView.Visible = true;
}
}
} catch (Exception ex) {
//Do the error stuff
} finally {
if (conn != null) {
conn.Close();
}
}
Create a separate procedure that returns only the row count.
Check that value not the row count of a fully retrieved data set then retrieve the full data set as needed.
Keep in mind you can use the same connection to do both retrievals, no need to close the connection between calls.
if you determine you need to fill a gridview and there is no need to edit the data you can read into the DataTable without the use of an adapter. Here is the basic idea modify with using statements or try/catch as you prefer:
conn = new SqlConnection(connString);
string query = "SELECT * FROM ....";
SqlCommand cmd = new SqlCommand(query, conn);
conn.Open();
SqlDataReader dr = cmd.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dr);
GridView1.DataSource = dt;
GridView1.DataBind();
Related
I am working on a C# windows application to populate records from SQL Server to data grid view, with dynamic checkbox facility in each row. I want to select selected rows for some purpose via checkbox of that particular row. Till now I successfully achieve my target, but I'm facing a minor issue regarding saving a checked status.
For example I want to check only those records whose Name = Max. I have a textbox in that textbox I call text change event with like Query:
try
{
SqlCommand cmd = null;
SqlConnection con = null; Ranks rank = new Ranks();
con = new SqlConnection(cs.DBcon);
con.Open();
cmd = con.CreateCommand();
cmd.CommandText = "Select * from Records where Name like #Name order by Pno";
cmd.Parameters.AddWithValue("#Name", "%" + FilterByNameTextbox.Text.Trim() + "%");
SqlDataAdapter adapter1 = new SqlDataAdapter(cmd);
DataTable dt = new DataTable();
adapter1.Fill(dt);
dataGridView1.DataSource = dt;
Make_fields_Colorful();
}
catch (Exception exception)
{
MessageBox.Show(exception.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Hand);
}
If I write Max in filter by name textbox it would return 3 records with name starts with max using like query as I mention code above. So I only check 2 records out of 3 using dynamic checkbox, till now my code runs perfectly. Now I want to check records which name starts from Ali, now when I write ali in my filter by name textbox it will return rows where name like ali , but problem comes here it will remove my previous checked records, so how I would able to save checked records for both max and ali's rows:
Code for adding dynamic checkboxes in each row
DataGridViewCheckBoxColumn checkBoxColumn = new DataGridViewCheckBoxColumn();
checkBoxColumn.Name = "checkBoxColumn";
checkBoxColumn.DataPropertyName = "Report";
checkBoxColumn.HeaderText = "Report";
dataGridView1.Columns.Insert(10, checkBoxColumn);
dataGridView1.RowTemplate.Height = 100;
dataGridView1.Columns[10].Width = 50;
Images:
Image 1
Image 2
I suggest you achieve this by caching selected rows, first you should have a list of cached rows:
List<DataGridViewRow> CachedRows = new List<DataGridViewRow>();
then add event handler on cell value change like the following:
dataGridView1.CellValueChanged += view_CellValueChanged;
and the handler should check if the column changed is the checkbox and checked, should be something like the following:
try
{
if(e.ColumnIndex == indexOfCheckBoxColumn)
{
if((bool)dataGridView1.Rows[e.RowIndex].Cells[e.ColumnIndex].Value == true)
{
CachedRows.Add((DataGridViewRow)dataGridView1.Rows[e.RowIndex].Clone());
}
else if (CachedRows.Contains(dataGridView1.Rows[e.RowIndex]))//Validate if this works, if not you should associate each row with unique key like for example (id) using a dictionary
{
CachedRows.Remove(dataGridView1.Rows[e.RowIndex]);
}
}
}
catch(Exception ex)
{
}
then after the filter changes, re-add the cached rows again, so code becomes:
try
{
SqlCommand cmd = null;
SqlConnection con = null; Ranks rank = new Ranks();
con = new SqlConnection(cs.DBcon);
con.Open();
cmd = con.CreateCommand();
cmd.CommandText = "Select * from Records where Name like #Name order by Pno";
cmd.Parameters.AddWithValue("#Name", "%" + FilterByNameTextbox.Text.Trim() + "%");
SqlDataAdapter adapter1 = new SqlDataAdapter(cmd);
DataTable dt = new DataTable();
adapter1.Fill(dt);
dataGridView1.DataSource = dt;
//add folowing
if (CachedRows.Any())
{
dataGridView1.Rows.AddRange(CachedRows.ToArray());
CachedRows.Clear();
}
Make_fields_Colorful();
}
catch (Exception exception)
{
MessageBox.Show(exception.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Hand);
}
I am trying to get the data from a MySQL table and storing it into a dictionary. I know I can do that with a loop but DB table contains more than a million tuples and will slow the progress. Is there any way to do that without having to loop all the entries?
DB table has two columns, 1st(key) float type and 2nd(value) varchar.
My Current approach:
public static Dictionary<Single, string> GetData()
{
Dictionary<Single, string> dic= new Dictionary<Single, string>();
string query = "select * from table;";
if (OpenConnection() == true)
{
try
{
MySqlCommand cmd = new MySqlCommand(query, connection);
MySqlDataReader reader = cmd.ExecuteReader();
if (reader.Read())
{
//Something to store DB columns into dictionary without having to loop.
}
reader.Close();
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
}
return dic;
}
Next is what you want:
try
{
MySqlCommand cmd = new MySqlCommand(query, connection);
MySqlDataAdapter msda = new MySqlDataAdapter(cmd);
DataSet ds = new DataSet();
msda.Fill(ds);
your_control.DataSource = ds;
your_control.DataBind();
//your control means some repeater, or gridview, or anything else, which may show all your elements together.
}
But you also should consider that if you have millions of records, it would be better to make them selected partially. For example, 100 records per page, using row_number or similar methods
I have done this before, but for the life of me I can't remember how this worked.
I have a database that has a bunch of rows with data in them like names and ID numbers. What I need to do is populate a treeview from names in the database. I am running up against an issue just getting the reader to read multiple rows in the database. It only seems to be reading the first row and not subsequent rows. the actual task would be similar to below :
For each row in database add a parent node to treeview where the name is = to (reader[4].ToString()). That's about it. At the moment all I am trying to do is just get it to pop a messagebox showing that it's reading the multiple rows.
Please what am I missing to get this working?
SqlCeConnection conn = null;
try
{
using (conn = new SqlCeConnection("Data Source =" + ConfigurationFile + "; Password =*********"))
{
conn.Open();
SqlCeCommand cmd = conn.CreateCommand();
cmd.CommandText = "select * from t_mainprofiles";
cmd.ExecuteNonQuery();
var reader = cmd.ExecuteReader();
while (reader.Read())
{
ID = (Convert.ToInt32(reader[1]));
profileID = (Convert.ToInt32(reader[2]));
profileNAME = (reader[4].ToString().Trim());
profileLOC = (reader[5].ToString().Trim());
profileCHILD = (reader[6].ToString().Trim());
}
MessageBox.Show(profileNAME);
reader.Close();
}
}
catch(Exception error)
{
MessageBox.Show(""+error);
System.Diagnostics.Process.GetCurrentProcess().Kill();
}
finally
{
conn.Close();
}
Try removing the line cmd.ExecuteNonQuery();
Here is an example from MSDN
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqldatareader.read(v=vs.110).aspx
I have a DataTable with a few records. I want to insert all those records into a remote database. What would be the easiest way to do it? I read that most people iterate over the rows of the DataTable and insert record by record. I would like to make just 1 connection to the remote server and do a bulk insert. Is it possible? I am using C# and MySQL.
Although Kemal Taskin's answer is an elegant solution it is horrible on performance with a large DataTable.
I tried it with a 37500 record insert and it took over 15 minutes.
It seems to be inserting one record at a time.
I found that if I generate a MySQL insert statement string with 1000 records in it and loop over the data until its complete I have reduced my insert time down to 6 seconds. It's not BULK LOADING, its CHUNK LOADING. If anyone can come up with a better solution, please let me know.
public void writeToDBTable(DataTable dt)
{
MySqlConnection conn = new MySqlConnection(globalClass.connString);
conn.Open();
String sql = null;
String sqlStart = "insert into MyTable (run_id, model_id, start_frame,water_year, state_id, obligateCover, DTWoodyCover, perennialGrowth, clonalCover) values ";
Console.WriteLine("Write to DB - Start. Records to insert = {0}", dt.Rows.Count);
int x = 0;
foreach (DataRow row in dt.Rows)
{
x += 1;
if (x == 1)
{
sql = String.Format(#"({0},{1},{2},{3},{4},{5},{6},{7},{8})",
row["runId"],
row["modelId"],
row["startFrame"],
row["waterYear"],
row["currentFrame"],
row["obligateCover"],
row["DTWoodyCover"],
row["perennialGrowth"],
row["clonalCover"]
);
}
else
{
sql = String.Format(sql + #",({0},{1},{2},{3},{4},{5},{6},{7},{8})",
row["runId"],
row["modelId"],
row["startFrame"],
row["waterYear"],
row["currentFrame"],
row["obligateCover"],
row["DTWoodyCover"],
row["perennialGrowth"],
row["clonalCover"]
);
}
if (x == 1000)
{
try
{
sql = sqlStart + sql;
MySqlCommand cmd = new MySqlCommand(sql, conn);
cmd.ExecuteNonQuery();
Console.WriteLine("Write {0}", x);
x = 0;
}
catch (Exception ex)
{
Console.WriteLine(sql);
Console.WriteLine(ex.ToString());
}
}
}
// get any straglers
if (x > 0)
{
try
{
sql = sqlStart + sql;
MySqlCommand cmd = new MySqlCommand(sql, conn);
cmd.ExecuteNonQuery();
Console.WriteLine("Write {0}", x);
x = 0;
}
catch (Exception ex)
{
Console.WriteLine(sql);
Console.WriteLine(ex.ToString());
}
}
conn.Close();
Console.WriteLine("Write to DB - End.");
}
I don't know whether this answer is too late or not :)
You can do something like this:
// assume you have a table with one column;
string commandText = "insert into t_test1 (myid) values (#tempid)";
using (MySqlConnection cn = new MySqlConnection(myConnectionString))
{
cn.Open();
using (MySqlCommand cmd = new MySqlCommand(commandText, cn))
{
cmd.UpdatedRowSource = UpdateRowSource.None;
cmd.Parameters.Add("?tempid", MySqlDbType.UInt32).SourceColumn = "tempid";
MySqlDataAdapter da = new MySqlDataAdapter();
da.InsertCommand = cmd;
// assume DataTable dt contains one column with name "tempid"
int records = da.Update(dt);
}
cn.Close();
}
For Kemal Taşkın solution, RowState set must be equal to DataRowState.Added.
If it is not the case do this :
foreach (DataRow row in dt.Rows)
row.SetAdded();
For Mr.Black, it is recommended to use sql parameter and not use data value directly.
When importing data into InnoDB, turn off autocommit mode, because it performs a log flush to disk for every insert. To disable autocommit during your import operation, surround it with SET autocommit and COMMIT statements:
SET autocommit=0;
... SQL import statements ...
COMMIT;
Performance test : insertion of 5400 rows in 2 tables
insertion from CSV file : 3 seconds
LOAD DATA INFILE 'data.csv' INTO TABLE myTable TERMINATED BY '\t';
insertion by using Kemal Taşkın solution: 32 seconds
MySqlDataAdapter.Update (DataTable)
insertion row by row (): 41 seconds
INSERT INTO table (columns) VALUES (values);
INSERT INTO table (columns) VALUES (values);
...
insertion all rows in one query: 143 seconds
INSERT INTO table (columns) VALUES (values), (values), ...;
=> LOAD DATA is the most performant by far !
You can check also this article :
https://dev.mysql.com/doc/refman/8.0/en/insert-optimization.html
I have filled a DataSet with a Table that was created from another database file. The table is NOT in the database file which I want to be able to copy the Table to.
Now I want to save all those records (DataTable) to a newly created SQLite database file...
How can i do that?
Also I really want to avoid loops if this is possible.
The best answer is by me :) so i'll share it.This is loop but writes 100k entries in 2-3secs.
using (DbTransaction dbTrans = kaupykliuduomConn.BeginTransaction())
{
downloadas.Visible = true; //my progressbar
downloadas.Maximum = dataSet1.Tables["duomenys"].Rows.Count;
using (DbCommand cmd = kaupykliuduomConn.CreateCommand())
{
cmd.CommandText = "INSERT INTO duomenys(Barkodas, Preke, kiekis) VALUES(?,?,?)";
DbParameter Field1 = cmd.CreateParameter();
DbParameter Field2 = cmd.CreateParameter();
DbParameter Field3 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
cmd.Parameters.Add(Field2);
cmd.Parameters.Add(Field3);
while (n != dataSet1.Tables["duomenys"].Rows.Count)
{
Field1.Value = dataSet1.Tables["duomenys"].Rows[n]["Barkodas"].ToString();
Field2.Value = dataSet1.Tables["duomenys"].Rows[n]["Preke"].ToString();
Field3.Value = dataSet1.Tables["duomenys"].Rows[n]["kiekis"].ToString();
downloadas.Value = n;
n++;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
In this case dataSet1.Tables["duomenys"] is already filled with all the data i need to transfer to another database. I used loop to fill dataset too.
When you load the DataTable from the source database, set the AcceptChangesDuringFill property of the data adapter to false, so that loaded records are kept in the Added state (assuming that the source database is SQL Server)
var sqlAdapter = new SqlDataAdapter("SELECT * FROM the_table", sqlConnection);
DataTable table = new DataTable();
sqlAdapter.AcceptChangesDuringFill = false;
sqlAdapter.Fill(table);
Create the table in the SQLite database, by executing the CREATE TABLE statement directly with SQLiteCommand.ExecuteNonQuery
Create a new DataAdapter for the SQLite database connection, and use it to Update the db:
var sqliteAdapter = new SQLiteDataAdapter("SELECT * FROM the_table", sqliteConnection);
var cmdBuilder = new SQLiteCommandBuilder(sqliteAdapter);
sqliteAdapter.Update(table);
If the source and target tables have the same column names and compatible types, it should work fine...
The way to import SQL data to SQLite will take long time. When you want to import data in millions, It will take lot of time. So the shortest and easiest way to do that is just fill fetch the data from SQL database in a DataTable and insert all its rows to SQLite database.
public bool ImportDataToSQLiteDatabase(string Proc, string SQLiteDatabase, params object[] obj)
{
DataTable result = null;
SqlConnection conn = null;
SqlCommand cmd = null;
try
{
result = new DataTable();
using (conn = new SqlConnection(ConStr))
{
using (cmd = CreateCommand(Proc, CommandType.StoredProcedure, obj))
{
cmd.Connection = conn;
conn.Open();
result.Load(cmd.ExecuteReader());
}
}
using (SQLiteConnection con = new SQLiteConnection(string.Format("Data Source={0};Version=3;New=False;Compress=True;Max Pool Size=100;", SQLiteDatabase)))
{
con.Open();
using (SQLiteTransaction transaction = con.BeginTransaction())
{
foreach (DataRow row in result.Rows)
{
using (SQLiteCommand sqlitecommand = new SQLiteCommand("insert into table(fh,ch,mt,pn) values ('" + Convert.ToString(row[0]) + "','" + Convert.ToString(row[1]) + "','"
+ Convert.ToString(row[2]) + "','" + Convert.ToString(row[3]) + "')", con))
{
sqlitecommand.ExecuteNonQuery();
}
}
transaction.Commit();
new General().WriteApplicationLog("Data successfully imported.");
return true;
}
}
}
catch (Exception ex)
{
result = null;
return false;
}
finally
{
if (conn.State == ConnectionState.Open)
conn.Close();
}
}
It will take a very few time as compare to upper given answers.