I am caching data to decrease SQL Server activity.
When I fetch data from SQL Server the first time it is requested, I then use WriteXml to store it on disk.
Then second time it is requested (i.e. a file called cacheName exists), I fetch the data with ReadXml.
The data in SQL Server has only one row with utcDT value of '2012-03-25 02:01' and when I write the DataTable at Code point 1 Test1() shows there is only one row. When I manually inspect Xml file on disk I also see only 1 row.
However, after reading with ReadXml at code point 2 Test1() shows 2 such rows!
How is this happening?
public static DataTable FetchCache()
{
var cacheName = #"C:\MyCache.xml";
if (File.Exists(cacheName))
{
var ds = new DataSet();
var fs = new FileStream(cacheName, FileMode.Open, FileAccess.Read, FileShare.Read);
using (var sr = new StreamReader(fs))
{
ds.ReadXml(sr, XmlReadMode.ReadSchema);
}
table = ds.Tables[0];
Test1(table); //Code point 2
}
else
{
table = FetchDataTable(connectionString, sqlCommand, nullOnError: nullOnError);
Test1(table); //Code point 1
table.WriteXml(cacheName, XmlWriteMode.WriteSchema);
}
return table;
}
public static void Test1(DataTable table)
{
var rows = table.AsEnumerable()
.Where(x => x.Field<DateTime>("utcDT").Equals(new DateTime(2012, 03, 25, 02, 01, 00)))
.ToArray();
}
public static DataTable FetchDataTable(string connectionString, string sqlCommand, bool nullOnError= false)
{
DataSet ds = new DataSet();
DataTable dt = new DataTable();
try
{
using (SqlConnection conn = new SqlConnection(connectionString))
{
conn.Open();
SqlDataAdapter da = new SqlDataAdapter(sqlCommand, conn);
da.SelectCommand.CommandTimeout = 600;
ds.Reset();
da.Fill(ds);
dt = ds.Tables[0];
}
}
catch (Exception err)
{
if (nullOnError)
return dt;
throw new Exception("[Utils.MsSQLS.FetchDataTable] Could not get data: " + err.Message);
}
return dt;
}
Just in case anyone else has same issue. I found the answer.
My SQL table DateTime columns were originally specified as UnspecifiedLocal, but the data they contain is actually UTC.
In order to save to cache as Xml, I have to create a new DataTable clone from table I fetch from SQL, and change DateTimeMode to UTC , then I copy from the old table to the new (row by row using ItemArray copy).
This is cumbersome but gets rid of the issue.
Related
in database i have a column id and a column pic of table gambar. Column pic is a column of string that are converted of images, i want to put the id and picin a datagridview, but the pic containing string, so i have to convert it first using base64, but i dont know how to display it to datagridview
code :
using (MySqlConnection mysqlCon = new MySqlConnection(ConfigurationManager.ConnectionStrings["konekKuy"].ConnectionString))
{
mysqlCon.Open();
string insertQuery = "SELECT * FROM gambar";
MySqlCommand command = new MySqlCommand(insertQuery, mysqlCon);
MySqlDataAdapter sqlDa = new MySqlDataAdapter(command);
DataTable dtblLaris = new DataTable();
sqlDa.Fill(dtblLaris);
if (dtblLaris.Rows.Count > 0)
{
dataGridView1.DataSource = dtblLaris;
dataGridView1.Columns["id"].HeaderText = "ID";
dataGridView1.Columns["pic"].HeaderText = "PHOTO";
byte[] imageBytes = Convert.FromBase64String(dataGridView1.Columns[1].ToString());
using (var ms = new MemoryStream(imageBytes, 0, imageBytes.Length))
{
Image image = Image.FromStream(ms, true);
}
DataGridViewImageColumn imageColumn = new DataGridViewImageColumn();
imageColumn = (DataGridViewImageColumn)dataGridView1.Columns[1];
imageColumn.ImageLayout = DataGridViewImageCellLayout.Stretch;
}
sqlDa.Dispose();
mysqlCon.Close();
}
do you know any better ways to display it on datagridview?
I think the code as it is should work fine... but it is pretty unstructured and hard to read / understand
First of all it is always good practise to seperate your code into dedicated methods. This is a common principle in software engineering called Single Responsibility Principle
If we apply the principle to your code it would look something like this:
private void Load()
{
// 1) Execute Sql query
DataTable originDataTable = QueryData("konekKuy", "SELECT * FROM gambar");
// 2) If query returns results
if (originDataTable.Rows.Count > 0)
{
// 3) Convert queried DataTable to DataTable with Image
DataTable convertedDataTable = ConvertDataTable(originDataTable);
// 4) Set the converted DataTable as DataGridView DataSource
LoadDataIntoDataGrid(convertedDataTable);
}
else
{
MessageBox.Show("No data to display!");
}
}
public DataTable QueryData(string connectionString, string queryCommand)
{
DataTable dataTable = new DataTable();
using (var sqlConnection = new MySqlConnection(ConfigurationManager.ConnectionStrings[connectionString].ConnectionString))
{
MySqlCommand command = new MySqlCommand(queryCommand, sqlConnection);
MySqlDataAdapter adapter = new MySqlDataAdapter(command);
sqlConnection.Open();
adapter.Fill(dataTable);
adapter.Dispose();
sqlConnection.Close();
}
return dataTable;
}
public DataTable ConvertDataTable(DataTable originDataTable)
{
// Create the new DataTable with their uppercase titles and their new types.
var convertedDataTable = new DataTable();
convertedDataTable.Columns.Add("ID", typeof(string));
convertedDataTable.Columns.Add("PHOTO", typeof(Image));
// Loop through original DataTable rows and convert base64 string to Image with 'ConvertBase64ToImage' method
foreach (DataRow dataRow in originDataTable.Rows)
{
var id = dataRow.Field<string>("id");
var image = ConvertBase64ToImage(dataRow.Field<string>("pic"));
convertedDataTable.Rows.Add(id, image);
}
return convertedDataTable;
}
public Image ConvertBase64ToImage(string base64)
{
var bytes = Convert.FromBase64String(base64);
using (var memory = new MemoryStream(bytes, 0, bytes.Length))
{
return Image.FromStream(memory, true);
}
}
public void LoadDataIntoDataGrid(DataTable convertedDataTable)
{
dataGridView1.DataSource = convertedDataTable;
}
The QueryData would contain all code which is responsible to get your data from the database. After the SQL query has been executed the QueryData method return the DataTable which contains the data from your database
ConvertDataTable method takes the DataTable from the database query and creates a new DataTable which holds the same data like the original DataTable except the Base64 string will be converted to a Image
ConvertBase64ToImage contains the code that converts the base64 string to a Image file and returns the Image
The LoadDataIntoDataGrid method accepts a DataTable as parameters and handels the insertion of the data into the UIs DataGridView
Creating dedicated methods for each action / responsibility makes your code much easier to understand and maintain. It also makes your code testable through UnitTests which is a big deal!
I hope this helped you clean up your code
I have a SQL Server stored procedure that I run two select statements. I can easily return one select statement and store it in a DataTable but how do I use two?
How can I set my variable countfromfirstselectstatement equal to the count returned from my first select statement and how can I set my variable countfromsecondselectstatement equal to the count returned from the second select.
private void ReturnTwoSelectStatements()
{
DataSet dt112 = new DataSet();
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand("Return2SelectStatements", con))
{
using (var da = new SqlDataAdapter(cmd))
{
cmd.CommandType = CommandType.StoredProcedure;
da.Fill(dt112);
}
}
}
DataTable table1 = dt112.Tables[0];
DataTable table2 = dt112.Tables[1];
string countfromFirstSelectStatement = table1.Rows.Count().ToString();
string countfromSecondSelectStatement = table2.Rows.Count().ToString();
//I only want to iterate the data from the 1st select statement
foreach (DataRow row in dt112.Rows)
{
}
}
You could also directly use the DbDataReader (cmd.ExecuteReader()), which gives you NextResult to advance to the next result set. DataTable.Load allows you to load rows from the data reader to the table.
DbDataAdapter is really a bit of an overkill for just reading data into a data table. It's designed to allow the whole CRUD-breadth of operation for controls that are abstracted away from the real source of data, which isn't really your case.
Sample:
using (var reader = cmd.ExecuteReader())
{
dataTable1.Load(reader);
if (!reader.NextResult()) throw SomethingWhenTheresNoSecondResultSet();
dataTable2.Load(reader);
}
Fill a Dataset with both select statements and access it:
....
DataSet dataSet = new DataSet();
da.Fill(dataSet);
....
DataTable table1 = dataSet.Tables[0];
DataTable table2 = dataSet.Tables[1];
string countfromFirstSelectStatement = table1.Rows.Count.ToString();
string countfromSecondSelectStatement = table2.Rows.Count.ToString();
I am trying to read from an Sqlite database in to a DataTable using SQLiteDataAdapter. The data in the database contains forwardslashes in fractions being stored as VARCHAR data types. After filling the DataTable, the forwardslashes have been removed along with all trailing characters. I have confirmed the database does contain the forwardslashes.
This is my code:
DataTable dt = new DataTable();
using (SQLiteConnection m_dbConnection = new SQLiteConnection(connection))
{
string sql = "Select Quantity From Recipes WHERE ID=#id;";
using (SQLiteDataAdapter da = new SQLiteDataAdapter(sql, m_dbConnection))
{
da.AcceptChangesDuringUpdate = true;
da.AcceptChangesDuringFill = true;
da.SelectCommand.Parameters.Add(new SQLiteParameter("#id") { Value = ID });
da.Fill(dt);
}
}
The result in Quantity should return 1/2, but returns 1 instead. This is different than the problem most posts refer to where people are trying to insert forwardslashes. I couldn't find any information about reading forwardslashes.
I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.
In the solution with MS SQL the code looks like this:
connection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "testTable";
bulkCopy.WriteToServer(rawData);
Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Any help would be highly appreciated.
Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a System.Data.DataTable into a MySQL table using a standard MySqlDataAdapter#Update() inside a Transaction. It consistently took about 30 seconds to run:
using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
{
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = conn;
cmd.Transaction = tran;
cmd.CommandText = "SELECT * FROM testtable";
using (MySqlDataAdapter da = new MySqlDataAdapter(cmd))
{
da.UpdateBatchSize = 1000;
using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da))
{
da.Update(rawData);
tran.Commit();
}
}
}
}
(I tried a couple of different values for UpdateBatchSize but they didn't seem to have a significant impact on the elapsed time.)
By contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds to run ...
string tempCsvFileSpec = #"C:\Users\Gord\Desktop\dump.csv";
using (StreamWriter writer = new StreamWriter(tempCsvFileSpec))
{
Rfc4180Writer.WriteDataTable(rawData, writer, false);
}
var msbl = new MySqlBulkLoader(conn);
msbl.TableName = "testtable";
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator = ",";
msbl.FieldQuotationCharacter = '"';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);
... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.
Similar to SqlBulkCopy, we have MySqlBulkCopy for Mysql.
here is the example how to use it.
public async Task<bool> MySqlBulCopyAsync(DataTable dataTable)
{
try
{
bool result = true;
using (var connection = new MySqlConnector.MySqlConnection(_connString + ";AllowLoadLocalInfile=True"))
{
await connection.OpenAsync();
var bulkCopy = new MySqlBulkCopy(connection);
bulkCopy.DestinationTableName = "yourtable";
// the column mapping is required if you have a identity column in the table
bulkCopy.ColumnMappings.AddRange(GetMySqlColumnMapping(dataTable));
await bulkCopy.WriteToServerAsync(dataTable);
return result;
}
}
catch (Exception ex)
{
throw;
}
}
private List<MySqlBulkCopyColumnMapping> GetMySqlColumnMapping(DataTable dataTable)
{
List<MySqlBulkCopyColumnMapping> colMappings = new List<MySqlBulkCopyColumnMapping>();
int i = 0;
foreach (DataColumn col in dataTable.Columns)
{
colMappings.Add(new MySqlBulkCopyColumnMapping(i, col.ColumnName));
i++;
}
return colMappings;
}
You can ignore the column mapping if you don't have any identity column in your table.
If you have identity column then you have to use the column mapping otherwise it won't insert any records in the table
It will just give message like "x rows were copied but only 0 rows were inserted".
This class i available in the below library
Assembly MySqlConnector, Version=1.0.0.0
Using any of BulkOperation NuGet-package, you can easily have this done.
Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/
MySqlConnection conn = DbConnection.OpenConnection();
DataTable dt = new DataTable("testtable");
MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn);
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
da.Fill(dt);
instead of using
......
da.UpdateBatchSize = 1000;
......
da.Update(dt)
just following two lines
var bulk = new BulkOperation(conn);
bulk.BulkInsert(dt);
will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.
I am using SqlBulkCopy object to write a datatable into an sql server table. However, everytime I recheck my database it remains intact with no changes. A
I have tried to do Google search to determine my problem but i am unable to resolve it.
The datatable came from an .xls file.
public static DataTable dt = new DataTable();
private void ExportToGrid(String path, String filen)
{
int idx = filen.IndexOf(".");
string tf = filen.Remove(idx, 4);
OleDbConnection MyConnection = null;
DataSet DtSet = null;
OleDbDataAdapter MyCommand = null;
MyConnection = new OleDbConnection("provider=Microsoft.Jet.OLEDB.4.0; Data Source='" + path + "';Extended Properties=Excel 8.0;");
ArrayList TblName = new ArrayList();
MyConnection.Open();
DataTable schemaTable = MyConnection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
foreach (DataRow row in schemaTable.Rows)
{
TblName.Add(row["TABLE_NAME"]);
}
MyCommand = new System.Data.OleDb.OleDbDataAdapter("select * from [" + TblName[0].ToString() + "]", MyConnection);
DtSet = new System.Data.DataSet();
MyCommand.Fill(DtSet);
MyCommand.FillSchema(DtSet, SchemaType.Source);
DataTable dt = new DataTable();
dt = DtSet.Tables[0];
Session["dt"] = dt;
int x = dt.Rows.Count;
MyConnection.Close();
if (dt.Rows.Count > 0)
{
theGridView.DataSource = dt;
theGridView.DataBind();
}
if (System.IO.File.Exists(path))
{
System.IO.File.Delete(path);
}
}
This is my writer function
private void StartImport()
{
string servername = server;
string database = database;
string tbl = "dbo.LinkDb";
Stopwatch sw = new Stopwatch();
sw.Start();
SqlBulkCopy bulkCopy = new SqlBulkCopy("Data Source=" + servername + ";Initial Catalog=" + database + ";Integrated Security=SSPI", SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = tbl;
bulkCopy.WriteToServer(dt);
sw.Stop();
lblResult.Visible = true;
lblResult.Text = (sw.ElapsedMilliseconds / 1000.00).ToString();
}
Below are the screenshot of the tables stored in my sql server. I assure you that I have been complying to Case Sensitive rules.
There was no exception thrown and average time elapsed is 0.018 - 0.020 secs
Appreciate any helps.
Thank you
Based on the code you have posted, you are writing an empty datatable to the database. Your "ExportToGrid" method fills dt, a DataTable declared locally, which loses scope outside of the method. Your write function is calling the static DataTable dt which is a new datatable.
Does dt need to be static? it seems as though this could be declared as
private DataTable dt;
then inside "ExportToGrid" instead of declaring another DataTable just instantiate the already declared dt rather than declaring a new one
dt = new DataTable();
Alternatively you could extract the DataTable straight from the GridView during the write Method:
DataTable dt = (DataTable)theGridView.DataSource;
bulkCopy.WriteToServer(dt);
This removes the need for variables outside of the scope of the method.
Lastly since you are storing your datatable within the session (I am not generally an advocate of storing large amounts of data in session variables but without knowing the specifics of your site I cannot really pass judgement), you could use the following:
DataTable dt = (DataTable)Session["dt"];
bulkCopy.WriteToServer(dt);
I dont see anything obvious compared to my use except for the fact that I explicitly map columns from the data table to the database table.
Using cn As New SqlConnection(DataAccessResource.CONNECTIONSTRING)
cn.Open()
Using copy As New SqlBulkCopy(cn)
copy.BulkCopyTimeout = 300
copy.ColumnMappings.Add(0, 0)
copy.ColumnMappings.Add(1, 1)
copy.ColumnMappings.Add(2, 2)
copy.ColumnMappings.Add(3, 3)
copy.DestinationTableName = "Tablename"
copy.WriteToServer(dataset.datatable)
End Using
End Using
Connection string (sql server 2000!) looks like
"data source=DBSERVERNAME;initial catalog=DBNAME;persist security info=True;user id=USERNAME;password=PASSWORD;packet size=4096"
I doubt the connection string is a problem assuming youve used it elsewhere.
Finally have you checked the data types for the columns in the dataset datatable match the ones in the database. In my experience oledb load from excel does not always produce output you might expect, date fields and columns with mixed text and numbers being perticular problems.