I am trying to read from an Sqlite database in to a DataTable using SQLiteDataAdapter. The data in the database contains forwardslashes in fractions being stored as VARCHAR data types. After filling the DataTable, the forwardslashes have been removed along with all trailing characters. I have confirmed the database does contain the forwardslashes.
This is my code:
DataTable dt = new DataTable();
using (SQLiteConnection m_dbConnection = new SQLiteConnection(connection))
{
string sql = "Select Quantity From Recipes WHERE ID=#id;";
using (SQLiteDataAdapter da = new SQLiteDataAdapter(sql, m_dbConnection))
{
da.AcceptChangesDuringUpdate = true;
da.AcceptChangesDuringFill = true;
da.SelectCommand.Parameters.Add(new SQLiteParameter("#id") { Value = ID });
da.Fill(dt);
}
}
The result in Quantity should return 1/2, but returns 1 instead. This is different than the problem most posts refer to where people are trying to insert forwardslashes. I couldn't find any information about reading forwardslashes.
Related
Is there a way, where I can use a sql adapter and convert the sql query results into a Data Columns? I'm kinda new a datatables. I need to build this dynamically since my column names are stored into a sql table. I keep running into datarows not columns. What I have built so far:
string feedcolumns = #"select FeedColumnName from myTable where FeedProcessID = #feedprocessid";
SqlCommand columnscommand = new SqlCommand(feedcolumns, connUpd);
DataTable dt = new DataTable("datafeed");
foreach(DataColumn dc in dt.Columns)
{
dc = new
dt.Columns.Add(dc);
}
You can fill a DataTable directly from an SqlReader, no need to go column by column.
One thing I did notice was that your SQL statement had a parameter in it that was never assigned to the command object, so I added it in
string feedcolumns = "select FeedColumnName from myTable where FeedProcessID = #feedprocessid";
DataTable dt = new DataTable("datafeed");
using (SqlConnection connUpd = new SqlConnection("MyConnection")) {
using (SqlCommand columnscommand = new SqlCommand(feedcolumns, connUpd)) {
columnscommand.Parameters.AddWithValue("#feedprocessid", feedprocessid);
connUpd.Open();
var dataReader = columnscommand.ExecuteReader();
dt.Load(dataReader);
}
}
With my program I have a datatable that gets populated with records fetched from a database. This is displayed in a datagrid view and when a cell is clicked it loads all the values into textboxes. When a save button is clicked it will then save the textbox values back into the datatable. However how can I send this datatable back to the database and have it update the records?
Here is my code to load the records:
indexRow = e.RowIndex;
DataGridViewRow row = dgv_ReturnSearch.Rows[indexRow];
tb_editFirstName.Text = row.Cells[1].Value.ToString();
tb_editLastName.Text = row.Cells[2].Value.ToString();
tb_editAge.Text = row.Cells[3].Value.ToString();
tb_editPostCode.Text = row.Cells[4].Value.ToString();
tb_editMobNum.Text = row.Cells[5].Value.ToString();
tb_editEmail.Text = row.Cells[6].Value.ToString();
tb_editAllergies.Text = row.Cells[7].Value.ToString();
tb_editDOB.Text = row.Cells[8].Value.ToString();
tb_editGender.Text = row.Cells[9].Value.ToString();
Here is my code to save them
DataGridViewRow newDataRow = dgv_ReturnSearch.Rows[indexRow];
newDataRow.Cells[1].Value = tb_editFirstName.Text;
newDataRow.Cells[2].Value = tb_editLastName.Text;
newDataRow.Cells[3].Value = tb_editAge.Text;
newDataRow.Cells[4].Value = tb_editPostCode.Text;
Logic.SQLQueriesUtility.Adapter.Update(dt);
However this doesn't actually update the database, only the local datatable. When it is loaded again all the changes revert.
Thanks
To load a gridview with data from the database you'll need to use DataTable and DataAdapter then bind the grid. it should look something like this:
private void CustomersBindGrid()
{
using (SqlConnection con = new SqlConnection(mycon))
{
using (SqlCommand cmd = new SqlCommand())
{
cmd.CommandText = "SELECT * FROM Customers";
cmd.Connection = con;
DataTable dt = new DataTable();
using (SqlDataAdapter sda = new SqlDataAdapter(cmd))
{
sda.Fill(dt);
Gridview1.DataSource = dt;
Gridview1.DataBind();
}
}
con.Close();
}
}
Try updating the database directly:
private void SaveEdits_Click()
{
using (SqlConnection con = new SqlConnection(mycon))
{
con.Open();
using (SqlCommand cmd = new SqlCommand())
{
cmd.CommandText = "UPDATE Customers set firstname= #FN , lastname= #LN , age = #AG , postcode= #PC where CustomerID = #CustID";
cmd.Connection = con;
cmd.Parameters.AddWithValue("#CustID", cid);
cmd.Parameters.AddWithValue("#FN", tb_editFirstName.Text);
cmd.Parameters.AddWithValue("#LN", tb_editLastName.Text);
cmd.Parameters.AddWithValue("#AG", tb_editAge.Text);
cmd.Parameters.AddWithValue("#PC", tb_editPostCode.Text);
cmd.ExecuteNonQuery();
}
}
CustomersBindGrid();
MessageBox.show("Information Updated!");
}
}
You need to edit this update command to your columns in the database and get a way to read the customer id so the condition actually work and update the database.
In your posting, you don't write, how your client is developed...
Normally, this would be done with a SQL update command.
Therefore (in the Microsoft universe) the SqlClient (System.Data.SqlClient Namespace) can be used.
As soon as the user press the save button, you overtake the relevant textbox.text values in variables, change datatypes if necessary, generate an SQL update string to update the data on your SQL server.
Example SQL update string:
Update TableName set ColumnName1Text = 'some text', ColumName2Integer = 12, ColumnName.... = ... where ColumnNameKey = KeyToDatatable
Then you submit the SQL command (SQL update string) to the SQL server via SqlClient (SqlCommand).
But therefore you need a unique key for the where clause (in the example KeyToDatatable) so that only the row with the key is updated.
The key normally is queried from the DataTable (with the other fields) to show the data (in your grid) and then taken over to the update command (can be "hided" from the user, that don't has to know the key).
Well I managed to fix it by doing:
DataGridViewRow newDataRow = dgv_ReturnSearch.Rows[indexRow];
dt.Rows[indexRow]["first_name"] = tb_editFirstName.Text;
dt.Rows[indexRow]["last_name"] = tb_editLastName.Text;
dt.Rows[indexRow]["age"] = tb_editAge.Text;
dt.Rows[indexRow]["postcode"] = tb_editPostCode.Text;
dt.Rows[indexRow]["mobile_num"] = tb_editMobNum.Text;
dt.Rows[indexRow]["email"] = tb_editEmail.Text;
dt.Rows[indexRow]["allergies"] = tb_editAllergies.Text;
dt.Rows[indexRow]["DOB"] = tb_editDOB.Text;
dt.Rows[indexRow]["gender"] = tb_editGender.Text;
Logic.SQLQueriesUtility.Adapter.Update(dt);
Instead of what I was doing before, now it works perfectly and any changes are saved back to the database.
I have a SQL Server stored procedure that I run two select statements. I can easily return one select statement and store it in a DataTable but how do I use two?
How can I set my variable countfromfirstselectstatement equal to the count returned from my first select statement and how can I set my variable countfromsecondselectstatement equal to the count returned from the second select.
private void ReturnTwoSelectStatements()
{
DataSet dt112 = new DataSet();
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand("Return2SelectStatements", con))
{
using (var da = new SqlDataAdapter(cmd))
{
cmd.CommandType = CommandType.StoredProcedure;
da.Fill(dt112);
}
}
}
DataTable table1 = dt112.Tables[0];
DataTable table2 = dt112.Tables[1];
string countfromFirstSelectStatement = table1.Rows.Count().ToString();
string countfromSecondSelectStatement = table2.Rows.Count().ToString();
//I only want to iterate the data from the 1st select statement
foreach (DataRow row in dt112.Rows)
{
}
}
You could also directly use the DbDataReader (cmd.ExecuteReader()), which gives you NextResult to advance to the next result set. DataTable.Load allows you to load rows from the data reader to the table.
DbDataAdapter is really a bit of an overkill for just reading data into a data table. It's designed to allow the whole CRUD-breadth of operation for controls that are abstracted away from the real source of data, which isn't really your case.
Sample:
using (var reader = cmd.ExecuteReader())
{
dataTable1.Load(reader);
if (!reader.NextResult()) throw SomethingWhenTheresNoSecondResultSet();
dataTable2.Load(reader);
}
Fill a Dataset with both select statements and access it:
....
DataSet dataSet = new DataSet();
da.Fill(dataSet);
....
DataTable table1 = dataSet.Tables[0];
DataTable table2 = dataSet.Tables[1];
string countfromFirstSelectStatement = table1.Rows.Count.ToString();
string countfromSecondSelectStatement = table2.Rows.Count.ToString();
I have an CSV file with 7 columns, which a user has to upload so it can be added in the database.
I found some help in reading the CSV and putting all the info in a single table, however, the data has to be spread over three tables.
My code for inserting all the data to 1 table:
protected void Upload(object sender, EventArgs e)
{
//Upload and save the file
string csvPath = Server.MapPath("~/Temp/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
FileUpload1.SaveAs(csvPath);
DataTable dt = new DataTable();
dt.Columns.AddRange(new DataColumn[7] {
new DataColumn("Title", typeof(string)),
new DataColumn("Artist", typeof(string)),
new DataColumn("Years", typeof(string)),
new DataColumn("Position", typeof(string)),
new DataColumn("Senddate", typeof(string)),
new DataColumn("Sendfrom", typeof(string)),
new DataColumn("Sendtill", typeof(string))});
string csvData = File.ReadAllText(csvPath);
foreach (string row in csvData.Split('\n'))
{
if (!string.IsNullOrEmpty(row))
{
dt.Rows.Add();
int i = 0;
foreach (string cell in row.Split(';'))
{
dt.Rows[dt.Rows.Count - 1][i] = cell;
i++;
}
}
}
string consString = ConfigurationManager.ConnectionStrings["connection"].ConnectionString;
using (SqlConnection con = new SqlConnection(consString))
{
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))
{
//Set the database table name
sqlBulkCopy.DestinationTableName = "dbo.ingevoerd";
con.Open();
sqlBulkCopy.WriteToServer(dt);
con.Close();
}
}
}
As you can see, it takes 7 columns, and puts them in the table [dbo].[ingevoerd]
How can i split the data to put the column 'Title' and 'Years' in a table called Song, 'Artist' in a table called Artiest, and 'Position', 'Senddate', 'Sendfrom' an 'Sendtill' in a table called Lijst?
For more information, put down a comment.
imho this is not the best way to handle this upload because the content is not flat data you can bulk upload in a breeze; there are many entitiest (at least 3) that should be linked.
i would go with the 'old style' approach of calling a insert for each row with proper parameters.
you are already looping through the Whole recordset when reading the CSV so i would make something like:
protected void Upload(object sender, EventArgs e)
{
//Upload and save the file
string csvPath = Server.MapPath("~/Temp/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
FileUpload1.SaveAs(csvPath);
string consString = ConfigurationManager.ConnectionStrings["connection"].ConnectionString;
using (SqlConnection con = new SqlConnection(consString))
{
con.Open();
using (SqlTransaction tran = con.BeginTransaction())
using (SqlCommand cmd = new SqlCommand())
{
cmd.Connection = con;
cmd.Transaction = tran;
cmd.CommandType = System.Data.CommandType.StoredProcedure;
cmd.CommandText = "your_sp_name_here";
cmd.Parameters.Add(new SqlParameter("#title",System.Data.SqlDbType.NVarChar));
cmd.Parameters.Add(new SqlParameter("#artist", System.Data.SqlDbType.NVarChar));
// other parameters follow
// ...
string csvData = File.ReadAllText(csvPath);
foreach (string row in csvData.Split('\n'))
{
if (!string.IsNullOrEmpty(row))
{
// for every row call the command and fill in the parameters with proper values
cmd.Parameters["#title"].Value = row[0];
cmd.Parameters["#artist"].Value = row[1];
// ...
cmd.ExecuteNonQuery();
}
}
// when done commit the transaction
tran.Commit();
}
}
}
inside your stored procedure handle the 'split' of the data in the relevant tables taking all the steps required to avoid duplicates and maybe linking the data among the tables:
create procedure your_sp_name_here(#title nvarchar(50), #artist nvarchar(50), #year int)
as
begin
-- add logic & checks here if needed
-- ...
-- ...
-- if everything is ok insert the rows
insert into songs (title, year) values (#title, #year)
insert into Artiest (Artist) values (#artist)
end
Have you looked into column mappings?
Check out stackoverflow.com/questions/17469349/mapping-columns-in-a-datatable-to-a-sql-table-with-sqlbulkcopy
I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.
In the solution with MS SQL the code looks like this:
connection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "testTable";
bulkCopy.WriteToServer(rawData);
Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Any help would be highly appreciated.
Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a System.Data.DataTable into a MySQL table using a standard MySqlDataAdapter#Update() inside a Transaction. It consistently took about 30 seconds to run:
using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
{
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = conn;
cmd.Transaction = tran;
cmd.CommandText = "SELECT * FROM testtable";
using (MySqlDataAdapter da = new MySqlDataAdapter(cmd))
{
da.UpdateBatchSize = 1000;
using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da))
{
da.Update(rawData);
tran.Commit();
}
}
}
}
(I tried a couple of different values for UpdateBatchSize but they didn't seem to have a significant impact on the elapsed time.)
By contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds to run ...
string tempCsvFileSpec = #"C:\Users\Gord\Desktop\dump.csv";
using (StreamWriter writer = new StreamWriter(tempCsvFileSpec))
{
Rfc4180Writer.WriteDataTable(rawData, writer, false);
}
var msbl = new MySqlBulkLoader(conn);
msbl.TableName = "testtable";
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator = ",";
msbl.FieldQuotationCharacter = '"';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);
... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.
Similar to SqlBulkCopy, we have MySqlBulkCopy for Mysql.
here is the example how to use it.
public async Task<bool> MySqlBulCopyAsync(DataTable dataTable)
{
try
{
bool result = true;
using (var connection = new MySqlConnector.MySqlConnection(_connString + ";AllowLoadLocalInfile=True"))
{
await connection.OpenAsync();
var bulkCopy = new MySqlBulkCopy(connection);
bulkCopy.DestinationTableName = "yourtable";
// the column mapping is required if you have a identity column in the table
bulkCopy.ColumnMappings.AddRange(GetMySqlColumnMapping(dataTable));
await bulkCopy.WriteToServerAsync(dataTable);
return result;
}
}
catch (Exception ex)
{
throw;
}
}
private List<MySqlBulkCopyColumnMapping> GetMySqlColumnMapping(DataTable dataTable)
{
List<MySqlBulkCopyColumnMapping> colMappings = new List<MySqlBulkCopyColumnMapping>();
int i = 0;
foreach (DataColumn col in dataTable.Columns)
{
colMappings.Add(new MySqlBulkCopyColumnMapping(i, col.ColumnName));
i++;
}
return colMappings;
}
You can ignore the column mapping if you don't have any identity column in your table.
If you have identity column then you have to use the column mapping otherwise it won't insert any records in the table
It will just give message like "x rows were copied but only 0 rows were inserted".
This class i available in the below library
Assembly MySqlConnector, Version=1.0.0.0
Using any of BulkOperation NuGet-package, you can easily have this done.
Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/
MySqlConnection conn = DbConnection.OpenConnection();
DataTable dt = new DataTable("testtable");
MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn);
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
da.Fill(dt);
instead of using
......
da.UpdateBatchSize = 1000;
......
da.Update(dt)
just following two lines
var bulk = new BulkOperation(conn);
bulk.BulkInsert(dt);
will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.