I am working on an application, wherein I want to create a table with data from a datatable.
Scenario is:
User selects a table and data is filled in a DataTable
Based on the DataTable schema I am creating a table in a temporary database (this database can be SQL Server, Oracle (user-defined))
Once the table is created, I want to copy data from DataTable to temp table
My code:
using (OleDbConnection con = new OleDbConnection(localConnString))
{
try
{
con.Open();
List<string> col = new List<string>();
StringBuilder sb = new StringBuilder();
StringBuilder insertQuery = new StringBuilder();
insertQuery.Append("Insert INTO " + tableName + " Values(");
foreach (DataRow row in dtSchema.Rows)
{
string type = getColumnType(row.Field<object>("DataType").ToString());
string column ="["+ row.Field<string>("ColumnName").ToString() + "] " + type;
col.Add(column);
}
sb.Append(string.Join(",", col));
string createTableQuery = "CREATE TABLE ["+tableName+"] (" + sb.ToString().Trim() + ")";
OleDbCommand command = new OleDbCommand(createTableQuery, con);
command.ExecuteNonQuery();
con.Close();
// Code to copy data from DataTable to temp table
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
I want to use something like SqlBulkCopy, but I am using OLEDB.
Any help would be appreciable.
Related
Can you help me with the following?
I have two databases: one in SQL Server making the connection with ADO.NET; the other is a DBF database which makes a class for the connection of the same. I want to make a comparison of data from one table to the other, since they have the same structure, and I want to see what data is missing from both the SQL Server and the DBF.
namespace log
{
static class Program
{
private static ECEntities1 dc = new ECEntities1();
public static void Main()
{
// Query in BDD SQL Server
var query =
from HInvoice in dc.HInvoice
where HInvoice.DOB == '2020-03-01'
select HInvoice.DOB
// Print data
foreach (var item in query)
{
Console.WriteLine(item);
}
// Query in BDD BDF
string path = AppDomain.CurrentDomain.BaseDirectory + #"db";
DataTable dt = DBF.ObtenerDatos(path, "GNDITEM.Dbf");
// Print Data
foreach (DataRow dtRow in dt.Rows)
{
// On all tables' columns
foreach (DataColumn dc in dt.Columns)
{
var field1 = dtRow[dc].ToString();
Console.WriteLine(field1);
}
}
}
}
}
I will answer myself, I am new so the question was how could I be focused, but that was not the way so I changed the focus what I did was the following:
I created a new project where I sent the DBF table to my SQL Server database through a stored procedure and from there make the comparison
Create the connection to the Dbf database.
OleDbConnection cnn = new OleDbConnection();
OleDbDataAdapter da = new OleDbDataAdapter();
OleDbCommand cmd;
string PathArchivo = (#"\\ruta_archivo");
string NombreArchivo = "GNDITEM.Dbf";
cnn.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + #PathArchivo + ";Extended Properties=dBASE IV;User ID=;Password=";
cmd = new OleDbCommand("select * from " + NombreArchivo, cnn);
da.SelectCommand = cmd;
cnn.Close();
var dt = new DataTable();
da.Fill(dt);
Pass the parameters to a DataTable
var dtParametro = new DataTable();
dtParametro.Columns.Add("parametro1", typeof(string));
dtParametro.Columns.Add("parametro2", typeof(string));
dtParametro.Columns.Add("parametro3", typeof(string));
dtParametro.Columns.Add("parametro4", typeof(string));
dtParametro.Rows.Add(drAdd);
// On all tables' columns
Then with a For we fill the DataTable with the information from the DBF table
foreach (DataRow dr in dt.Rows)
{
drAdd["TYPE"] = dr["parametro1"].ToString();
drAdd["EMPLOYEE"] = dr["parametro2"].ToString();
drAdd["CHECK"] = dr["parametro3"].ToString();
drAdd["ITEM"] = dr["parametro4"].ToString();
dtParametro.Rows.Add(drAdd);
}
Finally I connect to the SQL Server database and call the stored procedure
string cadena1 = #"conexionabdd...";
var cmda = new SqlCommand("Insertar_tbl_GNDITEM1", new SqlConnection(cadena));
cmda.CommandType = CommandType.StoredProcedure;
cmda.Parameters.Add("#param", SqlDbType.Structured).Value = dtParametro;
cmda.Connection.Open(); // abrimos la conexion
cmda.ExecuteNonQuery(); // lo ejecutamso
cmda.Connection.Close(); // cerramos la conexion
We review the database and see that the data is already in the same
PS: In addition to this, you must create the database in the SQL Server, also create a datatype for that table and the stored procedure, here is the video that will guide me, I hope it will be helpful for someone else Video
I have to update multiple records in SQL Server table from C#. Below are the steps I have to follow and below is the code.
The code is working but the process is taking much longer than expected.
I need a quick way to update 10000 records, not sure whether Bulk Copy would work for Update.
I have seen the other answers which has Bulk insert to temp and then update..But that update has a single statement and here I need to update the records in DB based on Excel data and for this I have to loop each excel record.So how can I achieve faster update.
1) Read the Excel Data and copied the data into a data table
string strDirectory = string. Empty;
strDirectory = System.IO.Directory.GetCurrentDirectory() + "\\" + "Filename.xlsx";
string Connection String = "Provider=Microsoft.ACE.OLEDB.12.0; Data Source = " + strDirectory + "; Extended Properties = \"Excel 12.0;HDR=YES;IMEX=1\"";
using (OleDbConnection conn = new OleDbConnection(Connection String))
{
conn.Open();
DataTable schemaTable = conn.GetOleDbSchemaTableOleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
DataRow schemaRow = schemaTable. Rows[0];
string sheet = schemaRow["TABLE_NAME"].ToString();
string query = "SELECT * FROM [" + sheet + "]";
OleDbDataAdapter daexcel = new OleDbDataAdapter(query, conn);
daexcel.Fill(dt);
conn.Close();
}
2) Doing some manipulations on datatable data before updating into table.
string strsqlst = string. Empty;
using (SqlConnection sqlConn = new SqlConnection(Connectionstring))
{
sqlConn.Open();
SqlCommand cmd;
StringBuilder sb = new StringBuilder();
sb.AppendLine("DataTable content:");
foreach (DataRow row in dt.Rows)
{
if (row.ItemArray[0].ToString() == "")
break;
strsqlst = "Update table Set col1= " + row.ItemArray[4].ToString() + " ,col2= " + row.ItemArray[5].ToString() + " where <Condition>'";
cmd = new SqlCommand(strsqlst, sqlConn);
cmd.CommandType = CommandType.Text;
cmd.ExecuteNonQuery();
}
sqlConn.Close();
}
The SqlCommand can be a whole SQL batch and is not limited to a single statement. So you can create a single large batch with 10,000 UPDATE statements, or divide it into for example 20 batches of 500 each.
In other words, you can create a single command with CommandText like this:
UPDATE [T] SET Col1='Value1', Col2='Value2' WHERE [Id] = 1;
...
UPDATE [T] SET Col1='Value999', Col2='Value1000' WHERE [Id] = 500;
That said, you should use parameters for all data values (to ensure SQL injection is not possible).
If you want to handle any errors (updates failing due to invalid data) you will need something a bit more sophisticated.
Let's say I want to copy all tables with their complete data from one database to another without specifically knowing detailed information about them(column count, data types...). The user would input a connection string to his database, and all data from it would be copied to an internal DB.
I tried to achieve it by using SqlConnection and writing direct T-SQL queries and managed to write a script that creates empty tables in Internal database with correct columns:
string createDestinationTableQuery = "create table " + schemaName + ".[" + tableName + "](";
DataTable ColumnsDT = new DataTable();
string getTableColumnDataQuery = "SELECT * FROM "+originalDBName+".INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = N'" + tableName +"'";
SqlCommand getTableColumnDataCommand = new SqlCommand(getTableColumnDataQuery, originalCon);
SqlDataAdapter TableDA = new SqlDataAdapter(getTableColumnDataCommand);
TableDA.Fill(ColumnsDT);
for (int x = 0; x < ColumnsDT.Rows.Count; x++)
{
createDestinationTableQuery += "[" + ColumnsDT.Rows[x].ItemArray[3].ToString() + "] " + "[" + ColumnsDT.Rows[x].ItemArray[7].ToString() + "], ";
}
createDestinationTableQuery = createDestinationTableQuery.Remove(createDestinationTableQuery.Length - 2);
createDestinationTableQuery += " )";
SqlCommand createDestinationTableCommand = new SqlCommand(createDestinationTableQuery, destinationCon);
createDestinationTableCommand.ExecuteNonQuery();
Console.WriteLine("Table " + schemaName + "." + tableName + " created succesfully!");
However, I am struggling with data insertion as the following code simply doesn't work:
DataTable dataTable = new DataTable();
string getTableDataquery = "select * from " + originalTableWithSchema;
SqlCommand getTableDataCommand = new SqlCommand(getTableDataquery, originalCon);
SqlDataAdapter da = new SqlDataAdapter(getTableDataCommand);
da.Fill(dataTable);
for (int x = 0; x < dataTable.Rows.Count; x++)
{
string insertQuery = "insert into " + schemaName + ".["+tableName+"](" ;
string values = "VALUES(";
for (int y = 0; y < dataTable.Columns.Count; y++)
{
insertQuery += dataTable.Columns[y].ColumnName + ", ";
values += dataTable.Rows[x].ItemArray[y].ToString() + ", ";
}
insertQuery = insertQuery.Remove(insertQuery.Length - 2);
insertQuery += " )";
values = values.Remove(values.Length - 2);
values += " )";
insertQuery += " " + values;
SqlCommand insertCommand = new SqlCommand(insertQuery, destinationCon);
insertCommand.ExecuteNonQuery();
}
da.Dispose();
How can I correctly Achieve this functionality? I was thinking of maybe scrapping all the code and using SMO instead?
If you are only looking to copy the data (because you have structure creation already working), then you could use DataTable to hold the data in a non-dbms specific structure, and a DataAdapter to generate the dbms specific insert statements. Here is an excerpt from code I wrote a while ago to copy data from Access to MySQL:
List<string> tableNames = new List<string>();
try
{
// Open connect to access db
sourceConn.Open();
// Build table names list from schema
foreach (DataRow row in sourceConn.GetSchema("Tables").Select("table_type = 'TABLE'"))
tableNames.Add(row["table_name"].ToString());
}
catch (Exception ex)
{
throw ex;
}
finally
{
if(sourceConn.State != ConnectionState.Closed)
sourceConn.Close();
}
foreach (string table in tableNames)
{
//Get all table data from Access
string query = string.Format("SELECT * FROM {0}", table);
DataTable accessTable = new DataTable(table);
try
{
sourceConn.Open();
System.Data.OleDb.OleDbCommand accessSqlCommand = new System.Data.OleDb.OleDbCommand(query, accessConn);
System.Data.OleDb.OleDbDataReader reader = (System.Data.OleDb.OleDbDataReader)accessSqlCommand.ExecuteReader();
// Load all table data into accessTable
accessTable.Load(reader);
}
catch(Exception ex)
{
throw ex;
}
finally
{
if(sourceConn.State != ConnectionState.Closed)
sourceConn.Close();
}
// Import data into MySQL
accessTable.AcceptChanges();
// The table should be empty, so set everything as new rows (will be inserted)
foreach (DataRow row in accessTable.Rows)
row.SetAdded();
try
{
destConn.Open();
MySql.Data.MySqlClient.MySqlDataAdapter da = new MySql.Data.MySqlClient.MySqlDataAdapter(query, mySqlConn);
MySql.Data.MySqlClient.MySqlCommandBuilder cb = new MySql.Data.MySqlClient.MySqlCommandBuilder(da);
da.InsertCommand = cb.GetInsertCommand();
// Update the destination table 128 rows at a time
da.UpdateBatchSize = 128;
// Perform inserts (and capture row counts for output)
int insertCount = da.Update(accessTable);
}
catch (Exception ex)
{
throw ex;
}
finally
{
if(destConn.State != ConnectionState.Closed)
destConn.Close();
}
}
This could certainly be more efficient, but I wrote it for a quick conversion. Also, since this is copied and pasted you may need to tweak it. Hope it helps.
It might be worth thinking about using a linked server. Once a linked server is defined in the destination server, a table can be created and automatically filled with data using a SELECT…INTO statement.
Query executed in destination server database:
SELECT * INTO NewTableName FROM
SourceServername.SourceDatabasename.dbo.SourceTableName
I am trying to bulk copy from one table to another by mapping the column names as the source and destination may not have same columns always.
Source can have 8 columns and destination can have 10 .. I need to map the columns and bulk copy.
Tried the below code..didn't work..getting Error: The given ColumnName 'moduleid' does not match up with any column in data source.
Source: existingtablecolumnsPresent has [collection time],[logtime],[moduleid],[node],[reason],[time],[timestamp],[usecaseid]
Destination: dataTable.Columns has [Node],[Time],[Reason],[Moduleid],[Usecaseid]
Please advise
public static void BatchBulkCopy(DataTable dataTable, string DestinationTbl, List<string> columnMapping,string filename)
{
var program = new Program();
// Get the DataTable
DataTable dtInsertRows = dataTable;
using (SqlBulkCopy sbc = new SqlBulkCopy(program.connectionStr.ToString()))
{
try {
sbc.DestinationTableName = DestinationTbl.ToLower();
string sourceTableQuery = "Select top 1 * from " + "[" + dataTable.TableName + "]";
DataTable dtSource = SqlHelper.ExecuteDataset(program.connectionStr.ToString(), CommandType.Text, sourceTableQuery).Tables[0];
for (int i = 0; i < dataTable.Columns.Count; i++)
{ //check if destination Column Exists in Source table
if (dtSource.Columns.Cast<DataColumn>().Select(a => "[" + a.ColumnName.ToLower() + "]").Contains(dataTable.Columns[i].ToString().ToLower()))//contain method is not case sensitive
{
List<string> existingtablecolumnsPresent = dtSource.Columns.Cast<DataColumn>().Select(a => "[" + a.ColumnName.ToLower() + "]").Distinct().OrderBy(t => t).ToList();
int sourceColumnIndex = existingtablecolumnsPresent.IndexOf(dataTable.Columns[i].ToString().ToLower());//Once column matched get its index
sbc.ColumnMappings.Add(dtSource.Columns[sourceColumnIndex].ToString(), dtSource.Columns[sourceColumnIndex].ToString());//give coluns name of source table rather then destination table so that it would avoid case sensitivity
}
}
sbc.WriteToServer(dtInsertRows);
sbc.Close();
}
catch (Exception ex)
{
Log.WriteLog("BatchBulkCopy" + " - " + filename, dataTable.TableName, ex.Message.ToString());
// To move a file or folder to a new location:
//if (File.Exists(program.sourceFile + filename))
// System.IO.File.Move(program.sourceFile + filename, program.failedfiles + filename);
}
As requested (create a DataTable with the columns you want to insert in them- leave the others out. Make sure any columns you leave out are marked in the table for NULL or have a DEFAULT VALUE constraint (I can't show you how to do that unless you show me your table);
//This first method is psuedoCode to explain how to create your datatable. You need to do it in the way that makes sense for you.
public DataTable createDataTable(){
List<string> excludedColumns = new List<string>();
excludedColumns.Add("FieldToExclude");
//...
DataTable dt = new DataTable();
foreach(string col in getColumns(myTable)){
if(!excludedColumns.Contains(name)){
DataColumn dC = new DataColumn(name,type);
DataTable.Add(dC);
}
return dt;
}
public List<string> getColumns(string tableName)
{
List<string> ret = new List<string>();
using (SqlConnection conn = getConn())
{
conn.Open();
using (SqlCommand com = conn.CreateCommand())
{
com.CommandText = "select column_Name from information_schema.COLUMNS where table_name = #tab";
com.Parameters.AddWithValue("#tab", tableName);
SqlDataReader read = com.ExecuteReader();
While(read.Read()){
ret.Add(Convert.ToString(read[0]);
}
conn.Close();
}
return ret;
}
//Now, you have a DataTable that has all the columns you want to insert. Map them yourself in code by adding to the appropriate column in your datatable.
public bool isCopyInProgess = false;//not necessary - just part of my code
public void saveDataTable(string tableName, DataTable table)
{
using (SqlConnection conn = getConn())
{
conn.Open();
using (var bulkCopy = new SqlBulkCopy(conn))//, SqlBulkCopyOptions.KeepIdentity))//un-comment if you want to use your own identity column
{
// my DataTable column names match my SQL Column names, so I simply made this loop. However if your column names don't match, just pass in which datatable name matches the SQL column name in Column Mappings
foreach (DataColumn col in table.Columns)
{
//Console.WriteLine("mapping " + col.ColumnName+" ("+does_Column_Exist(col.ColumnName,"Item").ToString()+")");
bulkCopy.ColumnMappings.Add(col.ColumnName, "["+col.ColumnName+"]");
// Console.WriteLine("ok\n");
}
bulkCopy.BulkCopyTimeout = 8000;
bulkCopy.DestinationTableName = tableName;
bulkCopy.BatchSize = 10000;
bulkCopy.EnableStreaming = true;
//bulkCopy.SqlRowsCopied += BulkCopy_SqlRowsCopied;
//bulkCopy.NotifyAfter = 10000;
isCopyInProgess = true;
bulkCopy.WriteToServer(table);
}
conn.Close();
}
}
Also, use this as your bolumn checker:
public bool does_Column_Exist(string colName,string tableName)
{
bool ret = false;
using (SqlConnection conn = getConn())
{
conn.Open();
using (SqlCommand com = conn.CreateCommand())
{
com.CommandText = "select count(*) from information_schema.COLUMNS where column_name = #col and table_name = #tab";
com.Parameters.AddWithValue("#tab", tableName);
com.Parameters.AddWithValue("#col", colName);
ret = Convert.ToInt32(com.ExecuteScalar()) == 0 ? false : true;
}
conn.Close();
}
return ret;
}
Is there a specific reason you need C# for this? It seems like the path of least resistance would be to use SQL to do the job.
INSERT INTO table2
(column_name(s))
SELECT column_name(s)
FROM table1;
I have filled a DataSet with a Table that was created from another database file. The table is NOT in the database file which I want to be able to copy the Table to.
Now I want to save all those records (DataTable) to a newly created SQLite database file...
How can i do that?
Also I really want to avoid loops if this is possible.
The best answer is by me :) so i'll share it.This is loop but writes 100k entries in 2-3secs.
using (DbTransaction dbTrans = kaupykliuduomConn.BeginTransaction())
{
downloadas.Visible = true; //my progressbar
downloadas.Maximum = dataSet1.Tables["duomenys"].Rows.Count;
using (DbCommand cmd = kaupykliuduomConn.CreateCommand())
{
cmd.CommandText = "INSERT INTO duomenys(Barkodas, Preke, kiekis) VALUES(?,?,?)";
DbParameter Field1 = cmd.CreateParameter();
DbParameter Field2 = cmd.CreateParameter();
DbParameter Field3 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
cmd.Parameters.Add(Field2);
cmd.Parameters.Add(Field3);
while (n != dataSet1.Tables["duomenys"].Rows.Count)
{
Field1.Value = dataSet1.Tables["duomenys"].Rows[n]["Barkodas"].ToString();
Field2.Value = dataSet1.Tables["duomenys"].Rows[n]["Preke"].ToString();
Field3.Value = dataSet1.Tables["duomenys"].Rows[n]["kiekis"].ToString();
downloadas.Value = n;
n++;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
In this case dataSet1.Tables["duomenys"] is already filled with all the data i need to transfer to another database. I used loop to fill dataset too.
When you load the DataTable from the source database, set the AcceptChangesDuringFill property of the data adapter to false, so that loaded records are kept in the Added state (assuming that the source database is SQL Server)
var sqlAdapter = new SqlDataAdapter("SELECT * FROM the_table", sqlConnection);
DataTable table = new DataTable();
sqlAdapter.AcceptChangesDuringFill = false;
sqlAdapter.Fill(table);
Create the table in the SQLite database, by executing the CREATE TABLE statement directly with SQLiteCommand.ExecuteNonQuery
Create a new DataAdapter for the SQLite database connection, and use it to Update the db:
var sqliteAdapter = new SQLiteDataAdapter("SELECT * FROM the_table", sqliteConnection);
var cmdBuilder = new SQLiteCommandBuilder(sqliteAdapter);
sqliteAdapter.Update(table);
If the source and target tables have the same column names and compatible types, it should work fine...
The way to import SQL data to SQLite will take long time. When you want to import data in millions, It will take lot of time. So the shortest and easiest way to do that is just fill fetch the data from SQL database in a DataTable and insert all its rows to SQLite database.
public bool ImportDataToSQLiteDatabase(string Proc, string SQLiteDatabase, params object[] obj)
{
DataTable result = null;
SqlConnection conn = null;
SqlCommand cmd = null;
try
{
result = new DataTable();
using (conn = new SqlConnection(ConStr))
{
using (cmd = CreateCommand(Proc, CommandType.StoredProcedure, obj))
{
cmd.Connection = conn;
conn.Open();
result.Load(cmd.ExecuteReader());
}
}
using (SQLiteConnection con = new SQLiteConnection(string.Format("Data Source={0};Version=3;New=False;Compress=True;Max Pool Size=100;", SQLiteDatabase)))
{
con.Open();
using (SQLiteTransaction transaction = con.BeginTransaction())
{
foreach (DataRow row in result.Rows)
{
using (SQLiteCommand sqlitecommand = new SQLiteCommand("insert into table(fh,ch,mt,pn) values ('" + Convert.ToString(row[0]) + "','" + Convert.ToString(row[1]) + "','"
+ Convert.ToString(row[2]) + "','" + Convert.ToString(row[3]) + "')", con))
{
sqlitecommand.ExecuteNonQuery();
}
}
transaction.Commit();
new General().WriteApplicationLog("Data successfully imported.");
return true;
}
}
}
catch (Exception ex)
{
result = null;
return false;
}
finally
{
if (conn.State == ConnectionState.Open)
conn.Close();
}
}
It will take a very few time as compare to upper given answers.