SqlBulkCopy ColumnMapping Error - c#

My goal is to copy generic tables from one database to another. I would like to have it copy the data as is and it would be fine to either delete whatever is in the table or to add to it with new columns if there are new columns. The only thing I may want to change is to add something for versioning which can be done in a seperate part of the query.
Opening the data no problem but when I try a bulk copy but it is failing. I have gone though several posts and the closest thing is this one:
SqlBulkCopy Insert with Identity Column
I removed the SqlBulkCopyOptions.KeepIdentity from my code but it still is throwing
"The given ColumnMapping does not match up with any column in the source or destination" error
I have tried playing with the SqlBulkCopyOptions but so far no luck.
Ideas?
public void BatchBulkCopy(string connectionString, DataTable dataTable, string DestinationTbl, int batchSize)
{
// Get the DataTable
DataTable dtInsertRows = dataTable;
using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString))
{
sbc.DestinationTableName = DestinationTbl;
// Number of records to be processed in one go
sbc.BatchSize = batchSize;
// Finally write to server
sbc.WriteToServer(dtInsertRows);
}
}

If I could suggest another approach, I would have a look at the SMO (SQL Server Management Objects) library to perform such tasks.
You can find an interesting article here.
Using SMO, you can perform tasks in SQL Server, such a bulk copy, treating tables, columns and databases as objects.
Some time ago, I used SMO in a small open source application I developed, named SQLServerDatabaseCopy.
To copy the data from table to table, I created this code (the complete code is here):
foreach (Table table in Tables)
{
string columnsTable = GetListOfColumnsOfTable(table);
string bulkCopyStatement = "SELECT {3} FROM [{0}].[{1}].[{2}]";
bulkCopyStatement = String.Format(bulkCopyStatement, SourceDatabase.Name, table.Schema, table.Name, columnsTable);
using (SqlCommand selectCommand = new SqlCommand(bulkCopyStatement, connection))
{
LogFileManager.WriteToLogFile(bulkCopyStatement);
SqlDataReader dataReader = selectCommand.ExecuteReader();
using (SqlConnection destinationDatabaseConnection = new SqlConnection(destDatabaseConnString))
{
if (destinationDatabaseConnection.State == System.Data.ConnectionState.Closed)
{
destinationDatabaseConnection.Open();
}
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationDatabaseConnection))
{
bulkCopy.DestinationTableName = String.Format("[{0}].[{1}]", table.Schema, table.Name);
foreach (Column column in table.Columns)
{
//it's not needed to perfom a mapping for computed columns!
if (!column.Computed)
{
bulkCopy.ColumnMappings.Add(column.Name, column.Name);
}
}
try
{
bulkCopy.WriteToServer(dataReader);
LogFileManager.WriteToLogFile(String.Format("Bulk copy successful for table [{0}].[{1}]", table.Schema, table.Name));
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
}
finally
{
//closing reader
dataReader.Close();
}
}
}
}
}
As you can see, you have to add the ColumnMappings to the BulkCopy object for each column, because you have to define which column of source table must be mapped to a column of destination table. This is the reason of your error that says: The given ColumnMapping does not match up with any column in the source or destination.

I would add some validation to this to check what columns your source and destination tables have in common.
This essentially queries the system views (I have assumed SQL Server but this will be easily adaptable for other DBMS), to get the column names in the destination table (excluding identity columns), iterates over these and if there is a match in the source table adds the column mapping.
public void BatchBulkCopy(string connectionString, DataTable dataTable, string DestinationTbl, int batchSize)
{
using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString))
{
sbc.DestinationTableName = DestinationTbl;
string sql = "SELECT name FROM sys.columns WHERE is_identity = 0 AND object_id = OBJECT_ID(#table)";
using (var connection = new SqlConnection(connectionString))
using (var command = new SqlCommand(sql, connection))
{
command.Parameters.AddWithValue("#table", DestinationTbl);
connection.Open();
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
var column = reader.GetString(0);
if (dataTable.Columns.Contains(column))
{
sbc.ColumnMappings.Add(column, column);
}
}
}
}
// Number of records to be processed in one go
sbc.BatchSize = batchSize;
// Finally write to server
sbc.WriteToServer(dataTable);
}
}
This could still get invalid cast errors as there is no data type check, but should get you started for a generic method.

You can add
sbc.ColumnMappings.Add(0, 0);
sbc.ColumnMappings.Add(1, 1);
sbc.ColumnMappings.Add(2, 2);
sbc.ColumnMappings.Add(3, 3);
sbc.ColumnMappings.Add(4, 4);
before executing
sbc.WriteToServer(dataTable);
Thank you !!

Related

SqlBulkcopy .net 4.0 cannot access destination table

I have written a program that in .net that should copy tables data from one server to another. However I am getting an error:
cannot access destination table "mytable"
Despite googling and looking everywhere I cannot find a solution to the error I am getting
Some posts mentions permissions and I have done the following:
GRANT SELECT, UPDATE, DELETE, INSERT TO bulkadmin
but still no success.
Am I missing the obvious?
Help is greatly appreciated.
EDIT
I bulk copy 3 databases with 1000 tables to 01 "target" database.
I have simplified the code that I use and also tested with no luck.The intention is todo in Parallel ,but I want to get it working with a simple table first
private void TestBulkCopy(string sourceServer, string sourceDatabase, List<string> sourceTables)
{
string connectionStringSource = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
string connectionStringTarget = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
string sqlGetDataFromSource = string.Format("SELECT * FROM {0}", "testTable");
using (var sourceConnection = new SqlConnection(connectionStringSource))
{
sourceConnection.Open();
using (var cmdSource = new SqlCommand(sqlGetDataFromSource, sourceConnection))
using (SqlDataReader readerSource = cmdSource.ExecuteReader())
{
using (var sqlTargetConnection = new SqlConnection(connectionStringTarget))
{
sqlTargetConnection.Open();
using (var bulkCopy = new SqlBulkCopy(sqlTargetConnection, SqlBulkCopyOptions.TableLock, null))
{
bulkCopy.DestinationTableName = "testTable";
bulkCopy.SqlRowsCopied += OnSqlRowsCopied;
bulkCopy.BatchSize = 2600;
bulkCopy.NotifyAfter = 50;
bulkCopy.BulkCopyTimeout = 60;
bulkCopy.WriteToServer(readerSource);
}
}
}
}
}
}
Write the Schema before the table Name
Change
bulkCopy.DestinationTableName = "testTable";
to
bulkCopy.DestinationTableName = "dbo.testTable";
I think your destination table have defined field with auto-number identify. So, SqlBulkCopy can not copy values into that column. You must OFF that to-number identify column on destination table using this code :
BEGIN
SET IDENTITY_INSERT [building] ON;
INSERT INTO [Table2](.....)
VALUES(#id, #id_project,....)
SET IDENTITY_INSERT [building] OFF;
END
or edit definition of destination table and remove auto-number identify on that column.
The table name in WriteToServer method of SqlBulkCopy must be surrounded with [ ] signs.

Sql Bulk Copy/Insert in C#

I am new to JSON and SQLBulkCopy. I have a JSON formatted POST data that I want to Bulk Copy/Insert in Microsoft SQL using C#.
JSON Format:
{
"URLs": [{
"url_name": "Google",
"url_address": "http://www.google.com/"
},
{
"url_name": "Yahoo",
"url_address": "http://www.yahoo.com/"
},
{
"url_name": "FB",
"url_address": "http://www.fb.com/"
},
{
"url_name": "MegaSearches",
"url_address": "http://www.megasearches.com/"
}]
}
Classes:
public class UrlData
{
public List<Url> URLs {get;set;}
}
public class Url
{
public string url_address {get;set;}
public string url_name {get;set;}
}
How can I do that efficiently?
TL;DR If you have your data already represented as DataTable, you can insert it to the destination table on the server with SqlBulkCopy:
string csDestination = "put here the a connection string to the database";
using (SqlConnection connection = new SqlConnection(csDestination))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
connection.Open()
bulkCopy.DestinationTableName = "TUrls";
bulkCopy.WriteToServer(dataTableOfUrls);
}
If you want to load just "from 10 to 50 urls" there's no need to use SqlBulkCopy - its general purpose to eliminate thousands of separate inserts.
So, inserting without SqlBulkCopy [and without EntityFramework] can be done one by one:
string insertQuery = "insert into TUrls(address, name) values(#address, #name)";
foreach (URL url in listOfUrls)
{
SqlCommand cmd = new SqlCommand(insertQuery);
cmd.Parameters.AddWithValue("#name", url.url_name);
cmd.Parameters.AddWithValue("#address", url.urld_address);
// Remember to take care of connection! I omit this part for clarity
cmd.ExecuteNonQuery();
}
To insert data with SqlBulkCopy you need to convert your data (e.g. a list of custom class objects) to DataTable. Below is the quote from Marc Gravell's answer as an example of generic solution for such conversion:
Here's a nice 2013 update using
FastMember from NuGet:
IEnumerable<SomeType> data = ...
DataTable table = new DataTable();
using(var reader = ObjectReader.Create(data)) {
table.Load(reader);
}
Yes, this is pretty much the exact opposite of this one;
reflection would suffice - or if you need quicker,
HyperDescriptor in 2.0, or maybe Expression in 3.5. Actually,
HyperDescriptor should be more than adequate.
For example:
// remove "this" if not on C# 3.0 / .NET 3.5
public static DataTable ToDataTable<T>(this IList<T> data)
{
PropertyDescriptorCollection props =
TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
for(int i = 0 ; i < props.Count ; i++)
{
PropertyDescriptor prop = props[i];
table.Columns.Add(prop.Name, prop.PropertyType);
}
object[] values = new object[props.Count];
foreach (T item in data)
{
for (int i = 0; i < values.Length; i++)
{
values[i] = props[i].GetValue(item);
}
table.Rows.Add(values);
}
return table;
}
Now, having your data represented as DataTable, you're ready to write it to the destination table on the server:
string csDestination = "put here the a connection string to the database";
using (SqlConnection connection = new SqlConnection(csDestination))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
connection.Open();
bulkCopy.DestinationTableName = "TUrls";
bulkCopy.WriteToServer(dataTableOfUrls);
}
Hope it helps.
**UPDATE: **
Answer to #pseudonym27 question: "Hello can I use BulkCopy class to append data to existing table in SQL database?"
Yes, you can - BulkCopy works just as an insert command in a way that it appends data.
Also, consider using an intermediate table in case there's high probability for operation to go wrong (e.g. long insert time and connection issues), and you want to busy/lock the destination table as little time as possible. Another use case for an intermediate table is, of course, the need to do some data transformations before an insert.
Using this below code, you can convert List<YourClassname> to DataTable:-
List<YourClass> objlist = alldata;
string json = Newtonsoft.Json.JsonConvert.SerializeObject(objlist);
DataTable dt = Newtonsoft.Json.JsonConvert.DeserializeObject<DataTable>(json);
SaveDataInTables(dt, "Table_Name_Of_SQL");
Here, I'm assuming that alldata contains list<YourClass> object and you can also do - objlist.Add(objYourClass), then pass sql_TableName and data table in SaveDataInTables method. This method will insert all data in SQL_Table.
public void SaveDataInTables(DataTable dataTable, string tablename)
{
if (dataTable.Rows.Count > 0)
{
using (SqlConnection con = new SqlConnection("Your_ConnectionString"))
{
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))
{
sqlBulkCopy.DestinationTableName = tablename;
con.Open();
sqlBulkCopy.WriteToServer(dataTable);
con.Close();
}
}
}
}
Hope, these codes help you!!!
You should use Table valued parameters. if you are using > sql server 2005. You can have an example here
If it's only 10-50 urls, being inserted infrequently, you can fire off insert statements. Simple and less hassle and you can use something easy and quick like dapper.
Else if you want the Bulk Copy, you would need to create and fill up an ADO.NET datatable from your JSON first - preferably matching the schema of your destination sql table. It's your choice.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "dbo.LogData";
try
{
// Write from the source to the destination.
connection.Open();
bulkCopy.WriteToServer(dataTable1);
connection.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}

Fetch rows from database using ExecuteStoreQuery, without knowing the number of columns in the table

I'm trying to do some manual SQL queries against my SQLite database using the ExecuteStoreQuery method on my ObjectContext.
The catch is that I don't always know how many columns are in the table that I'm querying. Ideally, I would like each fetched row to simply be an string[] object.
I've looked at Example 2 here: http://msdn.microsoft.com/en-us/library/vstudio/dd487208(v=vs.100).aspx
It's close to what I want to do, except that I don't know the structure of the TElement I'm fetching, so I can't define a struct as they do in the example.
Below is some of my code (not compiling due to the ???? TElement). The code below is trying to fetch the table info, so in this case I do know the structure of the rows, but in general I don't.
Is there a way to do this with ExecuteStoreQuery? Or is there a different way of doing it, while still using the existing connection of my ObjectContext (rather than opening a new SQL connection to the DB)?
public void PrintColumnHeaders(NWRevalDatabaseEntities entities, string tableName)
{
string columnListQuery = string.Format("PRAGMA table_info({0})", tableName);
var result = entities.ExecuteStoreQuery<????>(columnListQuery);
foreach (string[] row in result)
{
string columnHeader = row[1]; // Column header is in second column of table
Console.WriteLine("Column Header: {0}", columnHeader);
}
}
I got this working based on Gert Arnold's comment. Also, it took me some effort to figure out that I need a SQLiteConnection, not the EntityConnection that I could get directly from the ObjectContext. The answer to this question helped me with that.
The working code is below:
public static void PrintColumnHeaders(NWRevalDatabaseEntities entities, string tableName)
{
var sc = ((System.Data.EntityClient.EntityConnection)entities.Connection).StoreConnection;
System.Data.SQLite.SQLiteConnection sqliteConnection = (System.Data.SQLite.SQLiteConnection)sc;
sqliteConnection.Open();
System.Data.Common.DbCommand cmd = sc.CreateCommand();
cmd.CommandType = System.Data.CommandType.Text;
cmd.CommandText = string.Format("PRAGMA table_info('{0}');", tableName);
System.Data.Common.DbDataReader reader = cmd.ExecuteReader();
if (reader.HasRows)
{
object[] values = new object[reader.FieldCount];
while (reader.Read())
{
int result = reader.GetValues(values);
string columnHeader = (string)values[1]; // table_info returns a row for each column, with the column header in the second column.
Console.WriteLine("Column Header: {0}", columnHeader);
}
}
sqliteConnection.Close();
}

C# SQLite error Insufficient parameters supplied to the command

I get the error SQLite error Insufficient parameters supplied to the command when running an insert statement in my program. However, I've got it narrowed down to it will only occur if I try to insert data into two different tables in succession. Meaning I insert data into one and everything in the method is diposed of since I am using USING statements.
This error won't occur if I insert data into one table stop debugging, rebuild my solution and then insert into the other table.
Anyone have any ideas? Like I said everything is encapsulated in using statements so I don't know what is getting held in memory.
Here is my method:
public static void SQLiteTableINSERT(string tableName)
{
using (SQLiteConnection Conn = new SQLiteConnection(SQLiteConn.Conn))
{
using (SQLiteTransaction sqliteTrans = Conn.BeginTransaction())
{
using (SQLiteCommand cmd = Conn.CreateCommand())
{
cmd.CommandText = PrepareInsert(tableName);
for (int i = 0; i < LocalDataSet.LocalDs.Tables[0].Rows.Count; ++i)
{
foreach (DataColumn col in LocalDataSet.LocalDs.Tables[0].Columns)
{
string temp = LocalDataSet.LocalDs.Tables[0].Rows[i][col, DataRowVersion.Current].ToString();
cmd.Parameters.AddWithValue("#" + col.ColumnName, temp);
}
cmd.ExecuteNonQuery();
}
}
sqliteTrans.Commit();
}
}
SQLite.SQLiteConn.Conn.Close();
}
Any ideas would be great.
Thanks.
I think you need to be using the tableName in the loop, assuming that the tables don't have the same schema.
The command is prepared for the table with name tableName, but the params you are adding are always from Tables[0], try Tables[tablename]

How do I extract data from a DataTable?

I have a DataTable that is filled in from an SQL query to a local database, but I don't know how to extract data from it.
Main method (in test program):
static void Main(string[] args)
{
const string connectionString = "server=localhost\\SQLExpress;database=master;integrated Security=SSPI;";
DataTable table = new DataTable("allPrograms");
using (var conn = new SqlConnection(connectionString))
{
Console.WriteLine("connection created successfuly");
string command = "SELECT * FROM Programs";
using (var cmd = new SqlCommand(command, conn))
{
Console.WriteLine("command created successfuly");
SqlDataAdapter adapt = new SqlDataAdapter(cmd);
conn.Open();
Console.WriteLine("connection opened successfuly");
adapt.Fill(table);
conn.Close();
Console.WriteLine("connection closed successfuly");
}
}
Console.Read();
}
The command I used to create the tables in my database:
create table programs
(
progid int primary key identity(1,1),
name nvarchar(255),
description nvarchar(500),
iconFile nvarchar(255),
installScript nvarchar(255)
)
How can I extract data from the DataTable into a form meaningful to use?
The DataTable has a collection .Rows of DataRow elements.
Each DataRow corresponds to one row in your database, and contains a collection of columns.
In order to access a single value, do something like this:
foreach(DataRow row in YourDataTable.Rows)
{
string name = row["name"].ToString();
string description = row["description"].ToString();
string icoFileName = row["iconFile"].ToString();
string installScript = row["installScript"].ToString();
}
You can set the datatable as a datasource to many elements.
For eg
gridView
repeater
datalist
etc etc
If you need to extract data from each row then you can use
table.rows[rowindex][columnindex]
or
if you know the column name
table.rows[rowindex][columnname]
If you need to iterate the table then you can either use a for loop or a foreach loop like
for ( int i = 0; i < table.rows.length; i ++ )
{
string name = table.rows[i]["columnname"].ToString();
}
foreach ( DataRow dr in table.Rows )
{
string name = dr["columnname"].ToString();
}
The simplest way to extract data from a DataTable when you have multiple data types (not just strings) is to use the Field<T> extension method available in the System.Data.DataSetExtensions assembly.
var id = row.Field<int>("ID"); // extract and parse int
var name = row.Field<string>("Name"); // extract string
From MSDN, the Field<T> method:
Provides strongly-typed access to each of the column values in the
DataRow.
This means that when you specify the type it will validate and unbox the object.
For example:
// iterate over the rows of the datatable
foreach (var row in table.AsEnumerable()) // AsEnumerable() returns IEnumerable<DataRow>
{
var id = row.Field<int>("ID"); // int
var name = row.Field<string>("Name"); // string
var orderValue = row.Field<decimal>("OrderValue"); // decimal
var interestRate = row.Field<double>("InterestRate"); // double
var isActive = row.Field<bool>("Active"); // bool
var orderDate = row.Field<DateTime>("OrderDate"); // DateTime
}
It also supports nullable types:
DateTime? date = row.Field<DateTime?>("DateColumn");
This can simplify extracting data from DataTable as it removes the need to explicitly convert or parse the object into the correct types.
Please consider using some code like this:
SqlDataReader reader = command.ExecuteReader();
int numRows = 0;
DataTable dt = new DataTable();
dt.Load(reader);
numRows = dt.Rows.Count;
string attended_type = "";
for (int index = 0; index < numRows; index++)
{
attended_type = dt.Rows[indice2]["columnname"].ToString();
}
reader.Close();
Unless you have a specific reason to do raw ado.net I would have a look at using an ORM (object relational mapper) like nHibernate or LINQ to SQL. That way you can query the database and retrieve objects to work with which are strongly typed and easier to work with IMHO.
var table = Tables[0]; //get first table from Dataset
foreach (DataRow row in table.Rows)
{
foreach (var item in row.ItemArray)
{
console.Write("Value:"+item);
}
}
Please, note that Open and Close the connection is not necessary when using DataAdapter.
So I suggest please update this code and remove the open and close of the connection:
SqlDataAdapter adapt = new SqlDataAdapter(cmd);
conn.Open(); // this line of code is uncessessary
Console.WriteLine("connection opened successfuly");
adapt.Fill(table);
conn.Close(); // this line of code is uncessessary
Console.WriteLine("connection closed successfuly");
Reference Documentation
The code shown in this example does not explicitly open and close the
Connection. The Fill method implicitly opens the Connection that the
DataAdapter is using if it finds that the connection is not already
open. If Fill opened the connection, it also closes the connection
when Fill is finished. This can simplify your code when you deal with
a single operation such as a Fill or an Update. However, if you are
performing multiple operations that require an open connection, you
can improve the performance of your application by explicitly calling
the Open method of the Connection, performing the operations against
the data source, and then calling the Close method of the Connection.
You should try to keep connections to the data source open as briefly
as possible to free resources for use by other client applications.

Categories

Resources