I'm trying to use SqlBulkCopy to copy data into an SQL database table however it is (wrongly) saying that the columns don't match. They do match. If I use a breakpoint to see the names of the columns being mapped, they're correct. The error message shows the name of the column, and it is correct.
This is my method. I have an identical method that does work and the only difference is where it gets the column names from. The strings containing the column names, however, are EXACTLY identical.
public static bool ManualMapImport(DataTable dataTable, string table)
{
if(dataTable != null)
{
SqlConnection connection = new SqlConnection(connectionString);
SqlBulkCopy import = new SqlBulkCopy(connection);
import.DestinationTableName = "[" + table + "]";
foreach (string s in Global.SelectedColumns)
{
/* The s string variable here is the EXACT same as
the c.ToString() in the other method below */
if (ColumnExists(table, s))
import.ColumnMappings.Add(s, s);
else
return false;
}
connection.Open();
import.WriteToServer(dataTable); //Error happens on this line
connection.Close();
return true;
}
else
{
return false;
}
}
This is the almost identical, working method:
public static bool AutoMapImport(DataTable dataTable, string table)
{
if (dataTable != null)
{
SqlConnection connection = new SqlConnection(connectionString);
SqlBulkCopy import = new SqlBulkCopy(connection);
import.DestinationTableName = "[" + table + "]";
foreach (DataColumn c in dataTable.Columns)
{
if (ColumnExists(table, c.ToString()))
import.ColumnMappings.Add(c.ToString(), c.ToString());
else
return false;
}
connection.Open();
import.WriteToServer(dataTable);
connection.Close();
return true;
}
else
{
return false;
}
}
If it helps, the column names are: ACT_Code, ACT_Paid, ACT_Name, ACT_Terminal_Code, ACT_TCustom1, ACT_TCustom2. These are exactly the same in the database itself. I'm aware that SqlBulkCopy mappings are case sensitive, and the column names are indeed correct.
This is the error message:
An unhandled exception of type 'System.InvalidOperationException'
occurred in System.Data.dll
Additional information: The given ColumnName 'ACT_Code' does not match
up with any column in data source.
Hopefully I'm just missing something obvious here, but I am well and truly lost.
Many thanks.
EDIT: For anyone happening to have the same problem as me, here's how
I fixed it.
Instead of having the ManualMapImport() method be a near-clone of
AutoMapImport(), I had it loop through the columns of the datatable
and change the names, then called AutoMapImport() with the amended
datatable, eliminating the need to try and map with plain strings at
all.
According to MSDN (here), the DataColumn.ToString() method returns "The Expression value, if the property is set; otherwise, the ColumnName property.".
I've always found the ToString() method to be wonky anyway (can change based on current state/conditions), so I'd recommend using the ColumnName property instead, as that's what you are actually trying to get out of ToString().
OK, failing that, then I'd have to guess that this is a problem with case-sensitivity in the names of the columns in the source datatable, as SQLBulkCopy is very case-sensitive even if the SQL DB is not. To address this, I would say that when you check to see if that column exists, then you should return/use the actual string from the datatable's column list itself, rather than using whatever string was passed in. This should be able to fix up any case or accent differences that your ColumnsExist routine might be ignoring.
I had the same problem... The message might seem a bit misleading, as it suggests you didn't perform the correct mapping.
To find the root of the problem I have decided to go step by step in adding table columns and calling the WriteToServer method.
Assuming you have a valid column mapping, you will have to ensure the following between the source DataTable and the destination table:
The column types and lengths (!) do match
You have provided a valid value for each non-empty (NOT NULL) destination column
If you don't control your identity column values and would like the SQL Server do this job for you, please make sure not to specify the SqlBulkCopyOptions.KeepIdentity option. In this case you don't add the identity column to your source either.
This should be all for your bulk insert to work. Hope it helps.
I'm getting this exception when trying to do an SqlBulkCopy from a DataTable.
Error Message: The given value of type String from the data source cannot be converted to type money of the specified target column.
Target Site: System.Object ConvertValue(System.Object, System.Data.SqlClient._SqlMetaData, Boolean, Boolean ByRef, Boolean ByRef)
I understand what the error is saying, but how I can I get more information, such as the row/field this is happening on? The datatable is populated by a 3rd party and can contain up to 200 columns and up to 10k rows. The columns that are returned depend on the request sent to the 3rd party. All of the datatable columns are of string type. The columns in my database are not all varchar, therefore, prior to executing the insert, I format the datatable values using the following code (non important code removed):
//--- create lists to hold the special data type columns
List<DataColumn> IntColumns = new List<DataColumn>();
List<DataColumn> DecimalColumns = new List<DataColumn>();
List<DataColumn> BoolColumns = new List<DataColumn>();
List<DataColumn> DateColumns = new List<DataColumn>();
foreach (DataColumn Column in dtData.Columns)
{
//--- find the field map that tells the system where to put this piece of data from the 3rd party
FieldMap ColumnMap = AllFieldMaps.Find(a => a.SourceFieldID.ToLower() == Column.ColumnName.ToLower());
//--- get the datatype for this field in our system
Type FieldDataType = Nullable.GetUnderlyingType(DestinationType.Property(ColumnMap.DestinationFieldName).PropertyType);
//--- find the field data type and add to respective list
switch (Type.GetTypeCode(FieldDataType))
{
case TypeCode.Int16:
case TypeCode.Int32:
case TypeCode.Int64: { IntColumns.Add(Column); break; }
case TypeCode.Boolean: { BoolColumns.Add(Column); break; }
case TypeCode.Double:
case TypeCode.Decimal: { DecimalColumns.Add(Column); break; }
case TypeCode.DateTime: { DateColumns.Add(Column); break; }
}
//--- add the mapping for the column on the BulkCopy object
BulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(Column.ColumnName, ColumnMap.DestinationFieldName));
}
//--- loop through all rows and convert the values to data types that match our database's data type for that field
foreach (DataRow dr in dtData.Rows)
{
//--- convert int values
foreach (DataColumn IntCol in IntColumns)
dr[IntCol] = Helpers.CleanNum(dr[IntCol].ToString());
//--- convert decimal values
foreach (DataColumn DecCol in DecimalColumns)
dr[DecCol] = Helpers.CleanDecimal(dr[DecCol].ToString());
//--- convert bool values
foreach (DataColumn BoolCol in BoolColumns)
dr[BoolCol] = Helpers.ConvertStringToBool(dr[BoolCol].ToString());
//--- convert date values
foreach (DataColumn DateCol in DateColumns)
dr[DateCol] = dr[DateCol].ToString().Replace("T", " ");
}
try
{
//--- do bulk insert
BulkCopy.WriteToServer(dtData);
transaction.Commit();
}
catch (Exception ex)
{
transaction.Rollback();
//--- handles error
//--- this is where I need to find the row & column having an issue
}
This code should format all values for their destination fields. In the case of this error, the decimal, the function that cleans that up will remove any character that is not 0-9 or . (decimal point). This field that is throwing the error would be nullable in the database.
The level 2 exception has this error:
Error Message: Failed to convert parameter value from a String to a Decimal.
Target Site: System.Object CoerceValue(System.Object, System.Data.SqlClient.MetaType, Boolean ByRef, Boolean ByRef, Boolean)
and the level 3 exception has this error:
Error Message: Input string was not in a correct format
Target Site: Void StringToNumber(System.String, System.Globalization.NumberStyles, NumberBuffer ByRef, System.Globalization.NumberFormatInfo, Boolean)
Does anyone have any ideas to fix? or any ideas to get more info?
For the people stumbling across this question and getting a similar error message in regards to an nvarchar instead of money:
The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.
This could be caused by a too-short column.
For example, if your column is defined as nvarchar(20) and you have a 40 character string, you may get this error.
Source
Please use SqlBulkCopyColumnMapping.
Example:
private void SaveFileToDatabase(string filePath)
{
string strConnection = System.Configuration.ConfigurationManager.ConnectionStrings["MHMRA_TexMedEvsConnectionString"].ConnectionString.ToString();
String excelConnString = String.Format("Provider=Microsoft.ACE.OLEDB.12.0;Data Source={0};Extended Properties=\"Excel 12.0\"", filePath);
//Create Connection to Excel work book
using (OleDbConnection excelConnection = new OleDbConnection(excelConnString))
{
//Create OleDbCommand to fetch data from Excel
using (OleDbCommand cmd = new OleDbCommand("Select * from [Crosswalk$]", excelConnection))
{
excelConnection.Open();
using (OleDbDataReader dReader = cmd.ExecuteReader())
{
using (SqlBulkCopy sqlBulk = new SqlBulkCopy(strConnection))
{
//Give your Destination table name
sqlBulk.DestinationTableName = "PaySrcCrosswalk";
// this is a simpler alternative to explicit column mappings, if the column names are the same on both sides and data types match
foreach(DataColumn column in dt.Columns) {
s.ColumnMappings.Add(new SqlBulkCopyColumnMapping(column.ColumnName, column.ColumnName));
}
sqlBulk.WriteToServer(dReader);
}
}
}
}
}
Since I don't believe "Please use..." plus some random code that is unrelated to the question is a good answer, but I do believe the spirit was correct, I decided to answer this correctly.
When you are using Sql Bulk Copy, it attempts to align your input data directly with the data on the server. So, it takes the Server Table and performs a SQL statement similar to this:
INSERT INTO [schema].[table] (col1, col2, col3) VALUES
Therefore, if you give it Columns 1, 3, and 2, EVEN THOUGH your names may match (e.g.: col1, col3, col2). It will insert like so:
INSERT INTO [schema].[table] (col1, col2, col3) VALUES
('col1', 'col3', 'col2')
It would be extra work and overhead for the Sql Bulk Insert to have to determine a Column Mapping. So it instead allows you to choose... Either ensure your Code and your SQL Table columns are in the same order, or explicitly state to align by Column Name.
Therefore, if your issue is mis-alignment of the columns, which is probably the majority of the cause of this error, this answer is for you.
TLDR
using System.Data;
//...
myDataTable.Columns.Cast<DataColumn>().ToList().ForEach(x =>
bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(x.ColumnName, x.ColumnName)));
This will take your existing DataTable, which you are attempt to insert into your created BulkCopy object, and it will just explicitly map name to name. Of course if, for some reason, you decided to name your DataTable Columns differently than your SQL Server Columns... that's on you.
#Corey - It just simply strips out all invalid characters. However, your comment made me think of the answer.
The problem was that many of the fields in my database are nullable. When using SqlBulkCopy, an empty string is not inserted as a null value. So in the case of my fields that are not varchar (bit, int, decimal, datetime, etc) it was trying to insert an empty string, which obviously is not valid for that data type.
The solution was to modify my loop where I validate the values to this (repeated for each datatype that is not string)
//--- convert decimal values
foreach (DataColumn DecCol in DecimalColumns)
{
if(string.IsNullOrEmpty(dr[DecCol].ToString()))
dr[DecCol] = null; //--- this had to be set to null, not empty
else
dr[DecCol] = Helpers.CleanDecimal(dr[DecCol].ToString());
}
After making the adjustments above, everything inserts without issues.
Make sure that the column values u added in entity class having get set properties also in the same order which is present in target table.
Not going to be everyone's fix, but it was for me:
So, i ran across this exact issue. The problem I seemed to have was when my DataTable didnt have an ID column, but the target destination had one with a primary key.
When i adapted my DataTable to have an id, the copy worked perfectly.
In my scenario, the Id column isnt very important to have the primary key so i deleted this column from the target destination table and the SqlBulkCopy is working without issue.
There is another issue you have to take care of it when you try mapping column which is string length,
for example TK_NO nvarchar(50) you will have to map to
the same length as the destination field.
I got the same error "occasionally". Note: column mapping was all correct, and that is why code worked most of the time.
And I found the root case to be string length issue. The target table column had datatype as nvarchar(255) - where as the value being sent was of length more then 255 chars in string.
Fixed it by increasing column length in DB.
p.s.: Sadly the error msg wont tell you which column of table is causing this error. That you have to guess/figure it out manually.
My issue was with the column mapping rather than the values. I was doing an extract from a dev system, creating the destination table, bulk copying the content, extracting from a prod system, adjusting the destination table and bulk copying the content so the column order from the 2 bulk copies wasn't matching
// explicitly setting the column mapping even though the source & destination column names
// are the same as the column orders will affect the bulk copy giving data conversion errors
foreach (DataColumn column in p_dataTable.Columns)
{
bulkCopy.ColumnMappings.Add(
new()
{
SourceColumn = column.ColumnName,
DestinationColumn = column.ColumnName
}
);
}
bulkCopy.WriteToServer(p_dataTable);
short answer: change type from nvarchar(size) to nvarchar(Max)
it is issue of string length size
note: all above suggestions made me write this short answer
Check The data you are writing to Server. May be data has delimiter which is not used.
like
045|2272575|0.000|0.000|2013-10-07
045|2272585|0.000|0.000;2013-10-07
your delimiter is '|' but data has a delimiter ';'.
So for this you are getting the error.
Hi I am currently working with a report in Visual Studio 2008. I use the query below to create a data set. This works correctly in SQL / SMSS and in the dataset when I test the query.
SELECT
CASE WHEN Make LIKE 'FO%' THEN 'Ford'
WHEN Make LIKE 'HON%' THEN 'Honda'
END Make,
CASE WHEN model LIKE 'CIV%' THEN 'Civic'
WHEN model LIKE '%AC%' THEN 'Accord'
ELSE model
END model,
year, AVG(Fuel.MPG) as AVGMPG
From cars, Fuel
Where Fuel.ID=cars.ID
AND year > 2003
AND Make is not NULL
AND model is not NULL
AND year is not NULL
Group by Make, model, year
When I have a report reference the dataset it generates the following error;
An error has occurred during report processing. Exception has been
thrown by the target of an invocation. Failed to enable constraints.
One or more rows contain values violating non-null, unique, or
foreign-key constraints.
Since the actual SQL statement is larger and involves several CASE statements, all of which work, I have narrowed it down to the else portion of the statement.
For background, I am trying to pull all the data from model but group certain values that are similar, but still pull the rest of the data as well.
My best guess here is that for some rows, the model column contains null, while the data set in your report does not allow null for the column value.
SELECT
CASE WHEN Make LIKE 'FO%' THEN 'Ford'
WHEN Make LIKE 'HON%' THEN 'Honda'
END Make
With no default case, you'll end up with an error. Change it to
SELECT
CASE WHEN Make LIKE 'FO%' THEN 'Ford'
WHEN Make LIKE 'HON%' THEN 'Honda'
ELSE ''
END Make
Here's a quick way to find the column in error and the actual error:
var ds = new DataSet();
ds.EnforceConstraints = false;
/* Fill dataset */
try {
ds.EnforceConstraints = true;
} catch {
foreach (DataTable tbl in ds.Tables) {
if (tbl.HasErrors) {
foreach (DataRow row in tbl.GetErrors()) {
foreach (DataColumn col in row.GetColumnsInError() {
row.GetColumnError(col);
}
}
}
}
}
I have come across a problem in using the DataAdapter, which I hope someone can help with. Basically I am creating a system, which is as follows:
Data is read in from a data source (MS-Access, SQL Server or Excel), converted to data tables and inserted into a local SQL Server database, using DataAdapters. This bit works fine. The SQL server table has a PK, which is an identity field with auto increment set to on.
Subsequent data loads read in the data from the source and compare it to what we already have. If the record is missing then it is added (this works fine). If the record is different then it needs to be updated (this doesn't work).
When doing the differential data load I create a data table which reads in the schema from the destination table (SQL server) and ensures it has the same columns etc.
The PK in the destination table is column 0, so when a record is inserted all of the values from column 1 onwards are set (as mentioned this works perfectly.). I don't change the row status for items I am adding. The PK in the data table is set correctly and I can confirm this.
When updating data I set column 0 (the PK column) to be the value of the record I am updating and set all of the columns to be the same as the source data.
For updated records I call AcceptChanges and SetModified on the row to ensure (I thought) that the application calls the correct method.
The DataAdapter is set with SelectCommand and UpdateCommand using the command builder.
When I run, I have traced it using SQL profiler and can see that the insert command is being ran correctly, but the update command isn't being ran at all, which is the crux of the problem. For reference an insert table will look something like the following
PK Value1 Value 2 Row State
== ====== ======= =========
124 Test1 Test 2 Added
123 Test3 Test4 Updated
Couple of things to be aware of....
I have tested this by loading the row to be changed into the datatable, changing some column fields and running update and this works. However, this is impractical for my solution because the data is HUGE >1Gb so I can't simply load it into a datatable without taking a huge performance hit. What I am doing is creating the data table with a max of 500 rows and the running the Update. Testing during the initial data load showed this to be the most efficient in terms of memory useage and performance. The data table is cleared after each batch is ran.
Anyone any ideas on where I am going wrong here?
Thanks in advance
Andrew
==========Update==============
Following is the code to create the insert/update rows
private static void AddNewRecordToDataTable(DbDataReader pReader, ref DataTable pUpdateDataTable)
{
// create a new row in the table
DataRow pUpdateRow = pUpdateDataTable.NewRow();
// loop through each item in the data reader - setting all the columns apart from the PK
for (int addCount = 0; addCount < pReader.FieldCount; addCount++)
{
pUpdateRow[addCount + 1] = pReader[addCount];
}
// add the row to the update table
pUpdateDataTable.Rows.Add(pUpdateRow);
}
private static void AddUpdateRecordToDataTable(DbDataReader pReader, int pKeyValue,
ref DataTable pUpdateDataTable)
{
DataRow pUpdateRow = pUpdateDataTable.NewRow();
// set the first column (PK) to the value passed in
pUpdateRow[0] = pKeyValue;
// loop for each row apart from the PK row
for (int addCount = 0; addCount < pReader.FieldCount; addCount++)
{
pUpdateRow[addCount + 1] = pReader[addCount];
}
// add the row to the table and then update it
pUpdateDataTable.Rows.Add(pUpdateRow);
pUpdateRow.AcceptChanges();
pUpdateRow.SetModified();
}
The following code is used to actually do the update:
updateAdapter.Fill(UpdateTable);
updateAdapter.Update(UpdateTable);
UpdateTable.AcceptChanges();
The following is used to create the data table to ensure it has the same fields/data types as the source data
private static DataTable CreateDataTable(DbDataReader pReader)
{
DataTable schemaTable = pReader.GetSchemaTable();
DataTable resultTable = new DataTable(<tableName>); // edited out personal info
// loop for each row in the schema table
try
{
foreach (DataRow dataRow in schemaTable.Rows)
{
// create a new DataColumn object and set values depending
// on the current DataRows values
DataColumn dataColumn = new DataColumn();
dataColumn.ColumnName = dataRow["ColumnName"].ToString();
dataColumn.DataType = Type.GetType(dataRow["DataType"].ToString());
dataColumn.ReadOnly = (bool)dataRow["IsReadOnly"];
dataColumn.AutoIncrement = (bool)dataRow["IsAutoIncrement"];
dataColumn.Unique = (bool)dataRow["IsUnique"];
resultTable.Columns.Add(dataColumn);
}
}
catch (Exception ex)
{
message = "Unable to create data table " + ex.Message;
throw new Exception(message, ex);
}
return resultTable;
}
In case anyone is interested I did manage to get around the problem, but never managed to get the data adapter to work. Basically what I did was as follows:
Create a list of objects with an index and a list of field values as members
Read in the rows that have changed and store the values from the source data (i.e. the values that will overwrite the current ones in the object). In addition I create a comma separated list of the indexes
When I am finished I use the comma separated list in a sql IN statement to return the rows and load them into my data adapter
For each one I run a LINQ query against the index and extract the new values, updating the data set. This sets the row status to modified
I then run the update and the rows are updated correctly.
This isn't the quickest or neatest solution, but it does work and allows me to run the changes in batches.
Thanks
Andrew
I'm doing operation on a dataset containing data from a sql table named Test_1 and then get the updated records using the DataSet.GetChanges(DataRowState.Modified) function. Then i try to save the dataset containing the updated records on a different table than the source one (the table is names Test and has the same structure as Test_1) using the following statement:
sqlDataAdapter.Update(changesDataSet,"Test");
I'm getting the following error : Update unable to find TableMapping['Test'] or DataTable 'Test'
I'm new to ado.net and don't even know if it"s something possible. Any advice is welcome.
Just to provide a bit of context. ETL jobs are importing data in temp table with same structure as the original but with _jobid suffix. Right after a rule engine is doing validation before updating the original table.
why don't you create an update trigger on the table instead that will insert into the other table?
if test and test_1 contain are totally equal, this works:
DataSet1 ds = new DataSet1();
var da = new DataSet1TableAdapters.DepositTableAdapter();
var da1 = new DataSet1TableAdapters.Deposit_1TableAdapter();
da.Fill(ds.Deposit);
foreach (DataSet1.DepositRow row in ds.Deposit.Rows)
{
if (row.ID == 3)
{
row.Amount++;
}
foreach (var c in row.ItemArray)
{
Console.Write(c);
}
Console.WriteLine("");
}
Console.WriteLine(ds.Deposit.GetChanges(System.Data.DataRowState.Modified).Rows);
var updateTable = new DataSet1.Deposit_1DataTable();
foreach (DataSet1.DepositRow row in ds.Deposit.GetChanges(System.Data.DataRowState.Modified).Rows)
{
updateTable.ImportRow((System.Data.DataRow)row);
}
da1.Update(updateTable);
Is test_1 everytime empty after the rule engine worked with it?
Do test_1 and test have the totally exact rows in them?
I hope i can provide further help if you answear my questions.