I have a simple table on SQL Server 2016:
CREATE TABLE AgentDownload
(
Download VARCHAR(max)
)
I'm using the below code to populate this table using the SqlBulkCopy class in C#.
var agentDataTable = new DataTable();
agentDataTable.Clear();
agentDataTable.Columns.Add("Download");
var agentDownloadRow = agentDataTable.NewRow();
var veryLongString = "aaa..." // 200,000 a's repeated
agentDownloadRow["Download"] = veryLongString;
agentDataTable.Rows.Add(agentDownloadRow);
using (var connection = new SqlConnection("Data Source=Server;Initial Catalog=Database;Integrated Security=True;"))
{
connection.Open();
var transaction = connection.BeginTransaction();
using (var sbCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.CheckConstraints, transaction))
{
sbCopy.BulkCopyTimeout = 0;
sbCopy.BatchSize = 10000;
sbCopy.DestinationTableName = "AgentDownload";
sbCopy.WriteToServer(agentDataTable);
}
transaction.Commit();
}
On the database when I retrieve this entry, the cell value is truncated at 65,535 characters. This is a limit of some sort but I'm not sure where this is coming from or if there is any way around this? The column can contain far more characters and a string type in C# can also contain far more characters. Is there any to do this operation?
I thought this might be a limitation of SqlBulkCopy class but using alternative code such as below also produces the same result:
using (var connection = new SqlConnection("Data Source=Server;Initial Catalog=Database;Integrated Security=True;"))
{
connection.Open();
var transaction = connection.BeginTransaction();
var cmd = connection.CreateCommand();
cmd.Transaction = transaction;
cmd.CommandText = "insert into AgentDownload (Download) values (CAST('' AS VARCHAR(MAX)) + #testVariable)";
var parameter = new SqlParameter
{
ParameterName = "testVariable",
DbType = DbType.String,
Size = -1,
Value = veryLongString
};
cmd.Parameters.Add(parameter);
cmd.ExecuteNonQuery();
transaction.Commit();
}
It seems you are using SSMS to view the bulk inserted data. SSMS truncates large values by default to 65535 characters.
To see the entire value in the results grid for the current query window, change the Max Characters Retrieved value from the SSMS menu under Query-->Query Options-->Results-->Grid.
One can also specify the value for all new query windows under Tools-->Options-->Query Results-->SQL Server-->Results to Grid. However, consider the implications on client memory requirements with many rows containing large values. I suggest one change the global option judiciously.
Related
I'm trying to import data into a SQL Server database from a .csv file. I have just one problem: for the money row, I am throwing Format.Exception due to wrong format of money variable.
I tried to convert to double I change the period instead of comma, I change in split(); method also semicolon ; instead of comma , but the exception didn't go away. Does anyone know what to do about this?
It is just an experiment.
My .csv file looks like this:
Database table's columns are:
name, second_Name, nickname, money
Code:
public void Import()
{
SqlCommand command = null;
var lineNumber = 0;
using (SqlConnection conn = DatabaseSingleton.GetInstance())
{
// conn.Open();
using (StreamReader reader = new StreamReader(#"C:\Users\petrb\Downloads\E-Shop\E-Shop\dataImport.csv"))
{
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
if (lineNumber != 0)
{
var values = line.Split(',');
using (command = new SqlCommand("INSERT INTO User_Shop VALUES (#name, #second_Name, #nickname, #money", conn))
{
command.Parameters.Add(new SqlParameter("#name", values[0].ToString()));
command.Parameters.Add(new SqlParameter("#second_Name", values[1].ToString()));
command.Parameters.Add(new SqlParameter("#nickname", values[2].ToString()));
command.Parameters.Add(new SqlParameter("#money", Convert.ToDecimal(values[3].ToString())));
command.Connection = conn;
command.ExecuteNonQuery();
}
}
lineNumber++;
}
}
conn.Close();
}
Console.WriteLine("Products import completed");
Console.ReadLine();
}
I maintain a package Sylvan.Data.Csv that makes it very easy to bulk import CSV data into SQL Server, assuming the shape of your CSV file matches the target table.
Here is some code that demonstrates how to do it:
SqlConnection conn = ...;
// Get the schema for the target table
var cmd = conn.CreateCommand();
cmd.CommandText = "select top 0 * from User_Shop";
var reader = cmd.ExecuteReader();
var tableSchema = reader.GetColumnSchema();
// apply the schema of the target SQL table to the CSV data.
var options =
new CsvDataReaderOptions {
Schema = new CsvSchema(tableSchema)
};
using var csv = CsvDataReader.Create("dataImport.csv", options);
// use sql bulk copy to bulk insert the data
var bcp = new SqlBulkCopy(conn);
bcp.BulkCopyTimeout = 0;
bcp.DestinationTableName = "User_Shop";
bcp.WriteToServer(csv);
On certain .NET framework versions GetColumnSchema might not exist, or might throw NotSupportedException. The Sylvan.Data v0.2.0 library can be used to work around this. You can call the older GetSchemaTable API, then use the Sylvan.Data.Schema type to convert it to the new-style schema IReadOnlyCollection<DbColumn>:
DataTable schemaDT = reader.GetSchemaTable();
var tableSchema = Schema.FromSchemaTable(schemaDT);
Try this:
SqlParameter moneyParam = new SqlParameter("#money", SqlDbType.Money);
moneyParam.Value = SqlMoney(Convert.ToDecimal(values[3].ToString()))
command.Parameters.Add(moneyParam)
Not sure if it'll work, but it seems to make sense to me.
The problem being that when you use the constructor for the SQL parameter, I think it assumes the type of your variable, so in this case, as you're passing a double or whatever, the equivalent DB type is 'decimal', however your DB schema will be using the DB type 'money', so explicitly setting the DB type in your parameter constructor should help.
\I have three columns in an access database table (DATA) as shown below
I just want to delete some rows based of two conditions in the WHERE clause in the SQL query ; for ex, delete row when NAME = "A" and Date = "1/1/2017"
I used DELETE from DATA Where Name='A' and Date='1/1/2017'
This gives "type mismatch error"!
Here is the code in C#:
using (OleDbConnection thisConnection = new OleDbConnection(connectionname))
{
string deletequery = " DELETE FROM DATA WHERE [Name] = 'A' And [Date] = '1/1/2017';
OleDbCommand myAccessCommandDelete = new OleDbCommand(deletequery, thisConnection);
thisConnection.Open();
myAccessCommandDelete.ExecuteNonQuery();
thisConnection.Close();
}
The best way to pass values to a database engine that will be used in a query is through the parameters collection specifying exactly the type of the parameter
using (OleDbConnection thisConnection = new OleDbConnection(connectionname))
{
string deletequery = #"DELETE FROM DATA WHERE [Name] = #name And
[Date] = #date";
OleDbCommand myAccessCommandDelete = new OleDbCommand(deletequery, thisConnection);
thisConnection.Open();
myAccessCommandDelete.Parameters.Add("#name", OleDbType.VarWChar).Value = "A";
myAccessCommandDelete.Parameters.Add("#date", OleDbType.Date).Value = new DateTime(2017,1,1);
myAccessCommandDelete.ExecuteNonQuery();
// not needed -> thisConnection.Close();
}
In this way you don't leave space to interpretation (conversion from string to date) of your values but you tell exactly to your db engine what your value is. And of course if you specify the correct type you can't have a Type Mismatch error
I would like to insert all the id's in a sql table. The following way works but this take very long. What is the best or better way to do this to increase the speed.
using (SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString))
{
string query = "";
foreach (var id in ids) // count = 60000
{
{
query += "INSERT INTO [table] (id) VALUES (" + id + ");";
}
}
SqlCommand command = new SqlCommand(query, connection);
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
reader.Close();
}
connection.Close();
}
You can use the SqlBulkCopy to insert large amounts of data - something like this:
// define a DataTable with the columns of your target table
DataTable tblToInsert = new DataTable();
tblToInsert.Columns.Add(new DataColumn("SomeValue", typeof (int)));
// insert your data into that DataTable
for (int index = 0; index < 60000; index++)
{
DataRow row = tblToInsert.NewRow();
row["SomeValue"] = index;
tblToInsert.Rows.Add(row);
}
// set up your SQL connection
using (SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString))
{
// define your SqlBulkCopy
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
// give it the name of the destination table WHICH MUST EXIST!
bulkCopy.DestinationTableName = "BulkTestTable";
// measure time needed
Stopwatch sw = new Stopwatch();
sw.Start();
// open connection, bulk insert, close connection
connection.Open();
bulkCopy.WriteToServer(tblToInsert);
connection.Close();
// stop time measurement
sw.Stop();
long milliseconds = sw.ElapsedMilliseconds;
}
On my system (PC, 32GB RAM, SQL Server 2014) I get those 60'000 rows inserted in 135 - 185 milliseconds.
Consider Table-Valued Parameters. They are an easy way to send a batch of data into a stored procedure that will then handle them on the SQL side, and they aren't restricted in most of the other approaches you will see are (insert limits, etc).
In the database create a custom Type that has the schema of your table.
CREATE TYPE dbo.TableType AS TABLE
( ID int )
Create a DataTable that matches your table schema (including column name and order).
DataTable newTableRecords = new DataTable();
// Insert your records, etc.
Create a stored procedure that receives a table parameter, and inserts the records from that parameter into your real table.
CREATE PROCEDURE usp_InsertTableRecords
(#tvpNewTableRecords dbo.TableType READONLY)
AS
BEGIN
INSERT INTO dbo.Table(ID)
SELECT tvp.ID FROM #tvpNewTableRecords AS tvp;
END
Call the procedure from your application code, passing in your data table as a parameter.
using (connection)
{
// Configure the SqlCommand and SqlParameter.
SqlCommand insertCommand = new SqlCommand(
"usp_InsertTableRecords", connection);
insertCommand.CommandType = CommandType.StoredProcedure;
SqlParameter tvpParam = insertCommand.Parameters.AddWithValue(
"#tvpNewTableRecords", newTableRecords);
tvpParam.SqlDbType = SqlDbType.Structured;
// Execute the command.
insertCommand.ExecuteNonQuery();
}
I've had really great performance at very large volumes with this approach, and it is nice because it allows everything to be set-based without any arbitrary insert limits like the INSERT INTO (Table) VALUES (1),(2),(3)... approach.
using (var connection = new SqlConnection(...))
{
string sql = "SELECT * FROM tableA";
using (var command = new SqlCommand(sql,connection))
{
using (var reader = command.ExecuteReader(...))
{
//***************Sample Start
string sql2 = "INSERT into tableB(column1) VALUES('"+reader["column1"]+"')";
using (var command2 = new SqlCommand(sql2,connection))
{
...
}
//***************Sample End
}
}
}
By using the above code snippet, I believe its the best practice to deal with SQL in C#. Now after I retrieve a list of records from tableA, for each of the row I would like to insert into tableB.
However, it's throwing an exception
There is already an open DataReader associated with this Command which must be closed first
I know this problem can be solved by creating another method and insert into the table from there, I'm wondering if there is any other way. Thanks for any input.
You need to use a different sql connection for the insert than for the select...
...but we can do even better. You can re-write this to be one sql statement, like so:
INSERT into tableB(column1)
SELECT column1 FROM tableA
And then run it all at once like this:
string sql = "INSERT into tableB(column1, column2) SELECT column1, #othervalue As column2 FROM tableA;";
using (var connection = new SqlConnection(...))
using (var command = new SqlCommand(sql,connection))
{
command.Paramters.Add("#othervalue", SqlDbType.NVarChar, 50).Value = "something";
connection.Open();
command.ExecuteNonQuery();
}
The single sql statement is typically much faster, and you end up with less code, too. I understand that this is likely a simplified example of your real query, but I promise you: you can re-write it all as one statement.
Additionally, sometimes you still want to do some client-side processing or display with the new records after the insert or update. In that case, you still only need to send one call to the database, but there will be two separate sql statements in that single call. The final code would look more like this:
string sql = "INSERT into tableB(column1, column2) SELECT column1, #othervalue As column2 FROM tableA;"
sql += "SELECT columnn1, #othervalue As column2 FROM tableA;";
using (var connection = new SqlConnection(...))
using (var command = new SqlCommand(sql,connection))
{
command.Paramters.Add("#othervalue", SqlDbType.NVarChar, 50).Value = "something";
connection.Open();
using (var reader = command.ExecuteReader() )
{
while (reader.Read() )
{
//...
}
}
}
And because someone else brought up MARS (multiple active result sets), I'll add that while this can work, I've had mixed results using it for inserts/updates. It seems to work best when everything that shares a connection is only doing reads.
As has been mentioned in comments, you need a separate database connection for the insert. Each connection can handle one active statement at a time, and you have two here - one for the SELECT, one (at a time) for the INSERT.
Try this for instance:
string srcqry = "SELECT * FROM tableA";
using (SqlConnection srccon = new SqlConnection(ConnectionString))
using (SqlCommand srccmd = new SqlCommand(srcqry, srccon))
{
srccon.Open();
using (SqlDataReader src = srccmd.ExecuteReader())
{
string insqry = "INSERT INTO tableB(column1) VALUES(#v1)";
// create new connection and command for insert:
using (SqlConnection inscon = new SqlConnection(ConnectionString))
using (SqlCommand inscmd = new SqlCommand(insqry, inscon))
{
inscmd.Parameters.Add("#v1", System.Data.SqlDbType.NVarChar, 80);
inscon.Open();
while (src.Read())
{
inscmd.Parameters["#v1"].Value = src["column1"];
inscmd.ExecuteNonQuery();
}
}
}
}
Using parameters solves the SQL Injection vulnerability. You should always do this rather than building the query string from raw user input, or from data that you're pulling from a database, or... well, always. Write some helper methods to make it easier if you like, just make sure you do it.
aside from a bad example, why not just simplify the query to
insert into TableB (column1) select column1 from TableA
Why my query does not return results?
I'm using C#.
It returns column headers but no rows. Is there a problem with my select statement?
here's my code :
conn = new SQLiteConnection("Data Source=local.db;Version=3;New=False;Compress=True;");
DataTable data = new DataTable();
SQLiteDataReader reader;
using (SQLiteTransaction trans = conn.BeginTransaction())
{
using (SQLiteCommand mycommand = new SQLiteCommand(conn))
{
mycommand.CommandText = "SELECT * FROM TAGTABLE WHERE TAG = '"+tag+"' ;";
reader = mycommand.ExecuteReader();
}
trans.Commit();
data.Load(reader);
reader.Close();
reader.Dispose();
trans.Dispose();
}
return data;
The TAGTABLE has following fields:
TID int,
Tag varchar(500),
FilePath varchar(1000)
You don't need the transaction, try the following:
DataTable data = new DataTable();
using (SQLiteConnection conn = new SQLiteConnection("Data Source=local.db;Version=3;New=False;Compress=True;"))
using (SQLiteCommand mycommand = new SQLiteCommand(conn))
{
mycommand.CommandText = "SELECT * FROM TAGTABLE WHERE TAG = #tag;";
mycommand.Parameters.AddWithValue("#tag", tag);
conn.Open();
using (SQLiteDataReader reader = mycommand.ExecuteReader())
{
data.Load(reader);
}
}
return data;
The most likely reason this won't return anything is if the SELECT doesn't yield any results.
Also note that anything implementing the IDisposable interface can be used in conjunction with the using statement, so manual closing / disposal of objects afterwards is not required.
Notice that the SQL has changed to use a parameterized query, this will help reduce the likelihood of SQL injection attacks, and is generally cleaner.
Since you don't show sample data and what should be returend some general pointers only:
The way you build the SQL is wide open to SQL incjection (a serious security issue)
Depending on the value of tag (for example if it contains ') the above SQL Statement would do something you don't expect
Since everything is wrappend in using (a good thing!) the question is, whether there is some exception thrown inside the using block (check with the debugger)
why are you using a transaction ? I can't see any reason which makes that necessary...
Please show some sample data with param value and expected result...