I want to insert GDAL OSGeo.OGR feature into my SQL Server database. I make a connection to SQL Server and retrieve some features from table by GDAL OGR and I want to insert these rows (Geometry) besides some columns values into another table:
string sqlConn = "MSSQL:server=[MyServer];database=[MyDB];driver={ODBC Driver 13 for SQL Server};UId=[MyUI];PWD=[MyPWD];";
Ogr.RegisterAll();
DataSource ds = Ogr.Open(sqlConn, 0);
string request = "SELECT * FROM [MyTable] WHERE [MyCondition]";
Layer serverLayer = ds.ExecuteSQL(request, null, "");
Feature memFeature;
while ((memFeature = serverLayer.GetNextFeature()) != null)
{
Geometry memGeom = memFeature.GetGeometryRef();
}
How to insert this Geometry (OSGeo.OGR.Geometry) besides some columns values into a SQL Server database?
Related
I am using this method to upload data to SQL.
private void button5_Click(object sender, EventArgs e)
{
string filepath = textBox2.Text;
string connectionString_i = string.Format(#"Provider=Microsoft.Jet.OleDb.4.0; Data Source={0};Extended Properties=""Text;HDR=YES;FMT=Delimited""", Path.GetDirectoryName(filepath));
using (OleDbConnection connection_i = new OleDbConnection(connectionString_i))
{
connection_i.Open();
OleDbCommand command = new OleDbCommand ("Select * FROM [" + Path.GetFileName(filepath) +"]", connection_i);
command.CommandTimeout = 180;
using (OleDbDataReader dr = command.ExecuteReader())
{
string sqlConnectionString = MyConString;
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnectionString))
{
SqlBulkCopy bulkInsert = new SqlBulkCopy(sqlConnectionString);
bulkInsert.BulkCopyTimeout = 180;
bulkInsert.DestinationTableName = "Table_Name";
bulkInsert.WriteToServer(dr);
MessageBox.Show("Upload Successful!");
}
}
connection_i.Close();
}
}
I have an Excel sheet in .CSV format of about 1,048,313 entries. That bulk copy method is just working for about 36000 to 60000 entries. I want to ask if there is any way that I can select the first 30000 entries from Excel and upload them to a SQL Server table, then again select next chunk of 30000 rows and upload those to SQL Server, and so on until the last entry has been stored.
Create a datatable to store the values from your csv file that needs to be inserted into your target table. Each column in the datatable would correspond to a data column in the csv file.
Create a custom data type (table-valued) on SQL Server to match your data table, including data type and length. As this post was tagged sql-server and not access, as your sample connection string seems to contradict that.
Using a text reader and a counter variable, populate your datatable with 30,000 records.
Pass the data table to your insert query or stored procedure. The pararameter type is SqlDbType.Structured.
In the event that the job fails and you need to restart, the first step could be to determine the last inserted value from a predefined key in your field. Your could also use a left outer join as part of your insert query to only insert records that do not exist on the table. These are just a few of the more common techniques to restart a failed ETL job.
This technique has some tactical advantages over the bulk copy as it adds flexibility and is less coupled to the target table, thus changes to the table could be less volatile, depending on the nature of the change.
I want user to transfer data from Excel files into SQL using a C# WinForms application one file at a time. The Excel files consist of similar columns, and so there might be some new columns or columns absent. Row data will vary.
For example:
Excel file 1: Name, City State
Excel file 2: Name, City, Zip
Excel file 3: Name, City, County
In my existing SQL table I have columns: Name, City, Population, Schools
How do I insert the new Excel files with similar column names into an existing SQL database?
My thought so far is to copy the new Excel file data into temporary tables, and then insert that into the existing SQL table. The problem is, I don't know how to write C# code (or a SQL query) that would insert new Excel data with more or less columns than the existing SQL table.
You need No-SQL for this purpose, if you need to enter columns that are not already part of the table then sql is not a good option if you use c# to alter table then be careful to the consequences. If you are sure about all possible column names in front then try altering your table before you start insert
This should't be too hard. Export your data from Excel to a staging table table in SQL Server. The C# code may look something like this.
public void importdatafromexcel(string excelfilepath)
{
//declare variables - edit these based on your particular situation
string ssqltable = "tdatamigrationtable";
// make sure your sheet name is correct, here sheet name is sheet1, so you can change your sheet name if have
different
string myexceldataquery = "select student,rollno,course from [sheet1$]";
try
{
//create our connection strings
string sexcelconnectionstring = #"provider=microsoft.jet.oledb.4.0;data source=" + excelfilepath +
";extended properties=" + "\"excel 8.0;hdr=yes;\"";
string ssqlconnectionstring = "server=mydatabaseservername;user
id=dbuserid;password=dbuserpassword;database=databasename;connection reset=false";
//execute a query to erase any previous data from our destination table
string sclearsql = "delete from " + ssqltable;
sqlconnection sqlconn = new sqlconnection(ssqlconnectionstring);
sqlcommand sqlcmd = new sqlcommand(sclearsql, sqlconn);
sqlconn.open();
sqlcmd.executenonquery();
sqlconn.close();
//series of commands to bulk copy data from the excel file into our sql table
oledbconnection oledbconn = new oledbconnection(sexcelconnectionstring);
oledbcommand oledbcmd = new oledbcommand(myexceldataquery, oledbconn);
oledbconn.open();
oledbdatareader dr = oledbcmd.executereader();
sqlbulkcopy bulkcopy = new sqlbulkcopy(ssqlconnectionstring);
bulkcopy.destinationtablename = ssqltable;
while (dr.read())
{
bulkcopy.writetoserver(dr);
}
oledbconn.close();
}
catch (exception ex)
{
//handle exception
}
}
Then, in SQL Server, move the data from the staging table to your final production table. Your SQL may look something like this.
insert into production
select ...
from staging
where not exists
(
select 1 from staging
where staging.key = production.key
)
That's my .02. I think you will have a lot more control over the whole process that way.
If you are reading this question then my answer to this other question may be helpful for you (just create a Class to handle each row information, process the excel file and perform a bulk insert like explained there):
Bulk insert is not working properly in Azure SQL Server
I hope it helps.
I have been asked to look at finding the most efficient way to take a DataTable input and write it to a SQL Server table using C#. The snag is that the solution must use ODBC Connections throughout, this rules out sqlBulkCopy. The solution must also work on all SQL Server versions back to SQL Server 2008 R2.
I am thinking that the best approach would be to use batch inserts of 1000 rows at a time using the following SQL syntax:
INSERT INTO dbo.Table1(Field1, Field2)
SELECT Value1, Value2
UNION
SELECT Value1, Value2
I have already written the code the check if a table corresponding to the DataTable input already exists on the SQL Server and to create one if it doesn't.
I have also written the code to create the INSERT statement itself. What I am struggling with is how to dynamically build the SELECT statements from the rows in the data table. How can I access the values in the rows to build my SELECT statement? I think I will also need to check the data type of each column in order to determine whether the values need to be enclosed in single quotes (') or not.
Here is my current code:
public bool CopyDataTable(DataTable sourceTable, OdbcConnection targetConn, string targetTable)
{
OdbcTransaction tran = null;
string[] selectStatement = new string[sourceTable.Rows.Count];
// Check if targetTable exists, create it if it doesn't
if (!TableExists(targetConn, targetTable))
{
bool created = CreateTableFromDataTable(targetConn, sourceTable);
if (!created)
return false;
}
try
{
// Prepare insert statement based on sourceTable
string insertStatement = string.Format("INSERT INTO [dbo].[{0}] (", targetTable);
foreach (DataColumn dataColumn in sourceTable.Columns)
{
insertStatement += dataColumn + ",";
}
insertStatement += insertStatement.TrimEnd(',') + ") ";
// Open connection to target db
using (targetConn)
{
if (targetConn.State != ConnectionState.Open)
targetConn.Open();
tran = targetConn.BeginTransaction();
for (int i = 0; i < sourceTable.Rows.Count; i++)
{
DataRow row = sourceTable.Rows[i];
// Need to iterate through columns in row, getting values and data types and building a SELECT statement
selectStatement[i] = "SELECT ";
}
insertStatement += string.Join(" UNION ", selectStatement);
using (OdbcCommand cmd = new OdbcCommand(insertStatement, targetConn, tran))
{
cmd.ExecuteNonQuery();
}
tran.Commit();
return true;
}
}
catch
{
tran.Rollback();
return false;
}
}
Any advice would be much appreciated. Also if there is a simpler approach than the one I am suggesting then any details of that would be great.
Ok since we cannot use stored procedures or Bulk Copy ; when I modelled the various approaches a couple of years ago, the key determinant to performance was the number of calls to the server. So batching a set of MERGE or INSERT statements into a single call separated by semi-colons was found to be the fastest method. I ended up batching my SQL statements. I think the max size of a SQL statement was 32k so I chopped up my batch into units of that size.
(Note - use StringBuilder instead of concatenating strings manually - it has a beneficial effect on performance)
Psuedo-code
string sqlStatement = "INSERT INTO Tab1 VALUES {0},{1},{2}";
StringBuilder sqlBatch = new StringBuilder();
foreach(DataRow row in myDataTable)
{
sqlBatch.AppendLine(string.Format(sqlStatement, row["Field1"], row["Field2"], row["Field3"]));
sqlBatch.Append(";");
}
myOdbcConnection.ExecuteSql(sqlBatch.ToString());
You need to deal with batch size complications, and formatting of the correct field data types in the string-replace step, but otherwise this will be the best performance.
Marked solution of PhillipH is open for several mistakes and SQL injection.
Normally you should build a DbCommand with parameters and execute this instead of executing a self build SQL statement.
The CommandText must be "INSERT INTO Tab1 VALUES ?,?,?" for ODBC and OLEDB, SqlClient needs named parameters ("#<Name>").
Parameters should be added with the dimensions of underlaying column.
I am using jQuery to post back values and its successfully posting back to server I checked with Mozilla FireBug. I am using these values in the insert query in .CS file to insert data in a table. The same query runs successfully in SQL Server Management Studio but when I use this query in .CS file it's not running.
Here is my code:
public static bool SaveCell(string row, string column)
{
var con = new SqlConnection("Data Source=local;Initial Catalog=Test;Integrated Security=True");
using (con)
using (var command = new SqlCommand("Insert into Match_Subcategory_BusinessSector5(SubCategoryID, BusinessSector5ID)"+
"Values("+
"(Select [SubCategory].ID from SubCategory where Kategorie = '#SubCategory')," +
"(SELECT [BusinessSector5].ID FROM BusinessSector5 where Description_DE = '#BusinessSector5'));",con))
{
command.Parameters.AddWithValue("#BusinessSector5", row);
command.Parameters.AddWithValue("#SubCategory", column);
con.Open();
command.ExecuteNonQuery();
}
return true;
}
I am getting this error:
The value NULL can not be inserted into the SubCategoryID column, Test.dbo.Match_Subcategory_BusinessSector5 table. The column does not allow nulls.
Chnage
'#SubCategory'
to
#SubCategory
And
'#BusinessSector5'
to
#BusinessSector5
When using parameterized query you don't need to add anything arround the parameter name, it is not combined in your code, but being sent to the server separately (it sends the sql as you wrote it and a list of parameters). Because of that, you are protected againts sql injections and related problems.
"(Select [SubCategory].ID from SubCategory where Kategorie = #SubCategory)," +
"(SELECT [BusinessSector5].ID FROM BusinessSector5 where Description_DE = #BusinessSector5));",con))
remove quotes from your query
I'm working with 2 SQL 2008 Servers on different machines. The server names are source.ex.com, and destination.ex.com.
destination.ex.com is linked to source.ex.com and the appropriate permissions are in place for source.ex.com to write to a database called bacon-wrench on destination.ex.com
I've logged into source.ex.com via SMS and tested this query (successfully):
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (4,6);
In a C# .NET 4.0 WebPage I connect to source.ex.com and perform a similar query (successfully):
using(SqlConnection c = new SqlConnection(ConfigurationManager.ConnectionStrings["SOURCE"].ConnectionString))
{
c.Open();
String sql = #"
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (34,56);";
using(SqlCommand cmd = new SqlCommand(sql, c))
{
cmd.ExecuteNonQuery();
}
}
For small sets of insert statements (say 20 or less) doing something like this performs fine:
using(SqlConnection c = new SqlConnection(ConfigurationManager.ConnectionStrings["SOURCE"].ConnectionString))
{
c.Open();
String sql = #"
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (34,56);
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (22,11);
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (33,55);
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES (1,2);";
using(SqlCommand cmd = new SqlCommand(sql, c))
{
cmd.ExecuteNonQuery();
}
}
I'm trying to do something like this with around 20000 records. The above method takes 11 minutes to complete -- which I assume is the server sreaming at me to make it some kind of bulk operation. From other StackOverflow threads the SqlBulkCopy class was recommended and it takes as a parameter DataTable, perfect!
So I build a DataTable and attempt to write it to the server (fail):
DataTable dt = new DataTable();
dt.Columns.Add("PunchID", typeof(int));
dt.Columns.Add("BaconID", typeof(int));
for(int i = 0; i < 20000; i++)
{
//I realize this would make 20000 duplicate
//rows but its not important
dt.Rows.Add(new object[] {
11, 33
});
}
using(SqlConnection c = new SqlConnection(ConfigurationManager.ConnectionStrings["SOURCE"].ConnectionString))
{
c.Open();
using(SqlBulkCopy bulk = new SqlBulkCopy(c))
{
bulk.DestinationTableName = "[destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]";
bulk.ColumnMappings.Add("PunchID", "PunchID");
bulk.ColumnMappings.Add("BaconID", "BaconID");
bulk.WriteToServer(dt);
}
}
EDIT2: The below message is what I'm attempting to fix:
The web page crashes at bulk.WriteToServer(dt); with an error message Database bacon-wrench does not exist please ensure it is typed correctly. What am I doing wrong? How do I change this to get it to work?
EDIT1:
I was able to speed up the query significantly using the below syntax. But it is still very slow for such a small record set.
using(SqlConnection c = new SqlConnection(ConfigurationManager.ConnectionStrings["SOURCE"].ConnectionString))
{
c.Open();
String sql = #"
INSERT INTO [destination.ex.com].[bacon-wrench].[dbo].[tblFruitPunch]
(PunchID, BaconID) VALUES
(34,56),
(22,11),
(33,55),
(1,2);";
using(SqlCommand cmd = new SqlCommand(sql, c))
{
cmd.ExecuteNonQuery();
}
}
If you are using SQL Server 2008+, you can introduce a Table user datatype. Prepare the type, receiving table and stored procedure something like below. Data type and stored procedure is on the local system. I generally have an if statement in the code detecting whether the table is remote or local, remote I do this, local I use SqlBulkCopy.
if(TYPE_ID(N'[Owner].[TempTableType]') is null)
begin
CREATE TYPE [Owner].[TempTableType] AS TABLE ( [PendingID] uniqueidentifier, [Reject] bit)
end
IF NOT EXISTS (SELECT * FROM [LinkedServer].[DatabaseOnLS].sys.tables where name = 'TableToReceive')
EXEC('
CREATE TABLE [DatabaseOnLS].[Owner].[TableToReceive] ( [PendingID] uniqueidentifier, [Reject] bit)
') AT [LinkedServer]
else
EXEC('
TRUNCATE TABLE [DatabaseOnLS].[Owner].[TableToReceive]
') AT [LinkedServer]
CREATE PROCEDURE [Owner].[TempInsertTable]
#newTableType TempTableType readonly
AS
BEGIN
insert into [LinkedServer].[DatabaseOnLS].[Owner].[TableToReceive] select * from #newTableType
END
In the C# code you can then do something like this to insert the DataTable into the table on the linked server (I'm using an existing UnitOfWork, which already have a connection and transaction):
using (var command = new SqlCommand("TempInsertTable",
oUoW.Database.Connection as SqlConnection) { CommandType = CommandType.StoredProcedure }
)
{
command.Transaction = oUoW.Database.CurrentTransaction as SqlTransaction;
command.Parameters.Add(new SqlParameter("#newTableType", oTempTable));
drResults = command.ExecuteReader();
drResults.Close();
}
After trying a number of things including linked server settings, collations, synonyms, etc., I eventually got to this error message:
Inserting into remote tables or views is not allowed by using the BCP utility or by using BULK INSERT.
Perhaps you can bulk insert to a staging table on your local server (your code works fine for this) and then insert from that staging table to your linked server from there, followed by a local delete of the staging table. You'll have to test for performance.