I have problem with an INSERT statement. I get data from gridviev and it works fine, but I can't insert them into table
private void button3_Click(object sender, EventArgs e)
{
int amorplanid = 0;
int idn = 0;
DateTime datum;
double interest = 0;
double principal = 0;
double payment = 0;
double newprincipal = 0;
string nizz = "";
string[] niz= new string[7];
for (int x = 0; x < dataGridView1.Rows.Count-1; x++)
{
for (int j = 0; j < dataGridView1.Rows[x].Cells.Count; j++)
{
nizz += dataGridView1.Rows[x].Cells[j].Value.ToString()+".";
}
niz = nizz.Split('.');
amorplanid = System.Convert.ToInt32(niz[0]);
idn = System.Convert.ToInt32(niz[1]);
// datum = System.Convert.ToDateTime(niz[2]);
datum = DateTime.Now;
interest = System.Convert.ToDouble(niz[3]);
principal = System.Convert.ToDouble(niz[4]);
payment = System.Convert.ToDouble(niz[5]);
newprincipal = System.Convert.ToDouble(niz[6]);
String insert = #"INSERT INTO AmortPlanCoupT(ID, AmortPlanID, CoupDate, Interest, Principal, Payxment, NewPrincipal) VALUES (" + idn + "," + amorplanid + "," + datum + "," + (float)interest + "," + (float)principal + "," + (float)payment + "," + (float)newprincipal + ")";
SqlConnection myconn = new SqlConnection(conn);
// String MyString = #"INSERT INTO Employee(ID, FirstName, LastName) VALUES(2, 'G', 'M')";
try
{
myconn.Open();
SqlCommand cmd = new SqlCommand(insert, myconn);
cmd.ExecuteNonQuery();
myconn.Close();
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
label14.Text = niz[0];
}
I have created a Windows console app to test :
I have table test with two columns id (int) , leto (float);
SqlConnection MyConnection = new SqlConnection(#"Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\App_Data\Database1.mdf;Integrated Security=True;User Instance=True");
try
{
MyConnection.Open();
String MyString = #"INSERT INTO test(id, leto) VALUES(2, 2)";
SqlCommand MyCmd = new SqlCommand(MyString, MyConnection);
MyCmd.ExecuteScalar();
MyConnection.Close();
}
catch (Exception e)
{
Console.WriteLine(e.ToString());
}
I've been trying different stuff to write data to table, and just can't get them there.
As I've said before on this site - the whole User Instance and AttachDbFileName= approach is flawed - at best! Visual Studio will be copying around the .mdf file and most likely, your INSERT works just fine - but you're just looking at the wrong .mdf file in the end!
If you want to stick with this approach, then try putting a breakpoint on the myConnection.Close() call - and then inspect the .mdf file with SQL Server Mgmt Studio Express - I'm almost certain your data is there.
The real solution in my opinion would be to
install SQL Server Express (and you've already done that anyway)
install SQL Server Management Studio Express
create your database in SSMS Express, give it a logical name (e.g. Database1)
connect to it using its logical database name (given when you create it on the server) - and don't mess around with physical database files and user instances. In that case, your connection string would be something like:
Data Source=.\SQLEXPRESS;Database=Database1;Integrated Security=True
and everything else is exactly the same as before...
Before you go any further, you should rewrite your code to use parameters, instead of string concatenation.
Related
[BEGINNER]
Sorry for my English.
I've been working for a new company for a short time and I consider myself a novice in BDD, since apart from school I haven't practiced since.
[ENVIRONMENT]
I am developing in C# an existing software that I need to optimize. With a big constraint, it is not to modify the bases since many customers already use them and will not want to change their server. In general if he uses ORACLE it is because he also uses it for other software.
The software used System.Data.OracleClient, but the latter is obsolete, I started implementing Oracle.ManagedDataAccess.Client. I can communicate, read and write data with the database, but I can't send a large amount of data quickly.
On site we have a server with a remote base which is therefore with Oracle, or MySQL (what interests me for the moment is Oracle)
And there are local workstations that use an SQLite database and must be able to operate without the remote database.
By default everything is saved locally and I have synchronizations to do if the remote database is accessible. It is therefore possible to have a large amount of data to transfer. (there are measurement records)
[FOR MY TESTS]
I recovered one of the customer databases in order to do tests with real data and one of the tables with more than 650,000 rows.
Transferring this line by line takes a lot of time, even if I don't make disconnection connections on each line of course.
I wanted to try to send parameter blocks, but I can't do it at the moment.
[MY RESEARCH]
I keep looking, but what I found was either using paid DLLs or I didn't understand their use.
[MY CODE]
For the moment I left on it, really in test:
public void testOracle()
{
try
{
if (Connection == null)
{
Connection = OracleConnection();
}
string commandString = "INSERT INTO GRAPHE (ID, MESUREE_ID, DATE_MESURE, POINT_GRAPHE, D0, D1) VALUES (:IDp, :MESUREE_IDp, to_timestamp( :DATE_MESUREp ,'DD/MM/RR HH24:MI:SSXFF'), :POINT_GRAPHEp, :D0p, :D1p)";
int _ID = 1;
int _MESUREE_ID = 9624;
string _DATE_MESURE = "16/12/ 08 00:00:00,000000000";
int _POINT_GRAPHE = 1229421394;
int[] _D0 = 0;
int[] _D1 = 0;
using (OracleCommand command = new OracleCommand(commandString, Connection))
{
using (var transaction = Connection.BeginTransaction())
{
for (int i = 0; i < _ID.Length; i++)
{
command.Parameters.Add("IDp", OracleDbType.Decimal).Value = _ID;
command.Parameters.Add("MESUREE_IDp", OracleDbType.Decimal).Value = _MESUREE_ID];
command.Parameters.Add("DATE_MESUREp", OracleDbType.Varchar2).Value = _DATE_MESURE[i];
command.Parameters.Add("POINT_GRAPHEp", OracleDbType.Decimal).Value = _POINT_GRAPHE[i];
command.Parameters.Add("DOS10p", OracleDbType.Decimal).Value = _D0[i];
command.Parameters.Add("DOS07p", OracleDbType.Decimal).Value = _D1[i];
command.ExecuteNonQuery();
}
transaction.Commit();
}
}
}
catch (Exception ex)
{
}
Connection.Close();
}
public OracleConnection OracleConnection()
{
string serveur_name = "192.168.0.1";
string database_name = "oracle.dev";
string user_name = "name";
string password = "pass";
string oraclePort = "1521";
OracleConnection _con = null;
try
{
string connectionString = "Data Source=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=" + serveur_name + ")(PORT=" + oraclePort + ")) (CONNECT_DATA=(SERVICE_NAME=" + database_name + "))); User Id=" + user_name + ";Password=" + password + ";";
_con = new OracleConnection(connectionString);
try
{
_con.Open();
}
catch (Exception e)
{
}
}
catch (Exception e)
{
}
return _con;
}
}```
then you can create a DataTable and do bulk insert
using (var bulkCopy = new OracleBulkCopy(yourConnectionString, OracleBulkCopyOptions.UseInternalTransaction))
{
bulkCopy.DestinationTableName = "GRAPHE";
bulkCopy.WriteToServer(dataTable);
}
You can also use the BatchSize property as per your requirement.
Let's say I want to copy all tables with their complete data from one database to another without specifically knowing detailed information about them(column count, data types...). The user would input a connection string to his database, and all data from it would be copied to an internal DB.
I tried to achieve it by using SqlConnection and writing direct T-SQL queries and managed to write a script that creates empty tables in Internal database with correct columns:
string createDestinationTableQuery = "create table " + schemaName + ".[" + tableName + "](";
DataTable ColumnsDT = new DataTable();
string getTableColumnDataQuery = "SELECT * FROM "+originalDBName+".INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = N'" + tableName +"'";
SqlCommand getTableColumnDataCommand = new SqlCommand(getTableColumnDataQuery, originalCon);
SqlDataAdapter TableDA = new SqlDataAdapter(getTableColumnDataCommand);
TableDA.Fill(ColumnsDT);
for (int x = 0; x < ColumnsDT.Rows.Count; x++)
{
createDestinationTableQuery += "[" + ColumnsDT.Rows[x].ItemArray[3].ToString() + "] " + "[" + ColumnsDT.Rows[x].ItemArray[7].ToString() + "], ";
}
createDestinationTableQuery = createDestinationTableQuery.Remove(createDestinationTableQuery.Length - 2);
createDestinationTableQuery += " )";
SqlCommand createDestinationTableCommand = new SqlCommand(createDestinationTableQuery, destinationCon);
createDestinationTableCommand.ExecuteNonQuery();
Console.WriteLine("Table " + schemaName + "." + tableName + " created succesfully!");
However, I am struggling with data insertion as the following code simply doesn't work:
DataTable dataTable = new DataTable();
string getTableDataquery = "select * from " + originalTableWithSchema;
SqlCommand getTableDataCommand = new SqlCommand(getTableDataquery, originalCon);
SqlDataAdapter da = new SqlDataAdapter(getTableDataCommand);
da.Fill(dataTable);
for (int x = 0; x < dataTable.Rows.Count; x++)
{
string insertQuery = "insert into " + schemaName + ".["+tableName+"](" ;
string values = "VALUES(";
for (int y = 0; y < dataTable.Columns.Count; y++)
{
insertQuery += dataTable.Columns[y].ColumnName + ", ";
values += dataTable.Rows[x].ItemArray[y].ToString() + ", ";
}
insertQuery = insertQuery.Remove(insertQuery.Length - 2);
insertQuery += " )";
values = values.Remove(values.Length - 2);
values += " )";
insertQuery += " " + values;
SqlCommand insertCommand = new SqlCommand(insertQuery, destinationCon);
insertCommand.ExecuteNonQuery();
}
da.Dispose();
How can I correctly Achieve this functionality? I was thinking of maybe scrapping all the code and using SMO instead?
If you are only looking to copy the data (because you have structure creation already working), then you could use DataTable to hold the data in a non-dbms specific structure, and a DataAdapter to generate the dbms specific insert statements. Here is an excerpt from code I wrote a while ago to copy data from Access to MySQL:
List<string> tableNames = new List<string>();
try
{
// Open connect to access db
sourceConn.Open();
// Build table names list from schema
foreach (DataRow row in sourceConn.GetSchema("Tables").Select("table_type = 'TABLE'"))
tableNames.Add(row["table_name"].ToString());
}
catch (Exception ex)
{
throw ex;
}
finally
{
if(sourceConn.State != ConnectionState.Closed)
sourceConn.Close();
}
foreach (string table in tableNames)
{
//Get all table data from Access
string query = string.Format("SELECT * FROM {0}", table);
DataTable accessTable = new DataTable(table);
try
{
sourceConn.Open();
System.Data.OleDb.OleDbCommand accessSqlCommand = new System.Data.OleDb.OleDbCommand(query, accessConn);
System.Data.OleDb.OleDbDataReader reader = (System.Data.OleDb.OleDbDataReader)accessSqlCommand.ExecuteReader();
// Load all table data into accessTable
accessTable.Load(reader);
}
catch(Exception ex)
{
throw ex;
}
finally
{
if(sourceConn.State != ConnectionState.Closed)
sourceConn.Close();
}
// Import data into MySQL
accessTable.AcceptChanges();
// The table should be empty, so set everything as new rows (will be inserted)
foreach (DataRow row in accessTable.Rows)
row.SetAdded();
try
{
destConn.Open();
MySql.Data.MySqlClient.MySqlDataAdapter da = new MySql.Data.MySqlClient.MySqlDataAdapter(query, mySqlConn);
MySql.Data.MySqlClient.MySqlCommandBuilder cb = new MySql.Data.MySqlClient.MySqlCommandBuilder(da);
da.InsertCommand = cb.GetInsertCommand();
// Update the destination table 128 rows at a time
da.UpdateBatchSize = 128;
// Perform inserts (and capture row counts for output)
int insertCount = da.Update(accessTable);
}
catch (Exception ex)
{
throw ex;
}
finally
{
if(destConn.State != ConnectionState.Closed)
destConn.Close();
}
}
This could certainly be more efficient, but I wrote it for a quick conversion. Also, since this is copied and pasted you may need to tweak it. Hope it helps.
It might be worth thinking about using a linked server. Once a linked server is defined in the destination server, a table can be created and automatically filled with data using a SELECT…INTO statement.
Query executed in destination server database:
SELECT * INTO NewTableName FROM
SourceServername.SourceDatabasename.dbo.SourceTableName
This program is supposed to take a csv file and load it to a sqlite database that the use specifies a path to. Rather than trying to load the values to the selected database it creates a copy of the database in the debug folder and then throws an error because the table doesn't exist in the new db. I can't find where it's deciding to make a new file rather than using the existing one.
using (System.Data.SQLite.SQLiteConnection conn = new System.Data.SQLite.SQLiteConnection("data source=" + db3FilePath + "; Synchronous=Off"))
{
using (System.Data.SQLite.SQLiteCommand cmd = new System.Data.SQLite.SQLiteCommand(conn))
{
conn.Open();
foreach (KeyValuePair<string, CSVRecord> kvp in csvDictionary.Skip(1))
{
//checks for duplicate records
cmd.CommandText = "SELECT COUNT(*) FROM Accounts WHERE SDM_ACCT='" + kvp.Value.sdmacct + "'";
int count = Convert.ToInt32(cmd.ExecuteScalar());
if (count < 1)
{
WritetoDatabase.WritetoAccountsTable(kvp.Value.description, kvp.Value.priority, db3FilePath);
WritetoDatabase.WritetoDirectoryTable(kvp.Value.number, kvp.Value.active, db3FilePath);
recordCount++;
//updates the progress bar and text field showing %
int progress = y++ * 100 / (csvDictionary.Keys.Count -1);
progressBar1.Invoke((MethodInvoker)(() => progressBar1.Value = progress));
progressBar1.Invoke((MethodInvoker)(() => progressBar1.Update()));
lblStatus.Invoke((MethodInvoker)(() => lblStatus.Text = "Writing records to database: " + progress.ToString() + "% Complete"));
lblStatus.Invoke((MethodInvoker)(() => lblStatus.Update()));
}
else
{
WritetoDatabase.WritetoDirectoryTable(kvp.Value.number, kvp.Value.active, db3FilePath);
y++;
}
}
conn.Close();
}
}
Here is a copy of the method that writes to the database.
public static int WritetoAccountsTable(string Comment, int PriorityInt, string filePath)
{
using (System.Data.SQLite.SQLiteConnection conn = new System.Data.SQLite.SQLiteConnection("data source=" + filePath + "; Synchronous=Off"))
{
using (System.Data.SQLite.SQLiteCommand cmd = new System.Data.SQLite.SQLiteCommand(conn))
{
conn.Open();
cmd.CommandText = #"INSERT INTO Accounts(Description,Priority)
values(#Comment,#PriorityInt)";
cmd.Parameters.AddWithValue("#Comment", Comment);
cmd.Parameters.AddWithValue("#PriorityInt", PriorityInt);
return cmd.ExecuteNonQuery();
}
}
}
I've changed everything I can think of trying to find what's causing the problem. Maybe you will see something I can't.
One of the all-time dumbest things I have seen break a program. The Connection String using (System.Data.SQLite.SQLiteConnection conn = new System.Data.SQLite.SQLiteConnection("data source=" + filePath + "; Synchronous=Off")) has "data source". If it is not capitalized, "Data Source", it doesn't recognize it and goes to a default setting.
I have tried nearly every solution on this website, but I can't solve this problem. I have data that has been retrieved from a database through an ODBC connection. The data is there. It will go into a Data Grid View just fine, but I can't get this data to go into my local SQL database. Please tell me what I'm doing wrong.
public partial class frmNorth : Form
{
// variables for the connections
private OdbcConnection epnConnection = new OdbcConnection();
private SqlConnection tempDbConnection = new SqlConnection();
public frmNorth()
{
InitializeComponent();
// This is for the ePN DB
epnConnection.ConnectionString = #"Dsn=ePN; uid=username; pwd=myPa$$Word";
// This is for the local DB
tempDbConnection.ConnectionString = #"Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|\TempDB.mdf;Integrated Security=True";
}
private void btnLoadData_Click(object sender, EventArgs e)
{
try
{
//===This part works just fine===============================================================
epnConnection.Open();
string epnQuery = "SELECT FNCL_SPLIT_REC_ID, PROJ_ID, SALES_SRC_PRC " +
"FROM PROJ_FNCL_SPLIT " +
"WHERE PROJ_ID=" + textBox1.Text + "";
OdbcCommand epnCommand = new OdbcCommand(epnQuery, epnConnection);
epnCommand.CommandTimeout = 0;
//This connects the data to the data table
OdbcDataAdapter da = new OdbcDataAdapter(epnCommand);
DataTable dt = new DataTable();
da.Fill(dt);
dataGridView1.DataSource = dt;
//===========================================================================================
//======The part below is the part that wont work. The data wont go into the SQL database====
tempDbConnection.Open();
string tempSql = "";
for (int i = 0; i < dt.Rows.Count; i++)
{
tempSql = "INSERT INTO tblTemp (FNCL_SPLIT_REC_ID, PROJ_ID, SALES_SRC_PRC) VALUES ('"
+ dt.Rows[i]["FNCL_SPLIT_REC_ID"].ToString().Trim() + "','"
+ dt.Rows[i]["PROJ_ID"].ToString().Trim() + "','"
+ dt.Rows[i]["SALES_SRC_PRC"].ToString().Trim() + "');";
SqlCommand tempCommand = new SqlCommand(tempSql, tempDbConnection);
tempCommand.ExecuteNonQuery();
}
// There are no errors. The data just doesn't save to the database.
//===========================================================================================
epnConnection.Close();
tempDbConnection.Close();
}
catch (Exception ex)
{
epnConnection.Close();
tempDbConnection.Close();
MessageBox.Show("Error " + ex);
}
}
}
}
//+++++++++++++++++++This is what the table looks like+++++++++++++++++++++++++++++++++++++++++++++++
CREATE TABLE [dbo].[tblTemp] (
[FNCL_SPLIT_REC_ID] INT NOT NULL,
[PROJ_ID] NCHAR (10) NULL,
[SALES_SRC_PRC] MONEY NULL,
PRIMARY KEY CLUSTERED ([FNCL_SPLIT_REC_ID] ASC)
Like I said no errors come up. The data just doesn't save to the database.
"INSERT INTO tblTemp (FNCL_SPLIT_REC_ID, PROJ_ID, SALES_SRC_PRC) VALUES ("
+ dt.Rows[i]["FNCL_SPLIT_REC_ID"].ToString().Trim() + ",'"
+ dt.Rows[i]["PROJ_ID"].ToString().Trim() + "',"
+ dt.Rows[i]["SALES_SRC_PRC"].ToString().Trim() + ");";
Removed the ' ' between FNCL_SPLIT_REC_ID as it is int and SALES_SRC_PRC since it is money.
I found no error on code you implemented. I found that connection definition for mdf file was wrong.
|DataDirectory| setting path to the folder where application run. In this case if we run in Debug mode, it will create separate application exe in Debug\bin folder with application resources like .mdf files. or in Release mode it will create particular folders inside release folder. So you need to change Database File name for database connection or you need to give entire directory path for connection string. Example
tempDbConnection.ConnectionString = #"Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|\TempDB.mdf;Integrated Security=True";
}
replace
tempDbConnection.ConnectionString = #"Data Source=(LocalDB)\v11.0;AttachDbFilename=C:\Users\Promod\Documents\Visual Studio 2012\Projects\Contribution1\Contribution1\bin\Debug\TempDB.mdf;Integrated Security=True";
I am writing a C# application
It is an offline test application
It imports a large data file in the form of a .csv file
This file is chosen by the user using a form
I then want to store the information contained in this .csv file in the form of a local database such that I can perform sql queries
I am using Visual Studio 2012
I have never setup an sql database before and only have limited experience using sql on existing databases
My attempt so far is:
Solution explorer > Add new file > Local Database (.sdf file)
Database Explorer > Tables > Create Table
I have then added column names for all the fields setting one as my primary key
I have attempted to add a single dataset to my data table with no luck
string dbfile = new System.IO.FileInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).DirectoryName + "\\MyDatabase.sdf";
SqlCeConnection sqlConnection = new SqlCeConnection("datasource=" + dbfile);
SqlCeDataAdapter sqlAdapter = new SqlCeDataAdapter("select * from MyTable", sqlConnection);
AMCCoreSignalsDBDataSet sqlData = new AMCCoreSignalsDBDataSet();
sqlAdapter.Fill(sqlData);
string strCSVDataLine = "1,2,3,four"
sqlData.Tables[0].Rows.Add(new object[] { strCSVDataLine });
sqlAdapter.Update(sqlData);
sqlConnection.Close();
This code fails to work
How can I use C# to populate my database with the .csv data?
Is my method incorrect/incomplete?
Is there a better way to do this?
The reason I would like to use sql is because there is a lot of data. I could create a class structure to contain the data however it would also mean creating many different filter functions. Which SQL already contains...
Problems were due to blank values occurring in the .csv file
This was my fix
public void Import(string csvfname)
{
string password;
string cacheDatabase;
string connectionString;
System.IO.StreamReader objFile;
string strCommand;
string lineHeader;
string line;
string[] arrLineData;
cacheDatabase = new System.IO.FileInfo(System.Reflection.Assembly.GetExecutingAssembly().Location).DirectoryName + "\\MyDatabase.sdf"; ;
password = "";
connectionString = string.Format("DataSource=\"{0}\"; Password='{1}'", this.cacheDatabase, this.password);
objFile = new System.IO.StreamReader(csvfname);
lineHeader = objFile.ReadLine();
while (!objFile.EndOfStream)
{
line = objFile.ReadLine();
arrLineData = line.Split(',');
try
{
sqlConnection = new SqlCeConnection(connectionString());
strCommand = "INSERT INTO MyTable VALUES ('" + arrLineData[0] + "', '" + arrLineData[1] + "', '" + arrLineData[2] + "')";
SqlCeCommand sqlCommand = new SqlCeCommand(strCommand, sqlConnection);
sqlCommand.ExecuteNonQuery();
sqlConnection.Close();
}
catch (Exception exc)
{
MessageBox.Show("Error in Import(): " + exc.Message);
}
}
}