Inserting huge data inside SQL server using C# - c#

I am making use of SQL Server 2012 and have a huge file of approx 20 GB size. I want to insert every record inside file into database. I am using SqlBulkCopy class for this purpose. But since, the size of data is very huge I will have to insert it part by part. Here is the code:
String line;
SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["conStrtingName"].ConnectionString);
conn.Open();
StreamReader readFile = new StreamReader(filePath);
SqlTransaction transaction = conn.BeginTransaction();
try
{
SqlBulkCopy copy = new SqlBulkCopy(conn, SqlBulkCopyOptions.KeepIdentity, transaction);
copy.BulkCopyTimeout = 600;
copy.DestinationTableName = "Txn";
int counter = 0;
while ((line = readFile.ReadLine()) != null)
{
string[] fields = line.Split('\t');
if (fields.Length == 3)
{
DateTime date = Convert.ToDateTime(fields[0]);
decimal txnCount = Convert.ToDecimal(fields[1]);
string merchantName = fields[2];
if (!string.IsNullOrEmpty(merchantName))
{
long MerchantId = Array.IndexOf(Program.merchantArray, merchantName) + 1;
tables[workerId].Rows.Add(MerchantId, date, txnCount);
counter++;
if (counter % 100000 == 0)
Console.WriteLine("Worker: " + workerId + " - Transaction Records Read: " + counter);
if (counter % 1000000 == 0)
{
copy.WriteToServer(tables[workerId]);
transaction.Commit();
tables[workerId].Rows.Clear();
//transaction = conn.BeginTransaction();
Console.WriteLine("Worker: " + workerId + " - Transaction Records Inserted: " + counter);
}
}
}
}
Console.WriteLine("Total Transaction Records Read: " + counter);
if (tables[workerId].Rows.Count > 0)
{
copy.WriteToServer(tables[workerId]);
transaction.Commit();
tables[workerId].Rows.Clear();
Console.WriteLine("Worker: " + workerId + " - Transaction Records Inserted: " + counter);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
transaction.Rollback();
}
finally
{
conn.Close();
}
It works for first 100000 records. However for the next set of records I get an exception The transaction is either not associated with the current connection or has been completed.
This happens when the control reaches to the transaction.Commit(); for the next set of records.
Can I have a workaround?

The problem is the commented line after the transaction is commit. You need to uncomment it, and also reinitialize your SqlBulkCopy copy variable. You'd better refactor your code, the only places where you need transaction and copy object is when you flush the data table that you are filling, like this (you can further factor out the repetitive part into a separate method):
String line;
SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["conStrtingName"].ConnectionString);
conn.Open();
StreamReader readFile = new StreamReader(filePath);
SqlTransaction transaction = null;
try
{
int counter = 0;
while ((line = readFile.ReadLine()) != null)
{
string[] fields = line.Split('\t');
if (fields.Length == 3)
{
DateTime date = Convert.ToDateTime(fields[0]);
decimal txnCount = Convert.ToDecimal(fields[1]);
string merchantName = fields[2];
if (!string.IsNullOrEmpty(merchantName))
{
long MerchantId = Array.IndexOf(Program.merchantArray, merchantName) + 1;
tables[workerId].Rows.Add(MerchantId, date, txnCount);
counter++;
if (counter % 100000 == 0)
Console.WriteLine("Worker: " + workerId + " - Transaction Records Read: " + counter);
if (counter % 1000000 == 0)
{
transaction = conn.BeginTransaction()
SqlBulkCopy copy = new SqlBulkCopy(conn, SqlBulkCopyOptions.KeepIdentity, transaction);
copy.BulkCopyTimeout = 600;
copy.DestinationTableName = "Txn";
copy.WriteToServer(tables[workerId]);
transaction.Commit();
transaction = null;
tables[workerId].Rows.Clear();
Console.WriteLine("Worker: " + workerId + " - Transaction Records Inserted: " + counter);
}
}
}
}
Console.WriteLine("Total Transaction Records Read: " + counter);
if (tables[workerId].Rows.Count > 0)
{
transaction = conn.BeginTransaction()
SqlBulkCopy copy = new SqlBulkCopy(conn, SqlBulkCopyOptions.KeepIdentity, transaction);
copy.BulkCopyTimeout = 600;
copy.DestinationTableName = "Txn";
copy.WriteToServer(tables[workerId]);
transaction.Commit();
transaction = null;
tables[workerId].Rows.Clear();
Console.WriteLine("Worker: " + workerId + " - Transaction Records Inserted: " + counter);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
if (transaction != null) transaction.Rollback();
}
finally
{
conn.Close();
}
The problem thought is that now you cannot rollback ALL the changes in case something goes wrong. Probably the better solution would be to not manually splitting your bulk inserts, but use some sort of a IDataReader implementation to avoid populating a huge DataTable in memory (for instance using Marc Gravell's ObjectReader).

Your transaction is committed every 100000 sets. So it is "gone", you have to start another one then with transaction = conn.BeginTransaction.
Maybe good to rework the code to better reflect the lifespan of the transaction then. You also might to make sure that "copy" is recreated with the new transaction.

You can increase the timeout for your transaction like this (use values appropriate for the expected length of your transaction). The code below is for 15 minutes: Source
using (TransactionScope scope =
new TransactionScope(TransactionScopeOption.Required,
new System.TimeSpan(0, 15, 0)))
{
// working code here
}

Related

Update sql database from datagridview in c#

I'm new here but I need some help. I need to update a SQL Server database from C# with Windows Forms, but I'm having problems. I looked it up but still can't find the right answer. I need to do insert and update by pressing a button for changing or filling the database from the datagridview. I've created a separate function for both I am using this code;
private void InsertPositionen()
{
string qry = "";
SqlCommand insert = new SqlCommand(qry, con);
try
{
for (int i = 0; i < dataGridView1.Rows.Count - 1; i++)
{
qry = "INSERT INTO BelegePositionen (BelID, BelPosId, Artikelnummer, Menge, Preis) VALUES( " + dataGridView1.Rows[i].Cells["BelID"] + ", "
+ dataGridView1.Rows[i].Cells["BelPosId"] + ", "
+ dataGridView1.Rows[i].Cells["Artikelnummer"] + ", "
+ dataGridView1.Rows[i].Cells["Menge"] + ", "
+ dataGridView1.Rows[i].Cells["Preis"];
}
insert.ExecuteNonQuery();
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
private void UpdatePositionen()
{
string updt = "";
SqlCommand update = new SqlCommand(updt, con);
try
{
for (int i = 0; i < dataGridView1.Rows.Count -1; i++)
{
updt = "UPDATE BelegePositionen SET BelID = "
+ dataGridView1.Rows[i].Cells["BelID"] +
", BelPosID = "
+ dataGridView1.Rows[i].Cells["BelPosID"] +
", Atrikelnummer = "
+ dataGridView1.Rows[i].Cells["Artikelnummer"] +
", Menge = "
+ dataGridView1.Rows[i].Cells["Menge"] +
", Preis = "
+ dataGridView1.Rows[i].Cells["Preis"];
}
update.ExecuteNonQuery();
con.Close();
MessageBox.Show("Done!");
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
You should really NOT do your SQL stuff like this!! This leaves your code wide open for SQL injection vulnerabilities! Stop that - right now!
Instead - use parametrized queries - like this:
private void InsertPositionen()
{
string qry = "INSERT INTO BelegePositionen (BelID, BelPosId, Artikelnummer, Menge, Preis) " +
"VALUES(#BelId, #BelPosId, #ArtNr, #Menge, #Preis);";
SqlCommand insert = new SqlCommand(qry, con);
// define the parameters
insert.Parameters.Add("#BelId", SqlDbType.Int);
insert.Parameters.Add("#BelPosId", SqlDbType.Int);
insert.Parameters.Add("#ArtNr", SqlDbType.Int); // maybe this is a string?
insert.Parameters.Add("#Menge", SqlDbType.Int);
insert.Parameters.Add("#Preis", SqlDbType.Decimal, 20, 4);
try
{
// in the loop, only *set* the parameter's values
for (int i = 0; i < dataGridView1.Rows.Count - 1; i++)
{
insert.Parameters["#BelId"].Value = 1;
insert.Parameters["#BelPosId"].Value = 2;
insert.Parameters["#ArtNr"].Value = 3;
insert.Parameters["#Menge"].Value = 4;
insert.Parameters["#Preis"].Value = 99.95;
insert.ExecuteNonQuery();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
Your Question is quite vague as you state you are having problems, but not quite sure what problems you are having. It will help if you can describe what problems you are having.
In addition to what #marc_c said about sql injection, I can't see how you manage your connection to the database.
From the code it looks like you could run into a situation where you are leaving connection strings open, or not opening them at all.
using the using(...) { } will close the connections when you are done with it.
private void InsertPositionen()
{
//using the using statement you will insure that the connection is closed and resources released
using (SqlConnection connection = new SqlConnection(Properties.Settings.Default.db))
{
string cmd = "INSERT INTO BelegePositionen (BelID, BelPosId, Artikelnummer, Menge, Preis) " +
"VALUES(#BelId, #BelPosId, #ArtNr, #Menge, #Preis);";
//using the using statement will ensure any reasources are released when exiting the code block
using (SqlCommand insert = new SqlCommand(cmd, connection))
{
// define the parameters
insert.Parameters.Add("#BelId", SqlDbType.Int);
insert.Parameters.Add("#BelPosId", SqlDbType.Int);
insert.Parameters.Add("#ArtNr", SqlDbType.Int); // maybe this is a string?
insert.Parameters.Add("#Menge", SqlDbType.Int);
insert.Parameters.Add("#Preis", SqlDbType.Decimal, 20, "4");
try
{
//open the connection
insert.Connection.Open();
// in the loop, only *set* the parameter's values
for (int i = 0; i < dataGridView1.Rows.Count - 1; i++)
{
insert.Parameters["#BelId"].Value = dataGridView1.Rows[i].Cells["BelID"];
insert.Parameters["#BelPosId"].Value = dataGridView1.Rows[i].Cells["BelPosId"];
insert.Parameters["#ArtNr"].Value = dataGridView1.Rows[i].Cells["Artikelnummer"];
insert.Parameters["#Menge"].Value = dataGridView1.Rows[i].Cells["Menge"];
insert.Parameters["#Preis"].Value = dataGridView1.Rows[i].Cells["Preis"];
insert.ExecuteNonQuery();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
finally
{
MessageBox.Show("Done!");
}
}
}
}

How to use counter with odbc transaction in C#

How to implement counter, so that it commit multiple queries, for example every 1000 "queries". Problem is that with or without transaction query is executed by "ExecuteNonQuery()" and it execute one by one, not as I whant e.g. every 1000?
odbc.dbsqlite.Open();
odbc.dbkopito.Open();
OdbcCommand comlite = odbc.dbsqlite.CreateCommand();
OdbcCommand comkopit = odbc.dbkopito.CreateCommand();
OdbcTransaction transaction = null;
comkopit.CommandText =
"SELECT DISTINCT ... "
#region TRY
try
{
OdbcDataReader dbKopitReader = comkopit.ExecuteReader();
var ordinal = new
{
cenik = dbKopitReader.GetOrdinal("sifra"),
ident = dbKopitReader.GetOrdinal("ident"),
klasifikacija = dbKopitReader.GetOrdinal("klasifikacija"),
cena = dbKopitReader.GetOrdinal("cena"),
eankoda = dbKopitReader.GetOrdinal("eankoda"),
};
int stevec = 0;
while (dbKopitReader.Read())
{
var cena = Convert.ToDouble(dbKopitReader.GetDouble(ordinal.cena));
var ident = Convert.ToString(dbKopitReader.GetString(ordinal.ident));
var cenik = Convert.ToString(dbKopitReader.GetString(ordinal.cenik));
var klasi = Convert.ToString(dbKopitReader.GetString(ordinal.klasifikacija));
var eanko = Convert.ToString(dbKopitReader.GetString(ordinal.eankoda));
using (var cmd = odbc.dbsqlite.CreateCommand() )
{
try
{
cmd.CommandText = "INSERT OR REPLACE INTO ARTIKLI (KLASIFIKACIJA, CENA, BARKODA, CENIK, IDENT) " +
"VALUES (?,?,?,?,?);";
cmd.Parameters.AddWithValue("#KLASIFIKACIJA", klasi);
cmd.Parameters.AddWithValue("#CENA", cena);
cmd.Parameters.AddWithValue("#BARKODA", eanko);
cmd.Parameters.AddWithValue("#CENIK", cenik);
cmd.Parameters.AddWithValue("#IDENT", ident);
cmd.ExecuteNonQuery();
if (stevec % 1000 == 0)
{
transaction.Commit();
transaction = odbc.dbsqlite.BeginTransaction();
cmd.Transaction = transaction;
}
stevec++;
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
try
{
transaction.Rollback();
}
catch
{
Console.WriteLine("Transakcija ni aktivna");
}
}
}
}
comlite.Dispose();
odbc.dbsqlite.Close();
dbKopitReader.Close();
comkopit.Dispose();
odbc.dbkopito.Close();
Transaction is not initiated at first iteration (should get null exception).
Transaction is not assigned to command.
You creating transaction and assigning it to command every 1000th, but next 999 commands is created without transactions.
You committing transaction on very first iteration.
Here is example how your code should look like:
var transaction = odbc.dbsqlite.BeginTransaction();
int count = 0;
while(read)
{
count++;
if(count % 1000 == 0)
{
transaction.Commit();
transaction = odbc.dbsqlite.BeginTransaction();
}
//I imagine this CreateCommand() is assigning connection...
using(var cmd = odbc.dbsqlite.CreateCommand())
{
cmd.CommandText = "INSERT...";
//Params...
cmd.Transaction = transaction;
cmd.ExecuteNonQuery();
}
}

SELECT a row from SQL server perform a web request using the information and then UPDATE the same row

I am writing a program to iterate through a database table (1.4 million rows), selecting information, making a web request to geocode from Google then I want to store the longitude and latitude in the same row I got the address from.
I think there might be a better way than I have come up with as I have two connections opened to the database and the opening and closing of the connection each time is most probably going to give me some overhead.
I am aware that I could use a JSON return instead ox XML and I might change to that but I would like to figure out the database connection issue first and then a way to make this program wait for the enforced 24 hour wait period to elapse before re-using the API keys?
So here is what I have so far, I would be very grateful for any insights any of you have.
string ApiKey = apikey[a]; //I have a couple of API keys in an array
while (a < 7)
{
using (SqlConnection conn = new SqlConnection(connection))
{
for (int j = 1; j <= 1463758; j++)
{
if (count % 2498 == 0) //2500 requests per API per day
{
a = a + 1;
}
SqlCommand cmd = new SqlCommand("SELECT AddressToGeoCode FROM testDB WHERE id = #id AND Status = 'Not Done'", conn);
try
{
conn.Open();
cmd.Parameters.AddWithValue("#id", j);
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
string address = reader[0].ToString();
var requestUri = string.Format("https://maps.googleapis.com/maps/api/geocode/xml?address={0}&key={1}", Uri.EscapeDataString(address), ApiKey);
var request = WebRequest.Create(requestUri);
var response = request.GetResponse();
var xdoc = XDocument.Load(response.GetResponseStream());
string status = xdoc.Element("GeocodeResponse").Element("status").Value;
var result = xdoc.Element("GeocodeResponse").Element("result");
var locationType = result.Element("geometry").Element("location_type").Value;
var locationElement = result.Element("geometry").Element("location");
float lat = float.Parse(locationElement.Element("lat").Value.ToString());
float lng = float.Parse(locationElement.Element("lng").Value.ToString());
Console.WriteLine("Api key number: " + a + ", " + apikey[a]);
Console.WriteLine("status: " + status);
Console.WriteLine("Coordinates: " + lat + ", " + lng);
using (SqlConnection conn2 = new SqlConnection(connection))
{
SqlCommand cmd2 = new SqlCommand("UPDATE testDB SET Lat = #lat, Long = #lng, LocationType = #lType WHERE id = #id", conn2);
try
{
conn2.Open();
cmd2.Parameters.AddWithValue("#id", j);
cmd2.Parameters.AddWithValue("#lat", lat);
cmd2.Parameters.AddWithValue("#lng", lng);
cmd2.Parameters.AddWithValue("#lType", locationType);
cmd2.ExecuteNonQuery();
conn2.Close();
}
catch (Exception ex)
{
Library.WriteErrorLog("Update exception " + ex.ToString());
}
}
count = count + 1;
}
conn.Close();
}
catch (Exception e)
{
Library.WriteErrorLog("Update exception " + .ToString());
};
}
Console.Read();
}
}
it's much more efficient to update all rows with one SQL Query rather than update each row in the loop.
you can insert all items that you want to update in a tempTable and than make an update join statement and execute it outside of the loop.
bulk-record-update-with-sql is an example

inserting to SqlLite takes too long

I insert data like this but it takes too long
For 34,000 records it took 20 minutes !! (for example if I insert into SQL Server CE it took only 3 minutes)
conn_sql = new SQLiteConnection(conn_str2);
conn_sql.Open();
cmd_sql = conn_sql.CreateCommand();
for (int i = 0; i < iTotalRows; i++)
{
try
{
Makat = dsView.Tables["Items"].Rows[i]["Makat"].ToString().Trim();
}
catch { Makat = ""; }
try
{
Barcode = dsView.Tables["Items"].Rows[i]["Barcode"].ToString().Trim();
}
catch { Barcode = ""; }
try
{
Des = dsView.Tables["Items"].Rows[i]["Des"].ToString().Trim();
}
catch { Des = ""; }
try
{
Price = dsView.Tables["Items"].Rows[i]["Price"].ToString().Trim();
}
catch { Price = ""; }
SQL = "INSERT INTO Catalog(Makat,Barcode,Des,Price)VALUES('" + Makat + "','" + Barcode + "','" + Des + "','" + Price + "')";
cmd_sql.CommandText = SQL;
cmd_sql.CommandType = CommandType.Text;
cmd_sql.ExecuteNonQuery();
//cmd_sql.Dispose();
}
How to insert faster ?
SQLite implicitly wraps queries within a transaction. Beginning and committing transactions in a loop could be slowing things down. I think you should get a significant boost of speed if you start a transaction and commit it once the loop is completed:
conn_sql.Open();
using(var tran = conn_sql.BeginTransaction()) // <--- create a transaction
{
cmd_sql = conn_sql.CreateCommand();
cmd_sql.Transaction = tran;   // <--- assign the transaction to the command
              
for (int i = 0; i < iTotalRows; i++)
{
// ...
cmd_sql.CommandText = SQL;
cmd_sql.CommandType = CommandType.Text;
cmd_sql.ExecuteNonQuery();
//cmd_sql.Dispose();
}
tran.Commit(); // <--- commit the transaction
} // <--- transaction will rollback if not committed already
If you do it in a single transaction it should be faster:
SqlTransaction transaction;
try
{
conn_sql = new SQLiteConnection(conn_str2);
conn_sql.Open();
cmd_sql = conn_sql.CreateCommand();
transaction = conn_sql.BeginTransaction();
for (int i = 0; i < iTotalRows; i++)
{
// create SQL string
cmd_sql.CommandText = SQL;
cmd_sql.CommandType = CommandType.Text;
cmd_sql.ExecuteNonQuery();
}
transaction.Commit();
}
catch
{
transaction.Rollback();
}
At first, try to use StringBuilder for the string concatenation. Secondly, you are making 34k requests and this is slow, try to make less amount of requests. Let say using StringBuilder concatenate 5k insert statements in one and fire it in transaction, do this until all your data will not be saved.

SQL Server timeout in recursive code

I have a base class which implements my database connection. I have a second class that inherits this base database class. The second class has some recursion inside of it, where when evaluating it's value, it may instantiate another instance of the second class. The recursion is only a few levels deep. I am running everything single-threaded.
My code will run correctly for about 1 or 2 minutes, then I begin getting consistent errors: "Timeout Expired. The timeout period elapsed prior to obtaining a connection from the pool".
My base class has a destructor which calls the .Dispose() method on the database objects. My second class has a destructor which closes the connection object in the base class.
My connection string to the database specifies a connection timeout = 0.
Any ideas as to why the code will work correctly for a few minutes and then begin timing out trying to connect to the database? I'm baffled.
namespace BaseLib2
{
public class TSBase
{
protected StreamWriter logFile;
protected OleDbCommand queryCmd;
protected OleDbCommand exeCmd;
protected OleDbConnection connection;
protected OleDbDataReader reader;
public SqlConnection sqlconn;//used for BCP
public TSBase()
{
}
~TSBase()
{
try
{
queryCmd.Dispose();
exeCmd.Dispose();
reader.Dispose();
connection.Dispose();
sqlconn.Dispose();
}
catch (Exception ex)
{
Console.WriteLine("BaseLib2 destrutor:" + ex.Message);
}
}
public void ConnectToDB()
{
string connString = "Provider=SQLNCLI11;Server=myserver;Database=mydb;Uid=myid;pwd=password;connection timeout=0";
queryCmd = new OleDbCommand();
exeCmd = new OleDbCommand();
connection = new OleDbConnection(connString);
queryCmd.CommandTimeout = 60000;
exeCmd.CommandTimeout = 60000;
connection.Open();
queryCmd.Connection = connection;
exeCmd.Connection = connection;
string sqlConnString = "server=dc2k8housql;database=mydb;Uid=myid;pwd=password;connection timeout=0";
sqlconn = new SqlConnection(sqlConnString);
sqlconn.Open();
}
public class Expression : BaseLib2.TSBase
{
private string ExpName;
private string ExpressionTxt;
private string sql;
private DateTime Contract_dt;
private DateTime Quote_dt;
private bool SaveToDB;
private string BaseSymbol;
public Expression(string expNameIn, DateTime contract_dtIn, DateTime quote_dtIn)
{
ExpName = expNameIn;
Contract_dt = contract_dtIn;
Quote_dt = quote_dtIn;
try
{
try
{
ConnectToDB();
}
catch (Exception ex)
{
Console.WriteLine("Error in EXP constructor connecting to database." + ex.Message );
throw new Exception("Error in EXP constructor connecting to database.");
}
//get expression text from database
sql = "select expression, save_to_db, coalesce(base_symbol, '') as base_symbol from expressions where exp_name = " + DBI(ExpName);
reader = ReadData(sql);
if (reader.Read())//should only return 1 row
{
ExpressionTxt = reader[0].ToString();
SaveToDB = bool.Parse(reader[1].ToString());
BaseSymbol = reader[2].ToString();
}
reader.Close();
}
catch (Exception ex)
{
Console.WriteLine("Exception in Expression constructor:" + ex.Message);
}
}
~Expression()
{
try
{
connection.Close();
sqlconn.Close();
connection.Dispose();
sqlconn.Dispose();
}
catch (Exception ex)
{
Console.WriteLine("Error in destructor:" + ex.Message);
}
}
public double Eval()
{
try
{
//check to see if there are any $RV in the expression
if (ExpressionTxt.Contains("$RV("))
{
//parse and evaluate the $RV's
String[] split = ExpressionTxt.Split(("$".ToCharArray()));
foreach (string s in split){
Console.WriteLine("s=" + s);
if (s.Length > 3)//make sure we have a string with a symbol in it
{
//for each rv we find, create a new expression and evaluate it
if (s.Substring(0, 3).Contains("RV"))
{
int pStart = s.IndexOf("(");
int pEnd = s.IndexOf(")");
string rvSymb = s.Substring(pStart + 1, pEnd - pStart - 1);
System.Console.WriteLine(rvSymb);
Expression oExp = new Expression(rvSymb, Contract_dt, Quote_dt);
double rVal = oExp.Eval();//recursive call
oExp = null;
ExpressionTxt = ExpressionTxt.Replace("$RV(" + rvSymb + ")", rVal.ToString());
}
}
}
}
//replace SV values in formula
if (ExpressionTxt.Contains("$SV("))
{
//find symbols in $SV brackets and collect contract dates
String[] split = ExpressionTxt.Split (("$".ToCharArray()));
foreach (string s in split)
{
if (s.Length > 3)
{//make sure we have a symbol
if (s.Substring(0, 3).Contains("SV"))
{
int pStart = s.IndexOf("(");
int pEnd = s.IndexOf(")");
string svSymb = s.Substring(pStart + 1, pEnd - pStart - 1);
System.Console.WriteLine("sv=" + svSymb);
//replace $SV with numerical values
double sVal = GetQuoteValue(svSymb);
ExpressionTxt = ExpressionTxt.Replace("$SV(" + svSymb + ")", sVal.ToString());
}
}
}
}
//evaluate
double ret = Evaluate(ExpressionTxt);
Console.WriteLine(ExpName + "=" + ret.ToString());
if (SaveToDB)
{
Console.WriteLine(ExpName + " cd:" + Contract_dt.ToShortDateString() + " qd:" + Quote_dt.ToShortDateString() + ": saving to db...");
sql = "delete from exp_quotes where exp_name = " + DBI(ExpName ) ;
sql = sql + " and contract_dt = " + DBI(Contract_dt.ToShortDateString());
sql = sql + " and quote_dt = " + DBI(Quote_dt.ToShortDateString());
WriteData(sql);
sql = "insert into exp_quotes(exp_name, contract_dt, quote_dt, calculated_dt, price) values(";
sql = sql + DBI(ExpName ) + "," + DBI(Contract_dt.ToShortDateString()) + "," + DBI(Quote_dt.ToShortDateString());
sql = sql + ", getdate(), " + ret + ")";
WriteData(sql);
}
connection.Close();//after we evaluate, close down the connection
connection.Dispose();
return ret;
//return value
}
catch (Exception ex)
{
Console.WriteLine("exp:" + ExpName + " cd:" + Contract_dt.ToShortDateString() + " qd:" + Quote_dt.ToShortDateString() + " = " + ex.Message);
}
return 0;
}
private double GetQuoteValue(string symbIn)
{
double ret = 0;
sql = "select close_val from prices_union_all_vw where symbol = " + DBI(symbIn) + " and contract_dt = " + DBI(Contract_dt.ToShortDateString()) + " and quote_dt = " + DBI(Quote_dt.ToShortDateString());
reader = ReadData(sql);
if (reader.Read())
{
ret = Double.Parse(reader[0].ToString());
reader.Close();
}
else
{//we didn't get a record for the specific quote date, try again using the mostrecent view
sql = "select close_val from prices_union_all_mostrecent_vw where symbol = " + DBI(symbIn) + " and contract_dt = " + DBI(Contract_dt.ToShortDateString());
reader = ReadData(sql);
if (reader.Read())
{
ret = Double.Parse(reader[0].ToString());
}
reader.Close();
}
return ret;
}
private static double Evaluate(string expression)
{
var loDataTable = new DataTable();
var loDataColumn = new DataColumn("Eval", typeof(double), expression);
loDataTable.Columns.Add(loDataColumn);
loDataTable.Rows.Add(0);
return (double)(loDataTable.Rows[0]["Eval"]);
}
You are exhausting your available pool of connections because you are creating a connection to the database for every Expression and sub-Expression that you parse, and they are not being cleaned up in time to be re-used.
Solution: Do not make connections recursively, or iteratively, or whatever. Make one for one purpose and just use it. And if you need to release a connection in-time for you to re-use it, do NOT rely on class destructors, because they do not run when you want them to.
In general, classes that try to allocate limited external resources (like connections) implicitly in their Initializers should be pretty static objects, and you certainly do not normally want to inherit them in a class that is intended to create objects as dynamically as a parser.
Have you tried extending the timeout period?
Add a big timeout to the connection string like "Connect Timeout=1800". This usually helps me when I get such messages.
The other thing you can see is if you can improve the query more.
You might check your Max connection setting for the database. Also check how many active connections are open when new connection attempts start to time out.
How to determine total number of open/active connections in ms sql server 2005

Categories

Resources