How to prevent resources from being consumed by long-running queries? - c#

I hava access DB, one of my function(C#.net) need to Exec a SQL more than 4000 times with transaction.
It seems that after execution the DB file stay opened exclusively. because there is a *.ldb file, and that file stay there for a long time.
Is that caused by dispose resources incorrectly???
private int AmendUniqueData(Trans trn)
{
int reslt = 0;
foreach (DataRow dr in _dt.Rows)
{
OleDbParameter[] _params = {
new OleDbParameter("#templateId",dr["Id"].ToString()),
new OleDbParameter("#templateNumber",dr["templateNumber"].ToString())
};
string sqlUpdateUnique = "UPDATE " + dr["proformaNo"].ToString().Substring(0,2) + "_unique SET templateId = #templateId WHERE templateNumber=#templateNumber";
reslt = OleDBHelper.ExecSqlWithTran(sqlUpdateUnique, trn, _params);
if (reslt < 0)
{
throw new Exception(dr["id"].ToString());
}
}
return reslt;
}
the transaction:
using (Trans trn = new Trans())
{
try
{
int reslt=AmendUniqueData(trn);
trn.Commit();
return reslt;
}
catch
{
trn.RollBack();
throw;
}
finally
{
trn.Colse();
}
}

forget closing the database connection.

Related

MysqlConnector issue with async await and multi thread

I have this issue in production with MySQL connections where the connection is either being used or it's not opened error comes up frequently. I think when multiple requests hit the method then this issue is happening. I got stats from production were 80 different instances hit the Graphql method. I am using Sqlkata to make DB calls. When there are no parallel calls then it works fine.
ConnectionManager.cs
public IDbConnection GetConnection
{
get
{
if (_iDbConnection != null)
{
if (_iDbConnection.State != ConnectionState.Open)
{
_iDbConnection.Open();
return _iDbConnection;
}
}
if (_iDbConnection != null && _iDbConnection.State == ConnectionState.Open)
{
// close the previous connection
_iDbConnection.Close();
// open new connection
_iDbConnection.Open();
return _iDbConnection;
}
try
{
string connectionString = Configuration.GetConnectionString("seasonal");
// Register the factory
DbProviderFactories.RegisterFactory("MySqlConnector", MySqlConnectorFactory.Instance);
// Get the provider invariant names
IEnumerable<string> invariants = DbProviderFactories.GetProviderInvariantNames();
var factory = DbProviderFactories.GetFactory(invariants.FirstOrDefault());
var conn = factory.CreateConnection();
if (conn != null)
{
conn.ConnectionString = connectionString;
conn.Open();
_iDbConnection = conn;
}
}
catch (Exception e)
{
throw;
}
return _iDbConnection;
}
}
Service (constructor)
public OutboundRecordService(IConnectionManager connectionManager) : base(connectionManager)
{
IDbConnection connection = connectionManager.GetConnection;
if (connection.State != ConnectionState.Open)
{
_logger.Error($"Connection is not open, current state: {connection.State}");
}
_queryFactory = new QueryFactory(connection, new MySqlCompiler());
_logger = LogManager.GetLogger(typeof(OutboundRecordService<T>));
}
public async Task<CareMetx.Service.Models.PaginationResult<OutboundRecord>> GetAllByCriteria(OutboundRecordSearchCriteria searchCriteria)
{
try
{
// START - Test code
var arrayTask = new List<Task>();
int i = 10;
for (int c = 0; c < i; c++)
{
arrayTask.Add(Task.Factory.StartNew(() => GetDataFromDB(searchCriteria)));
}
Task.WaitAll(arrayTask.ToArray());
Console.WriteLine("All threads complete");
return await Task.FromResult<CareMetx.Service.Models.PaginationResult<OutboundRecord>>(null);
// END - Test code
}
catch (Exception ex)
{
var message = ex.Message;
}
return null;
}
private async Task<List<OutboundRecord>> GetDataFromDB(OutboundRecordSearchCriteria searchCriteria)
{
Query dbQuery = _queryFactory.Query("SELECT * FROM OutboundRecords Where BatchId = 1");
// Clone the query and get the total count
var totalCountQuery = dbQuery.Clone();
totalCountQuery = totalCountQuery.AsCount();
// Log Record Count SQL
LogSqlQuery(totalCountQuery);
// Get the total record count
var totalRecords = await totalCountQuery.FirstOrDefaultAsync<int>();
}
Exception

ODBC ERROR [HY000] [TOD][ODBC][GENESIS]VISION: Cannot insert into 'STOCKTRN',No locks available

I send SQL command to ODBC to be able to insert into another application's database.
Looping in selected forms and its rows, I create command.
"OnTransactionCommand" simply returns an sql string of related row to be sent to ODBC for table 'STOCKTRN'
But the problem is that my integration function sometimes returns the error below:
ERROR [HY000] [TOD][ODBC][GENESIS]VISION: Cannot insert into 'STOCKTRN',No locks available
I am not sure but it is probably about the number of execution. With similar data when I test it with 15-20 records it works nicely. But Working with +50 records it returns error.
Is there any kind of limit or I am missing another part ?
My function:
private void MTransaction(MyGridView gvList)
{
try
{
var m_Integration = ctx.Integration.Take(1).FirstOrDefault();
if (m_Integration != null)
{
DbProviderFactory mFactory= DbProviderFactories.GetFactory(m_Integration.ServerType);
using (DbConnection dbcon = mFactory.CreateConnection())
{
dbcon.ConnectionString = m_Integration.ConnString;
for (int i = 0; i < gvList.RowCount; i++)
{
if (Convert.ToBoolean(gvList.GetRowCellValue(i, "Select")))
{
dbcon.Open();
int m_TransactionId = Convert.ToInt32(gvList.GetRowCellValue(i, "Id"));
MTransaction m_Transaction = ctx.Transaction.Where(c => c.Id == m_TransactionId).FirstOrDefault();
using (DbTransaction dbtr = dbcon.BeginTransaction())
{
try
{
using (DbCommand dbcmd = dbcon.CreateCommand())
{
dbcmd.Transaction = dbtr;
if (m_Transaction != null)
{
int TransactionRowCounter = 1;
foreach (var item in m_Transaction.TransactionRow)
{
dbcmd.CommandText = OnTransactionCommand(m_Transaction, item, TransactionRowCounter);
dbcmd.ExecuteNonQuery();
TransactionRowCounter++;
}
}
dbtr.Commit();
}
}
catch (Exception err)
{
dbtr.Rollback();
dbcon.Close();
XtraMessageBox.Show(err.Message);
continue;
}
}
dbcon.Close();
}
}
}
}
}
catch (Exception err)
{
SuccessMessage = "Error:\n" + err.Message;
}
}

MySQL OdbcCommand commands sometimes hangs?

I am running a loop in C# that reads a file and make updates to the MySQL database with MySQL ODBC 5.1 driver in a Windows 8 64-bit environment.
The operations is simple
Count +1
See if file exists
Load XML file(XDocument)
Fetch data from XDocument
Open ODBCConnection
Run a couple of Stored Procedures against the MySQL database to store data
Close ODBCConnection
The problem is that after a while it will hang on for example a OdbcCommand.ExecuteNonQuery. It is not always the same SP that it will hang on?
This is a real problem, I need to loop 60 000 files but it only last around 1000 at a time.
Edit 1:
The problem seemse to accure here hever time :
public bool addPublisherToGame(int inPublisherId, int inGameId)
{
string sqlStr;
OdbcCommand commandObj;
try
{
sqlStr = "INSERT INTO games_publisher_binder (gameId, publisherId) VALUE(?,?)";
commandObj = new OdbcCommand(sqlStr, mainConnection);
commandObj.Parameters.Add("#gameId", OdbcType.Int).Value = inGameId;
commandObj.Parameters.Add("#publisherId", OdbcType.Int).Value = inPublisherId;
if (Convert.ToInt32(executeNonQuery(commandObj)) > 0)
return true;
else
return false;
}
catch (Exception ex)
{
throw (loggErrorMessage(this.ToString(), "addPublisherToGame", ex, -1, "", ""));
}
finally
{
}
}
protected object executeNonQuery(OdbcCommand inCommandObj)
{
try
{
//FileStream file = new FileStream("d:\\test.txt", FileMode.Append, FileAccess.Write);
//System.IO.StreamWriter stream = new System.IO.StreamWriter(file);
//stream.WriteLine(DateTime.Now.ToString() + " - " + inCommandObj.CommandText);
//stream.Close();
//file.Close();
//mainConnection.Open();
return inCommandObj.ExecuteNonQuery();
}
catch (Exception ex)
{
throw (ex);
}
}
I can see that the in parameters is correct
The open and close of the connection is done in a top method for ever loop (with finally).
Edit 2:
This is the method that will extract the information and save to database :
public Boolean addBoardgameToDatabase(XElement boardgame, GameFactory gameFactory)
{
int incomingGameId = -1;
XElement tmpElement;
string primaryName = string.Empty;
List<string> names = new List<string>();
GameStorage externalGameStorage;
int retry = 3;
try
{
if (boardgame.FirstAttribute != null &&
boardgame.FirstAttribute.Value != null)
{
while (retry > -1)
{
try
{
incomingGameId = int.Parse(boardgame.FirstAttribute.Value);
#region Find primary name
tmpElement = boardgame.Elements("name").Where(c => c.Attribute("primary") != null).FirstOrDefault(a => a.Attribute("primary").Value.Equals("true"));
if (tmpElement != null)
primaryName = tmpElement.Value;
else
return false;
#endregion
externalGameStorage = new GameStorage(incomingGameId,
primaryName,
string.Empty,
getDateTime("1/1/" + boardgame.Element("yearpublished").Value),
getInteger(boardgame.Element("minplayers").Value),
getInteger(boardgame.Element("maxplayers").Value),
boardgame.Element("playingtime").Value,
0, 0, false);
gameFactory.updateGame(externalGameStorage);
gameFactory.updateGameGrade(incomingGameId);
gameFactory.removeDesignersFromGame(externalGameStorage.id);
foreach (XElement designer in boardgame.Elements("boardgamedesigner"))
{
gameFactory.updateDesigner(int.Parse(designer.FirstAttribute.Value), designer.Value);
gameFactory.addDesignerToGame(int.Parse(designer.FirstAttribute.Value), externalGameStorage.id);
}
gameFactory.removePublishersFromGame(externalGameStorage.id);
foreach (XElement publisher in boardgame.Elements("boardgamepublisher"))
{
gameFactory.updatePublisher(int.Parse(publisher.FirstAttribute.Value), publisher.Value, string.Empty);
gameFactory.addPublisherToGame(int.Parse(publisher.FirstAttribute.Value), externalGameStorage.id);
}
foreach (XElement element in boardgame.Elements("name").Where(c => c.Attribute("primary") == null))
names.Add(element.Value);
gameFactory.removeGameNames(incomingGameId);
foreach (string name in names)
if (name != null && name.Length > 0)
gameFactory.addGameName(incomingGameId, name);
return true;
}
catch (Exception)
{
retry--;
if (retry < 0)
return false;
}
}
}
return false;
}
catch (Exception ex)
{
throw (new Exception(this.ToString() + ".addBoardgameToDatabase : " + ex.Message, ex));
}
}
And then we got one step higher, the method that will trigger addBoardgameToDatabase :
private void StartThreadToHandleXmlFile(int gameId)
{
FileInfo fileInfo;
XDocument xmlDoc;
Boolean gameAdded = false;
GameFactory gameFactory = new GameFactory();
try
{
fileInfo = new FileInfo(_directory + "\\" + gameId.ToString() + ".xml");
if (fileInfo.Exists)
{
xmlDoc = XDocument.Load(fileInfo.FullName);
if (addBoardgameToDatabase(xmlDoc.Element("boardgames").Element("boardgame"), gameFactory))
{
gameAdded = true;
fileInfo.Delete();
}
else
return;
}
if (!gameAdded)
{
gameFactory.InactivateGame(gameId);
fileInfo.Delete();
}
}
catch (Exception)
{ throw; }
finally
{
if(gameFactory != null)
gameFactory.CloseConnection();
}
}
And then finally the top level :
public void UpdateGames(string directory)
{
DirectoryInfo dirInfo;
FileInfo fileInfo;
Thread thread;
int gameIdToStartOn = 1;
dirInfo = new DirectoryInfo(directory);
if(dirInfo.Exists)
{
_directory = directory;
fileInfo = dirInfo.GetFiles("*.xml").OrderBy(c=> int.Parse(c.Name.Replace(".xml",""))).FirstOrDefault();
gameIdToStartOn = int.Parse(fileInfo.Name.Replace(".xml", ""));
for (int gameId = gameIdToStartOn; gameId < 500000; gameId++)
{
try
{ StartThreadToHandleXmlFile(gameId); }
catch(Exception){}
}
}
}
Use SQL connection pooling by adding "Pooling=true" to your connectionstring.
Make sure you properly close the connection AND the file.
You can create one large query and execute it only once, I think it is a lot faster then 60.000 loose queries!
Can you show a bit of your code?

C# BulkCopy, DBF Errors (Timout, & Provider could not determine...)

I've written a small console app that I point to a folder containing DBF/FoxPo files.
It then creates a table in SQL based on each dbf table, then does a bulk copy to insert the data into SQL. It works quite well for the most part, except for a few snags..
1) Some of the FoxPro tables contain 5000000+ records and the connection expries before the insert completes..
Here is my connection string:
<add name="SQL" connectionString="data source=source_source;persist security info=True;user id=DBFToSQL;password=DBFToSQL;Connection Timeout=20000;Max Pool Size=200" providerName="System.Data.SqlClient" />
Error message:
"Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
CODE:
using (SqlConnection SQLConn = new SqlConnection(SQLString))
using (OleDbConnection FPConn = new OleDbConnection(FoxString))
{
ServerConnection srvConn = new Microsoft.SqlServer.Management.Common.ServerConnection(SQLConn);
try
{
FPConn.Open();
string dataString = String.Format("Select * from {0}", tableName);
using (OleDbCommand Command = new OleDbCommand(dataString, FPConn))
using (OleDbDataReader Reader = Command.ExecuteReader(CommandBehavior.SequentialAccess))
{
tbl = new Table(database, tableName, "schema");
for (int i = 0; i < Reader.FieldCount; i++)
{
col = new Column(tbl, Reader.GetName(i), ConvertTypeToDataType(Reader.GetFieldType(i)));
col.Nullable = true;
tbl.Columns.Add(col);
}
tbl.Create();
BulkCopy(Reader, tableName);
}
}
catch (Exception ex)
{
// LogText(ex, #"C:\LoadTable_Errors.txt", tableName);
throw ex;
}
finally
{
SQLConn.Close();
srvConn.Disconnect();
}
}
private DataType ConvertTypeToDataType(Type type)
{
switch (type.ToString())
{
case "System.Decimal":
return DataType.Decimal(18, 38);
case "System.String":
return DataType.NVarCharMax;
case "System.Int32":
return DataType.Int;
case "System.DateTime":
return DataType.DateTime;
case "System.Boolean":
return DataType.Bit;
default:
throw new NotImplementedException("ConvertTypeToDataType Not implemented for type : " + type.ToString());
}
}
private void BulkCopy(OleDbDataReader reader, string tableName)
{
using (SqlConnection SQLConn = new SqlConnection(SQLString))
{
SQLConn.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(SQLConn);
bulkCopy.DestinationTableName = "schema." + tableName;
try
{
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
//LogText(ex, #"C:\BulkCopy_Errors.txt", tableName);
}
finally
{
SQLConn.Close();
reader.Close();
}
}
}
My 2nd & 3rd errors are the following:
I understand what the issues are, but how to rectify them i'm not so sure
2) "The provider could not determine the Decimal value. For example, the row was just created, the default for the Decimal column was not available, and the consumer had not yet set a new Decimal value."
3) SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.
I found a result on google that indicated what the issue is : [A]... and a possible work around [B] (but I'd like to keep my decimal values as decimal, and dates as date, as I'll be doing further calculations against the data)
What I'm wanting to do as a solution
1.) Either increase the connection time, (but i dont think i can increase it any more than i have), or alternatively is it possible to split the OleDbDataReader's results and do in incremental bulk insert?
2.)I was thinking if its possible to have bulk copy to ignore results with errors, or have the records that do error out log to a csv file or something to that extent?
So where you do the "for" statement I would probably break it up to take so many at a time :
int i = 0;
int MaxCount = 1000;
while (i < Reader.FieldCount)
{
var tbl = new Table(database, tableName, "schema");
for (int j = i; j < MaxCount; j++)
{
col = new Column(tbl, Reader.GetName(j), ConvertTypeToDataType(Reader.GetFieldType(j)));
col.Nullable = true;
tbl.Columns.Add(col);
i++;
}
tbl.Create();
BulkCopy(Reader, tableName);
}
So, "i" keeps track of the overall count, "j" keeps track of the incremental count (ie your max at one time count) and when you have created your 'batch', you create the table and Bulk Copy it.
Does that look like what you would expect?
Cheers,
Chris.
This is my current attemt at the bulk copy method, I't works for about 90% of the tables, but i get an OutOfMemory exeption, with the bigger tables... I'd like to split the reader's data into smaller secions, without having to pass it into a DataTable and store it in memory first (which is the cause of the OutOfMemory exception on the bigger result sets)
UPDATE
Imodified the code below as to how it looks in my solution.. It aint pretty.. but it works. I'll def do some refactoring, and update my answer again.
private void BulkCopy(OleDbDataReader reader, string tableName, Table table)
{
Console.WriteLine(tableName + " BulkCopy Started.");
try
{
DataTable tbl = new DataTable();
List<Type> typeList = new List<Type>();
foreach (Column col in table.Columns)
{
tbl.Columns.Add(col.Name, ConvertDataTypeToType(col.DataType));
typeList.Add(ConvertDataTypeToType(col.DataType));
}
int batch = 1;
int counter = 0;
DataRow tblRow = tbl.NewRow();
while (reader.Read())
{
counter++;
int colcounter = 0;
foreach (Column col in table.Columns)
{
try
{
tblRow[colcounter] = reader[colcounter];
}
catch (Exception)
{
tblRow[colcounter] = GetDefault(typeList[0]);
}
colcounter++;
}
tbl.LoadDataRow(tblRow.ItemArray, true);
if (counter == BulkInsertIncrement)
{
Console.WriteLine(tableName + " :: Batch >> " + batch);
counter = PerformInsert(tableName, tbl, batch);
batch++;
}
}
if (counter > 0)
{
Console.WriteLine(tableName + " :: Batch >> " + batch);
PerformInsert(tableName, tbl, counter);
}
tbl = null;
Console.WriteLine("BulkCopy Success!");
}
catch (Exception ex)
{
Console.WriteLine("BulkCopy Fail!");
SharedLogger.Write(ex, #"C:\BulkCopy_Errors.txt", tableName);
Console.WriteLine(ex.Message);
}
finally
{
reader.Close();
reader.Dispose();
}
Console.WriteLine(tableName + " BulkCopy Ended.");
Console.WriteLine("*****");
Console.WriteLine("");
}

How to move to next row in datatable if one row catches an error? C# 2.0

I am getting data from a mySql database and I am inserting it into another system. If a column has data in an incorrect format I log this and go to next row in the datatable. It works as expected but now if I have a search function in my method that gets some additional data and this function fails I want to immediately log this and go to next row. As it is now I just log it but it still gets inserted (without the value that didn't meet the search criteria).
My code:
private void updateCustomer()
{
MySqlConnection connection = new MySqlConnection("server=myServer;database=myDatabase;uid=myID;password=myPass");
MySqlCommand command = new MySqlCommand(#"mySelectCommand", connection);
DataTable customerTbl = new DataTable();
MySqlDataReader reader;
try
{
connection.Open();
reader = command.ExecuteReader();
if (reader.HasRows)
{
customerTbl.Load(reader);
}
reader.Close();
}
catch (Exception ex)
{
_out.error("Could not connect to mySql database");
}
finally
{
connection.Close();
}
foreach (DataRow row in customerTbl.Rows)
{
// Declare the customer variables
string customerID = Encoding.ASCII.GetString((byte[])row["Customer ID"]);
string ChildOf = row["Child of"].ToString();
// Create the customer object
Customer customer = new Customer();
customer.entityId = customerID;
if (ChildOf != "")
{
RecordRef parentRef = new RecordRef();
try
{
parentRef.internalId = searchCustomer(ChildOf);
}
catch
{
// If it fails here I want to log the customerID and then go to the next row in the datatable (could not find the internalid for ChildOf
_out.error(customerID + " was not updated. Error: invalid format Parent string");
}
finally
{
parentRef.typeSpecified = false;
customer.parent = parentRef;
}
}
// Invoke update() operation
WriteResponse response = _service.update(customer);
// Process the response
if (response.status.isSuccess)
{
}
else
{
_out.error(customerID + " was not updated. Error: " + getStatusDetails(response.status));
}
}
}
You need to remove the row in the catch block, and change the foreach loop to a backwards for loop to handle the removals.
I realized that I want to log the other failed fields as well. Maybe it's an inefficient way but I did something like:
bool findParent = true;
if (ChildOf != "")
{
try
{
RecordRef parentRef = new RecordRef();
parentRef.internalId = searchCustomer(ChildOf);
parentRef.typeSpecified = false;
customer.parent = parentRef;
}
catch
{
findParent = false;
_out.error(customerID + " was not inserted. Error: invalid format Parent string");
}
}
And then an if statement before trying to insert:
if (findPartner == true && findParent == true)
{
response = _service.add(customer);
// Process the response
if (response.status.isSuccess)
{
}
else
{
_out.error(customerID + " was not inserted. Error: " + getStatusDetails(response.status));
}
}
else
{
//_out.error(customerID + " was not updated. Error: " + getStatusDetails(response.status));
}
Use the row.HasError property.

Categories

Resources