I am creating a mechanism to bulk-insert (import) a lot of new records into a ORACLE database. I am using multiple threads and dependent transactions:
Creation of threads:
const int ThreadCount = 4;
using (TransactionScope transaction = new TransactionScope())
{
List<Thread> threads = new List<Thread>(threadCount);
for (int i = 0; i < ThreadCount; i++)
{
Thread thread = new Thread(WorkerThread);
thread.Start(Transaction.Current.DependentClone(DependentCloneOption.BlockCommitUntilComplete));
threads.Add(thread);
}
threads.ForEach(thread => thread.Join());
transaction.Complete();
}
The method that does the actual work:
private void WorkerThread(object transaction)
{
using (DependentTransaction dTx = (DependentTransaction)transaction)
using (TransactionScope ts = new TransactionScope(dTx))
{
// The actual work, the inserts on the database, are executed here.
}
}
During this operation, I get an exception of type System.Data.OracleClient.OracleException with the message ORA-24757: duplicate transaction identifier.
What am I doing wrong? Am I implementing the dependent transaction the incorrect way? Is it incompatible with Oracle? If so, is there a workarround?
I don't know about why u get this exception but a work around might be:
const int ThreadCount = 4;
using (var connection = new OracleConnection(MyConnectionstring))
{
connection.Open();
using (var transaction = connection.BeginTransaction())
{
List<Thread> threads = new List<Thread>(ThreadCount);
for (int i = 0; i < ThreadCount; i++)
{
Thread thread = new Thread(WorkerThread);
thread.Start(transaction);
threads.Add(thread);
}
threads.ForEach(thread => thread.Join());
transaction.Commit();
}
}
and the working class would look something like:
private void WorkerThread(object transaction)
{
OracleTransaction trans = (OracleTransaction) transaction;
using (OracleCommand command = trans.Connection.CreateCommand())
{
command.CommandText = "INSERT INTO mytable (x) values (1) ";
command.ExecuteNonQuery();
}
}
Related
I am trying to capture the entity framework queries via SQL Server trace using C# that are getting executed during my automation test run. When I run the below code ,I am getting the exception
Issue
The queries that are getting executed in the background are not getting captured. Any other thing that I can try to get the select /insert/ update queries that are running in the background
Code
class Program
{
static void Main(string[] args)
{
using (SqlConnection conn = new SqlConnection(#"Data Source = XXXXXX\XXXXX; Initial Catalog = Test; Integrated Security = SSPI"))
{
XEStore store = new XEStore(new SqlStoreConnection(conn));
string sessionName = "abc";
if (store.Sessions[sessionName] != null)
{
Console.WriteLine("dropping existing session");
store.Sessions[sessionName].Drop();
}
Session s = store.CreateSession(sessionName);
s.MaxMemory = 4096;
s.MaxDispatchLatency = 30;
s.EventRetentionMode = Session.EventRetentionModeEnum.AllowMultipleEventLoss;
Event rpc = s.AddEvent("rpc_completed");
rpc.AddAction("username");
rpc.AddAction("database_name");
rpc.AddAction("sql_text");
rpc.PredicateExpression = #"sqlserver.username NOT LIKE '%testuser'";
s.Create();
s.Start();
int i = 0;
while (i < 15000)
{
s.Refresh();
foreach (var prop in rpc.Actions)
{
Console.WriteLine(prop.Description);
Console.WriteLine(prop.KeyChain);
Console.WriteLine(prop.IdentityKey);
Console.WriteLine(prop.Metadata);
Console.WriteLine(prop.ModuleID);
Console.WriteLine(prop.Name);
Console.WriteLine(prop.PackageName);
Console.WriteLine(prop.Parent);
Console.WriteLine(prop.Properties);
Console.WriteLine(prop.State);
Console.WriteLine(prop.Urn);
}
Thread.Sleep(1000);
}
}
}
}
I'm doing bulk insert of million records using temporary table to check the performance in two ways - with transaction and without. At first deleting if exist and creating new tables before each test.
//Filling up data to insert, calling from constructor
private void FillData()
{
_insertData = new List<TransactionDto>();
for (var i = 1; i <= 1000000; i++)
{
_insertData.Add(new TransactionDto(i, $"Insert{i}"));
}
}
private void PrepareDbTables(NpgsqlConnection connection)
{
var query = #"DROP TABLE IF EXISTS TransactionTest;
CREATE TABLE TransactionTest(id integer,text varchar(24))";
connection.Query(query);
}
private void DropAndCreateTempTable(NpgsqlConnection connection)
{
var query = #"DROP TABLE IF EXISTS TmpTransactions";
connection.Execute(query);
query = #"CREATE TEMP TABLE TmpTransactions (id integer, text varchar(24));";
connection.Execute(query);
}
Here are 2 tests:
[Fact]
public void CheckInsertBulkSpeedWithOutTransaction()
{
var sw = new Stopwatch();
using (var con = new NpgsqlConnection(ConnectionString))
{
con.Open();
//Delete and create new table TransactionTest
PrepareDbTables(con);
DropAndCreateTempTable(con);
sw.Start();
InsertBulkWithTempTable(null, _insertData, con);
sw.Stop();
}
_output.WriteLine(
$"Test completed. InsertBulk without transaction {_insertData.Count} elements for: {sw.ElapsedMilliseconds} ms");
}
The same test with transaction:
[Fact]
public void CheckInsertBulkSpeedWithTransaction()
{
var sw = new Stopwatch();
using (var con = new NpgsqlConnection(ConnectionString))
{
con.Open();
//Delete and create new table TransactionTest
PrepareDbTables(con);
DropAndCreateTempTable(con);
sw.Start();
using (var transaction = con.BeginTransaction(IsolationLevel.ReadUncommitted))
{
InsertBulkWithTempTable(transaction, _insertData, con);
transaction.Commit();
transaction.Dispose();
}
sw.Stop();
}
_output.WriteLine(
$"Test completed. InsertBulk with transaction {_insertData.Count} elements for: {sw.ElapsedMilliseconds} ms");
}
The main method which inserts records of object:
private void InsertBulkWithTempTable(NpgsqlTransaction transaction, List<TransactionDto> data, NpgsqlConnection connection)
{
using (var writer =
connection.BeginBinaryImport(
"COPY TmpTransactions(id,text) FROM STDIN(Format BINARY)"))
{
foreach (var dto in data)
{
writer.WriteRow(dto.Id, dto.Text);
}
writer.Complete();
}
var query =
"INSERT INTO TransactionTest select * from TmpTransactions";
//connection.Query(query, transaction);
connection.Execute(query);
}
The results of this tests are always different each time I run them, and it doesn't matter if I use Execute() or Query().
Test completed. InsertBulk without transaction 1000000 elements for: 7451 ms
Test completed. InsertBulk with transaction 1000000 elements for: 4676 ms
Test completed. InsertBulk without transaction 1000000 elements for: 6336 ms
Test completed. InsertBulk with transaction 1000000 elements for: 8776 ms
I'm trying to figure out what it depends on?
Any ideas? Any help is appreciated.
Thanks.
The data is inserted using LINQ to SQL, the id is generated but the database table is empty.
Using a stored procedure there is no problem. But inserting using linq the id is generated everytime but the table is empty.
The code is below:
Int32 t = 2;
using (EduDataClassesDataContext db =new EduDataClassesDataContext())
{
using (var scope = new TransactionScope())
{
db.Connection.ConnectionString = Common.EdukatingConnectionString;
UserLogin userlog = new UserLogin();
userlog.Username = userinfo.Username;
userlog.Password = userinfo.Password;
userlog.UserTypeId = t;
userlog.FullName = userinfo.FullName;
db.UserLogins.InsertOnSubmit(userlog);
db.SubmitChanges();
Int64 n = userlog.Id;
UserInformation userinfor = new UserInformation();
userinfor.FirstName = userinfo.FirstName;
userinfor.LastName = userinfo.LastName;
userinfor.MobileNum = userinfo.MobileNum;
userinfor.Email = userinfo.Email;
userinfor.Gender = userinfo.Gender;
userinfor.Address = userinfo.Address;
userinfor.UserLoginId = n;
userinfor.CreatedBy = n;
userinfor.OrganizationName = userinfo.OrganizationName;
userinfor.DateOfBirth = userinfo.DateOfBirth;
userinfor.CreatedDate = DateTime.Now;
db.UserInformations.InsertOnSubmit(userinfor);
db.SubmitChanges();
}
}
When you are using a TransactionScope, you need to call the Complete method in order to Commit the transaction in the DataBase.
using (var db = new EduDataClassesDataContext())
using (var scope = new TransactionScope())
{
...
db.UserInformations.InsertOnSubmit(userinfor);
db.SubmitChanges();
// The Complete method commits the transaction. If an exception has been thrown,
// Complete is not called and the transaction is rolled back.
scope.Complete();
}
Failing to call this method aborts the transaction, because the
transaction manager interprets this as a system failure, or exceptions
thrown within the scope of transaction.
I have two parts to my application which both does massive amount of insert and update respectively and because or poor managemenent, there's deadlock.
I am using entity framework to do my insert and update.
The following is my code for my TestSpool program. The purpose of this program is to insert x number of records with a given interval.
using System;
using System.Linq;
using System.Threading;
using System.Transactions;
namespace TestSpool
{
class Program
{
static void Main(string[] args)
{
using (var db = new TestEntities())
{
decimal start = 700001;
while (true)
{
using (TransactionScope scope = new TransactionScope())
{
//Random ir = new Random();
//int i = ir.Next(1, 50);
var objs = db.BidItems.Where(m => m.BidItem_Close == false);
foreach (BidItem bi in objs)
{
for (int j = 0; j <= 10; j++)
{
Transaction t = new Transaction();
t.Item_Id = bi.BidItemId;
t.User_Id = "Ghost";
t.Trans_Price = start;
t.Trans_TimeStamp = DateTime.Now;
start += 10;
db.Transactions.AddObject(t);
}
Console.WriteLine("Test Spooled for item " + bi.BidItemId.ToString() + " of " + 11 + " bids");
db.SaveChanges();
}
scope.Complete();
Thread.Sleep(5000);
}
}
}
}
}
}
The second part of the program is the testserverclass, the serverclass supposed to processed a huge amount of transactions from testspool and determined the highest amount of the transaction and update to another table.
using System;
using System.Linq;
using System.Transactions;
public class TestServerClass
{
public void Start()
{
try
{
using (var db = new TestServer.TestEntities())
{
while (true)
{
using (TransactionScope scope = new TransactionScope())
{
var objsItems = db.BidItems.Where(m => m.BidItem_Close == false);
foreach (TestServer.BidItem bi in objsItems)
{
var trans = db.Transactions.Where(m => m.Trans_Proceesed == null && m.Item_Id == bi.BidItemId).OrderBy(m => m.Trans_TimeStamp).Take(100);
if (trans.Count() > 0)
{
var tran = trans.OrderByDescending(m => m.Trans_Price).FirstOrDefault();
// TestServer.BidItem bi = db.BidItems.FirstOrDefault(m => m.BidItemId == itemid);
if (bi != null)
{
bi.BidMinBid_LastBid_TimeStamp = tran.Trans_TimeStamp;
bi.BidMinBid_LastBidAmount = tran.Trans_Price;
bi.BidMinBid_LastBidBy = tran.User_Id;
}
foreach (var t in trans)
{
t.Trans_Proceesed = "1";
db.Transactions.ApplyCurrentValues(t);
}
db.BidItems.ApplyCurrentValues(bi);
Console.WriteLine("Processed " + trans.Count() + " bids for Item " + bi.BidItemId);
db.SaveChanges();
}
}
scope.Complete();
}
}
}
}
catch (Exception e)
{
Start();
}
}
}
However, as both application con-currently runs, it will go into deadlock pretty fast randomly either from the first test or server application. How do i optimised my code for both side to prevent deadlocks ? I am expecting huge amount of inserts from the testspool application.
Since they work on the same data and get in each others way, I believe the cleanest way to do this would be to avoid executing these two at the same time.
Define a global static variable, or a mutex or a flag of some kind, maybe on the database. Whoever starts executing raises the flag, other one waits for the flag to come down. When flag comes down the other one raises the flag and starts executing.
To avoid long wait times on each class you can alter both your classes to process only a limited amount of records each turn. You should also introduce a maximum wait time for the flag. Maximum record limit should be chosen carefully to ensure that each class finishes its job in shorter time than max wait time.
I hava access DB, one of my function(C#.net) need to Exec a SQL more than 4000 times with transaction.
It seems that after execution the DB file stay opened exclusively. because there is a *.ldb file, and that file stay there for a long time.
Is that caused by dispose resources incorrectly???
private int AmendUniqueData(Trans trn)
{
int reslt = 0;
foreach (DataRow dr in _dt.Rows)
{
OleDbParameter[] _params = {
new OleDbParameter("#templateId",dr["Id"].ToString()),
new OleDbParameter("#templateNumber",dr["templateNumber"].ToString())
};
string sqlUpdateUnique = "UPDATE " + dr["proformaNo"].ToString().Substring(0,2) + "_unique SET templateId = #templateId WHERE templateNumber=#templateNumber";
reslt = OleDBHelper.ExecSqlWithTran(sqlUpdateUnique, trn, _params);
if (reslt < 0)
{
throw new Exception(dr["id"].ToString());
}
}
return reslt;
}
the transaction:
using (Trans trn = new Trans())
{
try
{
int reslt=AmendUniqueData(trn);
trn.Commit();
return reslt;
}
catch
{
trn.RollBack();
throw;
}
finally
{
trn.Colse();
}
}
forget closing the database connection.