I have a process where I retrieve records from a database periodically, and run 3 operations on each. For each record, the 3 operations must either all succeed, or none at all. In case of a failure on one of the operations, I want the operations that have been already processed for the previous records to be
committed, so that next time the process runs, it picks up on the record for which one of the 3 transactions failed previously.
I thought of wrapping the 3 operations in a transaction per record, and loop for each record, but I want to ensure that using a database transaction in this scenario is efficient. The following is what have in mind. Is it correct?
public async Task OrderCollectionProcessorWorker()
{
using (var context = new DbContext())
{
try
{
IList<Order> ordersToCollect =
await context.Orders.Where(
x => x.OrderStatusId == OrderStatusCodes.DeliveredId)
.ToListAsync(_cancellationTokenSource.Token);
await ProcessCollectionsAsync(context, ordersToCollect);
}
catch (Exception ex)
{
Log.Error("Exception in OrderCollectionProcessorWorker", ex);
}
}
}
/// <summary>
/// For each order to collect, perform 3 operations
/// </summary>
/// <param name="context">db context</param>
/// <param name="ordersToCollect">List of Orders for collection</param>
private async Task ProcessCollectionsAsync(DbContext context, IList<Order> ordersToCollect)
{
if (ordersToCollect.Count == 0) return;
Log.Debug($"ProcessCollections: processing {ordersToCollect.Count} orders");
foreach (var order in ordersToCollect)
{
// group the 3 operations in one transaction for each order
// so that if one operation fails, the operations performend on the previous orders
// are committed
using (var transaction = context.Database.BeginTransaction())
{
try
{
// *************************
// run the 3 operations here
// operations consist of updating the order itself, and other database updates
Operation1(order);
Operation2(order);
Operation3(order);
// *************************
await context.SaveChangesAsync();
transaction.Commit();
}
catch (Exception ex)
{
transaction?.Rollback();
Log.Error("General exception when executing ProcessCollectionsAsync on Order " + order.Id, ex);
throw new Exception("ProcessCollections failed on Order " + order.Id, ex);
}
}
}
}
It seems like a correct way of doing it, apart perhaps from fact that in catch you should rethrow the exception or do something else to stop progressing on the loop (if I understood correctly your requirments). It is even not necessary to use
var transaction = context.Database.BeginTransaction()
because
await context.SaveChangesAsync();
creates its own transaction. Every change you made is stored in the context and when you call SaveChanges there is transaction made and all the changes are written as 1 batch. If something fails all the changes are rollbacked. Another call to SaveChanges will make another transaction on new changes. Please bear in mind however that in case transaction fails you should no longer use the same context but create a new one. To summarize I would write your method as follows:
private async Task ProcessCollectionsAsync(DbContext context, IList<Order> ordersToCollect)
{
if (ordersToCollect.Count == 0) return;
Log.Debug($"ProcessCollections: processing {ordersToCollect.Count} orders");
foreach (var order in ordersToCollect)
{
// group the 3 operations in one transaction for each order
// so that if one operation fails, the operations performend on the previous orders
// are committed
try
{
// *************************
// run the 3 operations here
// operations consist of updating the order itself, and other database updates
Operation1(order);
Operation2(order);
Operation3(order);
// *************************
await context.SaveChangesAsync();
}
catch (Exception ex)
{
Log.Error("General exception when executing ProcessCollectionsAsync on Order " + order.Id, ex);
throw;
}
}
Related
I am using the transaction scope from System.Transactions.
I have this method where I have two insertions in database. The first Localization is inserted, but then rolled back since it fails on the second insertion.
Now the error is not with the data I send. The data is good. When I remove the transaction scope it works.
I get this error:
System.InvalidOperationException: A root ambient transaction was completed before the nested transaction. The nested transactions should be completed first.
It also enters the second catch and disposes the scope. What could be the problem?
This is my code:
public async Task InsertCategory(InsertCategoryRequest request)
{
using var scope = new TransactionScope();
int localizationId;
try
{
localizationId = await _localizationRepository.InsertLocalization(new Localization
{
English = request.NameEN,
Albanian = request.NameAL,
Macedonian = request.NameMK
});
}
catch (Exception e)
{
scope.Dispose();
Log.Error("Unable to insert localization {#Exception}", e);
throw ExceptionHandler.ThrowException(ErrorCode.Localization_UnableToInsert);
}
try
{
await _categoryRepository.InsertCategory(new Category
{
Name = request.NameEN,
LocalizationId = localizationId
});
}
catch (Exception e)
{
scope.Dispose();
Log.Error("Unable to insert category {#Exception}", e);
throw ExceptionHandler.ThrowException(ErrorCode.Category_UnableToInsert);
}
scope.Complete();
scope.Dispose();
}
I found the answer. I looked for such a long time, but after I posted I found the answer lol.
Just added TransactionScopeAsyncFlowOption.Enabled when constructing the Transaction Scope.
I have an issue, I want continue to insert data after the exception was raised by SQL Server.
I got an Unique Index on 3 different columns in table to detect duplicates.
For example I am trying to insert 2 rows, the first one is an duplicate, the second one is not.
When the duplicate is detected it goes in the catch, then I'm doing nothing, but when it comes on the second row which is not an duplicate, an exception is raised again for the previous row.
This is my code:
public async Task<IEnumerable<Result>> Handle(NewResultCommandDTO requests, CancellationToken cancellationToken)
{
var results = new List<Result>();
...
for (var j = 0; j < resultDetails.Count(); j++)
{
var rd = resultDetails.ElementAt(j);
var newResult1 = new Result
{
AthleteFEIID = rd.AthleteFEIID,
CompetitionCode = competition.CompetitionCode,
HorseId = horse.Id,
};
results.Add(newResult1);
try
{
await _resultsService.AddResultAsync(newResult1);
await _resultsService.CompleteAsync();
}
catch (Exception ex)
{
var x = ex;
}
}
}
public async Task AddResultAsync(Result result)
{
Context.Results.AddAsync(result);
}
public async Task CompleteAsync()
{
await Context.SaveChangesAsync().ConfigureAwait(false);
}
Thank you for your help !
await _resultsService.CompleteAsync(); is throwing the sql exception statement.
await _resultsService.AddResultAsync(newResult1); statement is already adding the entity in the db context. Even if the next statement throws the exception and it goes to catch block, the duplicated entity is still added in the context. So, when you are adding the next entity in the context and trying to save it, it is throwing exception because of the previous duplicated entity which is not removed from the context.
One solution is to remove the duplicated entity from the context when it goes to catch block.
try
{
await _resultsService.AddResultAsync(newResult1);
await _resultsService.CompleteAsync();
}
catch (Exception ex) {
var x = ex;
_resultsService.RemoveResult(newResult1);
}
public void RemoveResult(Result result)
{
Context.Results.Remove(result);
}
Another solution is to check if the duplication already exists in the table before adding it. For that you will have to write a get method using the unique indexed columns.
I want to use System.Transactions and update multiple rows. My database is connected using Entity Framework.
Below is the code I tried but it throws an error :
public void Update(List<PortfolioCompanyLinkModel> record)
{
var transaction = _context.Database.BeginTransaction();
try
{
foreach (var item in record)
{
var portfolioCompanyLink = _context.PortfolioCompanyLink.FirstOrDefault(p => p.Id == item.Id);
portfolioCompanyLink.ModifiedBy = _loggedInUser;
portfolioCompanyLink.ModifiedOn = DateTime.UtcNow;
portfolioCompanyLink.URL = item.URL;
_context.SaveChanges();
//_context.PortfolioCompanyLink.Update(portfolioCompanyLink);
}
transaction.Commit();
}
catch(Exception ex)
{
transaction.Rollback();
}
}
Error:
The configured execution strategy 'SqlServerRetryingExecutionStrategy' does not support user initiated transactions. Use the execution strategy returned by 'DbContext.Database.CreateExecutionStrategy()' to execute all the operations in the transaction as a retriable unit.
Can someone help me on how to proceed with this?
You problem is the SqlServerRetryingExecutionStrategy as described in Microsoft documentation
When not using a retrying execution strategy you can wrap multiple operations in a single transaction. For example, the following code wraps two SaveChanges calls in a single transaction. If any part of either operation fails then none of the changes are applied.
MS docs on resiliency
System.InvalidOperationException: The configured execution strategy 'SqlServerRetryingExecutionStrategy' does not support user initiated transactions. Use the execution strategy returned by 'DbContext.Database.CreateExecutionStrategy()' to execute all the operations in the transaction as a retriable unit.
Solution: Manually Call Execution Strategy
var executionStrategy = _context.db.CreateExecutionStrategy();
executionStrategy.Execute(
() =>
{
// execute your logic here
using(var transaction = _context.Database.BeginTransaction())
{
try
{
foreach (var item in record)
{
var portfolioCompanyLink = _context.PortfolioCompanyLink.FirstOrDefault(p => p.Id == item.Id);
portfolioCompanyLink.ModifiedBy = _loggedInUser;
portfolioCompanyLink.ModifiedOn = DateTime.UtcNow;
portfolioCompanyLink.URL = item.URL;
_context.SaveChanges();
//_context.PortfolioCompanyLink.Update(portfolioCompanyLink);
}
transaction.Commit();
}
catch(Exception ex) {
transaction.Rollback();
}
}
});
You can set the strategy globally too, but that depends on what you are trying to achieve.
I have an issue where two records with the same PK can run through my program within milliseconds of each other, causing a PK violation because Entity Framework checks if the row exists for the second record before the first one is sent through. I have a class that handles the AddOrUpdate and SaveChanges shown here:
public async Task Process(NewEvent newEvent)
{
try
{
_unitOfWork.UnitRepository.AddOrUpdate(newEvent);
await _unitOfWork.SaveAsync();
}
catch (Exception ex)
{
var innerException = ex.InnerException.InnerException as SqlException;
if (ShouldRetry(innerException))
{
numRetries++;
Logger.Warning(innerException.Message);
Logger.Warning($"Now attempting retry #{numRetries}");
await Process(newEvent);
}
else
{
Logger.Error($"An error occured attempting to process {newEvent.eventType}");
throw;
}
}
}
If the PK violation occurs it should catch the exception and retry the method, so that the check is run again and by then the first record should have went through. However, no matter how many retries I do (tried 20) it always fails the check. Am I not waiting long enough?
Sidenote, my interfaces are registered as Scoped and should dispose the UnitOfWork everytime.
I'm running the below code to update some records based on a bank transaction history file that is sent to us each morning. It's pretty basic stuff but, for some reason, when I hit the end, dbContext.GetChangeSet() reports "0" for all actions.
public void ProcessBatchFile(string fileName)
{
List<string[]> failed = new List<string[]>();
int recCount = 0;
DateTime dtStart = DateTime.Now;
using (ePermitsDataContext dbContext = new ePermitsDataContext())
{
try
{
// A transaction must be begun before any data is read.
dbContext.BeginTransaction();
dbContext.ObjectTrackingEnabled = true;
// Load all the records for this batch file.
var batchRecords = (from b in dbContext.AmegyDailyFiles
where b.FileName == fileName
&& b.BatchProcessed == false
&& (b.FailReason == null || b.FailReason.Trim().Length < 1)
select b);
// Loop through the loaded records
int paymentID;
foreach (var r in batchRecords)
{
paymentID = 0;
try
{
// We have to 'parse' the primary key, since it's stored as a string value with leading zero's.
if (!int.TryParse(r.TransAct.TrimStart('0'), out paymentID))
throw new Exception("TransAct value is not a valid integer: " + r.TransAct);
// Store the parsed, Int32 value in the original record and read the "real" record from the database.
r.OrderPaymentID = paymentID;
var orderPayment = this.GetOrderPayment(dbContext, paymentID);
if (string.IsNullOrWhiteSpace(orderPayment.AuthorizationCode))
// If we haven't processed this payment "Payment Received" do it now.
this.PaymentReceived(orderPayment, r.AuthorizationNumber);
// Update the PaymentTypeDetailID (type of Credit Card--all other types will return NULL).
var paymentTypeDetail = dbContext.PaymentTypes.FirstOrDefault(w => w.PaymentType1 == r.PayType);
orderPayment.PaymentTypeDetailID = (paymentTypeDetail != null ? (int?)paymentTypeDetail.PaymentTypeID : null);
// Match the batch record as processed.
r.BatchProcessed = true;
r.BatchProcessedDateTime = DateTime.Now;
dbContext.SubmitChanges();
}
catch (Exception ex)
{
// If there's a problem, just record the error message and add it to the "failed" list for logging and notification.
if (paymentID > 0)
r.OrderPaymentID = paymentID;
r.BatchProcessed = false;
r.BatchProcessedDateTime = null;
r.FailReason = ex.Message;
failed.Add(new string[] { r.TransAct, ex.Message });
dbContext.SubmitChanges();
}
recCount++;
}
dbContext.CommitTransaction();
}
// Any transaction will already be commited, if the process completed successfully. I just want to make
// absolutely certain that there's no chance of leaving a transaction open.
finally { dbContext.RollbackTransaction(); }
}
TimeSpan procTime = DateTime.Now.Subtract(dtStart);
// Send an email notification that the processor completed.
System.Text.StringBuilder sb = new System.Text.StringBuilder();
sb.AppendFormat("<p>Processed {0} batch records from batch file '{1}'.</p>", recCount, fileName);
if (failed.Count > 0)
{
sb.AppendFormat("<p>The following {0} records failed:</p>", failed.Count);
sb.Append("<ul>");
for (int i = 0; i < failed.Count; i++)
sb.AppendFormat("<li>{0}: {1}</li>", failed[i][0], failed[i][1]);
sb.Append("<ul>");
}
sb.AppendFormat("<p>Time taken: {0}:{1}:{2}.{3}</p>", procTime.Hours, procTime.Minutes, procTime.Seconds, procTime.Milliseconds);
EMailHelper.SendAdminEmailNotification("Batch Processing Complete", sb.ToString(), true);
}
The dbContext.BeginTransaction() method is something I added to the DataContext just to make it easy to use explicit transactions. I'm fairly confident that this isn't the problem, since it's used extensively elsewhere in the application. Our database design makes it necessary to use explicit transactions for a few, specific operations, and the call to "PaymentReceived" happens to be one of them.
I have stepped through the code and confirmed that the Rollback() method on the transaction itself is not begin called, and I have also checked the dbContext.GetChangeSet() before the call to CommitTransaction() happens with the same result.
I have included the BeginTransaction(), CommitTransaction() and RollbackTransaction() method bodies below, just for clarity.
/// <summary>
/// Begins a new explicit transaction on this context. This is useful if you need to perform a call to SubmitChanges multiple times due to "circular" foreign key linkage, but still want to maintain an atomic write.
/// </summary>
public void BeginTransaction()
{
if (this.HasOpenTransaction)
return;
if (this.Connection.State != System.Data.ConnectionState.Open)
this.Connection.Open();
System.Data.Common.DbTransaction trans = this.Connection.BeginTransaction();
this.Transaction = trans;
this._openTrans = true;
}
/// <summary>
/// Commits the current transaction (if active) and submits all changes on this context.
/// </summary>
public void CommitTransaction()
{
this.SubmitChanges();
if (this.Transaction != null)
this.Transaction.Commit();
this._openTrans = false;
this.RollbackTransaction(); // Since the transaction has already been committed, this just disposes and decouples the transaction object itself.
}
/// <summary>
/// Disposes and removes an existing transaction on the this context. This is useful if you want to use the context again after an explicit transaction has been used.
/// </summary>
public void RollbackTransaction()
{
// Kill/Rollback the transaction, as necessary.
try
{
if (this.Transaction != null)
{
if (this._openTrans)
this.Transaction.Rollback();
this.Transaction.Dispose();
this.Transaction = null;
}
this._openTrans = false;
}
catch (ObjectDisposedException) { } // If this gets called after the object is disposed, we don't want to let it throw exceptions.
catch { throw; }
}
I just found the problem: my DBA didn't put a primary key on the table when he created it for me, so LinqToSql did not generate any of the "PropertyChanged" event/handler stuff in the entity class, which is why the DataContext was not aware that changes were being made. Apparently, if your table has no primary key, Linq2Sql won't track any changes to that table, which makes sense, but it would be nice if there were some kind of notification to that effect. I'm sure my DBA didn't think about it, because of this just being a way of "tracking" which of these line items from the text file had been processed and doesn't directly relate to any other tables.