Transaction Scope with Entity - c#

I have a windows form application with .NET 4 and Entity Framework for data layer
I need one method with transaction, but making simple tests I couldn't make it work
In BLL:
public int Insert(List<Estrutura> lista)
{
using (TransactionScope scope = new TransactionScope())
{
id = this._dal.Insert(lista);
}
}
In DAL:
public int Insert(List<Estrutura> lista)
{
using (Entities ctx = new Entities (ConnectionType.Custom))
{
ctx.AddToEstrutura(lista);
ctx.SaveChanges(); //<---exception is thrown here
}
}
"The underlying provider failed on Open."
Anyone have any ideas?
PROBLEM RESOLVED - MY SOLUTION
I solved my problem doing some changes.
In one of my DAL I use a Bulk Insert and others Entity.
The problem transaction was occurring by the fact that the bulk of the transaction (transaction sql) do not understand a transaction scope
So I separated the Entity in DAL and used the sql transaction in its running some trivial. ExecuteScalar ();
I believe that is not the most elegant way to do this, but solved my problem transaction.
Here is the code of my DAL
using (SqlConnection sourceConnection = new SqlConnection(Utils.ConnectionString()))
{
sourceConnection.Open();
using (SqlTransaction transaction = sourceConnection.BeginTransaction())
{
StringBuilder query = new StringBuilder();
query.Append("INSERT INTO...");
SqlCommand command = new SqlCommand(query.ToString(), sourceConnection, transaction);
using (SqlBulkCopy bulk = new SqlBulkCopy(sourceConnection, SqlBulkCopyOptions.KeepNulls, transaction))
{
bulk.BulkCopyTimeout = int.MaxValue;
bulk.DestinationTableName = "TABLE_NAME";
bulk.WriteToServer(myDataTable);
StringBuilder updateQuery = new StringBuilder();
//another simple insert or update can be performed here
updateQuery.Append("UPDATE... ");
command.CommandText = updateQuery.ToString();
command.Parameters.Clear();
command.Parameters.AddWithValue("#SOME_PARAM", DateTime.Now);
command.ExecuteNonQuery();
transaction.Commit();
}
}
}
thanks for the help

According to the all-mighty Google, it seems that EF will open/close connections with each call to a database. Since it's doing that, it will treat the transaction as using multiple connections (using a distributed transaction). The way to get around this is to open and close the connection manually when using it.
Here's the information on the distributed transactions issue.
Here's how to manually open and close the connection.
A small code sample:
public int Insert(List<Estrutura> lista)
{
using (TransactionScope scope = new TransactionScope())
{
using (Entities ctx = new Entities (ConnectionType.Custom))
{
ctx.Connection.Open()
id = this._dal.Insert(ctx, lista);
}
}
}
public int Insert(Entities ctx, List<Estrutura> lista)
{
ctx.AddToEstrutura(lista);
ctx.SaveChanges();
}

Instead of employing TransactionScope, it is better to employ UnitOfWork pattern while working with entity framework. please refer to:
unit of work pattern
and also;
unit of work and persistance ignorance

Related

How to use Microsoft SQL Server extended events to see the set options?

I am using extended events to log the query plan. I would also like to see the set options. How can I do this using extended events? (I know the profile shows the set options, but can this also be done using extended events?)
Background
I have a .NET core application that uses Entity Framework Core.
I connect to one database using a context and execute a LINQ query
And then within the first context I use SqlConnection + SqlCommand.ExecuteReader to execute a query directly on a 2nd database. The SqlCommand.ExecuteReader is slow (compared to direct execution in SQL Server Management Studio) and I would like to know what SET options are enabled.
Here is the pseudo code
using (var context = new MyDBContext(options))
{
var sensor = await context.Set<Sensor>()
.AsNoTracking()
.Where(s => s.Id == 4)
.SingleAsync(cancellationToken);
using (SqlConnection connection = new SqlConnection(connectionString2))
{
using (SqlCommand command = new SqlCommand(sql, connection))
{
connection.Open();
SqlDataReader reader = command.ExecuteReader();
try
{
reader.Read();
value = Convert.ToInt32(reader["CountColumn"]);
}
finally
{
reader.Close();
}
}
}
var newResult = new Result(sensor.Id, value);
context.Set<Result>().Add(newResult);
sensor.LatestResult = newResult;
context.Set<Sensor>().Update(sensor);
await context.SaveChangesAsync(cancellationToken);
}

System.ObjectDisposedException: 'Cannot access a disposed object.Object name: 'OracleConnection'.'

The following code uses Entity Framework 6 and Managed Oracle Providers to call an Oracle stored procedure that returns multiple cursors.
The using statement is throwing the following exception:
System.ObjectDisposedException: 'Cannot access a disposed object.Object name: 'OracleConnection'.'
If I remove the using statement and instead use the code in the following post. I get no errors.
Using Entity Framework to Call an Oracle Stored Procedure with Multiple Cursors
Why is the using statement causing an exception? It has been suggested to me that there is a bug with the Oracle Managed Provider. But, my colleagues are using the same provider and their using statements are working fine.
Example Code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Data;
using Oracle.ManagedDataAccess.Client;
using System.Data.Entity.Infrastructure;
namespace MyCompany
{
public class MyClass
{
private MyDbContext _dbContext = new MyDbContext();
public MyItems GetMyItems(string id)
{
var sqlQuery = "";
var oracleParameters = new List<OracleParameter>();
var oneEntityList = new List<OneEntity>();
var twoEntityList = new List<TwoEntity>();
var threeEntityList = new List<ThreeEntity>();
sqlQuery = #"
BEGIN
MY_PACKAGE.GetMyItems(:id, :p_cursor1, :p_cursor2, :p_cursor3);
END;
";
oracleParameters = new List<OracleParameter>
{
new OracleParameter("p_id", id),
new OracleParameter("p_cursor1", OracleDbType.RefCursor, ParameterDirection.Output),
new OracleParameter("p_cursor2", OracleDbType.RefCursor, ParameterDirection.Output),
new OracleParameter("p_cursor3", OracleDbType.RefCursor, ParameterDirection.Output)
};
using (var connection = _dbContext.Database.Connection)
{
connection.Open();
var command = connection.CreateCommand();
command.CommandText = sqlQuery;
command.Parameters.AddRange(oracleParameters.ToArray());
using (var reader = command.ExecuteReader())
{
oneEntityList = ((IObjectContextAdapter)dbContext).ObjectContext
.Translate<OneEntity>(reader)
.ToList();
reader.NextResult();
twoEntityList = ((IObjectContextAdapter)dbContext).ObjectContext
.Translate<TwoEntity>(reader)
.ToList();
reader.NextResult();
threeEntityList = ((IObjectContextAdapter)dbContext).ObjectContext
.Translate<ThreeEntity>(reader)
.ToList();
}
return new MyItems { OneEntity = oneEntityList, TwoEntity = twoEntityList, ThreeEntity = threeEntityList };
}
}
}
}
It is correct and proper to use using statements around disposable objects when you own the lifetime; however, in this case: you don't! The connection here belongs to the data-context, and presumably the data-context itself is IDisposable, and it will dispose the connection when the data-context is disposed.
So: while you might be allowed to borrow the connection from the data-context for the purposes of executing queries - you shouldn't be trying to dispose it here. That would end up closing/disposing a connection at unexpected times, with unpredictable results.
Conversely: if you had a var conn = new OracleConnection(...), then clearly you do own that connection (unless you hand it to something that will manage the lifetime), and you should dispose it.
Just to complicate things further... currently, your MyClass seems to own the db-context, via:
private MyDbContext _dbContext = new MyDbContext();
So ideally, your MyClass should be disposable (: IDisposable), and disposing MyClass should cascade to dispose _dbContext.

SqlBulkCopy Wcf Rest Moving Data between 2 sql server tables

I am trying to move data from some tables from one sql server database to another sql server database. I am planning to write a wcf rest service to do that. I am also trying to implement this using SQLBulkCopy. I am trying to implement the below functionality on button click. Copy Table1 data from source sql server to Table 1 in destination sql server. Same as for table 2 and table 3. I am blocked on couple of things. Is sql bulk copy a best option with wcf rest service to transfer data. I was asked not to use ssis in this task. If there is any exception while moving data from source to destination, then the destination data should be reverted back. It is something like transaction. How do I implement this transaction functionality. Any pointer would help.
Based on the info you gave, your solution would be something like the sample code I inserted. Pointers for the SQLBulkCopy: BCopyTutorial1, BCopyTutorial2 SQLBulkCopy with transaction scope examples: TrasactionBulkCopy
My version is a simplified version of these, based on your question.
Interface:
[ServiceContract]
public interface IYourBulkCopyService {
[OperationContract]
void PerformBulkCopy();
}
Implementation:
public class YourBulkCopyService : IYourBulkCopyService {
public void PerformBulkCopy() {
string sourceCs = ConfigurationManager.AppSettings["SourceCs"];
string destinationCs = ConfigurationManager.AppSettings["DestinationCs"];
// Open a sourceConnection to the AdventureWorks database.
using (SqlConnection sourceConnection = new SqlConnection(sourceCs) {
sourceConnection.Open();
// Get data from the source table as a SqlDataReader.
SqlCommand commandSourceData = new SqlCommand(
"SELECT * FROM YourSourceTable", sourceConnection);
SqlDataReader reader = commandSourceData.ExecuteReader();
//Set up the bulk copy object inside the transaction.
using (SqlConnection destinationConnection = new SqlConnection(destinationCs)) {
destinationConnection.Open();
using (SqlTransaction transaction = destinationConnection.BeginTransaction()) {
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(
destinationConnection, SqlBulkCopyOptions.KeepIdentity,
transaction)) {
bulkCopy.BatchSize = 10;
bulkCopy.DestinationTableName =
"YourDestinationTable";
// Write from the source to the destination.
try {
bulkCopy.WriteToServer(reader);
transaction.Commit();
} catch (Exception ex) {
// If any error, rollback
Console.WriteLine(ex.Message);
transaction.Rollback();
} finally {
reader.Close();
}
}
}
}
}
}
}

TransactionScope timing out connection on dev virtualbox VM

I have some code thus:
private static void Delete(int PaxID)
{
using (SqlConnection conn = DataHelper.GetDBConnection())
{
using (SqlCommand cmd = DataHelper.GetSPCommand("spDeletePax",conn))
{
cmd.Parameters.AddWithValue("#PaxID", PaxID);
cmd.ExecuteNonQuery();
}
}
}
public static void DeletePaxes(List<int> ids, string bookingRef, string user)
{
using (TransactionScope ts = new DataHelper.CreateTransactionScope())
{
foreach (var i in ids)
{
Delete(i);
}
ts.Complete();
}
}
public static SqlConnection GetDBConnection()
{
SqlConnection conn = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["DB"].ConnectionString);
conn.Open();
return conn;
}
public static TransactionScope CreateTransactionScope()
{
var transactionOptions = new TransactionOptions();
transactionOptions.IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted;
transactionOptions.Timeout = TransactionManager.MaximumTimeout;
return new TransactionScope(TransactionScopeOption.Required, transactionOptions);
}
which until recently was working fine
I have changed no code, but simply changed my source control from VSS to SVN, and now opened up the project in VS2012 (instead of 2008).
If i call the DeletePaxes(..) the first delete works but the second times out when connecting to the DB
am i just doing this wrong or does 2012/.NET4/4.5 deal with transactions differently? I have done some googling and turned up nothing (hence posting here)
can anyone enlighten me as to what might be going on?
am I just doing this wrongly?
DTC issetup so dont think its that - and like i say was working fine until I changed the source control..
also if i change the transaction to just the default - not using the static method, it also fails..
removing the transaction works fine
I am using the transaction because I need ALL or NONE of the deletes to work..
thanks
OK dont beat me but turns out it was a problem with the VM setup.
the network adapter was set to NAT, not Bridged Adapter - and all is well now.
guess the transaction was not able to find its way back to the VM and thus is just sat there
may just be me that is foolish enough to forget that but thanks for all the replies anyway - and I will do some reading up on Table-valued Params

How to use SqlBulkCopy with SMO and transactions

I'm trying to create a table using SMO, and further use the SqlBulkCopy object to inject a bunch of data into that table. I can do this without using a transaction like this:-
Server server = new Server(new ServerConnection(new SqlConnection(connectionString)));
var database = server.Databases["MyDatabase"];
using (SqlConnection connection = server.ConnectionContext.SqlConnectionObject)
{
try
{
connection.Open();
Table table = new Table(database, "MyNewTable");
// --- Create the table and its columns --- //
SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(connection);
sqlBulkCopy.DestinationTableName = "MyNewTable";
sqlBulkCopy.WriteToServer(dataTable);
}
catch (Exception)
{
throw;
}
}
Basically I want to perform the above using a SqlTransaction object and committing it when the operation has been completed (Or rolling it back if it fails).
Can anyone help?
2 Things -
A - The SQLBulkCopy method is already transaction based by default. That means the copy itself is encapsulated in a transaction and works for fails as a unit.
B - The ServerConnection object has methods for StartTransaction, CommitTransaction, RollbackTransaction.
You should be able to use those methods in your code above, but I suspect if there is an issue with the table creation your try/catch will handle that appropriately.

Categories

Resources