I have searched a lot on the Internet without finding similar case. I have one TransactionScope with several DbContext.
I want to commit changes to the database only in case all context have saved changes successfully.
But the problem I'm facing is that I had to call generalContext.SaveChanges in the middle of the code as the changes took place on data was retrieved by generalContext sometime earlier, but I noticed the changes are committed right away after calling generalContext.SaveChanges().
What problem did I do?
I have tried TransactionScopeOption.Required and TransactionScopeOption.RequiresNew, without being able to solve the problem
var transactionOptions = new TransactionOptions();
transactionOptions.Timeout = TimeSpan.FromMinutes(30);
using (var scope = new TransactionScope(TransactionScopeOption.Required, transactionOptions))
using (var generalContext = new CoreEntities())
{
try
{
using (var subContext = new CoreEntities())
{
// the problem is here, that the changes are committed!!!
generalContext.SaveChanges();
subContext.SaveChanges();
}
scope.Complete();
}
catch
{
scope.Dispose();
}
}
Related
I have several tables in the DB and I want to remove all the data and repopulate the tables and only then perform a save changes (because in case the save fails I want to return to the old data).
when I remove the data from the DB and then try to add the data into the DB it fails and says "Adding a relationship with an entity which is in the Deleted state is not allowed.", but when I remove the data then save and then add the new data and saves again, everything works fine..
here is my code if it helps understanding the problem
// create the new data
SomeDataHolder data = ... ;
// save some data to re-enter back after changes
List<User> usersSave = ctx.Users.ToList();
List<UserPreferences> userPrefsSave = ctx.UserPreferences.ToList();
//clear DB
ctx.UserCourses.RemoveRange(ctx.UserCourses);
ctx.Users.RemoveRange(ctx.Users);
ctx.Specializtions.RemoveRange(ctx.Specializtions);
ctx.Course_Predecessor.RemoveRange(ctx.Course_Predecessor);
ctx.Courses.RemoveRange(ctx.Courses);
ctx.Departments.RemoveRange(ctx.Departments);
ctx.GroupsDetails.RemoveRange(ctx.GroupsDetails);
ctx.LinkTable.RemoveRange(ctx.LinkTable);
this next line makes everything works, without this line the code will fail on next save
// ctx.SaveChanges();
updateDepartmentsCoursesSpecialization(ctx, data.Specializations);
updateCoursePredecessorsAndParallel(ctx, data.Predecessors);
updateGroupDetails(ctx, data.GroupDetails);
updateLectureToPractice(ctx, data.LinkLectureWithPractice);
ctx.Users.AddRange(usersSave);
ctx.UserPreferences.AddRange(userPrefsSave);
ctx.SaveChanges();
Here you have to use Transaction.B'cos you're doing more than one atomic operation on your code base.By using Transaction where you can Combine several operations into one transaction within the same context.If there is any failure within the transaction then all will be roll-backed.
Transaction code snippet is like this :
using (var ctx = new MyContext())
{
using (var dbContextTransaction = ctx.Database.BeginTransaction())
{
try
{
//1st operations here
ctx.GroupsDetails.RemoveRange(ctx.GroupsDetails);
ctx.LinkTable.RemoveRange(ctx.LinkTable);
ctx.SaveChanges();
//2nd operations here
ctx.Users.AddRange(usersSave);
ctx.UserPreferences.AddRange(userPrefsSave);
ctx.SaveChanges();
dbContextTransaction.Commit();
}
catch (Exception)
{
dbContextTransaction.Rollback();
}
}
}
You can refer this for more info : Working with Transactions
I am building an application and am using Entity Framework 6. But I am running in to problems with memory usage. No matter what I try I sooner or later run into an out of memory error. So far I have tried the following:
Using using for the the context.
Save changes in batches and disposing of the context.
Manually calling GC.Collect().
But none of these prevent the Entity framework of using more memory with every saveChanges I do. Eventually hitting the 2GB limit and crashing my program.
Is there any way I am unaware of to make the Entity Framework release all memory?
Edit
using (var sqlite = new myEntities())
{
sqlite.Configuration.AutoDetectChangesEnabled = false;
sqlite.Configuration.ValidateOnSaveEnabled = false;
foreach (var someItem in someList)
{
var newItem = new Item
{
...
};
sqlite.tableName.Add(newItem);
if (++countRecords%1000 == 0)
{
sqlite.SaveChanges();
}
}
sqlite.SaveChanges();
}
As described above I also have tried setting the context without the using and disposing it after the SaveChanges.
if (++countRecords%1000 == 0)
{
sqlite.SaveChanges();
sqlite.Dispose();
sqlite = new myEntities()
}
If it is indeed a batch issue, try something like this:
int batchSize = 10;
for (int i = 0; i < = someList.Count / batchSize; i++)
{
var batch = someList.Skip(batchSize * i).Take(batchSize);
using (var sqllite = new nyEntities())
{
foreach(var item in batch)
{
var newItem = new Item() {...};
sqllite.tableName.Add(newItem);
}
sqllite.SaveChanges();
}
}
This inverts the using statement to dispose the sqllite after each batch, thus clearing it out and starting fresh for each batch.
This code was made in notepad++ so be careful to clean it up if you try it out.
I have the following code that opens a session with RavenDB, gets the relevant IDs, uses those ideas to load the entities, change them, and finally save them.
List<EventDescriptor> events;
using (var session = raven.OpenSession())
{
session.Store(aggregate);
session.SaveChanges();
events = (from descriptor in session.Query<EventDescriptor>() where descriptor.AggregateId == aggregate.Id select descriptor).ToList();
}
using (var session = raven.OpenSession())
{
foreach (var #event in events)
{
var e = session.Load<EventDescriptor>("EventDescriptors/" + #event.Id.ToString());
e.Saved = true;
}
session.SaveChanges();
}
The problem however is that the changes in the entities don't seem to be tracked, and I can't delete the entities either (gives me unknown entity error), even though the object is loaded. I already tried calling SaveChanges inside the loop, but that didn't help either. I looked at the Raven documentation but I don't see what I'm doing wrong here.
Yes, we can't track changes on structs, because every time that you change them, you create a new copy
The problem was that EventDescriptor was a struct, and not a class. Changing this solved the problem. I assume it's because a struct is a valuetype and not a referencetype.
I have developed a WCF api which is using nHibernate. I am new to this. I have used session.update to take care of transaction. I have a for loop in which based on select condition I am updating a record ie. If A is present in tabel1 then I am updating the table else inserting a new entry.
I am getting "could not execute query." when trying to execute a select query on a table which was previously being updated by adding a new entry in the table.
What I think is, because I am using session.save(table1) and then trying select entries from that table I am getting an error. Since session.save temporarily locks the table I am not able to execute a select query on that table.
What can be the solution on this?
Update:
This the for loop I am using to check in the database for some field:
using (ITransaction tranx = session.BeginTransaction())
{
savefunction();
tranx.Commit();
}
Save function:
public void savefunction()
{
for (int i = 0; i < dictionary.Count; i++)
{
ICandidateAttachmentManager candidateAttach = new ManagerFactory().GetCandidateAttachmentManager();
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
if(attach == null)
{
//insert new entry into table attach
session.save(attach);
}
}
}
checkCV function:
public void checkCV()
{
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
{
IList<CandidateAttachment> lstCandidateAttachment = CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
if (lstCandidateAttachment.Count > 0)
{
CandidateAttachment attach = lstCandidateAttachment.Where(x => x.CandidateAttachementType.Id.Equals(FileType)).FirstOrDefault();
if (attach != null)
{
return null;
}
else
{
return "some string";
}
}
}
}
What happening here is in the for loop if say for i=2 the attach value comes to null that I am entering new entry into attach table. Then for i=3 when it enters checkCV function I get an error at this line:
IList lstCandidateAttachment =
CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
I think it is because since I am using session.save and then trying to read the tabel contents I am unable to execute the query and table is locked till I commit my session. Between the beginTransaction and commit, the table associated with the object is locked. How can I achieve this? Any Ideas?
Update:
I read up on some of the post. It looks like I need to set isolation level for the transaction. But even after adding it doesn't seem to work. Here is how I tried to inplement it:
using (ITransaction tranx = session.BeginTransaction(IsolationLevel.ReadUncommitted))
{
saveDocument();
}
something I don't understand in your code is where you get your nHibernate session.
Indeed you use
new ManagerFactory().GetCandidateAttachmentManager();
and
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
so your ManagerFactory class provides you the ISession ?
then you do:
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
but
checkCV() returns either a null or a string ?
Finally you should never do
Save()
but instead
SaveOrUpdate()
Hope that helps you resolving your issue.
Feel free to give more details
I'm trying to update a field in a table just after I have added a row in a different table. (The value is just to show that the row has been imported) I thought I was using the right code here, but the bool 'Imported' field isn't updated. Here is the relevant code:
using (DbContext db = new DbContext())
{
db.Details.Add(details);
db.SaveChanges();
newID = details.DetailsID;
AccessRepository rep = new AccessRepository();
AccessDetails detailUpdate = rep.GetByID(item.AccessDetailsTableID);
detailUpdate.Imported = true;
db.SaveChanges();
}
The first SaveChanges call works, as I'm trying to add a new row, but not the second one. It successfully retrieves the data back from the repository but just doesn't update the value.
Any ideas why it might not be working?
Thanks
I think this is because your AccessRepository is using a different data context (db) to the one in scope (in your posted code)
You could try having a SaveChanges method in your AccessRepository which does the same but on the correct data context.
However, the issue with calling two saves is that you loss the single transaction benefits. So if those two updates are to be related you really should only call the SaveChanges once.
I would create an Add method and a Save method in your AccessRepository and then use something like this...
AccessRepository rep = new AccessRepository();
rep.Add(details);
AccessDetails detailUpdate = rep.GetByID(item.AccessDetailsTableID);
detailUpdate.Imported = true;
rep.Save();//this calls SaveChanges on a single data context
hope that helps
Should you use newId in GetById() method?
using (DbContext db = new DbContext())
{
db.Details.Add(details);
db.SaveChanges();
newID = details.DetailsID;
AccessRepository rep = new AccessRepository();
AccessDetails detailUpdate = rep.GetByID(newID);
detailUpdate.Imported = true;
db.SaveChanges();
}