Entity Framework not Saving Changes into Database - c#

I'm puzzled as to why this code is not working, it should save changes to database after the loops but when I place the SaveChanges method inside the loop, it saves the record into the database but outside it doesn't save anything? it's about only 300 ~ 1000 records
static bool lisReady = false;
static bool sacclReady = false;
static void Main(string[] args)
{
Logger("Starting services");
ConnectDBLis().Wait();
ConnectDBSaccl().Wait();
Thread.Sleep(1000);
if (lisReady & sacclReady){
//start
Logger("Services ready");
StartExport().Wait();
}
}
static async Task<bool> StartExport()
{
lis lisdb = new lis();
nrlsaccl saccldb = new nrlsaccl();
var getTestOrders = await lisdb.test_orders.ToListAsync();
Logger("Services starting");
foreach (var tO in getTestOrders.Where(x => x.entry_datetime.Value.Year == 2016))
{
foreach (var tr in tO.test_results)
{
foreach (var tL in tr.test_result_logs)
{
results_availability postResults = new results_availability
{
first_name = tO.patient_orders.patient.first_name,
middle_name = tO.patient_orders.patient.middle_name,
last_name = tO.patient_orders.patient.last_name,
birthdate = tO.patient_orders.patient.birthdate,
};
if (postResults.id == 0)
{
saccldb.results_availability.Add(postResults);
}
else
{
saccldb.Entry(postResults).State = EntityState.Modified;
}
}
}
}
await saccldb.SaveChangesAsync();
return true;
}
Edit:
So i limit the records to 100 and the save changes works, 3000 records at instant does not work, any solutions?

This code doesn't completely resolve your issue, this are some consideration for your problem.
Note: This works for me when adding 1200 records and 300 modifications
static async Task<bool> StartExport()
{
using (var db = new Entities())
{
var appraisals = await db.Appraisals.ToListAsync();
db.Database.CommandTimeout = 300;
//Disabling auto detect changes enabled will bring some performance tweaks
db.Configuration.AutoDetectChangesEnabled = false;
foreach (var appraisal in appraisals.Where(g => g.Id > 1))
{
if (appraisal.Id == 10)
{
appraisal.AppraisalName = "New name";
db.Entry(appraisal).State = EntityState.Added;
}
else
{
appraisal.AppraisalName = "Modified name";
db.Entry(appraisal).State = EntityState.Modified;
}
}
db.Configuration.AutoDetectChangesEnabled = true;
if (await db.SaveChangesAsync() > 1)
return true;
else
return false;
}
}
You could use db.Database.CommandTimeout = 300; to increase the timeout of you connection.
Entity framework 6 provides AddRange() This will insert the items in one shot, it will disable AutoDetectChangesEnabled and insert the entities
In your case you don't want to mark the entites as Modified, the EF already tracks it well. Entity Framework - Why explicitly set entity state to modified?
The purpose of change tracking to find that you have changed a value on attached entity and put it to modified state. Setting state manually is important in case of detached entities (entities loaded without change tracking or created outside of the current context).
Here we have all entities attached to the context itself

Use
saccldb.SaveChanges()
Simply because the async nature of await saccldb.SaveChangesAsync() cause your thread to continue and exit the function before the saving to the db completes. In your case it returns true.
I would suggest not using any async operations on a console application unless it has a user interface that you would want to keep going.

Related

How to bulk insert/update lots of data in the correct way?

Following thing boggles my mind:
I have to bulk insert a lot of changes, some are inserts some are updates. I am not sure how to do it the best way.
Logic looks something like this:
public class Worker
{
public void Run(){
var mailer = new Mailer();
HashSet<DbElements> dbElementsLookUp = new HashSet<DbElement>(dbContext.DbElements);
List<Element> elements = GetSomeChangesFromSomewhere();
var dbElementsToSave = new List<DbElements>();
foreach(var element in elements)
{
CreateOrUpdateDbElement(element, dbElementsToSave);
// Sends some data based on the element - due to legacy implementation it uses its own context
mailer.SendSomeLogging(element);
}
try
{
dbContext.ChangeTracker.DetectChanges();
dbContext.Set<DbElement>().AddRange(dbElementsToSave);
dbContext.SaveChanges();
}
catch (Exception e)
{
LogErrors(e);
}
}
private CreateOrUpdateDbElement(ElementDto element, HashSet<DbElement> lookUp, List<DbElement> dbElementsToSave)
{
var entity = lookUp.FirstOrDefault(e => e.Id == element.Id);
if(element is not null)
{
entity.SomeProperty = element.SomeProperty;
dbContext.Configuration.AutoDetectChangesEnabled = false;
dbContext.Entry(entity).State = EntityState.Modified;
dbContext.Configuration.AutoDetectChangesEnabled = true;
}
else
{
dbElementsToSave.Add(new DbElement
{
SomeProperty = element.SomeProperty,
CreationDate = DateTime.Now
});
}
}
}
I'm not sure what's the best way to do this, especially for the DetectChanges. Is it save to disable the autodetectchanges and call the detectchanges outside of the foreach. I am working with a lot of data and due to the legacy implementation it is pretty slow because for each mail there is a write operation on the database. It actually works on another instance of the context so it does not interfer with the saving of the objects of dbelements.
Is it better to add the entities to update to another list and do the same as for the adding of entities?

Problem trying to clone a Project Server database using OData and Entity Framework [duplicate]

This question already has answers here:
How can I use Fast Member to Bulk Copy data into a table with inconsistent column names?
(2 answers)
Closed 2 years ago.
I am having trouble updating my entities with Parallel.Foreach. The program I have, works fine by using foreach to update the entities, but if I use Parallel.Foreach it gives me the error like : "Argument Exception: An item with the same key has already been added". I have no idea why it happens, shouldn't it be thread safe? Or why giving me this error? How to resolve this issue?
The program itself get some data from a database and copy it to another one. If the datarow exists with the same guid (see below), and the status unchanged the matching datarow in the second must be updated. If theres a match, and status changed, modifications must be ignored. Finally if no match in the second database, then insert the datarow into the second database. (Synchronize the two databases). I just want to speed up the process somehow, that is why I first think of parallel processing.
(I am using Autofac as an IoC container and dependency injection if that matters)
Here is the code snippet which tries to update:
/* #param reports: data from the first database */
public string SynchronizeData(List<Reports> reports, int statusid)
{
// reportdataindatabase - the second database data, List() actually selects all, see next code snippet
List<Reports> reportdataindatabase = unitOfWorkTAFeedBack.ReportsRepository.List().ToList();
int allcount = reports.Count;
int insertedcount = 0;
int updatedcount = 0;
int ignoredcount = 0;
// DOES NOT WORK, GIVES THE ERROR
Parallel.ForEach(reports, r =>
{
var guid = reportdataindatabase.FirstOrDefault(x => x.AssignmentGUID == r.AssignmentGUID);
if (guid == null)
{
unitOfWorkTAFeedBack.ReportsRepository.Add(r); // an insert on the repository
insertedcount++;
}
else
{
if (guid.StatusId == statusid)
{
r.ReportsID = guid.ReportsID;
unitOfWorkTAFeedBack.ReportsRepository.Update(r); // update on the repo
updatedcount++;
}
else
{
ignoredcount++;
}
}
});
/* WORKS PERFECTLY BUT RELATIVELY SLOW - takes 80 seconds to update 1287 records
foreach (Reports r in reports)
{
var guid = reportdataindatabase.FirstOrDefault(x => x.AssignmentGUID == r.AssignmentGUID); // find match between the two databases
if (guid == null)
{
unitOfWorkTAFeedBack.ReportsRepository.Add(r); // no match, insert
insertedcount++;
}
else
{
if (guid.StatusId == statusid)
{
r.ReportsID = guid.ReportsID;
unitOfWorkTAFeedBack.ReportsRepository.Update(r);
updatedcount++;
}
else
{
ignoredcount++;
}
}
} */
unitOfWorkTAFeedBack.Commit(); // this only calls SaveChanges() on DbContext object
int allprocessed = insertedcount + updatedcount + ignoredcount;
string result = "Synchronization finished. " + allprocessed + " reports processed out of " + allcount + ", "
+ insertedcount + " has been inserted, " + updatedcount + " has been updated and "
+ ignoredcount + " has been ignored. \n Press a button to dismiss this window." ;
return result;
}
The program breaks on this Repository class in the Update method (with Parallel.Foreach, no problem with the standard foreach):
public class EntityFrameworkReportsRepository : IReportsRepository
{
private readonly TAFeedBackContext tAFeedBackContext;
public EntityFrameworkReportsRepository(TAFeedBackContext tAFeedBackContext)
{
this.tAFeedBackContext = tAFeedBackContext;
}
public void Add(Reports r)
{
tAFeedBackContext.Reports.Add(r);
}
public void Delete(int Id)
{
var obj = tAFeedBackContext.Reports.Find(Id);
tAFeedBackContext.Reports.Remove(obj);
}
public Reports Get(int Id)
{
var obj = tAFeedBackContext.Reports.Find(Id);
return obj;
}
public IQueryable<Reports> List()
{
return tAFeedBackContext.Reports.AsNoTracking();
}
public void Update(Reports r)
{
var entry = tAFeedBackContext.Entry(r); // The Program Breaks At This Point!
if (entry.State == EntityState.Detached)
{
tAFeedBackContext.Reports.Attach(r);
tAFeedBackContext.Entry(r).State = EntityState.Modified;
}
else
{
tAFeedBackContext.Entry(r).CurrentValues.SetValues(r);
}
}
}
Please bear in mind it hard to give a complete answer as there are thing I need clarity on … but comments should help with building a picture.
Parallel.ForEach(reports, r => //Parallel.ForEach is not the answer..
{
//reportdataindatabase is done..before so ok here
// do you really want FirstOrDefault vs SingleOrDefault
var guid = reportdataindatabase.FirstOrDefault(x => x.AssignmentGUID == r.AssignmentGUID);
if (guid == null)
{
// this is done on the context not the DB, unresolved..(excuted)
unitOfWorkTAFeedBack.ReportsRepository.Add(r); // an insert on the repository
//insertedcount++; u would need a lock
}
else
{
if (guid.StatusId == statusid)
{
r.ReportsID = guid.ReportsID;
// this is done on the context not the DB, unresolved..(excuted)
unitOfWorkTAFeedBack.ReportsRepository.Update(r); // update on the repo
//updatedcount++; u would need a lock
}
else
{
//ignoredcount++; u would need a lock
}
}
});
the issue here... as reportdataindatabase can contain the same key twice..
and the context is only updated after the fact aka when it get here..
unitOfWorkTAFeedBack.Commit();
it may have been called twice for the same entity
as above (commit) is where the work is... doing the add/update above in Parallel wont save you any real time, as that part is quick..
//takes 80 seconds to update 1287 records... does seem long...
//List reportdataindatabase = unitOfWorkTAFeedBack.ReportsRepository.List().ToList();
//PS Add how reports are retrieved.. you want something like
TAFeedBackContext db = new TAFeedBackContext();
var remoteReports = DatafromAnotherPLace //include how this was retrieved;
var localReports = TAFeedBackContext.Reports.ToList(); //these are tracked.. (by default)
foreach (var item in remoteReports)
{
//i assume more than one is invalid.
var localEntity = localReports.SingleOrDefault(x => x.AssignmentGUID == item.AssignmentGUID);
if (localEntity == null)
{
//add as it doenst exist
TAFeedBackContext.Reports.Add(new Report() { *set fields* });
}
else
{
if (localEntity.StatusId == statusid) //only update if status is the passed in status.
{
//why are you modifying the remote entity
item.ReportsID = localEntity.ReportsID;
//update remove entity?, i get the impression its from a different context,
//if not then cool, but you need to show how reports is retrieved
}
else
{
}
}
}
TAFeedBackContext.SaveChanges();

Real time workflow on record creation runs before the record is committed to database

I have this code that runs in a real time workflow on record creation. It basically determines on which entity it's running on, tries to get the "related to field", which can change from entity to another, then see if the related record is an account, if so, update some of the account's attributes.
public override void ExecuteWorkflowLogic(XrmObjects xrmObjects)
{
using (XrmServiceContext ctx = new XrmServiceContext(xrmObjects.Service))
{
var target = xrmObjects.WorkflowContext.InputParameters["Target"] as Entity;
Xrm.Account account = null;
var regardingFieldName = string.Empty;
var logicalName = string.Empty;
var primaryKeyName = string.Empty;
switch (target.LogicalName)
{
case Xrm.Task.EntityLogicalName:
case Xrm.Email.EntityLogicalName:
case Xrm.PhoneCall.EntityLogicalName:
case Xrm.Letter.EntityLogicalName:
case Xrm.Appointment.EntityLogicalName:
case Xrm.Fax.EntityLogicalName:
regardingFieldName = "regardingobjectid";
logicalName = Xrm.ActivityPointer.EntityLogicalName;
primaryKeyName = "activityid";
break;
case Xrm.Annotation.EntityLogicalName:
logicalName = Xrm.Annotation.EntityLogicalName;
regardingFieldName = "objectid";
primaryKeyName = "annotationid";
break;
case Xrm.Opportunity.EntityLogicalName:
regardingFieldName = "customerid";
logicalName = Xrm.Opportunity.EntityLogicalName;
primaryKeyName = "opportunityid";
break;
case Xrm.Post.EntityLogicalName:
regardingFieldName = "regardingobjectid";
logicalName = Xrm.Post.EntityLogicalName;
primaryKeyName = "postid";
break;
}
var activity = (from a in ctx.CreateQuery(logicalName)
where a[regardingFieldName] != null && (Guid)a[primaryKeyName] == target.Id
select a).FirstOrDefault();
var ec = (EntityReference)activity[regardingFieldName];
if (ec == null)
return;
//if regarding isn't an account...
if (ec.LogicalName != Xrm.Account.EntityLogicalName)
return;
account = ctx.AccountSet.SingleOrDefault(x => x.Id == ec.Id);
if (account == null)
return;
account.new_last_activity_created_by = (EntityReference)target["createdby"];
account.new_last_activity_date = (DateTime)target["createdon"];
account.new_last_activity_type = GetActivityType(target); //returns an optionset based on entity logical name....
account.new_last_activity_entity_logicalname = target.LogicalName;
if (!ctx.IsAttached(account))
ctx.Attach(account);
ctx.UpdateObject(account);
ctx.SaveChanges();
}
}
I want it to run in a real time workflow, because the changes are going to be instant and also, because the user created the entity above may not have the right to update the account, so a real time workflow allows me to do the update as another user (workflow owner) that has the rights to do so.
The problem is when it's for an activity entity (the first big block in the switch statement), the record looks like it has no been committed to the database and there's no way for me to fetch it and look further into it (activity ends up null)
If I look for target's GUID in ActivityPointerBase or TaskBase, EmailBase, etc. I am not seeing it anywhere during the execution of this workflow (which happens after a record is created I assume)
If I run this in an async workflow, it works for all entities.
In real time mode, it only works for opportunities, posts and annotations. I am creating the activities from the social feed menu in my tests.
The code above runs within a method in a class that extends BaseWorkflowStep:
public abstract class BaseWorkflowStep : CodeActivity
{
protected override void Execute(CodeActivityContext context)
{
var xrmObjects = new XrmObjects();
var serviceFactory = context.GetExtension<IOrganizationServiceFactory>();
xrmObjects.CodeActivityContext = context;
xrmObjects.TracingService = context.GetExtension<ITracingService>();
xrmObjects.WorkflowContext = context.GetExtension<IWorkflowContext>();
xrmObjects.Service = serviceFactory.CreateOrganizationService(xrmObjects.WorkflowContext.UserId);
ExecuteWorkflowLogic(xrmObjects);
}
public virtual void ExecuteWorkflowLogic(XrmObjects xrmObjects)
{
throw new NotImplementedException();
}
}
What's happening here ?
This is probably too old now, but if you're using a real-time Workflow (RTWF), then at the time your WF is running the record wouldn't be committed yet. This is in part because if your RTWF fails, the failure will cancel the creation of the primary record. So the primary record and your code results are a part of the same DB transaction. That's why async works too, since those operations occur in separate DB transactions.
Instead of querying for the data, instead use the images that are passed in the Context. For example, if this were occurring after the creation of a record, you might use the "PostBusinessEntity" within the PostEntityImage collection.

Why Does Await Not Appear to Prevent Second Operation on EF Context

Within an ASP.NET MVC Application I'm recieving the following error message for one of my controller methods that uses my Entity Framework context.
A second operation started on this context before a previous asynchronous operation completed. Use 'await' to ensure that any asynchronous operations have completed before calling another method on this context. Any instance members are not guaranteed to be thread safe.
I'm aware that you cannot run queries in parallel, and everything appears to be awaited properly. If I debug the program and step and inspect some of the data returned from EF then it works, probably because this forces the queries to complete.
EDIT If I place a breakpoint at the null check in the controller method and inspect the data of shipmentDetail the exception is NOT thrown.
Here's a snippit of the code:
Controller Method:
[Route("{id:int}/Deliveries")]
public async Task<ActionResult> DeliveryInfo(int id)
{
var shipmentDetail = await db.ShipmentDetails.SingleOrDefaultAsync(s => s.Id == id);
if (shipmentDetail == null)
return HttpNotFound(string.Format("No shipment detail found with id {0}", id));
var model = await DeliveryInfoModel.CreateModel(db, shipmentDetail);
return View("DeliveryInfo", model);
}
CreateModel Method:
public static async Task<DeliveryInfoModel> CreateModel(Context db, ShipmentDetail shipment)
{
DeliveryInfoModel model = new DeliveryInfoModel()
{
ShipmentInfo = shipment
};
//initialize processing dictionary
Dictionary<int, bool> boxesProcessed = new Dictionary<int, bool>();
List<DeliveryBoxStatus> statuses = new List<DeliveryBoxStatus>();
for (int i = 1; i <= shipment.BoxCount; i++ )
{
boxesProcessed.Add(i, false);
}
//work backwards through process
//check for dispositions from this shipment
if(shipment.Dispositions.Count > 0)
{
foreach (var d in shipment.Dispositions)
{
DeliveryBoxStatus status = new DeliveryBoxStatus()
{
BoxNumber = d.BoxNumber,
LastUpdated = d.Date,
Status = d.Type.GetDescription().ToUpper()
};
statuses.Add(status);
boxesProcessed[d.BoxNumber] = true;
}
}
//return if all boxes have been accounted for
if (boxesProcessed.Count(kv => kv.Value) == shipment.BoxCount)
{
model.BoxStatuses = statuses;
return model;
}
//check for deliveries
if(shipment.Job_Detail.Count > 0)
{
foreach (var j in shipment.Job_Detail.SelectMany(d => d.DeliveryInfos))
{
DeliveryBoxStatus status = new DeliveryBoxStatus()
{
BoxNumber = j.BoxNumber,
LastUpdated = j.Job_Detail.To_Client.GetValueOrDefault(),
Status = "DELIVERED"
};
statuses.Add(status);
boxesProcessed[j.BoxNumber] = true;
}
}
//check for items still in processing & where
foreach (int boxNum in boxesProcessed.Where(kv => !kv.Value).Select(kv => kv.Key))
{
//THIS LINE THROWS THE EXCEPTION
var processInfo = await db.Processes.Where(p => p.Jobs__.Equals(shipment.Job.Job__, StringComparison.InvariantCultureIgnoreCase) && p.Shipment == shipment.ShipmentNum && p.Box == boxNum)
.OrderByDescending(p => p.date)
.FirstOrDefaultAsync();
//process returned data
//...
}
model.BoxStatuses = statuses;
return model;
}
I'm not completely sure if it's because of the query made in the controller, or because of the queries made in the loop that aren't completing causing this behavior. Is there something I'm not understanding about when the queries are actually made/returned due to EF's laziness, or how async/await works in this situation? I have a lot of other methods & controllers that make async EF calls and haven't run into this previously.
EDIT
My context is injected into my controller using Ninject as my IoC container. Here's its config inside of NinjectWebCommon's RegisterServices method:
kernel.Bind<Context>().ToSelf().InRequestScope();
Avoid lazy loading when using async with Entity Framework. Instead, either load the data you need first, or use Include()'s to ensure the data you need is loaded with the query.
https://msdn.microsoft.com/en-gb/magazine/dn802603.aspx
Current State of Async Support
... Async
support was added to Entity Framework (in the EntityFramework NuGet
package) in version 6. You do have to be careful to avoid lazy
loading when working asynchronously, though, because lazy loading is
always performed synchronously. ...
(Emphasis mine)
Also:
https://entityframework.codeplex.com/wikipage?title=Task-based%20Asynchronous%20Pattern%20support%20in%20EF.#ThreadSafety

.Net mvc EF codefirst how to hanle concurrent update requests to database

I've got a table in database:
USERID MONEY
______________
1 500
The money value could be changed only by logged in user that owns account. I've got a function like:
bool buy(int moneyToSpend)
{
var moneyRow = db.UserMoney.Find(loggedinUserID);
if(moneyRow.MONEY < moneyToSpend)
return false;
//code for placing order
moneyRow.MONEY -= moneyToSpend;
return true;
}
I know that mvc sessions are always synchronous, so there will never be 2 symulateous calls to this function in one user session. But what if user logs in to the site 2 times from different browsers? Will it be still single threaded session or I can get 2 concurrent requests to this function?
And if there will be concurrency then how should I handle it with EF? Normally in ADO I would use MSSQL's "BEGIN WORK" for this type of situation, but I have no idea on how to make it with EF.
Thank you for your time!
I would suggest you to use RowVersion to handle concurrent requests.
Good reference here: http://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/handling-concurrency-with-the-entity-framework-in-an-asp-net-mvc-application
// in UserMoney.cs
[Timestamp]
public byte[] RowVersion { get; set; }
// in model builder
modelBuilder.Entity<UserMoney>().Property(p => p.RowVersion).IsConcurrencyToken();
// The update logic
public bool Buy(int moneyToSpend, byte[] rowVersion)
{
try
{
var moneyRow = db.UserMoney.Find(loggedinUserID);
if(moneyRow.MONEY < moneyToSpend)
{
return false;
}
//code for placing order
moneyRow.MONEY -= moneyToSpend;
return true;
}
catch (DbUpdateConcurrencyException ex)
{
var entry = ex.Entries.Single();
var submittedUserMoney = (UserMoney) entry.Entity;
var databaseValue = entry.GetDatabaseValues();
if (databaseValue == null)
{
// this entry is no longer existed in db
}
else
{
// this entry is existed and have newer version in db
var userMoneyInDb = (UserMoney) databaseValue.ToObject();
}
}
catch (RetryLimitExceededException)
{
// probably put some logs here
}
}
I do not think it would be a major problem for you since the idea is that MSSQL as far as i know will not allow asyncroneus data commits to the same user from the same thread it has to finish one process before moving to the next one but you can try something like this
using (var db = new YourContext())
{
var moneyRow = db.UserMoney.Find(loggedinUserID);
moneyRow.MONEY -= moneyToSpend;
bool saveFailed;
do
{
saveFailed = false;
try
{
db.SaveChanges();
}
catch (DbUpdateConcurrencyException ex)
{
saveFailed = true;
// Update original values from the database
var entry = ex.Entries.Single();
entry.OriginalValues.SetValues(entry.GetDatabaseValues());
}
} while (saveFailed);
}
More can be found here Optimistic Concurrency Patterns

Categories

Resources