inconsistent context state - c#

I am currently running an app with a very LONG Linq transaction. Everything has to happen in this one transaction, and I'm not sure if or where objects are interfering with each other.
When I try to save the changes I see this warning come up:
System.InvalidOperationException: The changes to the database were
committed successfully, but an error occurred while updating the
object context. The ObjectContext might be in an inconsistent state.
Inner exception message: AcceptChanges cannot continue because the
object's key values conflict with another object in the
ObjectStateManager. Make sure that t he key values are unique before
calling AcceptChanges.
From what I've read googling around, people have found a lot of one-off solutions (rarely having anything to do with conflicting keys), and they don't usually post what the tip-offs are (better than nothing, of course).
What I'm not clear on is how do I hunt down the cause of this problem?
I update a lot of records in different places and let them go out of scope. I'm guessing the .NET compiler knows how to keep track of these objects without letting them go through the GC so it can commit everything at the end. And all the changes seem to end up in the database afterword.
Example:
// create new A, SA for new incoming tasks
SF_SUB_AREA sa = null;
SF_AREA a = null;
if (isNewSA) // new SA
{
areaID = MakeSalesForceGUID();
a = new SF_AREA
{
ID = areaID,
DESCRIPTION = t.DESCRIPTION,
CU_NUMBER = Convert.ToString(t.CU_NUMBER),
FC = t.FC,
PROJECT = cp.ID,
DELETE_FLAG = "I"
};
ctx.SF_AREA.AddObject(a);
SAID = MakeSalesForceGUID();
sa= new SF_SUB_AREA
{
ID = SAID,
PROJECT_REGION = t.CR,
AREA = areaID,
DELETE_FLAG = "I"
};
ctx.SF_SUB_AREA.AddObject(sa);
}
else // old SA
{
List<SF_AREA> lia = (from a2 in ctx.SF_AREA
join a2 in ctx.SF_SUB_AREA on a2.ID equals sa2.AREA
where sa2.ID == t.SUB_AREA
select ct2).ToList();
if ((lia != null) && (lia.Count > 0))
{
a = lia[0];
a.DELETE_FLAG = "U";
a.CLIENT_UNIT_NUMBER = Convert.ToString(t.CU_NUMBER);
a.DESCRIPTION = t.DESCRIPTION;
a.FC = t.FC;
a.PROJECT = cp.ID;
} // TODO: throw an error here for else block
List<SF_SUB_AREA> lisa = (from sa2 in ctx.SF_SUB_AREA
where sa2.ID == t.SUB_AREA
select sa2).ToList();
if ((lisa != null) && (lisa.Count > 0))
{
sa = lisa[0];
sa.PROJECT_REGION = t.AREA;
sa.AREA = lisa[0].AREA;
sa.DELETE_FLAG = "U";
}
}
...
ctx.SaveChanges(); // left out the try/catch
Currently I'm just creating new context every time I commit something, but I don't know if this is advisable.
foreach (SF_MOVE_ORDER mo in liMO ) {
using (SFEntitiesRM ctx2 = new SFEntitiesRM()) // new context for every MO since it goes into an unknown state after every commit
{
List<SF_CLIENT_PROJECT> liCP = (from cp in ctx2.SF_CLIENT_PROJECT
where cp.ID == mo.CLIENT_PROJECT
select cp).ToList();
if ((liCP != null) && (liCP.Count > 0))
{
PerformMoveOrder(mo, liCP[0], ctx2);
}
}
}

Usually with errors like this, it is best to start out by saving one object and building in the complexities one at a time. That way, you can start to figure out where the problem lies. It can end up being somewhere down the object graph that you weren't even expecting. But I would not just continue to throw the whole LINQ update at the context. Break it up into smaller saves and rebuild it to the largets graph and you'll find your error.

Related

Insert into multiple tables using WCF Transactions

I am trying to add an entry into a table and use the primary key of that added entry to create an additional entry into another table.
The error I am getting is
The transaction manager has disabled its support for remote/network
transactions. (Exception from HRESULT: 0x8004D024)
I believe this is caused by creating multiple connections within a single TransactionScope, but I am doing everything within one context / using statement, so I do not believe that I should be receiving this error.
Service
[OperationBehavior(TransactionScopeRequired = true)]
public void CreateGroup(NewGroupData data)
{
var groupRepo = _GroupRepo ?? new InvestigatorGroupRepository();
groupRepo.CreateGroup(data.UserId, data.InvestigatorGroupName, data.HasGameAssignment, data.InstitutionId);
}
Repository
public void CreateGroup(string userId, string investigatorGroupName, bool hasGameAssignment, int institutionId)
{
using (var context = new GameDbContext())
{
var newGroup = new InvestigatorGroup()
{
InvestigatorGroupName = investigatorGroupName,
HasGameAssignment = hasGameAssignment,
InstitutionId = institutionId,
IsTrashed = false
};
int institutionUserId =
context.InstitutionUsers.Where(
iu => !iu.IsTrashed && iu.APUser.UserId == userId && iu.InstitutionId == institutionId).Select(iu => iu.InstitutionUserId).Single();
var newGroupUser = new InvestigatorGroupUser()
{
InstitutionUserId = institutionUserId,
InvestigatorGroup = newGroup,
CreationDate = DateTime.Now
};
context.InvestigatorGroupUsers.Add(newGroupUser);
context.SaveChanges();
}
}
You start with a wrong assumption.
The line...
int newGroupId = context.InvestigatorGroups.Add(newGroup).InvestigatorGroupId;
...will always assign 0 to newGroupId. The Add method only marks the entity for insert, but doesn't actually insert it. Only SaveChanges writes data to the database, not any other method in Entity Framework.
So the assignment...
InvestigatorGroupId = newGroupId,
...is faulty as well. You have to assign the new InvestigatorGroup to a navigation property in InvestigatorGroupUser:
InvestigatorGroup = newGroup,
Add this navigation property to InvestigatorGroupUser if you haven't got it yet.
If you have that, it's enough to execute these lines:
context.InvestigatorGroupUsers.Add(newGroupUser);
context.SaveChanges();
No need to Add the newGroup object too, It will be added by adding newGroupUser.
So if you do that, the only transaction you need is the one that SaveChanges uses internally by default. For the code you show, you don't need a TransactionScope. If this is part of a greater WCF transaction the story may be different, but I think at least you needed some misconceptions to be straightened out.

Linq to SQL fetching cached results

when I select objects using Linq, i seem to get cached results at first.
I use the following code to fetch an applicant from the DB on GET requests assume that this data is STATE_1
using (AdmissionsAppEntities db = new AdmissionsAppEntities())
{
// Fetch the user
ApplicationData applicant = (from a in db.ApplicationDatas
where a.userGUID == userGUID
select a).SingleOrDefault();
}
and the following code to save changes against this record on POST requests, after the SaveChanges is called this record should be in STATE_2
using (AdmissionsAppEntities db = new AdmissionsAppEntities())
{
// Fetch the user
var applicant = (from a in db.ApplicationDatas
where a.userGUID == userGUID
select a).SingleOrDefault();
if (applicant != null)
{
// Save page 1 data
...
applicant.lastUpdate = DateTime.Now;
db.Entry(applicant).State = EntityState.Modified;
DataBag.result = db.Entry(applicant).GetValidationResult();
if (DataBag.result.ValidationErrors.Count == 0)
{
db.SaveChanges();
}
}
}
if (DataBag.result.ValidationErrors.Count == 0)
{
return RedirectToAction("PageTwo");
}
The database properly saves STATE_2 (i can see it in the db if I use a sql inspection tool), but on subsequent pageload, STATE_1 is retrieved.
I see tons of results where people are having this issue, but no ideas on how to fix it.
UPDATE 1
I moved the RedirectToAction calls (all my returns) to outside the using block to make sure that each DbContext's Destroy function gets called. Did not appear to solve the problem.
People are actually having this issue, so I'm going to leave this up, my issue ended up being Orchard CMS caching my module's pages.

Why does this EF upsert logic result in deletions instead of updates?

The following code results in deletions instead of updates.
My question is: is this a bug in the way I'm coding against Entity Framework or should I suspect something else?
Update: I got this working, but I'm leaving the question now with both the original and the working versions in hopes that I can learn something I didn't understand about EF.
In this, the original non working code, when the database is fresh, all the additions of SearchDailySummary object succeed, but on the second time through, when my code was supposedly going to perform the update, the net result is a once again empty table in the database, i.e. this logic manages to be the equiv. of removing each entity.
//Logger.Info("Upserting SearchDailySummaries..");
using (var db = new ClientPortalContext())
{
foreach (var item in items)
{
var campaignName = item["campaign"];
var pk1 = db.SearchCampaigns.Single(c => c.SearchCampaignName == campaignName).SearchCampaignId;
var pk2 = DateTime.Parse(item["day"].Replace('-', '/'));
var source = new SearchDailySummary
{
SearchCampaignId = pk1,
Date = pk2,
Revenue = decimal.Parse(item["totalConvValue"]),
Cost = decimal.Parse(item["cost"]),
Orders = int.Parse(item["conv1PerClick"]),
Clicks = int.Parse(item["clicks"]),
Impressions = int.Parse(item["impressions"]),
CurrencyId = item["currency"] == "USD" ? 1 : -1 // NOTE: non USD (if exists) -1 for now
};
var target = db.Set<SearchDailySummary>().Find(pk1, pk2) ?? new SearchDailySummary();
if (db.Entry(target).State == EntityState.Detached)
{
db.SearchDailySummaries.Add(target);
addedCount++;
}
else
{
// TODO?: compare source and target and change the entity state to unchanged if no diff
updatedCount++;
}
AutoMapper.Mapper.Map(source, target);
itemCount++;
}
Logger.Info("Saving {0} SearchDailySummaries ({1} updates, {2} additions)", itemCount, updatedCount, addedCount);
db.SaveChanges();
}
Here is the working version (although I'm not 100% it's optimized, it's working reliably and performing fine as long as I batch it out in groups of 500 or less items in a shot - after that it slows down exponentially but I think that just may be a different question/subject)...
//Logger.Info("Upserting SearchDailySummaries..");
using (var db = new ClientPortalContext())
{
foreach (var item in items)
{
var campaignName = item["campaign"];
var pk1 = db.SearchCampaigns.Single(c => c.SearchCampaignName == campaignName).SearchCampaignId;
var pk2 = DateTime.Parse(item["day"].Replace('-', '/'));
var source = new SearchDailySummary
{
SearchCampaignId = pk1,
Date = pk2,
Revenue = decimal.Parse(item["totalConvValue"]),
Cost = decimal.Parse(item["cost"]),
Orders = int.Parse(item["conv1PerClick"]),
Clicks = int.Parse(item["clicks"]),
Impressions = int.Parse(item["impressions"]),
CurrencyId = item["currency"] == "USD" ? 1 : -1 // NOTE: non USD (if exists) -1 for now
};
var target = db.Set<SearchDailySummary>().Find(pk1, pk2);
if (target == null)
{
db.SearchDailySummaries.Add(source);
addedCount++;
}
else
{
AutoMapper.Mapper.Map(source, target);
db.Entry(target).State = EntityState.Modified;
updatedCount++;
}
itemCount++;
}
Logger.Info("Saving {0} SearchDailySummaries ({1} updates, {2} additions)", itemCount, updatedCount, addedCount);
db.SaveChanges();
}
The thing that keeps popping up in my mind is that maybe the Entry(entity) or Find(pk) method has some side effects? I should probably be consulting the documentation but any advice is appreciated..
It's a slight assumption on my part (without looking into your models/entities), but have a look at what's going on within this block (see if the objects being attached here are related to the deletions):
if (db.Entry(target).State == EntityState.Detached)
{
db.SearchDailySummaries.Add(target);
addedCount++;
}
Your detached object won't be able to use its navigation properties to locate its related objects; you'll be re-attaching an object in a potentially conflicting state (without the correct relationships).
You haven't mentioned what is being deleted above, so I may be way off. Just off out, so this is a little rushed, hope there's something useful in there.

Bulk inserting and updating with Entity Framework (Probably a better alternative?)

I have a data set of devices, addresses, and companies that I need to import into our database, with the catch that our database may already include a specific device/address/company that is included in the new data set. If that is the case, I need to update that entry with the new information in the data set, excluding addresses. We check if an exact copy of that address exists, otherwise we make a new entry.
My issue is that it is very slow to attempt to grab a device/company in EF and if it exist updated it, otherwise insert it. To fix this I tried to get all the companies, devices, and addresses and insert them into respective hashmaps, and check if the identifier of the new data exists in the hashmap. This hasn't led to any performance increases. I've included my code below. Typically I would do a batch insert, I'm not sure what I would do for a batch update though. Can someone advise a different route?
var context = ObjectContextHelper.CurrentObjectContext;
var oldDevices = context.Devices;
var companies = context.Companies;
var addresses = context.Addresses;
Dictionary<string, Company> companyMap = new Dictionary<string, Company>(StringComparer.OrdinalIgnoreCase);
Dictionary<string, Device> deviceMap = new Dictionary<string, Device>(StringComparer.OrdinalIgnoreCase);
Dictionary<string, Address> addressMap = new Dictionary<string, Address>(StringComparer.OrdinalIgnoreCase);
foreach (Company c in companies)
{
if (c.CompanyAccountID != null && !companyMap.ContainsKey(c.CompanyAccountID))
companyMap.Add(c.CompanyAccountID, c);
}
foreach (Device d in oldDevices)
{
if (d.SerialNumber != null && !deviceMap.ContainsKey(d.SerialNumber))
deviceMap.Add(d.SerialNumber, d);
}
foreach (Address a in addresses)
{
string identifier = GetAddressIdentifier(a);
if (!addressMap.ContainsKey(identifier))
addressMap.Add(identifier, a);
}
foreach (DeviceData.TabsDevice device in devices)
{
// update a device
Company tempCompany;
Address tempAddress;
Device currentDevice;
if (deviceMap.ContainsKey(device.SerialNumber)) //update a device
deviceMap.TryGetValue(device.SerialNumber, out currentDevice);
else // insert a new device
currentDevice = new Device();
currentDevice.SerialNumber = device.SerialNumber;
currentDevice.SerialNumberTABS = device.SerialNumberTabs;
currentDevice.Model = device.Model;
if (device.CustomerAccountID != null && device.CustomerAccountID != "")
{
companyMap.TryGetValue(device.CustomerAccountID, out tempCompany);
currentDevice.CustomerID = tempCompany.CompanyID;
currentDevice.CustomerName = tempCompany.CompanyName;
}
if (companyMap.TryGetValue(device.ServicingDealerAccountID, out tempCompany))
currentDevice.CompanyID = tempCompany.CompanyID;
currentDevice.StatusID = 1;
currentDevice.Retries = 0;
currentDevice.ControllerFamilyID = 1;
if (currentDevice.EWBFrontPanelMsgOption == null) // set the Panel option to the default if it isn't set already
currentDevice.EWBFrontPanelMsgOption = context.EWBFrontPanelMsgOptions.Where( i => i.OptionDescription.Contains("default")).Single();
// link the device to the existing address as long as it is actually an address
if (addressMap.TryGetValue(GetAddressIdentifier(device.address), out tempAddress))
{
if (GetAddressIdentifier(device.address) != "")
currentDevice.Address = tempAddress;
else
currentDevice.Address = null;
}
else // insert a new Address and link the device to it (if not null)
{
if (GetAddressIdentifier(device.address) == "")
currentDevice.Address = null;
else
{
tempAddress = new Address();
tempAddress.Address1 = device.address.Address1;
tempAddress.Address2 = device.address.Address2;
tempAddress.Address3 = device.address.Address3;
tempAddress.Address4 = device.address.Address4;
tempAddress.City = device.address.City;
tempAddress.Country = device.address.Country;
tempAddress.PostalCode = device.address.PostalCode;
tempAddress.State = device.address.State;
addresses.AddObject(tempAddress);
addressMap.Add(GetAddressIdentifier(tempAddress), tempAddress);
currentDevice.Address = tempAddress;
}
}
if (!deviceMap.ContainsKey(device.SerialNumber)) // if inserting, add to context
{
oldDevices.AddObject(currentDevice);
deviceMap.Add(device.SerialNumber, currentDevice);
}
}
context.SaveChanges();
Although it doesn't cover your exact problem, this thread has helped me improve the performance of my database import immensly and I recommend you read it.
If you have a lot of hashing, using the task parallel library could help. We also use hash maps to map IDs and it helped a lot. But I recommend you lock{} on the SaveChanges(), so you don't run into concurrency issues (because of this, the TPL only helps when you hash and convert a lot - in our case it helped quite a lot, because we had to do quite a bit of parsing).

bulk insert and update with ADO.NET Entity Framework

I am writing a small application that does a lot of feed processing. I want to use LINQ EF for this as speed is not an issue, it is a single user app and, in the end, will only be used once a month.
My questions revolves around the best way to do bulk inserts using LINQ EF.
After parsing the incoming data stream I end up with a List of values. Since the end user may end up trying to import some duplicate data I would like to "clean" the data during insert rather than reading all the records, doing a for loop, rejecting records, then finally importing the remainder.
This is what I am currently doing:
DateTime minDate = dataTransferObject.Min(c => c.DoorOpen);
DateTime maxDate = dataTransferObject.Max(c => c.DoorOpen);
using (LabUseEntities myEntities = new LabUseEntities())
{
var recCheck = myEntities.ImportDoorAccess.Where(a => a.DoorOpen >= minDate && a.DoorOpen <= maxDate).ToList();
if (recCheck.Count > 0)
{
foreach (ImportDoorAccess ida in recCheck)
{
DoorAudit da = dataTransferObject.Where(a => a.DoorOpen == ida.DoorOpen && a.CardNumber == ida.CardNumber).First();
if (da != null)
da.DoInsert = false;
}
}
ImportDoorAccess newIDA;
foreach (DoorAudit newDoorAudit in dataTransferObject)
{
if (newDoorAudit.DoInsert)
{
newIDA = new ImportDoorAccess
{
CardNumber = newDoorAudit.CardNumber,
Door = newDoorAudit.Door,
DoorOpen = newDoorAudit.DoorOpen,
Imported = newDoorAudit.Imported,
RawData = newDoorAudit.RawData,
UserName = newDoorAudit.UserName
};
myEntities.AddToImportDoorAccess(newIDA);
}
}
myEntities.SaveChanges();
}
I am also getting this error:
System.Data.UpdateException was unhandled
Message="Unable to update the EntitySet 'ImportDoorAccess' because it has a DefiningQuery and no element exists in the element to support the current operation."
Source="System.Data.SqlServerCe.Entity"
What am I doing wrong?
Any pointers are welcome.
You can do multiple inserts this way.
I've seen the exception you're getting in cases where the model (EDMX) is not set up correctly. You either don't have a primary key (EntityKey in EF terms) on that table, or the designer has tried to guess what the EntityKey should be. In the latter case, you'll see two or more properties in the EDM Designer with keys next to them.
Make sure the ImportDoorAccess table has a single primary key and refresh the model.

Categories

Resources