EF Change Tracker: Collection was modified; enumeration operation may not execute - c#

I'm getting an error when I loop through a collection. I'm not changing the list at any point, but it gives me the following "Collection was modified; enumeration operation may not execute. Following code shows the method that I have this Parallel.Foreach.
public void DownloadImages()
{
IList<Vehicle> vehicles = _repository.Retrieve().ToList();
Parallel.ForEach(vehicles, vehicle =>
{
IList<VehiclePhoto> vehiclePhotos =
vehicle.VehiclePhotos.Where(x => x.ImageStatus == ImageStatus.Inactive).ToList();
if (vehiclePhotos.Count > 0)
{
Parallel.ForEach(vehiclePhotos, photo =>
{
using (var client = new WebClient())
{
try
{
string fileName = Guid.NewGuid() + ".png";
string filePath = _imagePath + "\\" + fileName;
client.DownloadFile(photo.ImageUrl, filePath);
photo.FileName = fileName;
photo.ImageStatus = ImageStatus.Active;
}
catch (Exception)
{
}
}
});
_repository.Save(vehicle);
}
});
}
This happens when _repository.Save(vehicle) is called. Following code will show the save changes method. base.SaveChanges(); is the place where error get raised.
public override int SaveChanges()
{
DateTime nowAuditDate = DateTime.Now;
IEnumerable<System.Data.Entity.Infrastructure.DbEntityEntry<DomainEntity>> changeSet = ChangeTracker.Entries<DomainEntity>();
if (changeSet != null)
{
foreach (System.Data.Entity.Infrastructure.DbEntityEntry<DomainEntity> entry in changeSet)
{
switch (entry.State)
{
case EntityState.Added:
entry.Entity.Created = nowAuditDate;
entry.Entity.Modified = nowAuditDate;
break;
case EntityState.Modified:
entry.Entity.Modified = nowAuditDate;
break;
}
}
}
return base.SaveChanges();
}
Any ideas on this?
EDITED:
I was trying to fix this above mentioned error and changed few code lines in DownloadImages method. Those changes are as follows:
Instead of
IList<Vehicle> vehicles = _repository.Retrieve().ToList();
I used var
var vehicles = _repository.Retrieve().AsParallel();
Instead of
IList<VehiclePhoto> vehiclePhotos =
vehicle.VehiclePhotos.Where(x => x.ImageStatus == ImageStatus.Inactive).ToList();
I used var
var vehiclePhotos =
vehicle.VehiclePhotos.Where(x => x.ImageStatus == ImageStatus.Inactive).AsParallel();
When I tried to run the code again. It gave me a different error: Error is as follows:
In the header it says
System.Data.Entity.Infrastructure.DbUpdateException
But in the innerException
System.Data.Entity.Core.UpdateException
ExecuteNonQuery requires the command to have a transaction when the connection assigned to the command is in a pending local transaction. The Transaction property of the command has not been initialized.

This happens because you are iterating over vehicles and the code inside the loop is changing the elements of the enumeration vehicles. Also, looping through vehiclePhotos and changing the elements of it.
I usually solve this by using an explicit for loop, i.e.:
for(var i = 0; i < vehicles.Count; i++)
{
var vehicle = vehicles.ElementAt(i);
// process vehicle
}
However, if you need to use Parallel.ForEach, you could try creating an array of integers with length of vehicles.Count, iterate over that and use it as index for vehicles.

Related

Getting Value cannot be null. parameter name: source on release but not in debug

I'm creating a Revit plugin that reads and writes modelinformation to a database, and it all works fine in debug mode, but when I release the project and run Revit with the plugin outside visual studio, I'm getting an error when the plugin tries to read data from the database.
The code runs on DocumenetOpened event and looks like this:
public void application_DocumentOpenedEvent(object sender, DocumentOpenedEventArgs e)
{
UIApplication uiapp = new UIApplication(sender as Autodesk.Revit.ApplicationServices.Application);
Document doc = uiapp.ActiveUIDocument.Document;
//ModelGUID COMMAND
var command = new ModelCheckerCommandExec();
command.Execute(uiapp);
}
It then fails on the following line:
ModelsList = (DatabaseHelper.ReadNonAsync<RevitModel>())
.Where(m => m.ModelGUID == DataStores.ModelData.ModelGUID).ToList();
In this code block that gets executed:
public class ModelCheckerCommandExec : IExternalCommand
{
public Result Execute(ExternalCommandData commandData, ref string message, ElementSet elements)
{
return Execute(commandData.Application);
}
public Result Execute(UIApplication uiapp)
{
Document doc = uiapp.ActiveUIDocument.Document;
Transaction trans = new Transaction(doc);
try
{
trans.Start("ModelGUID");
ModelGUIDCommand.GetAndSetGUID(doc);
trans.Commit();
var ModelsList = new List<RevitModel>();
ModelsList = (DatabaseHelper.ReadNonAsync<RevitModel>()).ToList();//.Where(m => m.ModelGUID == DataStores.ModelData.ModelGUID).ToList(); // Read method only finds models the are similar to the DataStore.ModelDate.DBId;
if (ModelsList.Count == 1)
{
trans.Start("DataFromDB");
doc.ProjectInformation.Name = ModelsList[0].ProjectName;
doc.ProjectInformation.Number = ModelsList[0].ModelNumber;
doc.ProjectInformation.Status = ModelsList[0].ModelStatus;
doc.ProjectInformation.IssueDate = ModelsList[0].ProjectIssueDate;
doc.ProjectInformation.ClientName = ModelsList[0].ClientName;
doc.ProjectInformation.Address = ModelsList[0].ProjectAddress;
doc.ProjectInformation.LookupParameter("Cadastral Data").Set(ModelsList[0].ProjectIssueDate);
doc.ProjectInformation.LookupParameter("Current Version").Set(ModelsList[0].CurrentVersion);
doc.ProjectInformation.BuildingName = ModelsList[0].BuildingName;
DataStores.ModelData.ModelManager1 = ModelsList[0].ModelManagerOne;
DataStores.ModelData.ModelManager1Id = ModelsList[0].ModelManagerOneId;
DataStores.ModelData.ModelManager2 = ModelsList[0].ModelManagerTwo;
DataStores.ModelData.ModelManager2Id = ModelsList[0].ModelManagerTwoId;
trans.Commit();
}
return Result.Succeeded;
}
catch (Exception ex)
{
TaskDialog.Show("Error", ex.Message);
return Result.Failed;
}
}
}
The "ReadNonAsync" method is as follows:
public static List<T> ReadNonAsync<T>() where T : IHasId
{
using (var client = new HttpClient())
{
var result = client.GetAsync($"{dbPath}{Properties.Settings.Default.CompanyName}_{typeof(T).Name.ToLower()}.json?access_token={DataStores.IdToken.UserIdToken}").GetAwaiter().GetResult();
var jsonResult = result.Content.ReadAsStringAsync().GetAwaiter().GetResult();
if (result.IsSuccessStatusCode)
{
var objects = JsonConvert.DeserializeObject<Dictionary<string, T>>(jsonResult);
List<T> list = new List<T>();
if (objects != null)
{
foreach (var o in objects)
{
o.Value.Id = o.Key;
list.Add(o.Value);
}
}
return list;
}
else
{
return null;
}
}
}
In the rest of my code I use a async Read method which works, so I'm wondering wether or not that's the issue, but Revit wont let me use an async method inside an Execute method.
How do I debug this issue correctly, and why could there be code working in debug that doesn't work in "live" versions?
I found a solution!
The issue:
The reason for the error was that when I run the software in debug-mode, a file path of "xxx.txt" finds files in the solution folder, but when I run the software "xxx.txt" points to the folder of the software and not the .dll -
So in my case it pointed to "c:\Program Files\Autodesk\Revit\2021".
The fix:
Hard coding the path, or by getting the path of the executing .dll
Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location)
Debugging/Troubleshooting:
I found the issue by inserting dialogboxes with errormessages in all my try/catch statements.

How to efficiently edit data in a database?

The method is supposed to receive data from a server, check if new tokens have been added, and if there are, add them to the database. If the token already exists, update its status but don't add a new row in the table. This is the code I've written so far.
IEnumerable<Token> serverTokens = JsonConvert.DeserializeObject<IEnumerable<Token>>
(server.GetTokens().Content);
IEnumerable<Token> dbTokens = _tokenService.GetAllTokens();
foreach (var token in serverTokens)
{
var dbToken = dbTokens.Where(x => x.Symbol == token.Symbol).FirstOrDefault();
if (dbToken != null)
{
Token editedToken = dbToken;
editedToken.UpdatedOn = DateTime.Now;
editedToken.Active = token.Active;
_tokenService.AddToken(editedToken);
}
else
{
token.UpdatedOn = DateTime.Now;
_tokenService.AddToken(token);
}
}
dbContext.SaveChanges();
The AddToken method is just a simple AddOrUpdate operation.
public void AddToken(Token token)
{
_dbContext.Tokens.AddOrUpdate(token);
//_dbContext.SaveChanges();
}
Now, this code does what it's supposed to, however it's extremely slow. How would I go about optimizing it?
dbTokens.Where(x => x.Symbol == token.Symbol) is IEnumerable
So he will load it each time you call it on the loop.
Store in in a list before the loop
List<Token> dbTokens = _tokenService.GetAllTokens().ToList()

Dynamics CRM SDK: Execute Multiple Requests for Bulk Update of around 5000 Records

I have written a function to update Default Price List for all the Active Products on the CRM 2013 Online.
//The method takes IOrganization service and total number of records to be created as input
private void UpdateMultipleProducts(IOrganizationService service, int batchSize, EntityCollection UpdateProductsCollection, Guid PriceListGuid)
{
//To execute the request we have to add the Microsoft.Xrm.Sdk of the latest SDK as reference
ExecuteMultipleRequest req = new ExecuteMultipleRequest();
req.Requests = new OrganizationRequestCollection();
req.Settings = new ExecuteMultipleSettings();
req.Settings.ContinueOnError = true;
req.Settings.ReturnResponses = true;
try
{
foreach (var entity in UpdateProductsCollection.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
entity.Attributes["pricelevelid"] = new EntityReference("pricelevel", PriceListGuid);
req.Requests.Add(updateRequest);
}
var res = service.Execute(req) as ExecuteMultipleResponse; //Execute the collection of requests
}
//If the BatchSize exceeds 1000 fault will be thrown.In the catch block divide the records into batchable records and create
catch (FaultException<OrganizationServiceFault> fault)
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
}
}
Code Description : There are around 5000 active product records in the System. So I am updating Default Price List for all of them using above code.
But, I am missing here something so that, it has updated only 438 records. It loops through the While statement correctly, but it is not updating all of them here.
What should be the Batchsize when we run this function for the First Time?
Any one can help me here?
Thank you,
Mittal.
You pass remainingCreates as the batchSize parameter but your code never references batchSize so you are just going to reenter that while loop every time.
Also, I'm not sure how you are doing all your error handling but you need to update your catch block so that it doesn't just let FaultExceptions pass-through if they don't contain a MaxBatchSize value. Right now, if you take a FaultException regarding something other than batch size it will be ignored.
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
else throw;
}
Instead of reactive handling, i prefer proactive handling of the MaxBatchSize, this is true when you already know what is MaxMatchSize is.
Following is sample code, here while adding OrgRequest to collection i keep count of batch and when it exceeds I call Execute and reset the collection to take fresh batch.
foreach (DataRow dr in statusTable.Rows)
{
Entity updEntity = new Entity("ABZ_NBA");
updEntity["ABZ_NBAid"] = query.ToList().Where(a => a.NotificationNumber == dr["QNMUM"].ToString()).FirstOrDefault().TroubleTicketId;
//updEntity["ABZ_makerfccall"] = false;
updEntity["ABZ_rfccall"] = null;
updEntity[cNBAttribute.Key] = dr["test"];
req.Requests.Add(new UpdateRequest() { Target = updEntity });
if (req.Requests.Count == 1000)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
req.Requests = new OrganizationRequestCollection();
}
}
if (req.Requests.Count > 0)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
}

Out of Memory at line XXXX

can anyone help me how to resolve the out of memory error on my asp page? im using linq to sql.. after adding data several data.. like more than 10 rows. in the grid. an out of memory error occurs.. attached herewith is my add function..
public ServiceDetail checkservicedetailid()
{
string ServiceName = ViewState["Tab"].ToString();
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(a => a.ServiceName == ServiceName && a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID).SingleOrDefault();
return checkservicedetailid;
}
public IEnumerable<ServiceDetail> get(Expression<Func<ServiceDetail, Boolean>> express)
{
return ServiceDetailsDB.ServiceDetails.Where(express);
}
protected void btnSaveEmptyOC_Click(object sender, EventArgs e)
{
try
{
if (checkservicedetailid() != null)
{
CashExpense tblCashExpenses = new CashExpense();
Guid CashExpensesID = Guid.NewGuid();
tblCashExpenses.CashExpensesID = CashExpensesID;
tblCashExpenses.ServiceDetailsID = checkservicedetailid().ServiceDetailsID;
tblCashExpenses.Description = txtDescriptionEmptyOC.Text;
tblCashExpenses.Quantity = Decimal.Parse(txtQTYEmptyOC.Text);
tblCashExpenses.UnitCost = Decimal.Parse(txtUnitCostEmptyOC.Text);
tblCashExpenses.CreatedBy = User.Identity.Name;
tblCashExpenses.DateCreated = DateTime.Now;
tblCashExpenses.CashExpensesTypeID = "OTHER";
CashExpenses_worker.insert(tblCashExpenses);
CashExpenses_worker.submit();
//Clear items after saving
txtDescriptionEmptyOC.Text = "";
txtQTYEmptyOC.Text = "";
txtUnitCostEmptyOC.Text = "";
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOC2, "SaveEmptyOC", this.Page);
MyAuditProvider.Insert(this.GetType().ToString(), ViewState["MarginAnalysisID"].ToString(), MessageCenter.Mode.ADD, MessageCenter.CashExpenseMaintenace.InsertOC2, Page.Request, User);
divOtherCost.Visible = false;
grd_othercost.Visible = true;
btnaddothercost.Visible = true;
}
else
{
//Displays a Message on the Validation Summary (Service Id does not exist)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.SaveServiceDetailOC, "SaveEmptyOC", this.Page);
}
}
catch
{
//Displays a Message on the Validation Summary (Error on Saving)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOCError, "SaveEmptyOC", this.Page);
}
finally
{
//Rebinds the Grid
populategrd_othercost();
}
}
I'm guessing from your code here:
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID
).SingleOrDefault();
that .get() is taking a Func<SomeType, bool>, and you are doing something like:
var row = dbCtx.SomeTable.Where(predicate);
(please correct me here if I'm incorrect)
This, however, is using LINQ-to-Objects, meaning: it is loading every row from the table to the client and testing locally. That'll hurt memory, especially if a different db-context is created for each row. Additionally, the checkmarginanalysisid() call is being executed per row, when presumably it doesn't change between rows.
You should be testing this with an Expression<Func<SomeType, bool>> which would be translated to TSQL and executed at the server. You may also need to remove untranslatable methods, i.e.
var marginAnalysisId = checkmarginanalysisid().MarginAnalysisID;
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == marginAnalysisId
).SingleOrDefault();
where that is get(Expression<Func<SomeType, bool>>).
I tried all of the solution given to me both by my peers as well as the solution provided here, from GC.Collect, to disposing linq datacontext after use etc. however the error keeps on occurring, i then tried to remove the update panel, Ive read a site that showed how ridiculous update panel when it comes to handling data esp when a function is done repeatedly. And poof! the memory problem is gone!

SPListItem Save conflict while Update

(1) var list1 = web.GetList("/lists/list1");
(2) var item1 = list1.GetItemById(10001);
(3) ...
take breakpoint here, open item with ID = 10001 for edit, change 'Title' fields and save it. Then run code follow:
(4)item1[SPBuiltInFieldId.Title] = "some text";
(5)item1.Update();
row (5) throws save conflict exception.
How can to lock item for edit at line (3)? Or any other approach to avoid conflict?
You have to check the SPListItem manually
try
{
var item = list.GetItemById(3);
item["MyField"] = "FooBar";
item.Update();
}
catch(SPException conflictEx)
{
// handle conflict by re-evaluating SPListItem
var item = list.GetItemById(3);
// ..
}
I don't know any other mechanism atm.
// *create a new SPWeb object for each list modification otherwise
we'll get Save Conflict*
from the following URL
http://platinumdogs.me/2010/01/21/sharepoint-calling-splist-update-causes-save-conflict-spexception/
exceptions
using (var thisWeb = featSite.OpenWeb(featWeb.ID))
{
try
{
var listUpdate = false;
var theList = thisWeb.Lists[att.Value];
// change list configuration
// .....
// commit List modifications
if (listUpate)
theList.Update();
}
catch
{
// log the event and rethrow
throw;
}
}
}
}
Another approach is using Linq to SharePoint, Linq to SharePoint offers you a conflict resolution mechanism
SharePoint's LINQ provider is querying for concurrent changes when you try to save changes you've made using the SubmitChanges method.
When a conflict has been found, a ChangeConflictException will be thrown.
foreach(var notebook in spSite.Notebooks)
{
notebook.IsTopNotebook = true;
}
try
{
spSite.SubmitChanges(ConflictMode.ContinueOnConflict);
}
catch(ChangeConflictException ex)
{
foreach(ObjectChangeConflict occ in spSite.ChangeConflicts)
{
if (((Notebook)occ.Object).Memory > 16)
{
foreach (MemberChangeConflict field in occ.MemberConflicts)
{
if (field.Member.Name == "IsTopNotebook")
{
field.Resolve(RefreshMode.KeepCurrentValues);
}
else
{
field.Resolve(RefreshMode.OverwriteCurrentValues);
}
}
}
else
{
occ.Resolve(RefreshMode.KeepCurrentValues);
}
}
spSite.SubmitChanges();
}

Categories

Resources