I came across this question, and liked how the generic update for one-to-many is implemented.
I tried to mimic it to implement a one-to-one version for myself but could not be totally successful. Following is the result of my struggle -
public async Task<int> UpdateAsync<T>(T entity, params Expression<Func<T, object>>[] navigations) where T : EntityBase
{
var dbEntity = await _DbCtx.FindAsync<T>(entity.Id);
var entry = _DbCtx.Entry(dbEntity);
entry.CurrentValues.SetValues(entry);
foreach (var nav in navigations)
{
string propertyName = nav.GetPropertyAccess().Name;
// Problem #01 //
// if possible, I'd like to avoid this reflection in favor of EF Core MetaData?
var child = (EntityBase)entity.GetType().GetProperty(propertyName).GetValue(entity);
if (child == null)
continue; // if the client-sent model doesn't have child, skip
var referenceEntry = entry.Reference(propertyName);
await referenceEntry.LoadAsync();
var dbChild = (EntityBase)referenceEntry.CurrentValue;
if (dbChild == null)
{
// Problem #02 //
// if the existing entity doesn't have child, the client-sent child will be assigned.
// but I could not figure out how to do this
}
else
{
_DbCtx.Entry(dbChild).CurrentValues.SetValues(child);
}
}
return await _DbCtx.SaveChangesAsync();
}
I have marked the problems as Problem #01 and Problem #02 in code comments above. Any suggestion, solution will be appreciated. Thanks.
Edit :
Alternatively, if you think there is a better, more efficient way of doing the same thing I'm trying to do above, please share your knowledge.
For Problem #01, I couldn't find equivalent of GetCollectionAccessor, but one way I can think of solving it using EF Core metadata, would be calling the Entry method in the disconnected Entity:
var entry = _context.Entry(dbEntity);
entry.CurrentValues.SetValues(entry);
var disconnectedEntry = _context.Entry(entity); // new
foreach (var nav in navigations)
{
string propertyName = nav.GetPropertyAccess().Name;
var navigationChild = disconnectedEntry.Navigation(propertyName).CurrentValue; // new
}
Bear in mind that in every Entry method call EF Core will try to DetectChanges. As this verification is not necessary for this Entity, you can save some performance disabling the AutoDetectChanges as suggested in this issue.
Now for Problem #02, you can just assign the child Entity to the ReferenceEntry CurrentValue, like this:
var referenceEntry = entry.Reference(propertyName);
await referenceEntry.LoadAsync();
var dbChild = (EntityBase)referenceEntry.CurrentValue;
if (dbChild == null)
{
referenceEntry.CurrentValue = navigationChild; // new
}
else
{
_DbCtx.Entry(dbChild).CurrentValues.SetValues(child);
}
I hope it can help you. If some problem arises let me know!
Edit
I also suggest you to read about the TrackGraph method. Depending on how your entities work, maybe everything could be done with this method with a couple of lines.
Related
I've never attempted to use WaitAll() or WhenAll() when running async functionality. After looking at many documentations, SO posts, and tutorials, I haven't found enough information for this, so here I am.
I'm trying to figure out the best/proper way(s) to do the following:
Using EF6, get data as List<Entity>.
Iterate through each Entity and call an external API to perform some action.
External API returns data per Entity which I need to store on the same Entity.
Currently I have built (not tested) the following (without the error handling code):
public IEnumerable<Entity> Process() {
bool hasChanged = false;
var data = _db.Entity.Where(x => !x.IsRegistered);
foreach (var entity in data) {
var result = await CallExternalApi(entity.Id, entity.Name);
entity.RegistrationId = result.RegistrationId;
entity.IsRegistered = true;
_db.Entry(entity).State = EntityState.Modified;
hasChanges = true;
}
if (hasChanges) {
uow.Commit();
}
return data;
}
I feel like I may be able to take advantage of some other functionality/feature in async, but if I can I'm not sure how to implement it here.
Any guidance is really appreciated.
Update
The API I'm calling is the Zoom Api to add Registrants. While they do have an route to batch add Registrants, it does not return the RegistrantId and the Join Url I need.
First, figure out if your external API might have a way to get all the items you want in a batch. If it does, use that instead of sending a whole bunch of requests.
If you need to send a separate request for each item, but want to do it concurrently, you could do this:
public async Task<IReadOnlyCollection<Entity>> Process() {
var data = _db.Entity.Where(x => !x.IsRegistered).ToList();
if(!data.Any()) { return data; }
var entityResultTasks = data
.Select(async entity => new { entity, result = await CallExternalApi(entity.Id, entity.Name) })
.ToList();
var entityResults = await Task.WhenAll(entityResultTasks);
foreach (var entityResult in entityResults) {
var entity = entityResult.entity;
var result = entityResult.result;
entity.RegistrationId = result.RegistrationId;
entity.IsRegistered = true;
_db.Entry(entity).State = EntityState.Modified;
}
uow.Commit();
return data;
}
You will want to watch out for possible concurrency limits on the target source. Consider using Chunk to break your work into batches of acceptable sizes, or leveraging a semaphore or something to throttle the number of calls you're making.
Following thing boggles my mind:
I have to bulk insert a lot of changes, some are inserts some are updates. I am not sure how to do it the best way.
Logic looks something like this:
public class Worker
{
public void Run(){
var mailer = new Mailer();
HashSet<DbElements> dbElementsLookUp = new HashSet<DbElement>(dbContext.DbElements);
List<Element> elements = GetSomeChangesFromSomewhere();
var dbElementsToSave = new List<DbElements>();
foreach(var element in elements)
{
CreateOrUpdateDbElement(element, dbElementsToSave);
// Sends some data based on the element - due to legacy implementation it uses its own context
mailer.SendSomeLogging(element);
}
try
{
dbContext.ChangeTracker.DetectChanges();
dbContext.Set<DbElement>().AddRange(dbElementsToSave);
dbContext.SaveChanges();
}
catch (Exception e)
{
LogErrors(e);
}
}
private CreateOrUpdateDbElement(ElementDto element, HashSet<DbElement> lookUp, List<DbElement> dbElementsToSave)
{
var entity = lookUp.FirstOrDefault(e => e.Id == element.Id);
if(element is not null)
{
entity.SomeProperty = element.SomeProperty;
dbContext.Configuration.AutoDetectChangesEnabled = false;
dbContext.Entry(entity).State = EntityState.Modified;
dbContext.Configuration.AutoDetectChangesEnabled = true;
}
else
{
dbElementsToSave.Add(new DbElement
{
SomeProperty = element.SomeProperty,
CreationDate = DateTime.Now
});
}
}
}
I'm not sure what's the best way to do this, especially for the DetectChanges. Is it save to disable the autodetectchanges and call the detectchanges outside of the foreach. I am working with a lot of data and due to the legacy implementation it is pretty slow because for each mail there is a write operation on the database. It actually works on another instance of the context so it does not interfer with the saving of the objects of dbelements.
Is it better to add the entities to update to another list and do the same as for the adding of entities?
I'm working with Marten as my data layer and it's been great so far, but I've run into an issue that just doesn't make sense. I have a simple method that saves a transaction (a purchase) then updates a listing, adding the ID of the transaction to a collection. My problem is, it appears that Marten is not storing my updated listing, although it is storing the transaction.
When I look in the database, the TransactionIds property is null, but if I step through the code, everything seems to execute correctly. Am I doing something wrong here?
public async Task CreateListingTransactionAsync(ListingTransaction transaction)
{
if (transaction == null)
throw new ValidationException("Transaction is required to create a transaction");
bool isNew = transaction.Id == Guid.Empty;
await _listingTransactionValidator.ValidateAndThrowAsync(transaction);
using (var session = _store.LightweightSession())
{
session.Store(transaction);
if (isNew)
{
var listing = await session.LoadAsync<Listing>(transaction.ListingId);
if (listing == null)
throw new EntityNotFoundException($"Listing with Id: {transaction.ListingId} not found");
if (listing.TransactionIds == null)
listing.TransactionIds = new List<Guid>();
listing.TransactionIds.Add(transaction.Id);
session.Store(listing);
}
await session.SaveChangesAsync();
}
}
There could be a problem with the serialization of TransactionIds collection.
If that's not the case then here are some random things to try (and try to understand why it worked later):
Try session.Update(listing); instead of session.Store(listing);.
Try different type of document session. http://jasperfx.github.io/marten/documentation/troubleshoot/
I'm new to using entity as a data layer between MVC and SQL Server, so I apologize up front if what I'm doing is bad practice.
Let me start by sharing the code that is handling the update.
Update Delivery:
public bool One(Delivery toUpdate)
{
using (var dbContext = new FDb())
{
try
{
var deliveryInDb = this.dbTable(dbContext).Single(x => x.DeliveryId == toUpdate.DeliveryId);
dbContext.Entry(deliveryInDb).CurrentValues.SetValues(toUpdate);
//removal first
List<DeliveryDay> currentDays = FEngineCore.DeliveryDay.Get.ForValue((x => x.DeliveryId), toUpdate.DeliveryId);
List<DeliveryTime> currentTimes = FEngineCore.DeliveryTime.Get.ForValue((x => x.DeliveryId), toUpdate.DeliveryId);
//remove delivery days that are not needed
foreach (var curDay in currentDays)
{
if (!toUpdate.DeliveryDays.Select(x => x.DeliveryDayId).Contains(curDay.DeliveryDayId))
{
FEngineCore.DeliveryDay.Delete.One((x => x.DeliveryDayId), curDay.DeliveryDayId);
deliveryInDb.DeliveryDays.Remove(curDay);
}
}
//remove delivery times that are not needed
foreach (var curTime in currentTimes)
{
if (!toUpdate.DeliveryTimes.Select(x => x.DeliveryTimeId).Contains(curTime.DeliveryTimeId))
{
FEngineCore.DeliveryTime.Delete.One((x => x.DeliveryTimeId), curTime.DeliveryTimeId);
deliveryInDb.DeliveryTimes.Remove(curTime);
}
}
foreach (var day in toUpdate.DeliveryDays)
{
if (day.DeliveryDayId == 0)
{
dbContext.DeliveryDays.Add(day);
}
else
{
if (dbContext.DeliveryDays.Local.Any(e => e.DeliveryDayId == day.DeliveryDayId))
{
dbContext.Entry(dbContext.DeliveryDays.Local.First(e => e.DeliveryDayId == day.DeliveryDayId)).CurrentValues.SetValues(day);
dbContext.Entry(dbContext.DeliveryDays.Local.First(e => e.DeliveryDayId == day.DeliveryDayId)).State = EntityState.Modified;
}
else
{
DeliveryDay modDay = new DeliveryDay
{
DayOfWeek = day.DayOfWeek,
DeliveryDayId = day.DeliveryDayId,
DeliveryId = day.DeliveryId,
Interval = day.Interval
};
dbContext.DeliveryDays.Attach(modDay);
dbContext.Entry(modDay).State = EntityState.Modified;
}
deliveryInDb.DeliveryDays.Add(day);
}
}
foreach (var time in toUpdate.DeliveryTimes)
{
if (time.DeliveryTimeId == 0)
{
dbContext.DeliveryTimes.Add(time);
}
else
{
if (dbContext.DeliveryTimes.Local.Any(e => e.DeliveryTimeId == time.DeliveryTimeId))
{
dbContext.Entry(dbContext.DeliveryTimes.Local.First(e => e.DeliveryTimeId == time.DeliveryTimeId)).CurrentValues.SetValues(time);
dbContext.Entry(dbContext.DeliveryTimes.Local.First(e => e.DeliveryTimeId == time.DeliveryTimeId)).State = EntityState.Modified;
}
else
{
DeliveryTime modTime = new DeliveryTime
{
DeliveryId = time.DeliveryId,
DeliveryLocationId = time.DeliveryLocationId,
DeliveryTimeId = time.DeliveryTimeId,
DropoffTime = time.DropoffTime
};
dbContext.DeliveryTimes.Attach(modTime);
dbContext.Entry(modTime).State = EntityState.Modified;
}
deliveryInDb.DeliveryTimes.Add(time);
}
}
dbContext.SaveChanges();
dbContext.Entry(deliveryInDb).State = EntityState.Detached;
return true;
}
catch (Exception ex)
{
Console.WriteLine(ex.InnerException);
return false;
}
}
}
Let me continue by explaining that the delivery object has 2 children; DeliveryTime and DeliveryDay. The issue that arises happens when I try to remove one deliveryTime and modify nothing else. The end result of running the code normally (not in debug) is that the deliveryTime is in fact not removed. Here's the interesting thing guys, when I debug it and go through the break points, everything works as expected!
Let me continue by posting the code that is running behind the removal method of the deliveryTime (actually all entity objects in my system).
public bool One<V>(Expression<Func<T, V>> property, V value) where V : IComparable
{
using (var dbContext = new FoodsbyDb())
{
try
{
T toDelete;
//get the body as a property that represents the property of the entity object
MemberExpression entityPropertyExpression = property.Body as MemberExpression;
//get the parameter that is representing the entity object
ParameterExpression entityObjectExpression = (ParameterExpression)entityPropertyExpression.Expression;
//represent the value being checked against as an expression constant
Expression valueAsExpression = Expression.Constant(value);
//check the equality of the property and the value
Expression equalsExpression = Expression.Equal(entityPropertyExpression, valueAsExpression);
//create an expression that takes the entity object as a parameter, and checks the equality using the equalsExpression variable
Expression<Func<T, bool>> filterLambda = Expression.Lambda<Func<T, bool>>(equalsExpression, entityObjectExpression);
toDelete = this.dbTable(dbContext)
.SingleOrDefault(filterLambda);
if (toDelete != null)
{
this.dbTable(dbContext)
.Remove(toDelete);
dbContext.SaveChanges();
return true;
}
return false;
}
catch (Exception ex)
{
Console.WriteLine(ex.InnerException);
return false;
}
}
}
The code above is obviously generic, and it handles all my entity objects. I have tested it in and out and know for sure the problem does not lie in there. I thought it would be helpful to post it so you all can have a full understanding of what's going on.
Here's my best guess as to what's going on:
The reference to the removed deliveryTime still exists when the database context is saved, but when I debug, the system has enough time to remove the context.
Here was one of my attempted solutions:
Remove all references to the children objects immediately after setting currentDays and currentTimes and then proceeding to add them back to deliveryInDb as you enumerate through them.
Because I am new to all of this, if you see some bad practice along with the solution, I wouldn't mind constructive criticism to improve my programming method.
I actually encountered this issue in a project at work. The project is an older MVC4 project using EF 6.1.
In our situation, a simple update attempting to set a related entity property to null was failing to actually set it to null while running the web app normally (in debug mode). When setting a break point on the line of code that sets the property to null the database would be updated as expected, though. So, the update was working when a break point was in place but not working when allowed to run normally.
Using an EF interceptor, we could see that, with the break point in place, the update query was going through as expected.
Now, in our situation the related entity was using the virtual keyword to allow for lazy loading. I think this is the root of the issue. When a break point is present, EF has enough time to both lazily load that related entity and evaluate whatever it needs to evaluate and finally set it to null. When running without a break point, I think EF gets caught up trying to lazily load that entity and therefore fails to think it needs to be updated. To be clear, I was both accessing the related entity property for the first time and setting it null using a one-liner of code.
foo.Bar = null;
I resolved this issue, in our scenario, by accessing that property at least once prior to setting it to null so that EF is forced to load it. With it loaded, setting it to null seems to work as intended now. So again, to be clear, I think the issue is a combo of lazy loading and the one-liner of code both accessing that property for the first time and assigning it to null.
It appears that you're using multiple instances of your DbContext, which are not synchronized.
The solution would be to use a single instance, and pass that instance between your methods.
I recently upgraded my solution from EF5 to EF6.1.2, and changed my data access layer to use DbContext instead of ObjectContext.
Some of my unit tests are failing, and I don't understand why. Example of old data access code:
public virtual T Insert(T item)
{
if (item == null)
{
throw new ArgumentNullException("item", #"TaskDal.Insert");
}
using (var ctx = ObjectContextManager<StoreDataContext>.GetManager("StoreDataContext"))
{
var task = new Task();
WriteNonKeyData(task, item);
ctx.ObjectContext.Tasks.AddObject(task); // task.taskType null
ctx.ObjectContext.SaveChanges(); // task.TaskType set
return ReadData(task);
}
}
The Task Entity has a navigation property TaskType. As commented above, this gets set after the AddObject line.
My new code looks like so:
public virtual T Insert(T item)
{
if (item == null)
{
throw new ArgumentNullException("item", #"TaskDal.Insert");
}
using (var ctx = DbContextManager<StoreDataContext>.GetManager())
{
var task = new Task();
WriteNonKeyData(task, item);
ctx.DbContext.Tasks.Add(task); // task.TaskType null
ctx.DbContext.SaveChanges(); // task.TaskType still null
return ReadData(task);
}
}
Unlike the old code, task.TaskType is not set, which causes an exception in ReadData. LazyLoading is true in both examples.
I can workaround this by manually reloading the TaskType:
if (task.TaskType == null)
ctx.DbContext.Entry(task).Reference(p => p.TaskType).Load();
but I would prefer a better solution, as I am sure there are hundreds of other places in my code where this will need to be changed and it will be difficult for me to find them all.
Task will not load its navigation properties as these are not implemented to be lazily loaded. Take a look at your class definition, do you see any code in the getter? No.
Now, take a look at the model classes created automatically for your legacy code, is there a non empty getter that supports lazy loading? Yes, there is.
The difference is that with code-first, your model classes have no code that supports lazy loading. Lazy loading is supported only on proxy objects that are created by the context when you retrieve data from the database.
One of simplest workarounds would be to force the EF to create a proxy for you:
using (var ctx = DbContextManager<StoreDataContext>.GetManager())
{
var task = new Task();
WriteNonKeyData(task, item);
ctx.DbContext.Tasks.Add(task); // task.TaskType null
ctx.DbContext.SaveChanges(); // task.TaskType still null
// let ef create a proxy for the very same database object
var ptask = ctx.DbContext.Tasks.First( p => p.ID == task.ID );
// ptask.TaskType is now available as the actual type of
// ptask is not Task but rather a TaskProxy that inherits from Task
// and is created automatically by ef
return ReadData(ptask);
}