I can add, but not erase any item with the collection - unable to delete.
Found a few partial solutions, but nothing to guide me to a working solution. I can easily add values to the collection; ny help is appreciated.
I have the following:
[HttpPut("updateSOJ4")]
public IActionResult UpdateSOJ4([FromBody] Routing_Tool_SOJ4 Routing_Tool_SOJ4)
{
Routing_Tool_SOJ4 request = new Routing_Tool_SOJ4();
request.Id = Routing_Tool_SOJ4.Id;
request.Routing_Tool_Services = Routing_Tool_SOJ4.Routing_Tool_Services;
request.Routing_ToolId = Routing_Tool_SOJ4.Routing_ToolId;
_repository.UpdateSOJ4(request);
return Ok(request);
}
Here is where I was trying the different solutions, but, I am still stuck:
public void UpdateSOJ4(object routing_Tool_SOJ4)
{
// var missingItem = _context.Routing_Tool_Service.Where(i => i.Routing_Tool_SOJ4Id == _context.Routing_Tool_SOJ4.Id).First(); -- DOES NOT WORK
_context.Update(routing_Tool_SOJ4).State = EntityState.Modified;
_context.SaveChanges();
}
Here is the database structure:
public class Routing_Tool_SOJ4
{
[Key]
[Required]
public int Id { get; set; }
public int Routing_ToolId { get; set; }
[ForeignKey("Routing_ToolId")]
public virtual Routing_Tool Routing_Tool { get; set; }
public virtual ICollection <Routing_Tool_Service> Routing_Tool_Services { get; set; }
}
Collection:
public class Routing_Tool_Service
{
[Key]
[Required]
public int Id { get; set; }
public string ServiceName { get; set; }
[Required]
[ForeignKey("Routing_Tool_SOJ4Id")]
public int Routing_Tool_SOJ4Id { get; set; }
}
What I am deduce from your question is you have a method that accepts an updated Routing Tool object which contains an updated collection of Tool Services. You want to update that tool and it's associated services so that any service within that tool that is new gets added, otherwise updated, and any existing tool in the DB that is no longer in the passed in collection should be deleted..
If this is the case, you need to compare the provided version of the data to the database version of the data. For this example I am not using your Repository instance because I have no idea how it is implemented. Generally this pattern should be avoided unless there is a really good reason to have it.
[HttpPut("updateSOJ4")]
public IActionResult UpdateSOJ4([FromBody] Routing_Tool_SOJ4 updatedRoutingTool)
{
using (var context = new AppDbContext())
{
// Get tool and services from DB.
var existingRoutingTool = context.Routing_Tool_SOJ4s
.Include(x => x.Routing_Tool_Services)
.Single(x => x.Id == updatedRoutingTool.Id);
// Copy values that can be updated from the updatedRoutingTool to existingRoutingTool.
// ...
var updatedServiceIds = updatedRoutingTool.Routing_Tool_Services
.Select(x => x.Id)
.ToList();
var existingServiceIds = existingRoutingTool.Routing_Tool_Services
.Select(x => x.Id)
.ToList();
var serviceIdsToRemove = existingServiceIds
.Except(updatedServiceIds)
.ToList();
foreach (var service in updatedRoutingTool.Routing_Tool_Services)
{
var existingService = existingRoutingTool.Routing_ToolServices
.SingleOrDefault(x => x.Id == service.Id);
if (existingService == null)
existingRoutingTool.Routing_Tool_Services.Add(service);
else
{
// Copy allowed values from service to existingService
}
}
if(serviceIdsToRemove.Any())
{
var servicesToRemove = existingRoutingTool.Routing_Tool_Services
.Where(x => serviceIdsToRemove.Contains(x.Id))
.ToList();
foreach(var serviceToRemove in servicesToRemove)
existingRoutingTool.Routing_Tool_Services.Remove(serviceToRemove);
}
context.SaveChanges();
}
return Ok(request);
}
Normally the DbContext or Unit of Work would be injected into your controller, or the logic would be handed off to a service. This example uses a using block with a DbContext just to outline the minimum viable process flow for the operation.
Essentially load the current data state, compare that with the provided state to determine what needs to be added, updated, or removed.
Generally speaking when it comes to RESTful web services my recommendation is to avoid large update operations like this and instead structure the application to perform more atomic operations such as adding and removing services for a given tool as a distinct operation, working with a persisted copy (i.e. cached instance) of the data if you want the whole related operation to be committed to data state or abandoned at a higher level. This can help keep message sizes small, and server code more compact & worrying about a single responsibility. The risk of performing these large operations is that the passed in data must represent a complete picture of the data state or you could end up deleting/clearing data you don't intend. For example if you later want to optimize your code so that only added and updated services are sent over the wire, not unchanged services (to reduce message size) the above code will not work as it would delete anything not sent.
Related
I 'm using EF Core 3.1.10. I have the following entities:
public class Request {
public int Id { get; set; }
public string Title { get; set; }
public string ProjectId { get; set; }
public List<RequestAttachment> Attachments { get; set; } = new List<RequestAttachment> ();
}
public class RequestAttachment {
public int Id { get; set; }
public int RequestId { get; set; }
public Request Request { get; set; }
public byte[] FileStream { get; set; }
public string Filename { get; set; }
public RequestAttachmentType RequestAttachmentType { get; set; }
public int RequestAttachmentTypeId { get; set; }
}
public class RequestAttachmentType {
public int Id { get; set; }
public string Name { get; set; }
}
In my repository, I have a simple Update method:
public async Task UpdateRequest (Request aRequest) {
// I'm attaching aRequest.Attachments because they already exist in the database and I don 't want to update them here
// Option 1 Not working
// aRequest.Attachments.ForEach (a => theContext.RequestAttachments.Attach (a));
// Option 2 Not working
// theContext.RequestAttachments.AttachRange (aRequest.Attachments);
// Option 3 Working
aRequest.Attachments.ForEach (a => theContext.Entry (a).State = EntityState.Unchanged);
theContext.Requests.Update(aRequest);
await theContext.SaveChangesAsync ();
}
Note that I'm attaching "aRequest.Attachments" because I don 't want to update Attachments. I only want to update aRequest. "aRequest.Attachments" already exist in the database that's why I 'm using Attach so they don't get re-added. But Attach and AttachRange do not work when a request has more than one attachment. It throws the following error:
The instance of entity type 'RequestAttachmentType' cannot be tracked
because another instance with the key value '{Id: 1}' is already being
tracked. When attaching existing entities, ensure that only one entity
instance with a given key value is attached.
I don 't understand this error because I did not explicitly attach "RequestAttachmentType". The only thing I did was attaching its parent "aRequest.Attachments".
When I set the state manually like I did in Option 3, no error was thrown. I thought Attach is equivalent to theContext.Entry (a).State = EntityState.Unchanged. Why option 3 works but option 1 and 2 do not?
Working with detached entity graphs is going to continue to cause all kinds of headaches like this. Not only do you need to handle the scenario that you don't want to update/duplicate related entities, but you have to also handle cases where the DbContext is already tracking the entity you want to update. Sergey was on the right track there.
The problem is that you have a complete graph:
Request
Atachment
AttachmentType
Attachment
AttachmentType
where you want to update details in Request and the Attachments...
One issue with "Update" is that it will dive the graph to look for entities that might need to be added/updated. On its own with a detached graph this will usually result in duplicate items being created. Hence "attaching" them first. The trouble here is where the DbContext is already tracking one or more entities in the graph. One key detail to remember about EF is that References are everything. Deserializing entity graphs is a painful exercise.
For example lets say we deserialize a Request Id 1, with 2 attachments, #1, and #2, where both have an AttachmentType of "Document" (AttachmentType ID = 14)
What you will end up is something that looks like:
Document
{
ID:1
...
Attachments
{
Attachment
{
ID:1
...
AttachmentType
{
ID: 14
}
}
Attachment
{
ID:2
...
AttachmentType
{
ID: 14
}
}
}
}
Without considering what the DbContext may or may not already be tracking prior to looking at these entities, there is already a problem. Attachment ID 1 and 2 are distinct objects, however they both reference an AttachmentType ID 14. When de-serialized, these will be 2 completely distinct references to objects that have an ID of 14.
A common surprise is where test code appears to work fine because the two attachments had different attachment types, but then fails unexpectedly when they happen to have the same type. The first attachment would have the DbContext tracking the first attachment's "Type". If the second attachment's Type was a different ID, then attaching that 2nd type would succeed so long as the Context wasn't tracking it. However, when set to the same ID the "already tracking entity with the same ID" pops up.
When dealing with disconnected entities you need to be very deliberate about references and explicitly handle whenever the DbContext is tracking a reference. This means consulting the DbSet Local caches:
public async Task UpdateRequest (Request aRequest)
{
var existingRequest = theContext.Requests.Local.SingleOrDefault(x => x.Id = aRequest.Id);
if (existingRequest != null)
{
// copy values from aRequest -> existingRequest or Leverage something like automapper.Map(aRequest, existingRequest)
}
else
{
theContext.Requests.Attach(aRequest);
theContext.Entity(aRequest).State = EntityState.Modified; // Danger Will Robinson, make 100% sure your entity from client is validated!! This overwrites everything.
}
foreach(var attachment in aRequest)
{
var existingAttachment = theContext.Attachments.Local.SingleOrDefault(x => x.Id == attachment.Id);
// Look for a reference to the attachment type. If found, use it, if not attach and use that...
var existingAttachmentType = theContext.AttachmentTypes.Local.SingleOrDefault(x => x.Id == attachment.AttachmentType.Id);
if (existingAttachmentType == null)
{
theContext.AttachmentTypes.Attach(attachment.AttachmentType);
existingAttachmentType = attachment.AttachmentType;
}
if(existingAttachment != null)
{
// copy values across.
AttachmentType = existingAttachmentType; // in case we change the attachment type for this attachment.
}
else
{
theContext.Attachments.Attach(attachment);
theContext.Entity(attachment).State = EntityState.Modified;
attachment.AttachmentType = existingAttachmentType;
}
}
await theContext.SaveChangesAsync ();
}
Needless to say this is a lot of messing around to check and replace references to either get the DbContext to track detached entities or replace the references with tracked entities.
A simpler option is to leverage Automapper to establish a configuration for what fields can be updated from a source (ideally a ViewModel, but you can use an entity graph as a source) to a destination. (Entities tracked by the DbContext)
Step 1: Configure Automapper with the rules about what to update for a Request -> Attachments graph.. (Not shown)
Step 2: Load tracked entity graph, and the applicable AttachmentTypes:
var existingRequest = theContext.Requests
.Include(x => x.Attachments)
.ThenInclude(x => x.AttachmentType)
.Single(x => x.Id == aRequest.Id);
var referencedAttachmentTypeIds = aRequest.Attachments.Select(x => x.AttachmentTypeId)
.Distinct().ToList();
var referencedAttachmentTypes = theContext.AttachmentTypes
.Where(x => referencedAttachmentTypeIds.Contains(x.Id))
.ToList();
Getting the list of attachment types only applies if we can change an attachment's type, or are adding attachments.
Step 3: Leverage Automapper to copy across values
mapper.Map(aRequest, existingRequest);
If Attachments can be updated, added, and/or removed you will need to handle those scenarios against the existingRequest. Here we reference the loaded set of AttachmentTypes.
Step 4: Save Changes.
The primary benefits of this approach is that you do away with the constant checking for existing references and the consequences of missing a check. You also configure the rules about what values can legally be overwritten when calling the Automapper Map call so only values you expect are copied from the source to the existing data record. This also results in faster Update queries as EF will only build statements for the values that actually changed, where using Update or EntityState.Modified result in SQL UPDATE statements that update every column.
Try this:
var itemExist = await theContext.Requests.FirstOrDefaultAsync ( i=>i.Id == aRequest.Id);
if (itemExist !=null)
{
var attachments=aRequest.Attachments;
aRequest.Attachments=null;
theContext.Entry(itemExist ).CurrentValues.SetValues(aRequest);
await theContext.SaveChangesAsync();
aRequest.Attachments=attachments;
}
Usefull Context
I currently have two entities that look as below.
MovieSerie
public class MovieSerie
{
[Key]
public Guid MovieSerieId { get; set; }
[Required]
[MaxLength(128)]
public string Title { get; set; }
[Required]
[MaxLength(256)]
public string Description { get; set; }
public virtual ICollection<Movie> Movies { get; set; }
}
Movie
public class Movie
{
[Key]
public Guid MovieId { get; set; }
[Required]
[MaxLength(128)]
public string Title { get; set; }
public virtual MovieSerie MovieSerie { get; set; }
}
I have removed some properties that were unused so far so the example is a bit more readable.
These entities have a one-to-many relationship because a MovieSerie contains multiple movies but a movie can only belong to one MovieSerie.
The problem
When I am trying to make a new movie from Postman by providing an EXISTING MovieSerie, I am getting an exception. The exception looks as below.
Duplicate entry '\xA9\xCE\x0E\x1E\x9A\xAE\xA2G\x91<\xE6\xE3-\x88C\xE9' for key 'movieseries.PRIMARY'
So I figured out that it is trying to make a new MovieSerie when I am providing a MovieSerie object. The raw JSON from the request that I am trying to send from Postman looks like below.
{
"MovieId" : "6aa8c134-689c-45e2-bf60-cd0eb5473cc2",
"Title" : "TestMovie",
"MovieSerie" : {
"movieSerieId": "1e0ecea9-ae9a-47a2-913c-e6e32d8843e9",
"title": "Harry Potter",
"description": "This contains the Harry Potter serie"
}
}
The POST method to save the movie is shown below.
[HttpPost]
public async Task<ActionResult<Movie>> PostMovie(Movie movie)
{
if (movie == null)
{
return BadRequest("No movie object provided");
}
else if (movie.MovieSerie != null)
{
if (!_validator.MovieSerieExists(movie.MovieSerie.MovieSerieId))
{
return BadRequest("The movie serie does not exists in the database");
}
}
_context.Movies.Add(movie);
await _context.SaveChangesAsync();
return CreatedAtAction("GetMovie", new { id = movie.MovieId }, movie);
}
Could someone give me any insight into what I am doing wrong? Why is it trying to make a new entity while it already exists? What should I change to get the wished behavior?
I tried to provide all information required, however, let me know if I missed something.
EDIT ADDED DBCONTEXT
modelBuilder.Entity<MovieSerie>(entity =>
{
entity.HasKey(movieSerie => movieSerie.MovieSerieId);
entity.Property(movieSerie => movieSerie.Title).IsRequired();
entity.Property(movieSerie => movieSerie.Description).IsRequired();
entity.HasMany(ms => ms.Movies)
.WithOne(m => m.MovieSerie);
});
modelBuilder.Entity<Movie>(entity =>
{
entity.HasKey(movie => movie.MovieId);
entity.Property(movie => movie.Title).IsRequired();
entity.HasOne(m => m.MovieSerie)
.WithMany(s => s.Movies);
});
This is what happens when passing entities between server and client in ASP.Net. When your DbContext is lifetime scoped to a request, the entities are loaded by a DbContext and passed to the view, but then what you pass back on the Post call is a JSON object that is deserialized into an entity class definition. On this request, neither the Movie or it's associated related entities are tracked by the DbContext.
When you tell the Post's DbContext to Add the movie, any child entities on that movie will be treated as new entities as well, resulting in duplicate records.
How to avoid this:
Option 1: Use ViewModels to avoid confusing data coming from views with entities. (Data state) This is always my recommended option. This avoids confusion about what objects you are dealing with, and also means you can reduce the amount of data being sent over the wire. As entities get larger, sending entities back and forth means larger payloads for fields your view doesn't need. ViewModels can be populated to serve just the fields that the view will interact with. Automapper can help largely with turning entity graphs into ViewModels with it's ProjectTo method.
So if we had a view for creating a Movie (Movie/Create) and that view listed a the movie series to choose from, it might search/fetch series:
[Serializable]
public class MovieSeriesSummaryViewModel
{
public Guid MovieSeriesId { get; set; }
public string Name { get; set; }
}
Then when the controller goes to search/retrieve those series to choose from:
var series = _context.MovieSeries
// .Where(x => [search criteria...])
.ProjectTo<MovieSeriesSummaryViewModel>(config)
.ToList();
or
var series = _context.MovieSeries
// .Where(x => [search criteria...])
.Select( x = > new MovieSeriesSummaryViewModel
{
MovieSeriesId = x.MovieSeriesId,
Name = x.Name
}).ToList();
a PostMovie action accepts a PostMovieViewModel:
[Serializable]
public class PostMovieViewModel
{
public string MovieName { get; set; }
public Guid? MovieSeriesId { get; set; }
// ...
}
The create movie view model only needs to pass the series ID (if applicable) and the required fields to create a new movie. From there we associate the series from the DbContext when creating our new Movie:
[HttpPost]
public async Task<ActionResult<PostMovieViewModel>> PostMovie(PostMovieViewModel movieVM)
{
var movieSeries = movieVM.MovieSeriesId.HasValue
? _context.MovieSeries.Single(x => x.MovieSeriesId == movieVM.MovieSeriesId.Value)
: null;
var movie = new Movie
{
Name = movieVM.Name,
MovieSeries = movieSeries
};
_context.Movies.Add(movie);
await _context.SaveChangesAsync();
}
The key point here is that we fetch the existing series from the Context to associate to the new movie. Fetching entities by ID is quite fast and serves as a meaningful validation that the data we passed in is complete.
Option 2: Re-associate all references. The underlying problem with passing deserialize objects and treating them as entities is that the DbContext isn't tracking them. There are 2 ways you can fix this, either tell the DbContext to track them, or replace the references with tracked objects.
2a - Replacing references
[HttpPost]
public async Task<ActionResult<Movie>> PostMovie(Movie movie)
{
if (movie.MovieSeries != null)
{
var existingMovieSeries = _context.MovieSeries
.Single(x => MovieSeriesId == movie.MovieSeries.MovieSeriesId);
movie.MovieSeries = existingMovieSeries; // Replace the reference.
}
_context.Movies.Add(movie);
await _context.SaveChanges();
}
This still potentially means going to the DB for all references, and forgetting to will result in silent duplication issues.
2b - Track related entities. This one I saved for last as it can seem simple, but can trip you up...
[HttpPost]
public async Task<ActionResult<Movie>> PostMovie(Movie movie)
{
if (movie.MovieSeries != null)
_context.Attach(movie.MovieSeries);
_context.Movies.Add(movie);
await _context.SaveChanges();
}
That looks simple, and would work most of the time, but if the DbContext is already tracking that movie series for any reason, the Attach method will fail. This is an error that could appear intermittently at runtime depending on the particular actions/data combinations. (I.e. updating 2 movies /w same series or conditionally calling a method that loads that series) The proper check would be:
[HttpPost]
public async Task<ActionResult<Movie>> PostMovie(Movie movie)
{
if (movie.MovieSeries != null)
{
var existingMovieSeries = _context.MovieSeries.Local
.SingleOrDefault(x => x.MovieSeriesId == movie.MovieSeries.MovieSeriesId);
if (existingMovieSeries == null)
_context.Attach(movie.MovieSeries);
else
movie.MovieSeries = existingMovieSeries;
}
_context.Movies.Add(movie);
await _context.SaveChanges();
}
Checking MovieSeries.Local checks to see if the DbContext is tracking the series. (without hitting the DB) If not, we can attach it. If it is, we need to replace the reference. This can be a lot of boiler plate code to put in for every reference on a new object. When attaching entities coming from a view, it is also important not to ever set the entity state for that entity to Modified without first verifying the data is valid. (Which would require loading the entity first anyway) Doing so could allow users to alter data in ways you don't intend as setting an entity to Modified will update all fields on that entity. (Where loading an entity and then copying across values means only those values you change will be updated)
Your problem is that you are passing the whole movie serie object. This is not something you should do. The idea of relational databases is to, as the name suggest, relate tables. This relationships are done using keys (foreign keys).
In your particular case, you need to define a foreign key column in your Movie table, to relate it to MovieSeries, as follows:
public class Movie
{
[Key]
public Guid MovieId { get; set; }
public int MovieSerieId {get; set; }
[Required]
[MaxLength(128)]
public string Title { get; set; }
[ForeignKey("MovieSerieID")]
public virtual MovieSerie MovieSerie { get; set; }
}
As you can see, im specifying that the MovieSerieID attribute is a foreign key. The virtual MovieSerie attribute is used by EF to get all the details of your foreign key.
Now, you can create your movie passing only the MovieSerieid, as follows:
{
"MovieId" : "6aa8c134-689c-45e2-bf60-cd0eb5473cc2",
"Title" : "TestMovie",
"MovieSerieId": "1e0ecea9-ae9a-47a2-913c-e6e32d8843e9"
}
I am using Entity Framework Core with npgsql postgresql for Entity Framework Core.
and i'm working with .net core 3
My question is, when i try to update a MyTableRelated element from the MyTableClass and saving the context to the database, no changes are detected.
For example, lets suppose we have the following classes:
public class MyTableClass
{
public int Id { get; set; }
[Column(TypeName = "jsonb")]
public virtual List<MyTableRelated> Data { get; set; }
}
public class MyTableRelated
{
public int Id { get; set; }
public string prop1 { get; set; }
public string prop2 { get; set; }
}
and some code like this (this is not actual code, its just to get the ideia):
var context = dbContext;
var newMyTableClass = new MyTableClass() {
Id = 1;
};
var newMyTableRelated = new MyTableRelated(){
Id=1;
prop1 = "";
prop2 = "";
}
newMyTableClass.Data.Add(newMyTableRelated);
context.SaveChanges();
This works, and the entry is saved on the database.
Now somewhere on the application, i want to access that entry and change values on Data:
var context = dbContext;
var updateMyTableClass = context.MyTableClass.FirstOrDefault(x => x.Id == 1);
var tableRelated = updateMyTableClass.Data.FirstOrDefault(y => y.Id == 1);
tableRelated.prop1 = "prop1";
tableRelated.prop2 = "prop2";
context.SaveChanges();
I would suppose this would change values on database, like it does for other types of properties. But nothing happens.
A solution i found, was using this:
var entry = context.Entry(updateMyTableClass);
if (entry.State == EntityState.Unchanged)
{
entry.State = EntityState.Modified;
}
This is more of a temporary solution for that case.
How can we then make the EF automatically detect changes on jsonb properties?
Someone pointed to me that i should look at coase grained lock.
https://www.martinfowler.com/eaaCatalog/coarseGrainedLock.html
How can something like that be implemented?
Automatic change detection would mean that EF Core would take a snapshot of the JSON document when it loads the property (duplicating the entire tree), and then do a complete structural comparison of the original and current tree whenever SaveChanges is called. As this can be very heavy perf-wise, it is not done by default.
However, if you wish to do so, you can create a value comparer to implement precisely this - see the EF docs on how to do that. I've opened an issue on the Npgsql provider repo in case someone wishes to contribute this.
For perf reasons, I'd recommend manually flagging properties when they change, similar to what you have done. Note that you're marking the entire entity instance as changed - so all properties will be saved. You can use the following to only mark the JSON property:
ctx.Entry(entry).Property(e => e.SomeJsonProperty).IsModified = true;
i am struggeling for a while now to understand how EF loads / updates entities.
First of all i wanna explain what my app (WPF) is about. I am developing
an application where users can store Todo Items in Categories, these categories are predefined by the application. Each user can read all items but can only delete / update his own items. It's a multiuser system, means the application is running multiple times in the network accessing the same sql server database.
When a user is adding/deleting/updating items the UI on all the other running apps has to update.
My model looks like this:
public class Category
{
public int Id { get; set; }
public string Name { get; set; }
public List<Todo> Todos { get; set; }
}
public class Todo
{
public int Id { get; set; }
public string Content { get; set; }
public DateTime LastUpdate { get; set; }
public string Owner { get; set; }
public Category Category { get; set; }
public List<Info> Infos { get; set; }
}
public class Info
{
public int Id { get; set; }
public string Value { get; set; }
public Todo Todo { get; set; }
}
I am making the inital load like this, which works fine:
Context.dbsCategories.Where(c => c.Id == id).Include(c => c.Todos.Select(t => t.Infos)).FirstOrDefault();
Now i was trying to load only the Todos which are from the current user therefore i tried this:
Context.dbsCategories.Where(c => c.Id == id).Include(c => c.Todos.Where(t => t.Owner == Settings.User).Select(t => t.Infos)).FirstOrDefault();
This does not work because it's not possible to filter within include, so I tried this:
var cat = Context.dbsCategories.Where(c => c.Id == id).FirstOrDefault();
Context.dbsTodos.Where(t => t.Category.Id == cat.Id && t.Owner == Settings.User).Include(t=>t.Infos);
After executing the second line where i look for the Todo Items, these Items were automatically added to cat's Todos collection. Why? I would have expected that i have to add them manually to cat's Todos collection.
Just for my understanding what is EF doing here exactly?
Now to my main problem -> the synchronization of the data between database and client. I am using a long running Context which lives as long as the application is running to save changes to the database which are made on owned items. The user does not have the possibility to manipulate / delete data from other users this is guarantee by the user interface.
To synchronize the data i build this Synch Method which will run every 10 second, right now it's triggere manually.
Thats my synchronization Code, which only synchronizes Items to the client that do not belong to it.
private async Task Synchronize()
{
using (var ctx = new Context())
{
var database = ctx.dbsTodos().Where(x => x.Owner != Settings.User).Select(t => t.Infos).AsNoTracking();
var loaded = Context.dbsTodos.Local.Where(x => x.Owner != Settings.User);
//In local context but not in database anymore -> Detachen
foreach (var detach in loaded.Except(database, new TodoIdComparer()).ToList())
{
Context.ObjectContext.Detach(detach);
Log.Debug(this, $"Item {detach} detached");
}
//In database and local context -> Check Timestamp -> Update
foreach (var update in loaded.Intersect(database, new TodoIdTimeStampComparer()))
{
await Context.Entry(update).ReloadAsync();
Log.Debug(this, $"Item {update} updated");
}
//In database but not in local context -> Attach
foreach (var attach in database.ToList().Except(loaded, new TodoIdComparer()))
{
Context.dbsTodos().Attach(attach);
Log.Debug(this, $"Item {attach} attached");
}
}
}
I am having following problems / issues of unknow origin with it:
Detaching deleted Items seems to work, right now i am not sure if only the Todo Items are detached or also the Infos.
Updating Items works only for the TodoItem itsself, its not reloading the Infos within? How can i reload the whole entity with all it's relations?
I am thankful for every help on this, even if you are saying it's all wrong what i am doing here!
Attaching new Items and Infos does not work so far? What am i doing wrong here?
Is this the right approach to synchronize data between client and database?
What am i doing wrong here? Is there any "How to Sync" Tutorial? I have not found anything helpful so far?
Thanks!
My, you do like to deviate from entity framework code-first conventions, do you?
(1) Incorrect class definitions
The relations between your tables are Lists, instead of ICollections, they are not declared virtual and you forgot to declare the foreign key
There is a one-to-many relation between Todo and Category: every Todo belongs to exactly one Category (using a foreign key), every Category has zero or more Todos.
You choose to give Category a property:
List<Todo> Todos {get; set;}
Are you sure that category.Todos[4] has a defined meaning?
What would category.Todos.Insert(4, new Todo()) mean?
Better stick to an interface where you can't use functions that have no proper meaning in your database: use ICollection<Todo> Todos {get; set;}. This way you'll have only access to functions that Entity Framework can translate to SQL.
Besides, a query will probably be faster: you give entity framework the possibility to query the data in its most efficient way, instead of forcing it to put the result into a List.
In entity framework the columns of a table are represented by non-virtual properties; the virtual properties represent the relations between the tables (one-to-many, many-to-many)
public class Category
{
public int Id { get; set; }
public string Name { get; set; }
... // other properties
// every Category has zero or more Todos (one-to-many)
public virtual ICollection<Todo> Todos { get; set; }
}
public class Todo
{
public int Id { get; set; }
public string Content { get; set; }
... // other properties
// every Todo belongs to exactly one Category, using foreign key
public int CategoryId { get; set }
public virtual Category Category { get; set; }
// every Todo has zero or more Infos:
public virtual ICollection<Info> Infos { get; set; }
}
You'll probably guess Info by now:
public class Info
{
public int Id { get; set; }
public string Value { get; set; }
... // other properties
// every info belongs to exactly one Todo, using foreign key
public int TodoId {get; set;}
public virtual Todo Todo { get; set; }
}
Three major improvements:
ICollections instead of Lists
ICollections are virtual, because it is not a real column in your table,
foreign key definitions non-virtual: they are real columns in your tables.
(2) Use Select instead of Include
One of the slower parts of a database query is the transport of the selected data from the Database Management System to your local process. Hence it is wise to limit the amount of transported data.
Suppose Category with Id [4] has a thousand Todos. Every Todo of this Category will have a foreign key with a value 4. So this same value 4 will be transported 1001 times. What a waste of processing power!
In entity framework use Select instead of Include to query data and select only the properties you actually plan to use. Only use Include if you plan to update the Selected data.
Give me all Categories that ... with their Todos that ...
var results = dbContext.Categories
.Where(category => ...)
.Select(category => new
{
// only select properties that you plan to use
Id = category.Id,
Name = category.Name,
...
Todos = category.Todos
.Where(todo => ...) // only if you don't want all Todos
.Select(todo => new
{
// again, select only the properties you'll plan to use
Id = todo.Id,
...
// not needed, you know the value:
// CategoryId = todo.CategoryId,
// only if you also want some infos:
Infos = todo.Infos
.Select(info => ....) // you know the drill by now
.ToList(),
})
.ToList(),
});
(3) Don't keep DbContext alive for such a long time!
Another problem is that you keep your DbContext open for quite some time. This is not how a dbContext was meant. If your database changes between your query and your update, you'll have troubles. I can hardly imagine that you query so much data that you need to optimize it by keeping your dbContext alive. Even if you query a lot of data, the display of this huge amount of data would be the bottle-neck, not the database query.
Better fetch the data once, dispose the DbContext, and when updating fetch the data again, update the changed properties and SaveChanges.
fetch data:
RepositoryCategory FetchCategory(int categoryId)
{
using (var dbContext = new MyDbContext())
{
return dbContext.Categories.Where(category => category.Id == categoryId)
.Select(category => new RepositoryCategory
{
... // see above
})
.FirstOrDefault();
}
}
Yes, you'll need an extra class RepositoryCategory for this. The advantage is, that you hide that you fetched your data from a database. Your code would hardly change if you'd fetch your data from a CSV-file, or from the internet. This is way better testable, and also way better maintainable: if the Category table in your database changes, users of your RepositoryCategory won't notice it.
Consider creating a special namespace for the data you fetch from your database. This way you can name the fetched Category still Category, instead of RepositoryCategory. You even hide better where you fetched your data from.
Back to your question
You wrote:
Now i was trying to load only the Todos which are from the current user
After the previous improvements, this will be easy:
string owner = Settings.User; // or something similar
var result = dbContext.Todos.Where(todo => todo.Owner == owner)
.Select(todo => new
{
// properties you need
})
This is my first time using Entity Framework 6.1 (code first). I keep running into a problem where my navigation properties are null when I don't expect them to be. I've enabled lazy loading.
My entity looks like this:
public class Ask
{
public Ask()
{
this.quantity = -1;
this.price = -1;
}
public int id { get; set; }
public int quantity { get; set; }
public float price { get; set; }
public int sellerId { get; set; }
public virtual User seller { get; set; }
public int itemId { get; set; }
public virtual Item item { get; set; }
}
It has the following mapper:
class AskMapper : EntityTypeConfiguration<Ask>
{
public AskMapper()
{
this.ToTable("Asks");
this.HasKey(a => a.id);
this.Property(a => a.id).HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity);
this.Property(a => a.id).IsRequired();
this.Property(a => a.quantity).IsRequired();
this.Property(a => a.price).IsRequired();
this.Property(a => a.sellerId).IsRequired();
this.HasRequired(a => a.seller).WithMany(u => u.asks).HasForeignKey(a => a.sellerId).WillCascadeOnDelete(true);
this.Property(a => a.itemId).IsRequired();
this.HasRequired(a => a.item).WithMany(i => i.asks).HasForeignKey(a => a.itemId).WillCascadeOnDelete(true);
}
}
Specifically, the problem is that I have an Ask object with a correctly set itemId (which does correspond to an Item in the database), but the navigation property item is null, and as a result I end up getting a NullReferenceException. The exception is thrown in the code below, when I try to access a.item.name:
List<Ask> asks = repo.GetAsksBySeller(userId).ToList();
List<ReducedAsk> reducedAsks = new List<ReducedAsk>();
foreach (Ask a in asks)
{
ReducedAsk r = new ReducedAsk() { id = a.id, sellerName = a.seller.username, itemId = a.itemId, itemName = a.item.name, price = a.price, quantity = a.quantity };
reducedAsks.Add(r);
}
Confusingly, the seller navigation property is working fine there, and I can't find anything I've done differently in the 'User' entity, nor in its mapper.
I have a test which recreates this, but it passes without any problems:
public void canGetAsk()
{
int quantity = 2;
int price = 10;
//add a seller
User seller = new User() { username = "ted" };
Assert.IsNotNull(seller);
int sellerId = repo.InsertUser(seller);
Assert.AreNotEqual(-1, sellerId);
//add an item
Item item = new Item() { name = "fanta" };
Assert.IsNotNull(item);
int itemId = repo.InsertItem(item);
Assert.AreNotEqual(-1, itemId);
bool success = repo.AddInventory(sellerId, itemId, quantity);
Assert.AreNotEqual(-1, success);
//add an ask
int askId = repo.InsertAsk(new Ask() { sellerId = sellerId, itemId = itemId, quantity = quantity, price = price });
Assert.AreNotEqual(-1, askId);
//retrieve the ask
Ask ask = repo.GetAsk(askId);
Assert.IsNotNull(ask);
//check the ask info
Assert.AreEqual(quantity, ask.quantity);
Assert.AreEqual(price, ask.price);
Assert.AreEqual(sellerId, ask.sellerId);
Assert.AreEqual(sellerId, ask.seller.id);
Assert.AreEqual(itemId, ask.itemId);
Assert.AreEqual(itemId, ask.item.id);
Assert.AreEqual("fanta", ask.item.name);
}
Any help would be extremely appreciated; this has been driving me crazy for days.
EDIT:
The database is SQL Server 2014.
At the moment, I have one shared context, instantiated the level above this (my repository layer for the db). Should I be instantiating a new context for each method? Or instantiating one at the lowest possible level (i.e. for every db access)? For example:
public IQueryable<Ask> GetAsksBySeller(int sellerId)
{
using (MarketContext _ctx = new MarketContext())
{
return _ctx.Asks.Where(s => s.seller.id == sellerId).AsQueryable();
}
}
Some of my methods invoke others in the repo layer. Would it better for each method to take a context, which it can then pass to any methods it calls?
public IQueryable<Transaction> GetTransactionsByUser(MarketContext _ctx, int userId)
{
IQueryable<Transaction> buyTransactions = GetTransactionsByBuyer(_ctx, userId);
IQueryable<Transaction> sellTransactions = GetTransactionsBySeller(_ctx, userId);
return buyTransactions.Concat(sellTransactions);
}
Then I could just instantiate a new context whenever I call anything from the repo layer: repo.GetTransactionsByUser(new MarketContext(), userId);
Again, thanks for the help. I'm new to this, and don't know which approach would be best.
Try to add
Include call in your repository call:
public IQueryable<Ask> GetAsksBySeller(int sellerId)
{
using (MarketContext _ctx = new MarketContext())
{
return _ctx.Asks
.Include("seller")
.Include("item")
.Where(s => s.seller.id == sellerId).AsQueryable();
}
}
Also, there is an extension method Include which accepts lambda expression as parameter and provides you type checks on compile time
http://msdn.microsoft.com/en-us/data/jj574232.aspx
As for the context lifespan, your repositories should share one context per request if this is a web application. Else it's a bit more arbitrary, but it should be something like a context per use case or service call.
So the pattern would be: create a context, pass it to the repositories involved in the call, do the task, and dispose the context. The context can be seen as your unit of work, so no matter how many repositories are involved, in the end one SaveChanges() should normally be enough to commit all changes.
I can't tell if this will solve the lazy loading issue, because from what I see I can't explain why it doesn't occur.
But although if I were in your shoes I'd like to get to the bottom of it, lazy loading is something that should not be relied on too much. Take a look at your (abridged) code:
foreach (Ask a in asks)
{
ReducedAsk r = new ReducedAsk()
{
sellerName = a.seller.username,
itemName = a.item.name
};
If lazy loading would work as expected, this would execute two queries against the database for each iteration of the loop. Of course, that's highly inefficient. That's why using Include (as in Anton's answer) is better anyhow, not only to circumvent your issue.
A further optimization is to do the projection (i.e. the new {) in the query itself:
var reducedAsks = repo.GetAsksBySeller(userId)
.Select(a => new ReducedAsk() { ... })
.ToList();
(Assuming – and requiring – that repo.GetAsksBySeller returns IQueryable).
Now only the data necessary to create ReducedAsk will be fetched from the database and it prevents materialization of entities that you're not using anyway and relatively expensive processes as change tracking and relationship fixup.