Self referencing loop detected when serializing objects in ASP.NET Web API - c#

I am using MVC 5 Web API with Entity Framework 6 database first while i'm serializing my Object to JSON i faced a problems with Self referencing loops, i googled to identify the problem and i found many solutions so i'm wondering what is the best model ever?
I Found:
use [JsonIgnore] but i will need to add it every time i update the model form DB
Remove virtual from the collection
create new layer of Data Transfer Objects (DTO)
Use JsonSerializerSettings (Doesn't work with me since it generates "$id", "$ref")

I guess you know that the answer is "It depends" right? However in my experience I have come to the conclusion that more often than not we end up with DTO layer. It is useful for solving a myriad of problems of this kind while other solutions solve only this particular instance. In other occasions we had to flatten the object (Employee.CompanyName instead of Employee.Company.Name) and other similar issues. The downside is that you cannot directly expose IQueryable from your API although we did something to translate expression trees. So basically I guess it is a question of whether you care about the IQueryable exposed directly from the service.

Related

How to save changes with Breeze without EF?

I'm working with repositories and one thing I'm really working hard on is to make things as most decoupled as they can. So, if tomorrow we change from relational databases to something else, like NoSQL and things like that we are good to go, we just have to change our DAL.
I've been trying to find out how to implement the SaveChanges method in my WebAPI controller without needing to use the EFContextProvider. I've found then the Breeze NoDb sample, however this sample uses the Breeze ContextProvider in the repository. This is something that troubles me because Breeze is a JS library, so it is something about the presentation of my application. In that case, making the repository use a component from Breeze will couple the DAL and the presentation, something I don't want to do.
Searching again for how to implement SaveChanges without EF I've found this question where there's one very good answer telling how to convert the SaveBundle to a SaveMap and then tell to use this to implement the saving logic. However I'm stuck in this method because the entries of the SaveMap give just one Type object and the EntityInfo, so I don't see how to use this with my repositories.
So, how to deal with SaveChanges without to refer to EFContextProvider and without coupling the repositories with the ContextProvider?
Are you planning to switch from SQL Server to NoSQL database? why don't you want to do that just now? How often are you planning to switch backing storage? Probably not often, if ever.
I found that database switch, especially from SQL to NoSQL is a big shift in paradigm. In one of my application I've gone through conversion from SQL to RavenDb. Despite of having everything decoupled and with using Repositories everywhere, I still had to rewrite most of the application storage logic.
What you are trying to do - you are not going to need it. So stop making life hard for yourself and get on with implementing features.
The ContextProvider does the work of converting the JObject (which Json.NET gives you in the SaveChanges method), into real, typed .NET objects. The EntityInfo object that the ContextProvider creates for each entity contains the entity object itself, as well as the entityAspect properties that it got from the client: the EntityState (Added, Modified, or Deleted), the original values of all changed properties, and the temporary values for any auto-generated keys. This is the information that you would need to save the entities yourself. The "SaveMap" just organizes them by Type for convenience, but you can manipulate them however you like.
As described in the post you referenced, you could proceed by using a ContextProvider just to convert the JObject to entities, then pass those entities to the appropriate repositories. Your repositories don't need to know anything about the ContextProvider.
Breeze offers an NHibernate provider that you can look at that shows how to talk to a non EF backend that is still a .NET server. The ContextProvider is a convenence that makes implementing any .NET provider substantially easier, but it is by no means a requirement.
As for NoSQL you should take a look at the breeze Node provider and MongoDB sample which is hosted in NodeJs ( which shows that the ContextProvider is obviously not a requirement).
We also expect to have a Breeze server implementation written in Java in the near future, which again has no "ContextProvider" requirement.

WCF, SOAP, EF, POCO

We are developing a application which should use Entity Framework and its an simple WCf Soap service (not Wcf data service). I am very confused now I have read these following posts but I don't understand where to go This question is almost the same but I have a restriction to use POCOs and try to avoid DTOs. Its not that big service. But the link which I mentioned ,in answer its written that if I try to send POCO classes on wire, there will be problem with serialization.
This post has implemented the solution which related to my problem but he did not mention anything related to serialization problem. He just changed the ProxyCreationEnabled =false which I found in many other articles as well.
But these posts are also little old, so what is the recommendation today. I have to post and get lot of Word/Excel/PDFs/Text files as well, so will it be OK to send POCO classes or it will be problem in serialization.
Thanks!
I definitely do not agree with this answer. The answer mentioned suggests to reinvent the wheel (The answer does not even indicate why not using POCOs).
You can definitely go with POCOs, I see no reason to have serialization issues; but if you have any, you can write DTOs for these specific problematic parts and map them to POCOs in the Business layer.
It is a good practice to use POCOs as the name itself suggests; Plain Old CLR objects. Writing the same classes again instead of generating them will not have any advantage. You can simply test it.
UPDATE:
Lazy Loading: Lazy loading means fetching related objects from database whenever they are accessed. If you have already serialized and deserialized an entity (ex. you have sent the entity to client side over a wire), Lazy Loading will not work, since you will not have a proxy in the client side.
Proxy: Proxy class simply enables to communicate with DB (a very simple definition by the way). It is not possible to use an instance of Proxy in the client side; it does not make sense. Just seperate the Proxy class and POCO entities into two different DLLs and share only the POCO objects with the client. And use the proxy in the service side.

NHibernate DTO with deep object graph

I am writing a smart client WPF application using MVVM that communicates with a WCF service layer containing the business logic and domain objects that uses NHibernate to manage persistence. We are in control of both sides of the wire.
Currently, I am working on creating a screen to Edit Product Details it has a tab control with each tab representing some aspect of the Product such as Main Details, Product Class, Container Type and so on. In the end, there will probably be at least 5 of these tabs.
Up to now I have been working on transforming simple domain objects to DTOs using SetResultTransformer and this has been working quite nicely.
Now that I am getting to a more complicated object I am getting a bit stuck. I would like to return a DTO to be displayed that contains the Main Product details, categories and classes. As far as categories and classes are concerned I would not want to return every single property of the domain object.
Questions:
1) How do people go about creating a DTO where there are several one to
many collections to return as in this example?
2) Is there any concerns about the DTO becoming too large?
3) When sending the DTO back to the back end is it better to send the same type of DTO with the updated values or some other more command oriented DTO?
Thanks for any help
Alex
We are currently using pretty big DTOs and it is working pretty fine. NHibernate is doing a lot of lazy loading, so this helps with big objects.
We are using bags for one to many relations, they are lazy loaded and are working pretty well.
Depending on the type of application lazy loading can be a bit of a problem. We had some problems with our rich client application with big DTOs but with some planning and a sound architecture it works pretty well.
I don't know if large DTOs are really a problem with NHibernate, but so far we don't have got any problems.
We are sending the whole object back and forth and it is doing well. NHibernate updates just the changed fields and this is really nice.
I wouldn't serialize the NHIbernate objects over web services or something like that (I don't know the WCF service layer and how it communicates with your application). If I am transferring data through web services I am generating new data objects and fill them accoringly, transfer them back and forth and update the NHibernate objects with those.
Have you tried Automapper? I do all my DTO mappings with Automapper and it works like a charm.
Have a look at automapper. I'm sure you'll like it.

Intercepting NHibernate Lazy-Load behaviour to return null if not connected to a session?

This seems like it should be an obvious thing but I've been searching for the answer for hours now with no success.
I'm using NHibernate to persist a domain model, with a service layer that serves an ASP.NET MVC front end (the 'service layer' is currently just a standard class library but may be converted to WCF in the future). The web app asks for the data it wants and specifies the collections on the domain object that it needs, the service layer takes the request, loads the object and required collections (using lazy loading) and passes the object back where it is transformed using AutoMapper to a viewmodel friendly representation.
What I want to be able to do is load the required collections, detach the object from the session and pass it to the front end. However, when AutoMapper tries to map the object this causes a an exception because it's trying to access collections that haven't been initialized and the session is no longer available. I can leave the object connected but in this case the AutoMapper transformation ends up causing all the properties on the object to be lazy-loaded anyway and this won't be an option is we go down the WCF route.
What I want to do is alter this behaviour so that instead of throwing an exception, the collection returns null (or better yet empty) when it is not connected to a session. This was the default behaviour in Entity Framework V1 (which admittedly didn't do auto lazy loading), which I worked with previously but I can't find a way to do it in NH.
Any ideas? Am I on the wrong track here?
EDIT- To be a bit clearer on what I'm trying to achieve, when accessing a collection property I want this behaviour:
Connected to session: lazy-load collection as normal.
No session: property is null (rather than throw exception)
UPDATE - Following this post by Billy McCafferty, I've managed to implement a solution using IUserCollectionType that seems to work so far. Rather than use the provided PersistentGenericBag as he does though, I had to create new types that changed the behaviour when not connected to the session. It's not perfect and requires some very ugly mappings but at least I don't need to touch my domain objects or client mappings to get it working.
The most appropriate solution in this case is probably to check in AutoMapper for lazy-loadable fields if they were indeed loaded with NHibernateUtil.IsInitialized(). Not sure how/if possible to make Automapper use this check for all implicit property mappings though.
Old question but this is what we did to solve the same issue, hopefully it helps to set you on correct path if somebody stumbles upon this problem.

DTOs vs Serializing Persisted Entities

I'm curious to know what the community feels on this subject. I've recently come into the question with a NHibernate/WCF scenario(entities persisted at the service layer) and realized I may be going the wrong direction here.
My question is plainly, when using a persistent object graph(NHibernate, LINQ to SQL, etc) behind a web service(WCF in this scenario), do you prefer to send those entities over the wire? Or would you create a set of lighter DTO's(sans cyclic references) across?
DTOs. Use AutoMapper for object-to-object mapping
I've been in this scenario multiple times before and can speak from experience on both sides. Originally I was just serializing my entities and sending them as is. This worked fine from a functional standpoint but the more I looked into it the more I realized that I was sending more data than I needed to and I was losing the ability to vary the implementation on either side. In subsequent service applications I've taken to created DTOs whose only purpose is to get data to and from the web service.
Outside of any interop, having to think about all the fields that are being sent over the wire is very helpful (to me) to make sure I'm not sending data that isn't needed or worse, should not get down to the client.
As others have mentioned, AutoMapper is a great tool for entity to DTO mapping.
I've almost always created dtos to transfer over the wire and use richter entities on my server and client. On the client they'll have some common presentation logic while on the server they'll have business logic. Mapping between the dtos and the entities can be dumb but it needs to happen. Tools like AutoMapper help you.
If you're asking do I send serialized entities from a web service to the outside world? then the answer is definitely no, you're going to get minimal interoperability if you do that. DTOs help solve this problem by defining a set of 'objects' that can be instantiated in any language whether you're using C#, Java, Javascript or anything else.
I've always had problems sending nHibernate objects over the wire. Particularly if your using a ActiveRecord model. and/or if your object has ties to the session (yuck). Another nasty result is that nHibernate may try and load the object at the entry of the method (before you can get to it) which can also possibly cause problems.
So...getting the message here? problems, problems problems...DTO's all the way

Categories

Resources