We have one old application that is developed using asp.net web forms and asmx services. While developing that developer have created one class library with data objects and it is referenced both in Service layer and as well as client layer and to avoid the Data conversion while passing data from service to client vice versa they have modified service proxies (reference.cs) manually so whenever we update the service reference we have to go to service proxies and remove the data objects that is been generated by wsdl.
Now we are in the process of converting those ASMX services to WCF and we don’t want to do manual editions in service proxies like earlier so I have to build some kind of data conversion methods to transfer data from data contract objects to UI Models and vice versa.
Can anyone advise me what would be the better approach to achieve above? We have many complex class objects(approximately 3 to 4 level deep).
I have tried AUTOMAPPER to do but Automapper is trying to map values to **.specified properties that are generated by service proxies and i have approximately 20-30 properties in classes mapping logic becoming very huge.
Related
I'm just trying to wrap my head around this concept. I have written a couple different Web APIs but they have always been consumed by a website and interacted via JSON. I have a question about how to structure the implementation when the Web API will be consumed by a windows service.
In this case there is already an existing Database so I want to use Entity Framework's Database First approach.
I create a Class Library project for the models and use Entity Framework to look at the existing database and generate all of the required classes.
Then I create a Web API Project and add my Class Library with all of the models to it. Up to this point I am good.
My question is when I go to build the Windows Service that will interact with the Web API, how do I access the classes from my Class Library model project? I know I could add that project to my windows service but that doesn't seem like the correct approach because that would pretty much by-pass the Web API.
I guess my question is if I want to create and pass an Employee object to my Web API (so it can insert it into the database) from my windows Service, how does the windows service get the Employee object without adding the Class Library to the Windows service project?
In an n-tier solution you don't pass domain objects across physical boundaries, but you implement data-transfer objects (DTO) that will only hold the required info by the consumer/caller.
Usually you're going to created a shared library that will have the whole data-transfer objects and this will be referenced both by the server and the client.
After that, it's all about using a JSON serializer in order to serialize and/or deserialize your data-transfer objects.
Domain objects will be always mapped to data-transfer objects because these are lighter than a full object. Make yourself a question: if the consumer only requires the name and the second name of someone, why you need to send more data over the wire?
In addition, it's important to avoid server dependencies in client applications and services.
Some useful tips:
Learn what's a DTO: http://en.wikipedia.org/wiki/Data_transfer_object and http://martinfowler.com/eaaCatalog/dataTransferObject.html
Check AutoMapper and how can save you time in order to map domain objects to data-transfer objects: http://automapper.org/
Normally you create exra model classes that are used for the Web API. Those model classes often contain only a subset of the data of the entities. Furthermore, this distinction allows you to create truly RESTful APIs.
Inside your Web APIs controller classes the mapping between Model and Entity happens.
The windows service only referenes the project with the Model classes but not that with the Entity classes.
I'm curious about the correct way to architect an application that consists of the following (that needs refactoring):
Excel Addin
COM-Visible client library that includes WinForms and Methods exposed to Excel (calculation calls and form activation methods)
This then uses functionality in the client library to connect to the WCF services.
WCF services currently contain calculation logic, validation logic, database access via ORM tool.
i.e. Addin -> Winform/Direct call in client DLL -> WCF -> DB or calculation
Currently this exists in just 2 projects. My first though would be to re-architect as follows:
Client Side Projects
Excel "View" (Project.Client.Excel), this limits the level of COM visibility to one project.
WinForm "view" (Project.Client.UI)
Presentation for both sets of "views" (Project.Client.Presenter)
Server Side Projects
WCF "view" including data transfer objects? (Project.Server.WCF or Service)
Server side presenter (Project.Server.Presenter)?
Business Logic (Project.Business)
Data Access Layer (Project.DAL)
My questions are:
Where should the DTOs sit, in the WCF project or as their own library/project?
Where do the entity conversion routines belong (Data entity <> Business Entity <> DTO)? All in the business logic layer or some there and some in a server presenter?
What should the correct architecture be for this type of scheme?
Plenty else I've probably missed?
Part of the idea for the refactoring is to correct the architecture, separate concerns etc, and enable the inclusion of unit tests into the design.
This is how I would structure that but there is no 100% correct answer to this question. Many variations will make sense until they make your work comfortable.
Excel "View" (Project.Client.Excel), this limits the level of COM visibility to one project.
WinForm "view" (Project.Client.UI)
Presentation for both sets of "views" (Project.Presenter)
WCF Host (Project.Service) - Web Site with *.svc files if you host in IIS (no contracts here). No much business code here it is only for hosting methods implemented in BLL.
Business Logic (Project.Business)
Data Access Layer (Project.DAL)
Contracts (Project.Contract) - Operation and Data Contracts. This is library used by both WCF client, server and BLL.
Shared (Project.Shared) - common helpers to better structure dependencies.
Where should the DTOs sit, in the WCF project or as their own
library/project?
Contracts
Where do the entity conversion routines belong (Data entity <>
Business Entity <> DTO)? All in the business logic layer or some there and some in a server presenter?
Business for middle/small size project.
What should the correct architecture be for this type of scheme?
Your seems to be fine.
Server side presenter (Project.Server.Presenter) - this makes no sense for me because there is no GUI that consumes it
Where should the DTOs sit, in the WCF project or as their own library/project?
You don't want them in the WCF project as this means the client would have to reference that server-side project. Better to keep DTOs, WCF service contracts (interfaces), etc., in a separate "common" project that both the server and client projects can reference.
Where do the entity conversion routines belong?
Data entity <-> Business entity in the data access layer; Business entity <-> DTO in the business logic. Of course, it's also perfectly acceptable to use your data entities across all layers, avoiding the need for all these different entities and mapping code that you will need to keep updated. I guess it depends on the complexity of your system, but take a look at EF4 POCO.
As for your other questions, without knowing a lot more about your requirements and design, you look to be on the right lines with your list of projects.
I don't know very much of WCF...
I want to do a clean job to serve entities on client side using DataContracts. Imagine two DataContracts "System" and "Building": "System" may have many "Buildings" and "Building" may have many "Systems". So, we have a many-to-many relationship between them.
In service contract model, "System" have a "Buildings" property that is a collection. "Building" also have a collection of "Systems".
The WCF uses DataSets for the underlying data access (with stored procedures for CRUD) and I have a table between SYSTEM and BUILDING representing the relationship.
So, how can I implement this scenario cleanly? I want the clients to be able to get a simple representation of "Buildings" in "System", for example, I could use:
system = GetSystem(id);
foreach (Building building in system.Buildings) {
// do whatever with each buildings...
}
Thank you!
I think this question is too broad to cover in full detail, but I can give you a few pointers to get you started.
Forget about WCF and build the Data Access Layer (DAL). This should be a library which contains code to query the database and return strongly typed objects. This library might contain a method called GetBuildings() which returns a list of Building objects. The library might work with DataSets (and other database specific types), but should not expose DataSets to external callers.
Now that you have a library which can be used to get data from the database, write the WCF service. Code in the service component should call into the DAL and turn that information into DataContract objects to be sent over the web service boundary. Don't try to represent all your data in the DataContract objects - you want your data packets to be relatively small, so don't include information that isn't required. Balance this with trying to make as few web service calls as possible. In designing your DataContract classes, consider what the client application will be doing with the data.
Write the Service Client component. This is code which makes calls to the WCF Service, and turns that information into Entity objects.
The final (and most rewarding step) is to write the client application logic. Now you have another set off issues to confront about how you will structure client code (I recommend using MVVM). The client application should call into the Service Client component, and use the data to meet the requirements of your application.
By following the above 4 steps, you should end up with:
A Data Access Layer that talks to the database.
A Service Layer, which knows nothing about the database but is able to fetch data from the Data Access Layer.
A Service Client layer, which knows nothing about databases but knows how to fetch data from the Service Layer.
Application code, which knows nothing about databases or web services, but calls into the Service Client layer to get data and presents the data to a User Interface.
Everyone will do this differently, but the main thing is to separate concerns by using a layered architecture.
I am building an application. I am creating a Silverlight 4 client with the help of MVVM Light. I am acquiring data from a WCF Service. At least, this is the plan.
In the WCF Service I have defined the "entities" that I need to use in my application. When in the Silverlight Client I add a reference to my WCF Service, Visual Studio recreates on the client-side all classes that were marked with the attribute [DataContract] in the service.
What I would like to know is if this is a bad practice and if it were better to create the Models inside the Client. As far as I understand, in the first case I should only create the ViewModels and the Views in the Silverlight client, whereas in the second case I should create the Views, ViewModels and Models inside the Silverlight client, and populate the Models instances with the values coming from the WCF Service.
Thank you for your help.
Cheers,
G.
UPDATE
Ok, I don't think my question was clear enough as I havent not received many feedbacks.
However, I'd like to provide an update on this. The answer I was looking for is "No! Data Transfer Objects!".
I was thinking to use my entity classes (the ones mapped to the DB tables) as DataContract in a WCF Service. Adding a reference to this WCF service in a client would have created all the classes decorated with DataContract on the client too.
The big problem in my case is that the data layer is based on Hibernate, which soemtimes makes extensive use at runtime of "data proxy" classes (see Castle Proxy). Well, it turned out that there is a serialization issue with these data proxies, and as far as I understood the best approach is to adopt the Data Transfer Objects pattern in order to map the "complex" entities to a similar but "lighter" class (the DTO).
I hope this can help someone else.
Have a nice day!
Gianluca.
Have you looked at WCF RIA with Nhibernate? To try and answer the question though: I wouldn't try and return entities from the WCF service directly, I personally would create DTO's. And then I would probably map those DTO's to some sort of a client side model. So, that's what I would try and do if I wasn't able to take advantage of RIA.
I've been wading through all the new EF and WCF stuff in .NET 4 for a major project in its early stages, and I think my brain's now officially turned to sludge. It's the first large-scale development work I've done in .NET since the 1.1 days. As usual everything needs to be done yesterday, so I'm playing catch-up.
This is what I need to hammer together - any sanity checks or guidance would be greatly appreciated. The project itself can be thought of as essentially a posh e-commerce system, with multiple clients, both web and Windows-based, connecting to central servers with live data.
On the server side:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all the other accoutrements of a complex DB system)
Underlying classes used for EF and
WCF must be extensible both at a
property and class (ie field and
record) level, for validation,
security, high-level auditing and
other custom logic
On the client side:
WCF client
Underlying classes the same as the
server side, but with some of the
customisations not present
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
The client-side WCF API details will
probably end up being published
publicly, so sensitive server-side
implementation hints should not be
leaked through the API unless
absolutely unavoidable - this
includes EF attributes in properties
and classes
General requirements:
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can
foresee data traffic and server
workload increasing exponentially
within a few years
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it. Somehow they
need to be made suitable for
both EF and WCF on both client and
server side, and have various layers
of customisation, but appear as if
custom-written for each scenario
Sorry this is so open-ended, but as I said, my brain's completely turned to sludge and I've confused myself to the point where I'm frozen.
Could anyone point me in the general direction of how to build the classes to do all this? Honestly, thanks very, very much.
A few hints in no particular order:
POCO would be the way to go to avoid dependencies on EF classes in your data objects.
Consider adding a intermediate layer based on data transfer objects to cope with your "only modified properties" are passed (this requirement will be the tricky part). These DTO will be passed between service and clients to exchange modifications
Use a stateless communication model (no WCF session) to be able to implement load-balancing and fail-over very easily.
Share the POCO between client and services, use subclassing on the server to add the internal customized information.
You would end up in the server side with at least:
A project for the service contracts and the DTO (shared)
A project for the POCO (shared)
A project for the WCF service layer
A project for business logic (call by the WCF layer)
I have few notes to your requirements:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all
the other accoutrements of a complex
DB system)
Are you going to build only data access layer exposed as set of WCF services or heavy business logic exposed as WCF services? This strongly affect rest of your requirements. If you want to do the former case check WCF Data Services. In the later case check my other notes.
Underlying classes used for EF and WCF
must be extensible both at a property
and class (ie field and record) level,
for validation, security, high-level
auditing and other custom logic
Divide your data classes into two sets. Internally your services will use POCO classes implemented as domain objects. Domain objects will be materialized / persisted by EF (you need .NET 4.0) and they will also contain custom logic. If you want to build heavy business layer you should also think about Domain driven design = repositories, aggregate roots, etc.
Underlying classes the same as the
server side, but with some of the
customisations not present
Second set of data classes will be Data transfers objects which will be exposed by WCF services and shared among server and clients. Your domain objects will be converted to DTOs when sending data to client and DTOs will be converted to domain objects when returning from client.
Your WCF services should be build on the top of business logic - domain objects / domain services. WCF servies should expose chunky interfaces (instead of chatty CRUD interfaces) where DTO transfers data from several domain operations. This will also help you improve performance by reducing number of round trips between client and service.
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
I think this is achievable only by correct definition of DTOs or perhaps by some custom serialization.
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can foresee data traffic and server
workload increasing exponentially
within a few years
As already mentioned you have to design your service to be ready for load balancing and you should also think about caching (distributed) - check AppFabric. Good idea is to use stateless services.
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it.
This seems like simple requirement but you can easily model database which will be hard to use with Entity Framework.
The main advice:
Your project looks big and complex so the first thing you should do is to hire some developers with experience with WCF, EF, etc. Each of these technologies have some pitfalls so it is really big risk to use them in such scale without having experienced people.