I don't know very much of WCF...
I want to do a clean job to serve entities on client side using DataContracts. Imagine two DataContracts "System" and "Building": "System" may have many "Buildings" and "Building" may have many "Systems". So, we have a many-to-many relationship between them.
In service contract model, "System" have a "Buildings" property that is a collection. "Building" also have a collection of "Systems".
The WCF uses DataSets for the underlying data access (with stored procedures for CRUD) and I have a table between SYSTEM and BUILDING representing the relationship.
So, how can I implement this scenario cleanly? I want the clients to be able to get a simple representation of "Buildings" in "System", for example, I could use:
system = GetSystem(id);
foreach (Building building in system.Buildings) {
// do whatever with each buildings...
}
Thank you!
I think this question is too broad to cover in full detail, but I can give you a few pointers to get you started.
Forget about WCF and build the Data Access Layer (DAL). This should be a library which contains code to query the database and return strongly typed objects. This library might contain a method called GetBuildings() which returns a list of Building objects. The library might work with DataSets (and other database specific types), but should not expose DataSets to external callers.
Now that you have a library which can be used to get data from the database, write the WCF service. Code in the service component should call into the DAL and turn that information into DataContract objects to be sent over the web service boundary. Don't try to represent all your data in the DataContract objects - you want your data packets to be relatively small, so don't include information that isn't required. Balance this with trying to make as few web service calls as possible. In designing your DataContract classes, consider what the client application will be doing with the data.
Write the Service Client component. This is code which makes calls to the WCF Service, and turns that information into Entity objects.
The final (and most rewarding step) is to write the client application logic. Now you have another set off issues to confront about how you will structure client code (I recommend using MVVM). The client application should call into the Service Client component, and use the data to meet the requirements of your application.
By following the above 4 steps, you should end up with:
A Data Access Layer that talks to the database.
A Service Layer, which knows nothing about the database but is able to fetch data from the Data Access Layer.
A Service Client layer, which knows nothing about databases but knows how to fetch data from the Service Layer.
Application code, which knows nothing about databases or web services, but calls into the Service Client layer to get data and presents the data to a User Interface.
Everyone will do this differently, but the main thing is to separate concerns by using a layered architecture.
Related
I have a working WPF application that works on a single PC. I have used SQL server database, Entity Framework to communicate with database
and RDLC reporting in the application. Now the requirement has arrived to make this application work on the local company network where multiple users (normally around 25 at max) will access application depending upon there roles and permissions set. I did some R&D on this and used primarily the architecture mentioned here http://www.codeproject.com/Articles/434282/A-N-Tier-Architecture-Sample-with-ASP-NET-MVC-WCF, and after doing so, I have made a paper design/architecture of the application that will look like this
A WCF service running on a high end server within the company network
GPC.Service itself - defines the protocol to connect to the service
and all other necessary information
GPC.Algorithm - will be the main business logic layer that will
contain the logic and will be interface to the clients for calling
the database layer methods
GPC.Persistance - will have actual database interaction methods like
fetching/storing/updating/deleting records in the database
GPC.Data - This will contain the edmx schema for the Entity
Framwework
GPC.Entites - This will contain the entities of the database schema
and addional partial classes
**
Clients:
The client will a WPF Application based on MVVM pattern for now (may be in future we will need to move to the Web application but not required for now). Main components of the application are:
Import from excel: Currently all data is in Excel files. All that
data needs to be imported into the system.
Edit/Update/Delete: Once data is imported, allow interface to user
to edit/update/delete records
Generate reprots (using RDLC for this)
Users/Roles management etc.
Shared:
This is a library that contains differnet miscelenious classes like code to read excel file, Handle errors, Collections that will be bind to the UI etc.
DB context: Will be created in a using statement inside the Persistance layer for each method to ensure no stale information is left.
Does this architecure follow the n-tier architecture and is it flexible? What improvements are required in this and please guide me how to improve whatever issues are there. I want to make sure this is a good architecture before I go ahead and change my existing application.
It seems like your are on the correct path however you may be over engineering in some areas.
I think to a large degree the EntityFramework deals with the Entities, Data and Persistence layers for you. Implementing them yourself may be overkill unless you are looking to ultimately replace EntityFramework with some other ORM system.
You are eluding to SOA (Service Orientated Architecture) here with your GPC.Services library. Here you need to look at how you can break down your service layer into one or more atmoic services which will serve the client application. There are a number of ways of going about this and would depend largely on how you plan to use the service layer going forward. Take a look at RESTful services which breaks down the services layer nicely and will guide you into building neat atmoic services. Check out the Asp.net Web API for this.
I think what you are looking for in your GPC.Alogrithms library is really a domain model. A domain model encapsulates all your business logic and allows you to perform state changes on your objects via public functions which you expose. With this in mind the layers of the system would appear as follows:
Persistence (EF) -> Domain Model -> Service Layer -> DTO (Data Transfer Objects) -> Client
The DTO objects mentioned above would be a set of POCO (Plain Old C# Objects) which are responsible for delivering data to and from your client. You need this since serializing and desalinizing your domain objects will become problematic due to back references and other encapsulation issues. Putting DTO's in place will enforce a context boundary which is once of the tenets of SOA - "Boundarys are explicit", see this for more info on soa
With respect to the client side it seems like you are on track. What you may want to do is refactor you current client application so that all data queries are consolidated into a single layer. So when the time comes you will just replace that layer with the service implementation.
this makes perfect sense. (try to build it TDD style)
in order to make your life a bit easier with the client versions management consider to use ClickOnce installer to enforce the latest version installations on your users computers (this headache will be gone once you will move it to be a web app).
We have one old application that is developed using asp.net web forms and asmx services. While developing that developer have created one class library with data objects and it is referenced both in Service layer and as well as client layer and to avoid the Data conversion while passing data from service to client vice versa they have modified service proxies (reference.cs) manually so whenever we update the service reference we have to go to service proxies and remove the data objects that is been generated by wsdl.
Now we are in the process of converting those ASMX services to WCF and we don’t want to do manual editions in service proxies like earlier so I have to build some kind of data conversion methods to transfer data from data contract objects to UI Models and vice versa.
Can anyone advise me what would be the better approach to achieve above? We have many complex class objects(approximately 3 to 4 level deep).
I have tried AUTOMAPPER to do but Automapper is trying to map values to **.specified properties that are generated by service proxies and i have approximately 20-30 properties in classes mapping logic becoming very huge.
I'm just trying to wrap my head around this concept. I have written a couple different Web APIs but they have always been consumed by a website and interacted via JSON. I have a question about how to structure the implementation when the Web API will be consumed by a windows service.
In this case there is already an existing Database so I want to use Entity Framework's Database First approach.
I create a Class Library project for the models and use Entity Framework to look at the existing database and generate all of the required classes.
Then I create a Web API Project and add my Class Library with all of the models to it. Up to this point I am good.
My question is when I go to build the Windows Service that will interact with the Web API, how do I access the classes from my Class Library model project? I know I could add that project to my windows service but that doesn't seem like the correct approach because that would pretty much by-pass the Web API.
I guess my question is if I want to create and pass an Employee object to my Web API (so it can insert it into the database) from my windows Service, how does the windows service get the Employee object without adding the Class Library to the Windows service project?
In an n-tier solution you don't pass domain objects across physical boundaries, but you implement data-transfer objects (DTO) that will only hold the required info by the consumer/caller.
Usually you're going to created a shared library that will have the whole data-transfer objects and this will be referenced both by the server and the client.
After that, it's all about using a JSON serializer in order to serialize and/or deserialize your data-transfer objects.
Domain objects will be always mapped to data-transfer objects because these are lighter than a full object. Make yourself a question: if the consumer only requires the name and the second name of someone, why you need to send more data over the wire?
In addition, it's important to avoid server dependencies in client applications and services.
Some useful tips:
Learn what's a DTO: http://en.wikipedia.org/wiki/Data_transfer_object and http://martinfowler.com/eaaCatalog/dataTransferObject.html
Check AutoMapper and how can save you time in order to map domain objects to data-transfer objects: http://automapper.org/
Normally you create exra model classes that are used for the Web API. Those model classes often contain only a subset of the data of the entities. Furthermore, this distinction allows you to create truly RESTful APIs.
Inside your Web APIs controller classes the mapping between Model and Entity happens.
The windows service only referenes the project with the Model classes but not that with the Entity classes.
I've been wading through all the new EF and WCF stuff in .NET 4 for a major project in its early stages, and I think my brain's now officially turned to sludge. It's the first large-scale development work I've done in .NET since the 1.1 days. As usual everything needs to be done yesterday, so I'm playing catch-up.
This is what I need to hammer together - any sanity checks or guidance would be greatly appreciated. The project itself can be thought of as essentially a posh e-commerce system, with multiple clients, both web and Windows-based, connecting to central servers with live data.
On the server side:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all the other accoutrements of a complex DB system)
Underlying classes used for EF and
WCF must be extensible both at a
property and class (ie field and
record) level, for validation,
security, high-level auditing and
other custom logic
On the client side:
WCF client
Underlying classes the same as the
server side, but with some of the
customisations not present
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
The client-side WCF API details will
probably end up being published
publicly, so sensitive server-side
implementation hints should not be
leaked through the API unless
absolutely unavoidable - this
includes EF attributes in properties
and classes
General requirements:
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can
foresee data traffic and server
workload increasing exponentially
within a few years
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it. Somehow they
need to be made suitable for
both EF and WCF on both client and
server side, and have various layers
of customisation, but appear as if
custom-written for each scenario
Sorry this is so open-ended, but as I said, my brain's completely turned to sludge and I've confused myself to the point where I'm frozen.
Could anyone point me in the general direction of how to build the classes to do all this? Honestly, thanks very, very much.
A few hints in no particular order:
POCO would be the way to go to avoid dependencies on EF classes in your data objects.
Consider adding a intermediate layer based on data transfer objects to cope with your "only modified properties" are passed (this requirement will be the tricky part). These DTO will be passed between service and clients to exchange modifications
Use a stateless communication model (no WCF session) to be able to implement load-balancing and fail-over very easily.
Share the POCO between client and services, use subclassing on the server to add the internal customized information.
You would end up in the server side with at least:
A project for the service contracts and the DTO (shared)
A project for the POCO (shared)
A project for the WCF service layer
A project for business logic (call by the WCF layer)
I have few notes to your requirements:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all
the other accoutrements of a complex
DB system)
Are you going to build only data access layer exposed as set of WCF services or heavy business logic exposed as WCF services? This strongly affect rest of your requirements. If you want to do the former case check WCF Data Services. In the later case check my other notes.
Underlying classes used for EF and WCF
must be extensible both at a property
and class (ie field and record) level,
for validation, security, high-level
auditing and other custom logic
Divide your data classes into two sets. Internally your services will use POCO classes implemented as domain objects. Domain objects will be materialized / persisted by EF (you need .NET 4.0) and they will also contain custom logic. If you want to build heavy business layer you should also think about Domain driven design = repositories, aggregate roots, etc.
Underlying classes the same as the
server side, but with some of the
customisations not present
Second set of data classes will be Data transfers objects which will be exposed by WCF services and shared among server and clients. Your domain objects will be converted to DTOs when sending data to client and DTOs will be converted to domain objects when returning from client.
Your WCF services should be build on the top of business logic - domain objects / domain services. WCF servies should expose chunky interfaces (instead of chatty CRUD interfaces) where DTO transfers data from several domain operations. This will also help you improve performance by reducing number of round trips between client and service.
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
I think this is achievable only by correct definition of DTOs or perhaps by some custom serialization.
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can foresee data traffic and server
workload increasing exponentially
within a few years
As already mentioned you have to design your service to be ready for load balancing and you should also think about caching (distributed) - check AppFabric. Good idea is to use stateless services.
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it.
This seems like simple requirement but you can easily model database which will be hard to use with Entity Framework.
The main advice:
Your project looks big and complex so the first thing you should do is to hire some developers with experience with WCF, EF, etc. Each of these technologies have some pitfalls so it is really big risk to use them in such scale without having experienced people.
I created a new solution with 3 projects:
My "Client" is a ASP.Net Web Application. This should display the information.
My Businesslayer should have all logic in it, it's designed as a normal class libery.
My "Server" is a WebService. This connect via Linq to the database and get the Information.
Now only my Server knows Linq and knows the Database (how it should be).
But how can I give Linq Objects throug the WebService to my Business and my WebApp Layer to use it There?
For my understand there must be a way, because I have e.g. a complete user object with all needed Information with Linq, so I don't must create a own one, must I?
Linq should be covered as well as database. Your business-logic layer and server should better have common core objects used in client-server calls: this also will provide you easy means to add some additional info that is not stored in DB (if needed in future).
I would recommend you to create internal classes corresponds to the linq entities and expose those objects through the web service instead. Then you create mapping methods within the service application to map between those types.