I'm curious about the correct way to architect an application that consists of the following (that needs refactoring):
Excel Addin
COM-Visible client library that includes WinForms and Methods exposed to Excel (calculation calls and form activation methods)
This then uses functionality in the client library to connect to the WCF services.
WCF services currently contain calculation logic, validation logic, database access via ORM tool.
i.e. Addin -> Winform/Direct call in client DLL -> WCF -> DB or calculation
Currently this exists in just 2 projects. My first though would be to re-architect as follows:
Client Side Projects
Excel "View" (Project.Client.Excel), this limits the level of COM visibility to one project.
WinForm "view" (Project.Client.UI)
Presentation for both sets of "views" (Project.Client.Presenter)
Server Side Projects
WCF "view" including data transfer objects? (Project.Server.WCF or Service)
Server side presenter (Project.Server.Presenter)?
Business Logic (Project.Business)
Data Access Layer (Project.DAL)
My questions are:
Where should the DTOs sit, in the WCF project or as their own library/project?
Where do the entity conversion routines belong (Data entity <> Business Entity <> DTO)? All in the business logic layer or some there and some in a server presenter?
What should the correct architecture be for this type of scheme?
Plenty else I've probably missed?
Part of the idea for the refactoring is to correct the architecture, separate concerns etc, and enable the inclusion of unit tests into the design.
This is how I would structure that but there is no 100% correct answer to this question. Many variations will make sense until they make your work comfortable.
Excel "View" (Project.Client.Excel), this limits the level of COM visibility to one project.
WinForm "view" (Project.Client.UI)
Presentation for both sets of "views" (Project.Presenter)
WCF Host (Project.Service) - Web Site with *.svc files if you host in IIS (no contracts here). No much business code here it is only for hosting methods implemented in BLL.
Business Logic (Project.Business)
Data Access Layer (Project.DAL)
Contracts (Project.Contract) - Operation and Data Contracts. This is library used by both WCF client, server and BLL.
Shared (Project.Shared) - common helpers to better structure dependencies.
Where should the DTOs sit, in the WCF project or as their own
library/project?
Contracts
Where do the entity conversion routines belong (Data entity <>
Business Entity <> DTO)? All in the business logic layer or some there and some in a server presenter?
Business for middle/small size project.
What should the correct architecture be for this type of scheme?
Your seems to be fine.
Server side presenter (Project.Server.Presenter) - this makes no sense for me because there is no GUI that consumes it
Where should the DTOs sit, in the WCF project or as their own library/project?
You don't want them in the WCF project as this means the client would have to reference that server-side project. Better to keep DTOs, WCF service contracts (interfaces), etc., in a separate "common" project that both the server and client projects can reference.
Where do the entity conversion routines belong?
Data entity <-> Business entity in the data access layer; Business entity <-> DTO in the business logic. Of course, it's also perfectly acceptable to use your data entities across all layers, avoiding the need for all these different entities and mapping code that you will need to keep updated. I guess it depends on the complexity of your system, but take a look at EF4 POCO.
As for your other questions, without knowing a lot more about your requirements and design, you look to be on the right lines with your list of projects.
Related
I have a working WPF application that works on a single PC. I have used SQL server database, Entity Framework to communicate with database
and RDLC reporting in the application. Now the requirement has arrived to make this application work on the local company network where multiple users (normally around 25 at max) will access application depending upon there roles and permissions set. I did some R&D on this and used primarily the architecture mentioned here http://www.codeproject.com/Articles/434282/A-N-Tier-Architecture-Sample-with-ASP-NET-MVC-WCF, and after doing so, I have made a paper design/architecture of the application that will look like this
A WCF service running on a high end server within the company network
GPC.Service itself - defines the protocol to connect to the service
and all other necessary information
GPC.Algorithm - will be the main business logic layer that will
contain the logic and will be interface to the clients for calling
the database layer methods
GPC.Persistance - will have actual database interaction methods like
fetching/storing/updating/deleting records in the database
GPC.Data - This will contain the edmx schema for the Entity
Framwework
GPC.Entites - This will contain the entities of the database schema
and addional partial classes
**
Clients:
The client will a WPF Application based on MVVM pattern for now (may be in future we will need to move to the Web application but not required for now). Main components of the application are:
Import from excel: Currently all data is in Excel files. All that
data needs to be imported into the system.
Edit/Update/Delete: Once data is imported, allow interface to user
to edit/update/delete records
Generate reprots (using RDLC for this)
Users/Roles management etc.
Shared:
This is a library that contains differnet miscelenious classes like code to read excel file, Handle errors, Collections that will be bind to the UI etc.
DB context: Will be created in a using statement inside the Persistance layer for each method to ensure no stale information is left.
Does this architecure follow the n-tier architecture and is it flexible? What improvements are required in this and please guide me how to improve whatever issues are there. I want to make sure this is a good architecture before I go ahead and change my existing application.
It seems like your are on the correct path however you may be over engineering in some areas.
I think to a large degree the EntityFramework deals with the Entities, Data and Persistence layers for you. Implementing them yourself may be overkill unless you are looking to ultimately replace EntityFramework with some other ORM system.
You are eluding to SOA (Service Orientated Architecture) here with your GPC.Services library. Here you need to look at how you can break down your service layer into one or more atmoic services which will serve the client application. There are a number of ways of going about this and would depend largely on how you plan to use the service layer going forward. Take a look at RESTful services which breaks down the services layer nicely and will guide you into building neat atmoic services. Check out the Asp.net Web API for this.
I think what you are looking for in your GPC.Alogrithms library is really a domain model. A domain model encapsulates all your business logic and allows you to perform state changes on your objects via public functions which you expose. With this in mind the layers of the system would appear as follows:
Persistence (EF) -> Domain Model -> Service Layer -> DTO (Data Transfer Objects) -> Client
The DTO objects mentioned above would be a set of POCO (Plain Old C# Objects) which are responsible for delivering data to and from your client. You need this since serializing and desalinizing your domain objects will become problematic due to back references and other encapsulation issues. Putting DTO's in place will enforce a context boundary which is once of the tenets of SOA - "Boundarys are explicit", see this for more info on soa
With respect to the client side it seems like you are on track. What you may want to do is refactor you current client application so that all data queries are consolidated into a single layer. So when the time comes you will just replace that layer with the service implementation.
this makes perfect sense. (try to build it TDD style)
in order to make your life a bit easier with the client versions management consider to use ClickOnce installer to enforce the latest version installations on your users computers (this headache will be gone once you will move it to be a web app).
Premise:
I am exercising Domain-Driven Design and I separate my solution into 4 layers:
Presentation Layer
An ASP.NET Web API 2 project for a RESTful API web service
An ASP.NET Web MVC5 project for a documentation and admin screens
Application Layer
A class library project responsible to take commands from presentation layer and consume any domain services
Domain Layer
A class library project that contains business models and logic
A class library project that contains the domain services
Infrastructure Layer
A class library project that contains all the concrete implementation, like dataq persistance using Entity Framework, logging using Log4net, IoC using Simple Injector, etc
The domain layer only has a set of repository interfaces defined for the aggregates and it's up to the implementation data access mechanism which exists in the infrastructure layer to hide the implementation details.
In this exercise, I decide to use Entity Framework Database first approach. And of course, there is a app.config in the infrastructure project that contains a connection string.
Problems:
Ok, I spend a great deal and time trying to separate all the concerns and to focus on domain models. In the presentation layer (i.e., the API and MVC projects), there is no direct reference to the infrastructure project. And IoC container has been setup so all concrete implementation of the required interfaces would be injected into controller constructors.
When I select, for example, the API project as start project and run it, I got
An exception of type 'System.InvalidOperationException' occurred in EntityFramework.dll but was not handled in user code.
Additional information: No connection string named 'xxxxxx' could be found in the application config file.
Question:
Now I understand if I install Entity Framework into the API project, copy and paste connection string from the app.config of the infrastructure project into the web.config of the API project, things will work. But that breaks our original purpose of separating concerns, doesn't it? If we do that, then what's the point of using Domain-Driven Design and making the data access technology ignorance from the presentation layer?
The reason we don't directly reference direct implementation of data access technology (i.e. concrete implementations that use dbContext and Linq) is that we could easily switch the underground access technology to something else.
So what would be the proper way to do it?!!
I do not want to install Entity Framework in my presentation layer, nor copy the connection string everywhere. I want all the data access and concrete implementation of repositories exist in just one library.
The Entity Framework configuration must be in the project where it is being used. This doesn't mean it's going to break your layered structure or your separation of concerns.
Remove all entityframework elements from your app.config. Create your own connection string element and provide it to entityframework on app startup.
appreciate any help with this.
We are currently working on remodelling a legacy project that is now heading towards being too much of a nightmare to maintain. So, we've decided to properly normalise the data structure and use Entity Framework 4 Code First development in an MVC 3 project using a repository pattern. The issue we face at the moment is that the legacy application is hosted on a server outside of our main infrastructure for security reasons - therefore all CRUD operations are done via web services, there is no direct connection string to the MS SQL database.
My "proposed" solution is to define my repository contracts, during development there will be a direct connection to the database but once deployed there won't be (there may be scope to getting that changed later). So, would it be reasonable for me to provide two concrete versions of the repository working to the same contract. One that uses LINQ to perform CRUD operations (development and possible the infrastructure we can move to later) and another version that uses SOAP to pass objects (that would mean my POCOs would need to be defined as Serializable) and perform the CRUD operations this way?
Does this sound feasible or is there a much better way of achieving this?
Thanks!
If you are responsible for developing both client and service part you can use some simple approach:
Use some shared interface for repository and service client
When working with remote repository in client inject service client - the service will use repository implementation directly
When working with local repository in client inject repository directly
By using this approach you will have single repository implementation for both scenarios and your upper level logic will not change. There will be only additional web service layer for remote repository. Your repository will have to encapsulate all data access logic and queries = no linq queries outside of the repository which is not the issue in your architecture because linq queries are not serializable without your own custom development or without using WCF Data Services (but it would have other impacts on your architecture).
Btw. it is not very common to place repository behind the web service. More common is to place whole service layre / business logic behind the web service.
I don't know very much of WCF...
I want to do a clean job to serve entities on client side using DataContracts. Imagine two DataContracts "System" and "Building": "System" may have many "Buildings" and "Building" may have many "Systems". So, we have a many-to-many relationship between them.
In service contract model, "System" have a "Buildings" property that is a collection. "Building" also have a collection of "Systems".
The WCF uses DataSets for the underlying data access (with stored procedures for CRUD) and I have a table between SYSTEM and BUILDING representing the relationship.
So, how can I implement this scenario cleanly? I want the clients to be able to get a simple representation of "Buildings" in "System", for example, I could use:
system = GetSystem(id);
foreach (Building building in system.Buildings) {
// do whatever with each buildings...
}
Thank you!
I think this question is too broad to cover in full detail, but I can give you a few pointers to get you started.
Forget about WCF and build the Data Access Layer (DAL). This should be a library which contains code to query the database and return strongly typed objects. This library might contain a method called GetBuildings() which returns a list of Building objects. The library might work with DataSets (and other database specific types), but should not expose DataSets to external callers.
Now that you have a library which can be used to get data from the database, write the WCF service. Code in the service component should call into the DAL and turn that information into DataContract objects to be sent over the web service boundary. Don't try to represent all your data in the DataContract objects - you want your data packets to be relatively small, so don't include information that isn't required. Balance this with trying to make as few web service calls as possible. In designing your DataContract classes, consider what the client application will be doing with the data.
Write the Service Client component. This is code which makes calls to the WCF Service, and turns that information into Entity objects.
The final (and most rewarding step) is to write the client application logic. Now you have another set off issues to confront about how you will structure client code (I recommend using MVVM). The client application should call into the Service Client component, and use the data to meet the requirements of your application.
By following the above 4 steps, you should end up with:
A Data Access Layer that talks to the database.
A Service Layer, which knows nothing about the database but is able to fetch data from the Data Access Layer.
A Service Client layer, which knows nothing about databases but knows how to fetch data from the Service Layer.
Application code, which knows nothing about databases or web services, but calls into the Service Client layer to get data and presents the data to a User Interface.
Everyone will do this differently, but the main thing is to separate concerns by using a layered architecture.
I've been wading through all the new EF and WCF stuff in .NET 4 for a major project in its early stages, and I think my brain's now officially turned to sludge. It's the first large-scale development work I've done in .NET since the 1.1 days. As usual everything needs to be done yesterday, so I'm playing catch-up.
This is what I need to hammer together - any sanity checks or guidance would be greatly appreciated. The project itself can be thought of as essentially a posh e-commerce system, with multiple clients, both web and Windows-based, connecting to central servers with live data.
On the server side:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all the other accoutrements of a complex DB system)
Underlying classes used for EF and
WCF must be extensible both at a
property and class (ie field and
record) level, for validation,
security, high-level auditing and
other custom logic
On the client side:
WCF client
Underlying classes the same as the
server side, but with some of the
customisations not present
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
The client-side WCF API details will
probably end up being published
publicly, so sensitive server-side
implementation hints should not be
leaked through the API unless
absolutely unavoidable - this
includes EF attributes in properties
and classes
General requirements:
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can
foresee data traffic and server
workload increasing exponentially
within a few years
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it. Somehow they
need to be made suitable for
both EF and WCF on both client and
server side, and have various layers
of customisation, but appear as if
custom-written for each scenario
Sorry this is so open-ended, but as I said, my brain's completely turned to sludge and I've confused myself to the point where I'm frozen.
Could anyone point me in the general direction of how to build the classes to do all this? Honestly, thanks very, very much.
A few hints in no particular order:
POCO would be the way to go to avoid dependencies on EF classes in your data objects.
Consider adding a intermediate layer based on data transfer objects to cope with your "only modified properties" are passed (this requirement will be the tricky part). These DTO will be passed between service and clients to exchange modifications
Use a stateless communication model (no WCF session) to be able to implement load-balancing and fail-over very easily.
Share the POCO between client and services, use subclassing on the server to add the internal customized information.
You would end up in the server side with at least:
A project for the service contracts and the DTO (shared)
A project for the POCO (shared)
A project for the WCF service layer
A project for business logic (call by the WCF layer)
I have few notes to your requirements:
A WCF service, the implementation
using EF to connect to an SQL Server
data store (that will probably end up
having many hundreds of tables and all
the other accoutrements of a complex
DB system)
Are you going to build only data access layer exposed as set of WCF services or heavy business logic exposed as WCF services? This strongly affect rest of your requirements. If you want to do the former case check WCF Data Services. In the later case check my other notes.
Underlying classes used for EF and WCF
must be extensible both at a property
and class (ie field and record) level,
for validation, security, high-level
auditing and other custom logic
Divide your data classes into two sets. Internally your services will use POCO classes implemented as domain objects. Domain objects will be materialized / persisted by EF (you need .NET 4.0) and they will also contain custom logic. If you want to build heavy business layer you should also think about Domain driven design = repositories, aggregate roots, etc.
Underlying classes the same as the
server side, but with some of the
customisations not present
Second set of data classes will be Data transfers objects which will be exposed by WCF services and shared among server and clients. Your domain objects will be converted to DTOs when sending data to client and DTOs will be converted to domain objects when returning from client.
Your WCF services should be build on the top of business logic - domain objects / domain services. WCF servies should expose chunky interfaces (instead of chatty CRUD interfaces) where DTO transfers data from several domain operations. This will also help you improve performance by reducing number of round trips between client and service.
When an object is updated on the
client, preferably only the modified
properties should be sent to the
server
I think this is achievable only by correct definition of DTOs or perhaps by some custom serialization.
Network efficiency is important,
insofar as we don't want to make it
*in*efficient from day one - I can foresee data traffic and server
workload increasing exponentially
within a few years
As already mentioned you have to design your service to be ready for load balancing and you should also think about caching (distributed) - check AppFabric. Good idea is to use stateless services.
The database gets developed first, so
the (POCO, C#) classes generated by EF
will be based on it.
This seems like simple requirement but you can easily model database which will be hard to use with Entity Framework.
The main advice:
Your project looks big and complex so the first thing you should do is to hire some developers with experience with WCF, EF, etc. Each of these technologies have some pitfalls so it is really big risk to use them in such scale without having experienced people.