I have application with the following dlls:
Web project - MVC web application
Wcf Services - services that the web application uses.
Model - Entity framework code first entities and DbContext object.
Now when I need to change something lets say function in the wcf services so I change the function but the web also affected and I need to make add service reference again and change the code that uses the wcf service and also the model sometimes changes...
As the CCP states: package should not have more than one reason to change. If change were to happen in an application dependent on a number of packages, ideally we only want changes to occur in one package, rather than in a number of them.
This helps us determine classes that are likely to change and package them together for the same reasons. If the classes are tightly coupled, put them in the same package.
So in my design I violates this principle? If someone can explain better design I will be glad.
WCF Services contain WSDL definition which is the contract for using its functions and also defines the entities for it. One possible reason that you need to add your reference again is that you are changing this contract.
For more information check here:
http://msdn.microsoft.com/en-us/library/aa738723(v=vs.110).aspx
Related
I think elements of this question have been answered elsewhere, but I couldn't find an answer to my specific circumstance.
I work with an enterprise application. This application interfaces with various 3rd party APIs & services via what is currently a single class and a plethora of proxy dlls. This means that each of these dlls is referenced in the main project. In addition to this, over time as we've added new service calls, a lot of code has been duplicated, with only very small amendments. Most of the service calls do roughly the same thing and take largely the same objects as parameters.
As you can imagine, this presents us with a number of problems, not least of which is the time it takes to add a new service.
We have a task now to refactor and streamline this process to make it more manageable and resilient - I have an idea in my head and I've done a lot of research but I just wanted to see if anyone had any better ideas or similar experiences before I dive in.
What I want to do is add a facade layer so that the base code gets all the data commonly used and bundles it off to the facade along with a parameter stipulating which service it wants to call. The facade would then pass the data to either the proxies or a bridge, which would transform it into the right format for the target service, make the required calls in order, and return any responses back to the facade to pass on.
Although I have an idea of the architecture I want, I'm not 100% sure which way to go in terms of concrete C# code - whether to add the facade and bridge/adapter code in a new project which also has the proxy dlls referenced, or whether to go down the base/interface route and add the required transformation classes directly into the proxy dlls.
Edited to add: I am unable to consider consolidating this functionality into a service or microservice due to wider infrastructure considerations that I am unable to discuss here.
Any suggestions appreciated!
I have been developing a WCF service for my project. I have multiple projects in my solution, explained below:
WCF Service - WCF Service Project
Business Logic - project contains domain logic
Data Access Layer - for accessing data
Core - project contains the business objects only (Many of them are sent as a response of service call - This contains the classes which are to be shared with client)
Log - project to log errors and activities
Service project puts call to Business project for the respective operation and the Business Logic project intializes the objects of the classes defined in Core object. And these objects are sent as the response of service call.
This is done successfully, i am able to share all public properties of the class defined in Core project. But i am unable to use DataContract/DataMember attributes in the core project as this is a non-WCF project. I need to achieve few tasks for which i have to use DataMember attributes like i dont want to share a property in case of null value, i don't want to share few properties of some objects with the client.
Please tell me if i am mistaken with the approach. And please help me achieving the above, i couldn't find any similar question on the forum.
Edit: (Let me try explaining it better)
All the projects are referenced in the WCF Service Project which consumes those project. This was to keep logical separation.
For Example, The Core project contains a class named User and this class is the return type of a service api. When this API is called, the logic initializes an object of User. And the object is returned as a result to that call.
In this case, i haven't used any [DataContract] attribute for the User class. And it works fine. Now i want to stop sharing few properties of this class, for this i needed to use [DataContract]/[DataMember] attributes, which are not being resolved in the Core project.
Ok, let me start by pointing out that data contracts and data members are not directly related to WCF. Actually, these attributes reside in a namespace that has nothing to do with services directly...theyre rather related to serialization and it's just a matter of adding a reference to the required assembly...i think it's System.Runtime.Serialization. I don't see why you can't add a reference to this assembly in your project.
The second question related to "hiding" members...i dont think you have to many options here. If you decorate a property with the DataMember attribute it will be serialized with the data contract, so there's no way you can't "hide" it. At best, what you can do is not to decorate a property with the DataMember attribute, but in this case the property will not be serialized, in other words, it will be ignored during the serialization process.
I'm migrating a system to its version 2.0.
It's escalating so I want to build a WCF. This is the first time for me working with a WCF so this may be kind of basic, still, any heads up will be very much appreciated.
The existing system consists of a 3 layered proccess.
UI based on WebForms.
Business Layer.
DAL.
For this 2.0 version what I'd like to achieve is to leave webforms behind, moving to a more mvc oriented interface. And, as I've already said, to use some sort of web service to get through in order to connect to my db source.
The question is as follows. I've been investigating and reading about wcf/restful and in the Iservice.cs I can see the interface and the DataContract with its DataMembers. The scratching-head part is that I already have my classes defined on the other layers. So, what is it meant to be done? Should I define my classes inside the WCF one by one aswell? Can't I just reference my DAL/Object Layer and use the resources available there?
Should I add another proyect to the existing VS2010 solution or should I leave the wcf alone?
I'd love to get some input on best practices also, if you may.
If REST is really what you are after, then there are other options for this than just WCF. WCF is generally overkill for most scenarios, so consider looking into:
ASP.NET Web API
ServiceStack
Both options work with ASP.NET MVC and ASP.NET WebForms, although most ASP.NET Web API examples will be used with ASP.NET MVC, which is the scenario you want to use it in it sounds like.
You can treat the ASP.NET Web API or ServiceStack as another layer in your architecture and just reference it like you would the business or data-access layers, as separate projects in your solution.
better way will be, add another separate layer for WCF service give DAL reference to it, at the same time also give DAL reference to BAL. And DON'T give WCF project reference to any one because you want it to be RESTFULL (i.e. to be access only through HTTP, ftp like protocols)
Here Iservice.cs is just interface which exposes methods to the outer world, it depicts what data, in what form and where it will be find. just implement that interface to any repository class in WCF project, which further will get data from DAL for you. Buisness Layer is the Only layer who will talk to Service layer.
Adding DAL reference to BAL is only for the metadata of the entities.
If i was wrong kindly revert me.
Following the instructions to use the Reflection Provider (http://msdn.microsoft.com/en-us/library/dd728281.aspx) everything works well, until I move the classes Order and Item to a Class Library and reference the class library from the web project with the SVC file.
Move the POCO classes into the WCF project all goes well.
Move the POCO classes out of the WCF project into separate assembly, I get a 500 with no explanation.
I want to be able to keep my poco classes in a separate project and expose them with an OData endpoint. What am I doing wrong?
--UPDATE--
The scenario described above is meant to illustrate a problem I have found using the WCF OData Reflection Provider. It is not my real problem, but is easier to explain for illustrative purposes.
Try upgrading to the latest version of WCF Data Services (currently 5.3), if you aren't already on it. I reproduced your issue using the version of WCF Data Services that ships with .Net 4.5, but once I upgraded the references in both assemblies to the latest release of Microsoft.Data.Services using NuGet, the problem went away.
If you're already using the most up-to-date version of WCF Data Services, make sure that both assemblies are referencing the exact same version of WCF Data Services.
If neither of these fix your problem, add the following attribute to your DataService class to get a more detailed error message and stack trace:
[System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)]
public class YourService : DataService<...>
And then please update your question with the results of that (if the solution doesn't immediately jump out from the stack trace).
(disclaimer: I usually don't like answers of the kind that don't help you with your problem but rather explain why your problem isn't the correct problem, but I think it's justified in this case :))
If you think about it, you don't really want to do that:
The Order and Item classes aren't really POCOs at all; they're not 'plain' C# objects; they have data attributes on them, which make them data transfer objects (DTOs).
They belong to the interface between your service and its clients;
The domain entities (or POCOs) Item and Order will, most likely, be a bit more complex, and contain other things besides data, such as operations and business logic.
I believe the correct way to go is to have a rich domain model, in which Order and Item contain a full set of attributes and operations, and on top of that, a DTO layer, which contains only those attributes that your service client needs.
Sending your POCOs over the wire was termed 'the stripper pattern', and I believe it's best avoided.
I have description of my Application Services using my fancy classes (ServiceDescription class that contains collection of ServiceMethod description, for simplification).
Now, I want to expose one Application Service as one WCF Service (one Contract). The current solution is very lame - I have console application that generates *.svc file for each Application Service (ServiceDescription). There is one method (Operation) generated for one ServiceMethod.
This works well but I would like to make it better. It could be improved using T4 template but I'm sure that there is still better way in WCF.
I would still like to have one *.svc file per one Application Service but I don't want to generate methods (for corresponding Application Service methods).
I'm sure that there must be some interfaces that allow to discover operations dynamically, at runtime. Maybe IContractBehavior...
Thanks.
EDIT1:
I don't want to use generic operation contract because I would like to have the ability to generate service proxy with all operations.
I'm sure that if I write WCF service and operations by hand then WCF uses reflection to discover the operations in the service.
Now, I would like to customize this point in order not to use reflection, just use my "operations discovering code" instead.
I think there is nothing wrong with static code generation in that case. In my opinion, it is a better solution than dynamic generation of contracts. Keep in mind that your contract is the only evidence you have/provide that a service is hosting a particular set operations.
The main issue I see about the dynamic approach is about versioning and compatibility. If everything is dynamically generated, you may end up transparently pushing breaking changes into the system and create some problems with existing clients.
If you have a code generator when you plan on implementing some changes in the application services, it will be easier to remember that the changes you make on the services may have a huge impact.
But if you really want to dynamically handle messages, you could use a generic operation contract (with the Action property set to *), and manually route the messages to the application services.
Keep in mind that you would lose the ability to generate from the service a proxy containing a list of operations available.