Using SvcUtil I generated proxy for a SOAP WebService. This webservice has many complex types and can change every year. Is there a tool I can use to generate wrapper for all classes. Using composition in wrapper class I will call proxy class.
Svcutil.exe generate POCO types at the client sides according to the XSD parts of WSDL. There shouldn't be T4Template involved which is too complex and overkill and inappropriate. Svcutil.exe could have created all the proxy classes you need.
If the complex types may be changing every years, you may consider versioning.
Once an interface is published, you should not changed it. This applies to both operation contracts and data contracts.
You may refer to this article WCF for the Real World, and google WCF versioning.
So basically you explicitly declare XML target namespace in the contracts, and map CLR namespaces with XML namespaces. When you need to change the complex types, you have to provide another version of WCF service. During the transition period before all the clients could upgrade to the latest version, you keep both versions running.
Related
I am new to WCF service. I am aware about three ways to generate proxies.
Using Service reference
Using SvcUtil
Using ClientBase
But I am confused in which case I should use which type. In my case I have to generate proxies for third party service for which I don't have service code. I don't want to use add service reference because it gives me following issue. Mentioned in this stackoverflow question. So I want to use clientBase. But I think I cannot use it without using service reference. I am pretty much confused when should we choose which kind of proxies.
In my case I have to generate proxies for third party service for which I don't have service code.
I will have multiple apps using this service.
In that case you are better off using SvcUtil because it can generate a single library that all of your projects can use, even if they are .NET libraries. After it is generated you can always go in and tweak it.
Add service reference on the other hand is fine for a single .exe but as you have discovered, is annoying for multiple apps as you need to repeat the process and you end up with multiple definitions of WCF types that is just going to increase maintenance.
Just be sure to leave WCF client config in the app.config of your applications and not your app.config of your class library (as the former may not be read).
If your vendor had followed "WCF the Manual Way… the Right Way" it would have made your life easier.
SOAP purists would argue however that the only thing the vendor provides is a SOAP WSDL XML file from which you are required to generate your types anyway. (sadly, the default behaviour in .NET is back-to-front)
I'm designing a new solution that will consist of three projects:
"Server" - a WCF service
"Client" - a winforms app that will call the WCF service
"ServiceContract" - a class lib containing some base classes plus the WCF service contract (interface). This will obviously be referenced by the Server, and also by the Client (I'm using a ChannelFactory rather than have VS generate a service reference). The service contract looks something like this:-
[ServiceContract]
[ServiceKnownType("GetCommandTypes", typeof(CommandTypesProvider))]
public interface INuService
{
[OperationContract]
bool ExecuteCommand(CommandBase command);
}
It's a very basic operation - the client creates a "command" object and sends it to the server to be executed. There will be many different commands, all inheriting from CommandBase (this base class resides in the "ServiceContract" project). As I'm using the base class in the WCF operation signature, I have to specify the known types which I'm doing dynamically using the ServiceKnownType attribute. This references a helper class (CommandTypesProvider) that returns all types deriving from CommandBase.
I've created a simple proof of concept with a couple of derived command classes that reside in the "ServiceContract" project. The helper class therefore only has to reflect types in the executing assembly. This all works fine.
Now in my "real" solution these command classes will be in different projects. These projects will reference the ServiceContract project, rather than vice-versa, which makes it difficult (or impossible?) for the helper to reflect the "command" assemblies. So my question is, how can I provide the known types?
Options I've thought about:-
The "Server" and "Client" projects will reference both the "ServiceContract" project and the various "command" projects. My helper could reflect through AppDomain.CurrentDomain.GetAssemblies(), but this fails because the "command" assemblies are not all loaded (I could force this by referencing a type in each, but that doesn't feel right - I want it to be a dynamic, pluggable architecture, and not have to modify code whenever I add a new command project).
Specify the known types in config. Again it would be nice if the app was dynamic, rather than have to update the config each time I add a command class.
Is there any way to access the underlying DataContractSerializer on both the client and server, and pass it the known types? I guess I'll still have the same issue of not being able to reflect the assemblies unless they've been loaded.
Refactor things to enable the ServiceContract project to reference the various command projects. I can then reflect them using 'Assembly.GetReferencedAssemblies()'. I guess the various command classes are part of the service contract, so perhaps this is the way to go? Edit: looks like this has the same problem of only finding loaded assemblies.
Any ideas greatly appreciated! Is it something that can be achieved or do I need to rethink my architecture?!
Thanks in advance.
One thing to consider is using the DataContractResolver.
Few resources:
WCF Extensibility – Data Contract Resolver by Carlos
Building Extensible WCF Service Interfaces With DataContractResolver by Kelly
Configuring Known Types Dynamically - Introducing the DataContractResolver by Youssef
Thanks for the replies regarding the Data Contract Resolver guys. I probably would have gone down this route normally but as I was using Windsor I was able to come up with a solution using this instead.
For anyone interested, I added a Windsor installer (IWindsorInstaller) to each of my "command" projects, which are run using Windsor's container.Install(FromAssembly.InDirectory.... These installs are responsible for registering any dependencies needed within that project, plus they also register all the command classes which my known types helper can resolve from the container.
I am trying to write some code in C# that will call a WCF service on the fly by importing the WSDL, examining it and then making calls to it dynamically.
The service I am calling can change from time to time - so if it does I want my client to know about new methods and new input parameters and output parameters to the calls, without rebuilding my client.
One possible solution to this is to import and compile a service reference on the fly.
Outlined here: Creating an assembly on the fly from a WSDL
I would like to avoid the generation of an assembly and then reflecting over it if possible.
I looked into the code of the dynamic proxy in the link and they use a framework class to do the import. This class is the WsdlImporter. So I had thought great - I can use that and examine the WSDL schema and determine what calls are present and what inputs and outputs are available.
The problem is that the type information is missing in the MessagePartDescription objects that the WsdlImporter creates. Apparently this is missing because it cannot find the types yet - see the response to the question from Brian.
So any advice on how I should proceed? Am I completely on the wrong track here?
This is probably not an answer but I will post it as one to fully describe my opinion.
Dynamic proxy:
IMO this is example of wrong usage of technology. It is elementary behavior of WSDL - if it changes you have to change client or you have to make good WSDL versioning and create new client.
You still have to somehow say your client to get WSDL - does it mean that you will parse WSDL before each call? Doesn't seem like a good idea.
Information about types is really not part of WSDL because by default WSDL is generated as interoperable. CLR types are not operation needed for interoperability. When you create service proxy by Add service reference or Svcutil it will generate code for types defined in WSDL. That code then need to be compiled.
You can try to use NetDataContractSerializer instead of default DataContractSerializer. NetDataContractSerializer adds CLR type information into WSDL but I still expect that new types must be known to your clients - it means deploying new assembly with types and use it by clients. This almost sounds like same approach when simply deploying assembly with new static client proxy.
Dynamic WF client
I also don't see too much usage of this architecture - you still need to change client to reflect new WF steps, don't you?
Changing the WF
Are we talking about Windows Workflow foundation? I can hardly imagine scenario where you create WF, expose it as a service and then change it. When you expose WF as service you are probably defining long running WF. Long running WFs use persistance which is based on serialization (at least in WF 3.5 but I believe it is same in WF 4). When you change WF definition, all persisted WFs are most probably doomed because they will never ever deserialize. This situation is usually solved by parallel deployment of new and old version where old version is only used to finish incomplete WFs. Again it means new clients.
If you look at the problem from a different angle. Do you need to regenerate the proxy each time or do you need a contract that continues to work when things change?
WCF has a mechanism for this IExtensibleDataContracts see: http://msdn.microsoft.com/en-us/library/ms731083%28v=VS.100%29.aspx
Best practices for versioning of contracts can be found here
I have a small WPF application based on MVVM priniciples. So far I had dummy Model classes created in my app. Now I plan to call a Web Service that uses XSD. Looking forward I would like to use these Xsd Types as Models.
I can see atleast two way of doing this (could be more); for eg -
add a reference to the Web Service. This means appropriate classes for types defined in the XSD will be generated by VS. I could then use these classes as Models. There is a potential namespace conflict (not a major one) when references are added if two or more web service are consumed in the app and these web services work with same XSD types.
writing my own Model classes which can be populated based on the XML returned from WebService call. Model can validate the XML against XSD on intialization. This way no references are added and Web service can be called using HTTP GET/POST methods. But this involves manually updating models everytime the XSD changes.
Can you please advice on an optimal approach to using XSD Types as Models (based on your past experience with similar scenario of using types defined in XSD in MVC/MVVM app)?
It depends on the scenario of your application. If you plan to distribute your client in all over the world and change often the service interface and data object, you have to find a way to build your model in the client from the xsd.
Otherwise the first approach is very much easier.
I have a webservice with a function that returns a type (foo). If I consume this webservice in .NET through the 2.0 generated proxies, it creates a class called foo in the generated proxy. If I have the DLL that contains that class (foo) that is the DLL being used by the webservice, is there any way to have it use that class instead of creating a custom proxy class? I'm looking for something similar to what remoting does... but not remoting.
I've seen 3 ways of doing this:
Let Visual Studio generate the proxy and then change the classes in the proxy to the full class names of the dll, by hand. Works, but you would have to do this again everytime you update your proxy. Plus it's really dirty, isn't it?
Use a generic class/method that
creates deep copies of your proxy
objects into the "real" objects by
reflection. Works, but of course
with a little performance offtrade
Use WCF, where you can reference the
dll with the data contracts (your
data classes) and use them instead
of creating any proxy by code
generation.
I think the key issue here is in generating the proxies. I've generally used two different approaches to web services:
1) Traditional services, where you expose methods and a client generates the proxy in Visual Studio to consume the methods.
2) Request/Response services, where the exposed "service" is more of a pass-through and the "actions" being performed are encapsulated in the objects being sent to and received from the service. These actions would be in that shared library that both the server and the client have.
In the former I often run into this same problem and I don't really think there's a solution, at least not one that Visual Studio is going to like at all. You could perhaps manually modify the generated proxies to use the other classes, but then you'll have to repeat that step any time you re-generate. Conversely, you can generate outside of Visual Studio in something like CodeSmith (the older version is free, but depends on .NET 1.1), which will require some work to create a template for the proxies and to step outside the IDE to re-generate any time you need to update them.
I can recommend a good tool for the latter, however, and that would be the Agatha project. It takes the approach of separating the "service" from the "actions" that are being performed, and makes the approach of the shared library very easy. Such a re-architecture may very well be out of the question for the project you're working on depending on your schedule, but it's definitely something to explore for future projects.
You could write your own proxy class, or you could implement a constructor on your Foo class that takes an instance of the generated Foo class and copies over the data as appropriate.