Why is a WCF service that receives too many parameters not breaking? - c#

OK this is a bit of a weird question. I have come across a section of code in a project which programmatically creates a ChannelFactory based on an interface. A method in the interface takes three parameters. Now when I looked at the actual WCF service's code, that method only expects two parameters but is being sent three from the client.
I would expect the service to break when receiving the extra parameter but it doesn't. Does anyone have any idea why this is working?

The service call gets encoded onto the wire and sent to the receiver. Depending on how strict the XML/Binary/Json parsers are, the extra parameter is simply ignored.
When the stub code on the WCF server receives the wire call, it does not go over the serialized packet and says "They are calling MethodX and I got param1, param2 and param3 - let's try stuffing them inside the method... Oh. It only takes param1 and param2. Boom."
Instead, it says something like:
"They called MethodX. Great. What parameters does it take? Param1 and Param2. Let's see if these are present in the packet. Oh! They are here. Sweet. I'll use them." It simply ignores the rest.
Some notes:
It is not difficult to write a stub that will fail if extra stuff is there. WCF will fail on some extra stuff and not fail on others for example.
There is a benefit to this where you can be forward and potentially backward compatible (i.e. Client V2 talking to Server V1 will still work, simply ignoring the parameter [which you may or may not want]).

WCF has Forward Compatible Data Contracts.
If the service contract inherits from IExtensibleDataObject, then the service behavior is to store and not throw an error when extra input data is found.
When the WCF infrastructure encounters data that is not part of the
original data contract, the data is stored in the property and
preserved. It is not processed in any other way except for temporary
storage. If the object is returned back to where it originated, the
original (unknown) data is also returned. Therefore, the data has made
a round trip to and from the originating endpoint without loss.

Related

Why is passing a dataset to a web service method not good?

Please explain in detail. I have been telling my mentor that everything I've researched on WCF and many programmers all over the net say it is NOT good to pass DataSets to the service, why is that? I created a BUNCH of classes in the service and the work great with the application, but he says that I just wasted time doing all that work, he has a better way of doing it.
He keeps telling me to create a SINGLE OperationContract. There will be many functions in the service, but the OperationContract will take the string name of the function and the dataset providing the details for that function.
Is his way bad practice? Not safe? I'm just trying to understand why many people say don't use datasets.
The first reason is interoperability. If you expect consumers of your service to be implemented in any other technologies other than .NET, they may have lots of trouble extracting or generating the data in the DataSet, as they will have no equivalent data structure on their end.
Performance can be affected quite a bit, as well. In particular, the serialization format for untyped datasets can be huge because it will contain not just the data, but also the XSD schema for the data set, which can be quite large depending on the complexity of the DataSet. This can make your messages a lot larger, which will use more network bandwidth, take longer to transfer (particularly over high latency links), and will take more resources at the endpoint to parse.
So the web service you have does something specific lets say it sends a bunch of emails. Lets say this service has one method that sends an email. The method should accept and email address, subject and a body.
Now if we send a data set with the information required the service would have to know the shape of the data and parse it.
Alternatively if the web service accepted a object with properties for email address, subject and body. It can be used in more than one place and is less prone to going wrong dues to a malformed dataset.
One more thing: you can get incorrect data using DataSet.
For example a value in the DataSet might look like the following before serialization:
<date_time>12:19:38</date_time>
In the client it would come with a offset specified:
<date_time>12:19:38.0000000-04:00</date_time>
The client code would adjust this to its local time (much like Outlook when you schedule an appointment with someone in a different timezone).
More details can be found here.
Using WCF is not just an implementation decision - it is a design choice. When you choose to use WCF you have to abandon many of your treasured OO principles behind and embrace a new set of patterns and principals that are associated to service orientation.
One such principle is that of explicit contracts: A service should have well defined public contracts (see this Wikipedia article). This is crucial for interoperability, but is also important so clients have an accurate picture of what functionality your service provides.
A DataSet is basically just a big bag of "stuff" - there is no limitation to what it could contain - or any well defined contract that explains how I can get data out. By using a DataSet you introduce inherent coupling between the client and the server - the client has to have "inside information" about how the DataSet was created in order to get the data out. By introducing this level of coupling between the client and service you have just negated one of the main motivations for using WCF (precisely that of decoupling the two areas of functionality to allow for independent deployment and/or development lifecycle).

C# - Can WebMethodAttribute adversely effect performance?

I just noticed my Excel service running much faster. I'm not sure if there is an environmental condition going on. I did make a change to the method. Where before it was
class WebServices{
[ WebMethod( /*...*/) ]
public string Method(){}
}
Now its attribute is removed and the method moved into another class
class NotWebService {
public string Method(){}
}
But, I did this because the Method is not called or used as a service. Instead it was called via
WebServices service = new WebServices();
service.Method();
and inside the same assembly. Now when I call the method
NotWebService notService = new NotWebService();
notService.Method();
The response time appears to have gone up. Does the WebMethodAttribute have the potential to slow local calls?
Indeed the WebMethod attribute adds a lot of functionality in order to expose the method through a XML WebService.
Part of the functionality that causes overhead are the following features considered as part of the configurable stuff for a web method:
BufferResponse
CacheDuration
Session Handling
Transaction Handling
For more information just check the WebMethod attribute documentation
Regards,
I know this is an old question, but to avoid misinformation I feel the need to answer it anyway.
I disagree with wacdany's assessment.
A method marked as a webmethod should have no additional overhead if called directly as a method, rather than via HTTP. After all, it compiles to exactly the same intermediate language, other than the presence of a custom attribute.
Now adding a custom attribute can impact performace if it is one of the ones that is special to the compiler or runtime. WebMethodAttibute is neither.
I would next consider whether there is any special overhead to constructing the webservice object. If you have added a constructor there might be some, but by default there is no real overhead, since the constuctors of the base classes are trivial.
Therefore if you really were calling the method directly, there should be no real overhead, despite it also being accesable as a web service action. If you experienced a significant difference, it would be wise to verify if you were constructing the real WebServices class, and not somehow inadvetently using a web service proxy, perhaps due to adding a web service reference to your project.

Why does WCF sometimes add "Field" to end of generated proxy types?

Basically, I have a server-side type "Foo" with members X and Y. Whenever I use Visual Studio's "Add Server Reference" then I see the WSDL and the generated proxy both append the word "Field" to all the members and change the casing of the first letter. IE, "X" and "Y" are renamed "xField" and "yField". Any idea why this is happening? I can't figure out the pattern.
Details -- I have a legacy ASMX web service which exposes a "Foo" type. I created a new WCF service that's a wrapper around that old web service -- the new service just wraps those methods and maybe updates the values of a few fields, but it exposes the exact same methods and returns the exact same types. I've tried re-creating the referenes several times, and every time, it always renames my fields: the varible "STUFF" is exposed in the wsdl and proxy as "sTUFFField". Variable "X" is exposed as "xField", etc.
Funny thing is I can't figure out the pattern -- I tried creating a new ASMX web service as a test and wrapping that -- variables are not renamed then. So I can't figure out the pattern of why/when WCF renames variables.
Anybody know?
I had the same issue, and sergiosp's answer got me headed in the right direction. Just adding some additional info to hopefully help someone else.
Adding [System.ServiceModel.XmlSerializerFormatAttribute()] to the interface, and re-generating the client code resolved the problem for me.
public interface IMyService
{
[System.ServiceModel.XmlSerializerFormatAttribute()]
[System.ServiceModel.OperationContract]
recordResponse GetRecord(recordRequest request);
}
I had the same problem but i was able to find solution.
In the interface if you add [DataContractFormat] tag you will end up with "XFieldField" case.
But if you replace it with [XmlSerializerFormat] in the interface it will not change the names in the proxy generated.
Typically, the generated proxy will have "XField" and "YField" as internal/protected/private fields, and expose the values through properties called "X" and "Y". There are options you can set when creating the proxy client to tweak that to your liking, I think.
UPDATE: I don't seem to find any switches or options to control this behavior. It might depend on which serializer (DataContractSerializer vs. XmlSerializer) WCF uses for creating the client proxy.
In the end, it's really more or less just an issue of coding style - functionally, it shouldn't make a difference.
Marc
I had this problem too, but from the client I was still getting Field at the end of the class members even after making the mentioned change at the interface.
The problem was, I was using a DataContractSerializer to work with disk file serialized requests (during the test of our service, we were getting serialized requests from the provider, to be able to debug before going live).
After changing the DataContractSerializer to a XmlSerializer, specifying on its constructor the root element (by a typeof() call) and the rootnamespace (because by default, XmlSerializers write the standard namespace), I could deserialize the requests and work perfectly with the WCF Service.
Hope this helps somebody. I lost soooo many time with this "issue".
Adding XmlSerializerFormat worked for me. Got solution from http://geekswithblogs.net/mipsen/archive/2010/02/06/field-postfix-in-wcf-reference.aspx
[ServiceContract(Namespace="http://somenamespace.com/contracts")]
public interface ISchemaService
{
[OperationContract]
[XmlSerializerFormat]
void DoSomething(GeneratedType data);
}

WCF: Individual methods or a generic ProcessMessage method accepting xml

My company is developing an application that receives data from another company via TCP sockets and xml messages. This is delivered to a single gateway application which then broadcasts it to multiple copies of the same internal application on various machines in our organisation.
WCF was chosen as the technology to handle the internal communications (internally bi-directional). The developers considered two methods.
Individual methods exposed by the
WCF service for each different
message received by the gateway
application. The gateway
application would parse the incoming
external message and call the
appropriate WCF service method. The
incoming XML would be translated
into DataContract DTO’s and supplied
as argument to the appropriate WCF
method.
The internal application
exposed a WCF service with one
method “ProcessMessage” which
accepted an Xml string message as
argument. The internal app would
parse then deserialize the received
xml and process it accordingly.
The lead developer thought option two was the better option as it was “easier” to serialized/deserialize the xml. I thought the argument didn’t make sense because DataContracts are serialized and deserialized by WCF and by using WCF we had better typing of our data. In option 2 someone could call the WCF service and pass in any string. I believe option 1 presents a neater interface and makes the application more maintainable and useable.
Both options would still require parsing and validation of the original xml string at some point, so it may also be a question where is the recommended place to perform this validation.
I was wondering what the current thoughts are for passing this kind of information and what people’s opinions on both alternatives are.
Option 1 is suited if you can ensure that the client always sends serialized representations of data contracts to the server.
However if you need some flexibility in the serialization/deserialization logic and not get tightly coupled with DataContracts, then option 2 looks good. Particularly useful when you want to support alternate forms of xml (say Atom representations, raw xml in custom format etc)
Also in option 2 inside the ProcessMessage() method, you have the option of deciding whether or not to deserialize the incoming xml payload (based on request headers or something that is specific to your application).
In option 1, the WCF runtime will always deserialize the payload.
I recently asked a couple of questions around this area: XML vs Objects and XML vs Objects #2. You'll find the answers to those questions interesting.
For our particular problem we've decided on a hybrod approach, with the interface looking something like this:
// Just using fields for simplicity and no attributes shown.
interface WCFDataContract
{
// Header details
public int id;
public int version;
public DateTime writeDateTime;
public string xmlBlob;
// Footer details
public int anotherBitOfInformation;
public string andSoemMoreInfo;
public book andABooleanJustInCase;
}
The reason we use an xmlBlob is because we own the header and footer schema but not the blob in the middle. Also, we don't really have to process that blob, rather we just pass it to another library (created by another department). The other library returns us more strongly typed data.
Good luck - I know from experience that your option 2 can be quite seductive and can sometimes be hard to argue against without being accused of being overly pure and not pragmatic enough ;)
I hope I understood this right. I think it might make sense to have your gateway app handle all the deserialization and have your internal app expose WCF services that take actual DataContract objects.
This way, your deserialization of the TCP-based XML is more centralized at the gateway, and your internal apps don't need to worry about it, they just need to expose whatever WCF services make sense, and can deal with actual objects.
If you force the internal apps to do the deserialization, you might end up with more maintenance if the format changes or whatever.
So I think I would say option 1 (unless I misunderstood).

ASP.NET Web Service Results, Proxy Classes and Type Conversion

I'm still new to the ASP.NET world, so I could be way off base here, but so far this is to the best of my (limited) knowledge!
Let's say I have a standard business object "Contact" in the Business namespace. I write a Web Service to retrieve a Contact's info from a database and return it. I then write a client application to request said details.
Now, I also then create a utility method that takes a "Contact" and does some magic with it, like Utils.BuyContactNewHat() say. Which of course takes the Contact of type Business.Contact.
I then go back to my client application and want to utilise the BuyContactNewHat method, so I add a reference to my Utils namespace and there it is. However, a problem arises with:
Contact c = MyWebService.GetContact("Rob);
Utils.BuyContactNewHat(c); // << Error Here
Since the return type of GetContact is of MyWebService.Contact and not Business.Contact as expected. I understand why this is because when accessing a web service, you are actually programming against the proxy class generated by the WSDL.
So, is there an "easier" way to deal with this type of mismatch? I was considering perhaps trying to create a generic converter class that uses reflection to ensure two objects have the same structure than simply transferring the values across from one to the other.
You are on the right track. To get the data from the proxy object back into one of your own objects, you have to do left-hand-right-hand code. i.e. copy property values. I'll bet you that there is already a generic method out there that uses reflection.
Some people will use something other than a web service (.net remoting) if they just want to get a business object across the wire. Or they'll use binary serialization. I'm guessing you are using the web service for a reason, so you'll have to do property copying.
You don't actually have to use the generated class that the WSDL gives you. If you take a look at the code that it generates, it's just making calls into some .NET framework classes to submit SOAP requests. In the past I have copied that code into a normal .cs file and edited it. Although I haven't tried this specifically, I see no reason why you couldn't drop the proxy class definition and use the original class to receive the results of the SOAP call. It must already be doing reflection under the hood, it seems a shame to do it twice.
I would recommend that you look at writing a Schema Importer Extension, which you can use to control proxy code generation. This approach can be used to (gracefully) resolve your problem without kludges (such as copying around objects from one namespace to another, or modifying the proxy generated reference.cs class only to have it replaced the next time you update the web reference).
Here's a (very) good tutorial on the subject:
http://www.microsoft.com/belux/msdn/nl/community/columns/jdruyts/wsproxy.mspx

Categories

Resources