passing xml as a string to webservice causes complete failure - c#

I am hosting a .net webservice written in c#. In one of my web methods, I'm expecting a few inbound parameters, all of which are strings. The final string is actually a string of xml (I started out doing a dataset for this, but read this was a big no-no, so went this route instead).
The xml (as a string) is incredibly simple.
<FriendsArray><Friend><FriendSocialID>123456789</FriendSocialID><FriendName>Test Test</FriendName><AvatarImage>https://www.imagelocation.com</AvatarImage><FriendGender></FriendGender></Friend></FriendsArray>
With the <FriendsArray> being the bounder, and the <Friend> node being repeated n times. On the webservice side, I take this string, serialize it into xml and deal with the data.
When the webservice is run locally on the server in Visual Studio, it works perfectly, no issues. When the webservice is run locally and attached to a test calling program that simply emulates passing these parameters, including the FriendArray to the webservice, again, right as rain, no issues, works perfectly.
Real world scenario of calling the webservice foreign (either via emulation, like SoapUI or XMLSpy, or directly from the app that is meant to call it) it bombs the service. No error is returned, or empty response object, it completely fails, and returns nothing.
The second I remove this FriendArray (its optional on the codebehind), it works perfectly.
Is there some inherent issue to doing this the way I am trying to? If so, is there an easy alternate way that requires not a huge amount of recoding? Thanks!

Apparently "<" and ">" are reserved in XML. Wasn't aware of that. All I had to do was escape them with < and > in the string, and it works perfectly now.

Related

Passing variable to .NET dll in ColdFusion

I built a very simple .dll in C# to call from a simple ColdFusion page. Everything works fine if I pass in literal values but as soon as I try and pass in a variable (#rollYear#) I get a message stating it can't find the method anymore.
The coldfusion page sets up my .dll like this:
<cfobject type="dotnet" name="getParcelData"
class="soapDLL.GetSecuredParcelByAPN"
assembly="{path}\soapdll.dll">
I then call it like this:
<cfset output = getParcelData.getData("46546504654","cy","#rollYear#")>
If I use the code above I get an error, "The getData method was not found.". If I replace the #rollYear# variable with a value (2017 for instance) then it works fine. In my tests I've set the #rollYear# variable via the CFSET function before I call the .dll.
I've been banging my head on this all day. Has anyone had similar experience? The .dll is very simple. It just takes 3 variables and based on those sets up what SOAP service to call to pull back some data. For reasons that are too complicated to explain I can't do the SOAP call from within ColdFusion, it has to go through a .net dll.
Any help would be appreciated, I don't have much hair left. :)
Whenever you work with Java or .NET components, you need to pay extra attention when passing ColdFusion variables/values to those methods. If the data types to not match exactly, you will encounter an error message telling you that the method does not exist or does not match the method signature.
ColdFusion offers javaCast() to explicitly cast to the required data type. Cast your arguments accordingly and it should work out in most cases.
Basic example:
A method that expects an integer will throw an error when you pass methodThatExpectsInt(123), because the 123 literal is internally stored as a string (or Double) by ColdFusion. By passing it via methodThatExpectsInt( javaCast("int", 123) ), the data type will be properly casted and match up.

Why is a WCF service that receives too many parameters not breaking?

OK this is a bit of a weird question. I have come across a section of code in a project which programmatically creates a ChannelFactory based on an interface. A method in the interface takes three parameters. Now when I looked at the actual WCF service's code, that method only expects two parameters but is being sent three from the client.
I would expect the service to break when receiving the extra parameter but it doesn't. Does anyone have any idea why this is working?
The service call gets encoded onto the wire and sent to the receiver. Depending on how strict the XML/Binary/Json parsers are, the extra parameter is simply ignored.
When the stub code on the WCF server receives the wire call, it does not go over the serialized packet and says "They are calling MethodX and I got param1, param2 and param3 - let's try stuffing them inside the method... Oh. It only takes param1 and param2. Boom."
Instead, it says something like:
"They called MethodX. Great. What parameters does it take? Param1 and Param2. Let's see if these are present in the packet. Oh! They are here. Sweet. I'll use them." It simply ignores the rest.
Some notes:
It is not difficult to write a stub that will fail if extra stuff is there. WCF will fail on some extra stuff and not fail on others for example.
There is a benefit to this where you can be forward and potentially backward compatible (i.e. Client V2 talking to Server V1 will still work, simply ignoring the parameter [which you may or may not want]).
WCF has Forward Compatible Data Contracts.
If the service contract inherits from IExtensibleDataObject, then the service behavior is to store and not throw an error when extra input data is found.
When the WCF infrastructure encounters data that is not part of the
original data contract, the data is stored in the property and
preserved. It is not processed in any other way except for temporary
storage. If the object is returned back to where it originated, the
original (unknown) data is also returned. Therefore, the data has made
a round trip to and from the originating endpoint without loss.

Why would c# not create code from wsdl using add by service reference in visual studio just because of a xs:choice in the schema?

We created a soap server using php and few of the functions give a varying outcome in terms of the xml elements depending on the arguments passed through.
To explain further, a function takes in argument a and depending on the data received it can return either of the two different arrays(complextype) with distinct number of child elements.
e.g.
if a =9 then outcome is array/struct ,,,,
a[delta]=20 ,,,
a[sigma]=yellow
if a =3 ,
a[aTotallyDifferentBallgame]=Omaha ,,,
a[t]=1 ,,,
a[theNumberOfElementsCanVary]=yup
In order to explain this possible variance we utilized choice in the schema, thereby trying to intimate that the outcome can be any single element within choice be it simple or complextype.
Now theoretically it sounds logical and it works fine with php's soap client but WHEN we tried to use the add Service Reference feature of the visual studio in a form application ; the application failed to create code for it citing that the use of xs:choice is not allowed for some unfathomable reasons.
Now what I would really like to know is what changes do I need to make to my wsdl or soap server to make this work. We thought a work around was by fixing the outcome to only one possible scenario and utilizing a completely different function to determine the outcome of the other thereby abstaining from use of choice but frankly this seems too redundant and weird.
Is there anything I have missed? Please let me know any ideas you have. Thanks!
The create service reference machinery tries to map the schema to C# classes, and there is no structure in a C# class corresponding to a choice in the schema - a class cannot have a value for either one property or another one but not for both.
My suggestion would be to replace choice with sequences of optional elements, the corresponding C# class will have properties for each of the elements - and only one of them will have a value, the other will be null, because the PHP service returns a value for only one of them at a time.

Web-service able to return IDictionary on Local Machine but not on server

I have an asmx web-service with a method returning a dictionary object, On my local machine using the test server it works fine. However when I publish it to the production server I get an error that it isn’t supported because it implements IDictionary. I looked around a bit and it seems that by default this is the case (that it isn’t supported), and so I guess that is the correct behavior. What I would like to know if why it works on my local machine, and if there is some setting that I can apply to get it to work on the server.
If it makes a difference the dictionary is using a string as the key and a custom object array for the value.
For now I’m able to get around it by serializing my dictionary as a string (using JavaScriptSerializer.Serialize) and returning that, but ideally I would like to return the dictionary straight (or barring that at least understand what’s going on).

WebServices fail because (400) Bad Request because of special character

Using Visual Studio 2008, I setup a client that uses Web Services.
It has nothing to do with buffer sizes (as that is a typical response, all appropriate sizes were increased).
I was using a List as the parameter of the Method.
After much experimentation, I used the System.Net.WebClient to manually create a Soap 1.1 and a Soap 1.2 request to test test the result.
using (var webCtx = new System.Net.WebClient())
{
webCtx.Headers.Add(System.Net.HttpRequestHeader.ContentType, "text/xml");
webCtx.Headers.Add("SOAPAction", "http://tempuri.org/HelloWorld");
var uri = new Uri("http://localhost:12345/MyProject/WebService.asmx");
MessageBox.Show(webCtx.UploadString(uri, xml));
}
Where the xml is a string variable of the xml with actual data.
The problem is when one of the fields has a special character. Here is the example of the message body of the xml.
<PocoClass>
<UnicodeString>123</UnicodeString>
</PocoClass>
If the value of UnicodeString was something simple (i.e. 123), but once the special character appeared, I got 400 Bad Request.
Now, I have found a Microsoft Knowledge Base Article describing the bug kb911276, which basically states, install a hot fix. That really isn't something that can be a solution to the problem, so I was wondering if there are any ideas of how to solve this problem?
Are the only solutions to write some sort of custom encoding/decoding or custom bindings to be implemented by the server and client? Are there any easier ideas?
Thanks
Edited:
The use of a List is not an issue as it is handled by VS. Here is a more complete (but still partial) contents of the soap message.
<HelloWorld xmlns="http://tempuri.org/">
<pocoList>
<PocoClass>
<UnicodeString>123</UnicodeString>
</PocoClass>
</pocoList>
</HelloWorld>
I had a similar problem with a webservice we had that users of our app run online account searches. I found a VS Feedback item about this, but it appears MS considers this behaviour by-design.
What we ended up doing was writing a method that replaced all characters that are not allowed when serializing to XML with question marks.
The allowed list was: 0x09, 0x0a, 0x0d, 0x20-0xd7ff, and 0xe000-0xfffd. Everything else, we turned to "?".
I guess if you can't do that, the data should be base64-encoded in transit to avoid this issue. Seems a bit odd to me, as it breaks the ability to losslessly route a call through a web service.
My concern is where you state that you are using a List as the parameter to a method. Web services don't directly support lists. You need to use an array as the parameter. Then internally you can convert it to a list using .ToList(). This may not be your issue directly!
That KB article was referring to unicode character sets within a request URL, not actual data being posted to the server.
That being said; i'd verify that your headers are well formed.
Also, if VS generated the client proxy for you; right click on the service reference and then on "Update Service Reference"
Just noticed one more thing; you said that you are passing in a list as a parameter; from the snippet of xml shown, that doesn't look like something that would be serialized from a list (by any of the .NET xml serializers); if you want to debug using the webclient approach; try verifying that the format of the xml is valid.
Where are you seeing ? In the raw XML or the serialized data? XML is defined to contain only human-readable characters and  is the XML entity for the unreadable ASCII Shift-Out character. If you're looking at the raw XML and it contains the XML entity code, that's allowed, but if you are looking at serialized data and your XML document contains raw control characters it is invalid and will be rejected.

Categories

Resources