Returning an array with PHP SOAP service - c#

When I started this, I knew nothing about SOAP or how it worked. I learned a lot in the first few hours and the result was an operational test of the SOAP service (which I'm writing in PHP). I was able to create the SOAP client in both PHP and Visual Studio (which is what I 'm really aiming for) and return the information I wanted.
Since it was just a test and I wanted to return more (and better formatted) information, I started messing around with it. I added some types and wrote some schemas that I imported. When I finished adding everything I needed (this was all for one result) I saved it all and tested it. It ran without error, but didn't return anything.
I'm trying to return an array with 8 elements, with differing types. Here are the elements and types of that array:
Title - string
LinkId - int
Date - int
Author - string
Content - string
Id - int
Icon - int
Edited - boolean
The function GetNews will return 25 of those arrays (per page, the input is an integer for which page of results to get).
I cannot figure out how to serialize the array so that SOAP will return it properly.
Here's the URL to my WSDL file:
http://api.infectionist.com/soap.wsdl
PLEASE help me out, I am completely stuck. I can't see anything wrong with the code, and it's running without error in PHP and Visual Studio, it's just returning an empty result.

I'm guessing you are trying to return the entire array at once from within PHP? If so, you need to make sure that you return an XML string. Like:
<?php
return '
<data>
<title>Actual title</title>
<linkId>154278</linkId>
...
</data>
';
?>
Have you considered using a JSON string as an alternative to send the data from the SOAP server? This would allow you to have all the elements you need in one element and retain flexibility to add/remove elements later (you would need to inform your clients of this).

Related

C# asp.net passing an object with a long property to front end changes it's value [duplicate]

This question already has answers here:
What is JavaScript's highest integer value that a number can go to without losing precision?
(21 answers)
Closed 1 year ago.
I'm building an application with a react.js front-end (although I'm fairly sure that's not relevant to the issue) and an asp.net back-end (targeting net5.0, not sure if that's relevant). When I call to the back-end API, I get back an object that includes, in part, an ID it generated based on the data passed in, that is of type long in C# (a 64 bit int). The behavior that I'm seeing is that the variable in C# and the variable read from the response on the front end are different. They appear to drift after about 16-17 digits.
Is this expected? Is there any way to get around it?
Code to reproduce / Pictures of what I see:
C#
[HttpPost]
[Route("test")]
public object TestPassingLongInObject()
{
/* actual logic omitted for brevity */
var rv = new
{
DataReadIn = new { /* omitted */ },
ValidationResult = new ValidationResult(),
GeneratedID = long.Parse($"922337203{new Random().Next(int.MaxValue):0000000000}") // close to long.MaxValue
};
Console.WriteLine($"ID in C# controller method: {rv.GeneratedID}");
return rv;
}
Console output: ID in C# controller method: 9223372030653055062
Chrome Dev Tools:
When I try to access the ID on the front-end I get the incorrect ID ending in 000 instead of 062.
Edit 1: It's been suggested that this is because JavaScript's Number.MAX_SAFE_INTEGER is less than the value I'm passing. I don't believe that is the reason, but perhaps I'm wrong and someone can enlighten me. On the JS side, I'm using BigInt, precisely because the number I'm passing is too large for Number. The issue is before I'm parsing the result to a JS object though (unless Chrome is doing that automatically, leading to the picture referenced in the issue).
Edit 2: Based on the answers below, it looks like perhaps JS is parsing the value to a number before I'm parsing it to a BigInt, so I'm still losing precision. Can someone more familiar with the web confirm this?
JavaScript is interpreting that value as a number which is a 64-bit floating-point number. If you want to keep that value, you're better off passing it back as a string.
Example JS to demonstrate the problem:
var value = 9223372030653055062;
console.log(value);
console.log(typeof(value));
JavaScript engines will (apparently) fail to create numbers this big. Try to open your console and enter the following:
var myLong = 9223372030653055062;
console.log(myLong);
You can check 9223372030653055 > Number.MAX_SAFE_INTEGER as well, as a pointer that you're going out of bounds.
The maximum size of a Number in Javascript is 2^53 - 1 which is 9007199254740991. In C# a long is 2^64 - 1, which is bigger. So "9223372030653055062" works as a C# long but not as a JavaScript Number because it is too big. If this ID is being used in a database using a long, I'd suggest you just pass it as a string to JavaScript.
Although the reason is that the number is bigger than JS can accurately represent, you seem to have been distracted be Chrome's Dev Tools Preview. This is only a "helpful" preview and not necessarily the truth.
The dev tools preview shows a helpful view of the response. I presume the AJAX response is transferred with content type of application/json, so Chrome helpfully parses the JSON to a JS object and lets you preview the response. Check the Response tab, this is what will actually be received by your AJAX code.
The chances are that the JSON is being parsed before you have chance to use BigInt on that number, capping the precision. You haven't shown us your AJAX code so this is my best guess.
The only solution would be to serialize it as a string then add a manual step to convert the string to BigInt.
jsObject.generatedId = BigInt(jsObject.generatedId);
If you're using it as an ID then you might as well keep it as a string on the client-side as you're not likely to be using it to do any calculations.

WorkFront / AtTask API $$TODAYe+6m flaw?

The workfront API isn't returning the same results as our web report:
On our web front-end on workfront one of the reports has a date range from $$TODAYbw to $$TODAYe+6m and it returned about ~500 rows.
I tried the same query on the API like so (formatted for easier reading)
/v7.0/RSALLO/search
?fields=DE:project:Probability,allocationDate,scheduledHours,project:name,project:status,roleID,project:status,role:name
&allocationDate_Mod=between
&allocationDate=$$TODAYbw
&allocationDate_Range=$$TODAYe+6m
&AND:0:project:status_Mod=notin
&AND:0:project:status=CPL
&AND:0:project:status=DED
&AND:0:project:status=REJ
&AND:0:project:status=UZF
&AND:0:project:status=IDA
&AND:0:roleID_Mod=in
&AND:0:roleID=55cb58b8001cc9bc1bd9767e080f6c10
&AND:0:roleID=55cb58b8001cc9bd9fc0f8b03a581493
&AND:0:roleID=55cb58b8001cc9bfaa01243cd6024b6d
&AND:0:roleID=55cb58b8001cc9c0afa399dece405efd
&$$LIMIT=1000
which returned barely any results. Notice the &allocationDate_Range=$$TODAYe+6m line. If I change it to read =$$TODAY+6m without the end of day modifier the API returns ~500 rows.
I went through every filter criteria and it's only the allocationDate range that is going wrong. I found this resource for the date modifiers and in it there is no e+6m example, yet it works on our web front-end report.
Is the API flawed or is the web report doing something extra in the background?
I don't have an exact solution for your problem, but I can confirm that the API does have some difficulty parsing wildcards like you're trying to use and they don't always come up the way we expect. Furthermore, the API doesn't parse things the same way as text mode reporting, so a query that looks great in the latter might return something different in the former.
If I may propose a different solution, since you're already coding this up outside of Workfront then I suggest you simply perform the date calculations on your own and pass explicit datetime objects to Workfront instead of allowing it to use its own logic. I know this doesn't answer the question of "what is a query that will return exactly what I want" but it should give you the correct end result.
For what it's worth, I spent about 15 minutes trying to get an example working on my end and I gave up after it kept returning values which should have been outside of my own date range.

The type string is expected but a type integer was received with value 0

I am working on Marketing APIs. I want to hit this Api to get Estimate of Target Audiences, basically how much users will be able to see my ad.
"https://graph.facebook.com/v2.2/{adaccountid}/reachestimate"
I am getting the response until I pass this parameter "flexible_spec". Whenever I pass this parameter I get this response "The type string is expected but a type integer was received with value 0". I searched for it but no solution was there.
I will show you what I tried until now:
1)"https://graph.facebook.com/v2.2//{adaccountid}/reachestimate? targeting_spec={"geo_locations":{"countries": ["IN"]},"flexible_spec":{"interests":["6006289279425"]}}
2) "https://graph.facebook.com/v2.2//{adaccountid}/reachestimate? targeting_spec={"geo_locations":{"countries": ["IN"]},"flexible_spec":{"interests":["Movies"]}}
3)"https://graph.facebook.com/v2.2//{adaccountid}/reachestimate? targeting_spec={"geo_locations":{"countries": ["IN"]},"flexible_spec":{"interests":["id":"6006289279425","name":"Movies"]}}
I have tried sending an array of ids in flexible spec interests , array of name's in flexible_spec interests and I have also tried sending list<interests> in it. But no luck.
I searched alot about this error. Then i came to know about my mistake i am doing in this API. We can specify interests fields in two types:
1) targeting_spec={"geo_locations":{"countries":"IN"]},"flexible_spec":[{"interests":["6006289279425"]}]}
2) targeting_spec={"geo_locations":{"countries": ["IN"]},"interests":["6006289279425","46345343534"]}
first one passing interests array in flexible_spec array.
second one passing interests as an array object but not in flexible_spec.
This helped me . I am posting this answer because it might help you too in the same situation.
Thanks.

Json Converter not recognising null attributes and throwing exceptions

Say I have a sample Json format string as
string per1 = #"[{""Email"":""AAA"",""mj_campaign_id"":""22"",""mj_contact_id"":""PPP"",""customcampaign"":""AAA"",""blocked"":""22"",""hard_bounce"":""PPP"",""blocked"":""22"",""hard_bounce"":""PPP""},"
+ #"{""Email"":""BBB"",""mj_campaign_id"":""25"",""mj_contact_id"":""QQQ"",""customcampaign"":""AAA"",""blocked"":""22"",""hard_bounce"":""PPP"",""blocked"":""22""},"
+ #"{""Email"":""CCC"",""mj_campaign_id"":""38"",""mj_contact_id"":""RRR"",""customcampaign"":""AAA"",""blocked"":""22"",""hard_bounce"":""PPP""}]";
I am trying to deserialize it using
var result = JsonConvert.DeserializeObject(per1);
Its working fine as long as all the rows of the string has values for the following attributes Email, mj_campaign_id, mj_contact_id, customcampaign, blocked, hard_bounce, error_related_to, error. But when I skip some sttribute values in some rows its throwing an error saying
Can not add Newtonsoft.Json.Linq.JValue to Newtonsoft.Json.Linq.JObject.
Any help would be appreciated. Thanks
Your error is because you are not assigning a value to an object, which you need to do. If you remove the value, at least add an empty string.
THAT SAID!
Herein lies the danger of manually building JSON strings. You should always avoid it if you can. If you are reading from a web page, that web page should serialize the payload for you, and then you should deserialize it with whatever you are using to pull in the payload (controller, restful service, etc). The beauty of .NET is that it handles all of this plumbing for you and you really are going to run into painful issues if you try to reinvent the .NET wheel

Detecting The length of the string exceeds the value set on the maxJsonLength property at the server

First let me start by saying I know what this error means.
I know about <jsonSerialization maxJsonLength="50000000"/>
And I understand the (in my opinion, very good) reasons for having a limit.
So I am calling a webservice from javascript using the .NET ScriptManager to create all the client code for me.
The webservice gets's a bunch of data from SQL, turns it into an array of objects then returns this array.
Under the hood the .NET engine turns this array into a JSON array for sending to the client.
Sometimes however the resulting JSON string is too long and it throws an exception.
I would like some generic way of detecting this error and handling it gracefully (e.g. by sending some of the data with an indicator that another subsequent request should be made to get the rest)
What I do not want to do is set a sky high limit on the length of the json string as this could result in it taking a very long time before the client gets all the data. I believe the upper limit on this setting to be 2147483644 i.e. 2GB.
Chances are the client has got bored and wandered off by the time all that data arrives just in time to crash the javascript engine when it tries to parse it.
So I guess really what I am after is a way to "know" how long the resulting json string will be before returning it so that it can be truncated as appropriate, or some way to handle the resulting error, shorten the array and try again (and keep shortening and retrying until it works or the array is empty).
I have tried putting a try/catch around the return statement but this didn't work - I suspect because the webservice method is successfully returning the data to the framework but the framework then chokes on the data.
I would prefer a solution that doesn't involve returning a string, but if there is no other way I can put up with that.
[WebMethod]
[System.Web.Script.Services.ScriptMethod(ResponseFormat = System.Web.Script.Services.ResponseFormat.Json)]
public MyClass[] GetMyClasses()
{
List<MyClass> lst;
//code that populate lst with enough instances to break the JSON.
return lst.ToArray();
}
Instead of trying to figure out the resulting JSON length and then apply some sort of paging, maybe you should always page your data..?
You already know that the method can result in huge data, so why not change your method to GetMyClasses(int skip, int take) to enforce chunks of data of reasonable sizes.

Categories

Resources