C# Deserialize JSON Object [duplicate] - c#

This question already has answers here:
How to auto-generate a C# class file from a JSON string [closed]
(3 answers)
Closed 3 years ago.
I am attempting to deserialize a Json object that was returned from a web API with the structure as follows.
Here is my class structure in which the object is to be Deserialized into...
public class CandidateJson
{
public string response { get; set; }
//public string result { get; set; }
public Result result { get; set; }
public string Candidates { get; set; }
public List<row> rows { get; set; }
}
public class Result
{
public string result { get; set; }
public string uri { get; set; }
}
public class row
{
public string no { get; set; }
public List<FL> FL { get; set; }
}
public class FL
{
public string val { get; set; }
public string content { get; set; }
}
I am using the following line of code to Deserialized with no success....
var json = JsonConvert.DeserializeObject<CandidateJson>(JsonResult);
Upon execution of this line of code, I am prompted with the following error...
Unexpected character encountered while parsing value: {. Path 'response', line 1, position 13.
I would appreciate any assistance with this issue.
Please let me know if any additional information is needed.
Here is the raw JSON string:
{"response":{"result":{"Candidates":{"row":[
{"no":"1","FL":[{"val":"CANDIDATEID","content":"508304000012555617"},{"val":"Candidate ID","content":"ZR_129661_CAND"},{"val":"First Name","content":"PRODUCTION"},{"val":"Last Name","content":"TEST"},{"val":"Email","content":"patricia.conley#ampcorporate.com"},{"val":"Phone","content":"815-543-2109"},{"val":"Mobile","content":"815-555-5555"},{"val":"Street","content":"555 Test Ave"},{"val":"City","content":"DeKalb"},{"val":"State","content":"IL"},{"val":"Zip Code","content":"60115"},{"val":"SMCREATORID","content":"508304000000490014"},{"val":"Created By","content":"AMP Support IT Team"},{"val":"MODIFIEDBY","content":"508304000000227003"},{"val":"Modified By","content":"Nikki Bowman"},{"val":"Created Time","content":"2019-12-17 08:38:25"},{"val":"Updated On","content":"2019-12-20 15:23:10"},{"val":"Last Activity Time","content":"2019-12-20 15:23:10"},{"val":"SMOWNERID","content":"508304000000490014"},{"val":"Candidate Owner","content":"AMP Support IT Team"},{"val":"Source","content":"Non-Employee Referral"},{"val":"Email Opt Out","content":"false"},{"val":"Is Locked","content":"false"},{"val":"Is Unqualified","content":"false"},{"val":"Is Attachment Present","content":"false"},{"val":"Candidate Status","content":"Sales Training Scheduled"},{"val":"Career Page Invite Status","content":"0"},{"val":"Extension","content":"5555"},{"val":"Sales Training Date_ID","content":"508304000011808848"},{"val":"Sales Training Date","content":"2019-12-11 Digital Sales Training"},{"val":"Start Date","content":"2019-12-17"},{"val":"Candidate Job Category","content":"Print + Digital Outside"},{"val":"District Sales Manager","content":"Luke Wasowski"},{"val":"College Graduate","content":"false"},{"val":"Recruiter Initials","content":"NKB"},{"val":"Unit/Apt/Ste","content":"Apt 5"},{"val":"Hourly Rate","content":"5.00"},{"val":"Work State","content":"Illinois"},{"val":"Full Time/Part Time","content":"FTR"},{"val":"Work Email Address","content":"Nikki.Bowman#ampcorporate.com"},{"val":"EEO Class","content":"1.1"}]},
{"no":"2","FL":[{"val":"CANDIDATEID","content":"508304000011834365"},{"val":"Candidate ID","content":"ZR_125018_CAND"},{"val":"First Name","content":"Jennifer"},{"val":"Last Name","content":"Pedersen"},{"val":"Email","content":"jennyped248_hwo#indeedemail.com"},{"val":"Mobile","content":"+18157517187"},{"val":"City","content":"Genoa"},{"val":"State","content":"IL"},{"val":"Zip Code","content":"60135"},{"val":"Country","content":"United States"},{"val":"Experience in Years","content":"8"},{"val":"Current Employer","content":"WALMART"},{"val":"Current Job Title","content":"MOD TEAM MEMBER"},{"val":"Skill Set","content":"quick and exceptional customer experience, Helping and Advising Customers, Basic Word Processing, Communication Skills, Customer Service, Data Entry, Hard-Working, Intermediate Word Processing, Organisational Skills, Teamwork, Time Management, outstanding communication skills, Microsoft Word, Microsoft Excel, Microsoft Excel 2000, Microsoft Office, Microsoft Outlook, Microsoft PowerPoint, basic scheduling"},{"val":"SMCREATORID","content":"508304000000562001"},{"val":"Created By","content":"Matt Chenoweth"},{"val":"MODIFIEDBY","content":"508304000008810064"},{"val":"Modified By","content":"HR Department"},{"val":"Created Time","content":"2019-12-02 12:25:53"},{"val":"Updated On","content":"2019-12-12 09:04:51"},{"val":"Last Activity Time","content":"2019-12-12 09:04:51"},{"val":"SMOWNERID","content":"508304000000562001"},{"val":"Candidate Owner","content":"Matt Chenoweth"},{"val":"Source","content":"Indeed Resume"},{"val":"Email Opt Out","content":"false"},{"val":"Is Locked","content":"false"},{"val":"Is Unqualified","content":"false"},{"val":"Is Attachment Present","content":"true"},{"val":"Candidate Status","content":"Hired - AMP Office"},{"val":"Career Page Invite Status","content":"0"},{"val":"Source By","content":"Applied by Candidate"},{"val":"EMPID","content":"JFP147"},{"val":"Candidate Job Category","content":"Office - Digital Verification"},{"val":"College Graduate","content":"false"}]
}]}}
,"uri":"/recruit/private/json/Candidates/searchRecords"}}

I haven't tested it, but by the looks of it, your code should look like:
public class CandidateJson
{
public Response response { get; set; }
}
public class Response
{
public Result result { get; set; }
public string uri { get; set; }
}
public class Result
{
public Candidate Candidates { get; set; }
}
public class Candidate
{
public List<Row> row { get; set; }
}
public class Row
{
public string no { get; set; }
public List<FL> FL { get; set; }
}
public class FL
{
public string val { get; set; }
public string content { get; set; }
}
Note: You might want to use int or decimal instead of string for val and no, but there is not enough information for me to assert that.

Related

.NET Core API REST C# List into List is null

I'm developing an api in net core.
I've done a post function in which I send an object containing multiple parameters and a list within another list.
When I'm debugging the code the function is called correctly but I find that the second list always arrives null.
The rest of the data arrives at you correctly. I have done different tests with other objects and everything works correctly.
It is this case in which the list within another the second one arrives null.
My code:
example request input
{
"Name": "TestName",
"Related1":
[{
"id1": "TestNameRelated1",
"Related2":
[{
"id2": "TestNameRelated2"
}]
}]
}
[HttpPost]
public resultExample Test([FromBody]TestClass test)
{
//do something
}
[DataContract]
public class TestClass
{
[DataMember]
public string Name { get; set; }
[DataMember]
public List<TestClassArray> Related1 { get; set; }
}
[DataContract]
public class TestClassArray
{
[DataMember]
public string id1 { get; set; }
[DataMember]
public List<TestClassArray2> Related2 { get; set; }
}
[DataContract]
public class TestClassArray2
{
[DataMember]
public string id2 { get; set; }
}
This api was previously made in .NET framework 4.8 and this case worked correctly.
Now I'm passing the api to .Net5.
Could it be that in .Net5 it is not allowed to pass lists within other lists?
Do you have to enable some kind of configuration to be able to do this now?
You need use class/DTO with constructor like shown below and you should be good to go. I have uploaded this sample API app's code working with .net5.0 on my GitHub here.
public class TestClass
{
public TestClass()
{
Related1 = new List<TestClassArray>();
}
public string Name { get; set; }
public List<TestClassArray> Related1 { get; set; }
}
public class TestClassArray
{
public TestClassArray()
{
Related2 = new List<TestClassArray2>();
}
public string id1 { get; set; }
public List<TestClassArray2> Related2 { get; set; }
}
public class TestClassArray2
{
public string id2 { get; set; }
}
public class ResultExample
{
public string StatusCode { get; set; }
public string Message { get; set; }
}
Controller Post Method
[HttpPost]
[ProducesResponseType(typeof(ResultExample), 200)]
public ResultExample Post([FromBody] TestClass test)
{
ResultExample testResult = new ResultExample();
TestClass test2 = new TestClass();
TestClassArray testClassArray = new TestClassArray();
TestClassArray2 testClassArray2 = new TestClassArray2();
test2.Name = test.Name;
foreach (var item in test.Related1)
{
foreach (var item2 in item.Related2)
{
testClassArray2.id2 = item2.id2;
}
testClassArray.Related2.Add(testClassArray2);
}
test2.Related1.Add(testClassArray);
Console.WriteLine(test2);
testResult.Message = "New Result added successfullly....";
testResult.StatusCode = "201";
return testResult;
}
Swagger Input Sample Payload
Post Controller Result
Response of Sample input payload,(You can change it to default 201 response code as well)
I had a similar issue.
API method shows List was null
In my case a date field was not well formatted
So I use SimpleDateFormat on Android Studio with a correct datetime format
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss",Locale.US);
item.setDate(dateFormat.format(calendar.getTime()));
and works fine

How to index multiple blobs under a main record in Azure Search?

I followed the steps described on this tutorial. My case is a little bit different:
Instead of indexing Hotels and Rooms, I am indexing Candidates and Resumes.
Instead of using CosmosDB I am using an Azure SQL Database.
Following the tutorial, I am able to create the Index, the 2 Indexers (one for the SQL DB and one for the Blobs storage), and the 2 data sources.
The SQL DB contains all my candidates, and the storage contains all their resumes (files with PDF/DOC/DOCX formats). Each blob has a metadata "ResumeCandidateId" that contains the same value as the "CandidateId" for the Candidate.
I have the following fields for my Index:
[SerializePropertyNamesAsCamelCase]
public partial class Candidate
{
[Key]
[IsFilterable, IsRetrievable(true), IsSearchable]
public string CandidateId { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string LastName { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string FirstName { get; set; }
[IsFilterable, IsRetrievable(true), IsSearchable, IsSortable]
public string Notes { get; set; }
public ResumeBlob[] ResumeBlobs { get; set; }
}
[SerializePropertyNamesAsCamelCase]
public class ResumeBlob
{
[IsRetrievable(true), IsSearchable]
[Analyzer(AnalyzerName.AsString.StandardLucene)]
public string content { get; set; }
[IsRetrievable(true)]
public string metadata_storage_content_type { get; set; }
public long metadata_storage_size { get; set; }
public DateTime metadata_storage_last_modified { get; set; }
public string metadata_storage_name { get; set; }
[Key]
[IsRetrievable(true)]
public string metadata_storage_path { get; set; }
[IsRetrievable(true)]
public string metadata_content_type { get; set; }
public string metadata_author { get; set; }
public DateTime metadata_creation_date { get; set; }
public DateTime metadata_last_modified { get; set; }
public string ResumeCandidateId { get; set; }
}
As you can see, one Candidate can have multiple Resumes. The challenge is to populate the ResumeBlobs property...
The data from the SQL DB is indexed and mapped correctly by the Indexer. When I run the Blobs Indexer, it loads documents, however it does not map them and they never show up in the search (ResumeBlobs is always empty). Here is the code used to create the Blobs Indexer:
var blobDataSource = DataSource.AzureBlobStorage(
name: "azure-blob-test02",
storageConnectionString: "DefaultEndpointsProtocol=https;AccountName=yyy;AccountKey=xxx;EndpointSuffix=core.windows.net",
containerName: "2019");
await searchService.DataSources.CreateOrUpdateAsync(blobDataSource);
List<FieldMapping> map = new List<FieldMapping> {
new FieldMapping("ResumeCandidateId", "CandidateId")
};
Indexer blobIndexer = new Indexer(
name: "hotel-rooms-blobs-indexer",
dataSourceName: blobDataSource.Name,
targetIndexName: indexName,
fieldMappings: map,
//parameters: new IndexingParameters().SetBlobExtractionMode(BlobExtractionMode.ContentAndMetadata).IndexFileNameExtensions(".DOC", ".DOCX", ".PDF", ".HTML", ".HTM"),
schedule: new IndexingSchedule(TimeSpan.FromDays(1)));
bool exists = await searchService.Indexers.ExistsAsync(blobIndexer.Name);
if (exists)
{
await searchService.Indexers.ResetAsync(blobIndexer.Name);
}
await searchService.Indexers.CreateOrUpdateAsync(blobIndexer);
try
{
await searchService.Indexers.RunAsync(blobIndexer.Name);
}
catch (CloudException e) when (e.Response.StatusCode == (HttpStatusCode)429)
{
Console.WriteLine("Failed to run indexer: {0}", e.Response.Content);
}
I commented the parameters for the blobIndexer but I get the same results even if it's not commented.
When I run a search, here is an example of what I get:
{
"#odata.context": "https://yyy.search.windows.net/indexes('index-test01')/$metadata#docs(*)",
"value": [
{
"#search.score": 1.2127206,
"candidateId": "363933d1-7e81-4ed2-b82e-d7496d98db50",
"lastName": "LAMLAST",
"firstName": "ZFIRST",
"notes": "MGA ; SQL ; T-SQL",
"resumeBlobs": []
}
]
}
"resumeBlobs" is empty. Any idea how to do such a mapping?
AFAIK, Azure Search doesn't support a collection merge feature that seems to be necessary to implement your scenario.
An alternative approach to this is to create a separate index for resumes and point the resume indexer to that index. That means that some of your search scenarios will have to hit two indexes, but it's a path forward.

Azure Search .NET SDK Custom Analyzer

Without too much background, here's my issue:
To create a new Azure Search index using the .NET SDK in C# (using the Hotel example provided in the documentation) my code looks something like this:
public class Hotel
{
[System.ComponentModel.DataAnnotations.Key]
[IsFilterable]
public string HotelId { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public double? BaseRate { get; set; }
[IsSearchable]
public string Description { get; set; }
[IsSearchable]
[Analyzer(AnalyzerName.AsString.FrLucene)]
[JsonProperty("description_fr")]
public string DescriptionFr { get; set; }
[IsSearchable, IsFilterable, IsSortable]
public string HotelName { get; set; }
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string Category { get; set; }
[IsSearchable, IsFilterable, IsFacetable]
public string[] Tags { get; set; }
[IsFilterable, IsFacetable]
public bool? ParkingIncluded { get; set; }
[IsFilterable, IsFacetable]
public bool? SmokingAllowed { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public DateTimeOffset? LastRenovationDate { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public int? Rating { get; set; }
[IsFilterable, IsSortable]
public GeographyPoint Location { get; set; }
}
private static void CreateHotelsIndex(ISearchServiceClient serviceClient)
{
var definition = new Index
{
Name = "hotels",
Fields = FieldBuilder.BuildForType<Hotel>()
};
serviceClient.Indexes.Create(definition);
}
This works fine.
The issue comes with searching using the .NET SDK. Prefix searching works fine
var results = indexClient.Documents.Search<Hotel>("cheap*");
will return all documents with strings that start with "cheap" but I need a string.Contains() kind of functionality, or at the very least, suffix searching. I'm trying to do something like
var results = indexClient.Documents.Search<Hotel>("*heap*");
to get all results containing the string "heap" in any position.
I know there are ways to do this with custom analyzers, but these analyzers can only be created and applied though the Azure Search REST API, and at that only at the time of the index creation. This makes nearly all of what I provided above unusable, as I'd have to define my "Hotels" index, fields, and analyzers in JSON through Postman and the SDK is really only useful for querying. It also means that I need to define the same custom analyzer repeatedly in every index I create, since Azure Search does not appear to support global analyzer definitions.
So the question here is: Is there a way to define a custom analyzer in C# that I can reference and apply to my indexes on creation? Or, really, is there an easy way to get full wildcard support using only the .NET SDK?
You can do something like this:
private static void CreateHotelsIndex(ISearchServiceClient serviceClient)
{
var definition = new Index
{
Name = "hotels",
Fields = FieldBuilder.BuildForType<Hotel>(),
Analyzers = new[]
{
new CustomAnalyzer
{
Name = "my_analyzer",
Tokenizer = TokenizerName.Standard,
TokenFilters = new[]
{
TokenFilterName.Lowercase,
TokenFilterName.AsciiFolding,
TokenFilterName.Phonetic
}
}
}
};
serviceClient.Indexes.Create(definition);
}
... and then reference the custom analyzer in the document definition:
[IsSearchable, IsFilterable, IsSortable, Analyzer("my_analyzer")]
public string HotelName { get; set; }
See Custom analyzers in Azure Search blog post and examples from API unit tests CustomAnalyzerTests for more information.

Web Service Exception: "The formatter threw an exception while trying to deserialise the message"

Got a question. I get this error and I know it is due to the fact that int32 has a number limit of 2147483647. But I don't know why I am getting this error when the value in question (a telephone number of 11 digits) is defined as a string in our SQL database, a string in our web service and a string in our web application.
I assume it is something to do with the way the service serialises and deserialises data over a connection, but I was wanting to know if there is a way to force Number to use only the string instead of parsing it when deserialisation happens. Or even get it to parse as int64.
Here is the error exception. I removed the namespace and service name. It is the property Number that is causing the problem.
There was an error deserializing the object of type .".ClientPhone[]. The value '07721545554' cannot be parsed as the type 'Int32'."
And here is the code for the service and the service interface.
[DataContract]
public class ClientPhone
{
[DataMember]
public int? ClientNumberID { get; set; }
[DataMember]
public int? RefID { get; set; }
[DataMember]
public string Number { get; set; }
[DataMember]
public string NumberType { get; set; }
[DataMember]
public bool? PrimaryNumber { get; set; }
}
public partial class ClientNumberEntity
{
public int ClientNumbersID { get; set; }
public Nullable<int> RefID { get; set; }
public string ClientNumberType { get; set; }
public string ClientNumber { get; set; }
public Nullable<bool> PrimaryNumber { get; set; }
public virtual ClientDataEntity ClientData { get; set; }
}
public List<ClientPhone> GetClientsPhoneByReference(int _reference)
{
OurDatabaseEntities context = new OurDatabaseEntities();
var phoneEntity = (from c in context.ClientNumberEntities
where c.RefID == _reference
select c).ToList();
if (phoneEntity != null)
{
return TranslateClientPhoneEntityToPhoneNumberList(phoneEntity);
}
else
throw new Exception("Unable to get phone data");
}
private List<ClientPhone> TranslateClientPhoneEntityToPhoneNumberList(List<ClientNumberEntity> numberEntities)
{
List<ClientPhone> phoneList = new List<ClientPhone>();
foreach (ClientNumberEntity numberEntity in numberEntities)
{
ClientPhone phoneListMember = new ClientPhone();
phoneListMember.ClientNumberID = numberEntity.ClientNumbersID;
phoneListMember.RefID = numberEntity.RefID;
phoneListMember.Number = numberEntity.ClientNumber;
phoneListMember.NumberType = numberEntity.ClientNumberType;
phoneListMember.PrimaryNumber = numberEntity.PrimaryNumber;
phoneList.Add(phoneListMember);
}
return phoneList;
}
Any advice on a solution would be greatly appreciated! Thanks :)
Got a solution, albeit it's more stupidity on my end.
I didn't realise that my .EDMX entity diagram hadn't been updated with the new values from the database (I had to manually delete the entity and re-add it to force changes).
After re-compiling and updating the service reference, it worked.

reading JSON data remotely

UPDATE 1
i try to implement and but when i hover over my topic and i see the TopicId and TopicName are null and i see the data in myJSON string.
what else i have to do? what i am missing?
Topic topic = new Topic();
MemoryStream stream1 = new MemoryStream(Encoding.Unicode.GetBytes(myJSON));
//stream1.Position = 0;
DataContractJsonSerializer serialize = new DataContractJsonSerializer(typeof(Topic));
//topic = (Topic)serialize.ReadObject(stream1);
Topic p2 = (Topic)serialize.ReadObject(stream1);
stream1.Close(); //later i will use in `using statement`
stream1.Dispose();
PS: i just have only Topic class is that enough or do i have to create all the classes that jcolebrand showed below?
i have created a class called Topic and in it i have two prop
[DataContract]
public class Topic
{
[DataMember]
public string TopicId { get; set; }
[DataMember]
public string TopicName { get; set; }
}
UPDATE 1 END
I am working on a requirement that returns JSON data and I need a way to parse the data and load that data into a dropdownlist and I'm looking for the element in JSON called TopicName
after the TopicName is extracted I will load that data into a DropDownList asp.net control
(not using JQuery or JavaScript)
here is JSON data:
[{"NS":{"Count":1},
"Source":{"Acronym":"ABC","Name":"Name"},
"Item":[{"Id":"12312",
"Url":"http://sitename",
"ContentItem":[{"NS":{"Count":1},
"SourceUrl":"sitename",
"ContentType":"text/xml",
"PersistentUrl":"sitename",
"Title":"MY TITLE",
"SelectionSpec":{"ClassList":"","ElementList":"","XPath":null},
"Language":{"Value":"eng","Scheme":"ISO 639-2"},
"Source":{"Acronym":"ABC","Name":"Name","Id":null},
"Topics":[{"Scheme":"ABC",
"Topic":[{"TopicId":"6544","TopicName":"TOPIC NAME1"},
{"TopicId":"63453","TopicName":"TOPIC NAME2"},
{"TopicId":"2343","TopicName":"TOPIC NAME3"},
{"TopicId":"2342","TopicName":"TOPIC NAME4"}]
}],
"ContentBody":null
}]
}]
},
[{"NS":{"Count":1},"Source":{"Acronym":"ABC1","Name":"Name1"},"Item":[{"Id":"123121","Url":"http://sitename1","ContentItem":[{"NS":{"Count":1},"SourceUrl":"sitename","ContentType":"text/xml","PersistentUrl":"sitename1","Title":"MY TITLE1","SelectionSpec":{"ClassList":"","ElementList":"","XPath":null},"Language":{"Value":"eng","Scheme":"ISO 639-2"},
"Source":{"Acronym":"ABC1","Name":"Name1","Id":null},"Topics":[{"Scheme":"ABC1","Topic":[{"TopicId":"65441","TopicName":"TOPIC NAME11"},{"TopicId":"634531","TopicName":"TOPIC NAME21"},{"TopicId":"23431","TopicName":"TOPIC NAME31"},{"TopicId":"23421","TopicName":"TOPIC NAME41"}]}],"ContentBody":null}]}]},
Assuming the re-indent as applied above is correct, then you have the following classes (apparently)
public class OuterWrapper {
public NS NS { get; set; }
public Source Source { get; set; }
public ContentItemWrapper[] Item { get; set; }
}
public class ContentItemWrapper {
public int Id { get; set; }
public string Url { get; set; }
public ContentItem[] ContentItem { get; set; }
}
public class ContentItem {
public NS NS { get; set; }
public SourceUrl { get; set; }
// I'm gonna skip a bunch of fields, you get the idea
public Topics Topic { get; set; }
}
public class Topics {
public string Scheme { get; set; }
public Topic[] Topic { get; set; }
}
public class Topic {
public string TopicId { get; set; }
public string TopicName { get; set; }
}
And what you do is you use that set of type declarations (specifically the OuterWrapper) to DataContractJsonSerializer decode the JSON into a C# object that you can then query using strongly typed methods, etc. This is one of those times where C# doesn't have anywhere near the flexibility of Javascript, because everything has to be explicitly declared.
Try using built in serializer for JSON - http://msdn.microsoft.com/en-us/library/bb412179.aspx : new DataContractJsonSerializer(typeof(Person)).ReadObject(stream1);.
If it is not enough to read your objects consider using JSON.Net ( http://json.codeplex.com/) - JsonConvert.DeserializeObject<Labels>(json);

Categories

Resources