I have a little problem with my Alexa skill. I want to include display components in it, and I want to send an appropriate directive. For most of my code, I use custom classes from API, e.g.
Resources colorsDark = new Resources();
colorsDark.description = "Color for dark theme";
colorsDark.when = "${viewport.theme == 'dark'}";
However, for one part of my skill, I use only previously-created values, so there is no need to create new objects and assign values to them. Instead, I've created a .json file that includes all the necessary information.
I'd like to point my code to this file, but here I encountered an issue.
I'd like to make it look like this:
doc.Styles = [JSON_FILE]
However, when the function is executed, it can't find this file.
I'm using JObject from Newtonsoft.Json.
I tried to use only the relative path:
JObject jObject = JObject.Parse(File.ReadAllText(".\\AlexaPresentationLanguage\\Styles\\ListStyle.json"));
as well as some other solutions like
Path.GetCurrentDirectory
and
Path.Combine
From System.IO
So far nothing worked. Do you have any ideas what can I do?
Related
Noob junior dev here trying to fix some old code written by someone else more senior but left. This person had a json file with a structure like this:
A: {
B: {
C: stuff1,
D: stuff2,
E: stuff3
}
},
...
The old code deserializes the json file into a JObject then tries to get information from the child node like this:
stuff1 = jObject["C"];
When I run the code, it just fills stuff1 with null. When I use the whole path like this:
stuff1 = jObject["A"]["B"]["C"]
I get the information I need from the child node. Just wondering if this was an oversight on the original author's part or if there's something I'm not seeing.
Am I right that you shouldn't be able to access the information in the child node without navigating through the tree properly? Is there any more robust way to get the information from the child node directly? I can see a scenario where if the structure of the Json file changes, then the jObject["A"]["B"]["C"] will no longer work.
If you're using Json.NET it looks like a bug or json changed.
However you address the situation write a unit test that shows what json(s) the code is expected to work with. The next dev will be have it easier; and that next dev may be you in 6 months.
Dealing with changing json
I can see a scenario where if the structure of the Json file changes, then the jObject["A"]["B"]["C"] will no longer work.
Dealing with changing json is ... let's say hard. If the json libary you're using supports JSONPath you could try with '$..C'. If you're using Json.NET have a look at Querying JSON with LINQ.
Fixing the bug
Use jObject["A"]["B"]["C"] or deserialise to a class and move on.
I'm at a loss with the KTA SDK. My intention is to pass a scanned document in PDF format with a few headers to KTA's jobs queue. As I'm still going through the documentation, my best guess right now is to use the Document class as a DTO then I need to call a method to pass that Document as a parameter:
...
[HttpPost]
public HttpResponseMessage Upload()
{
var httpRequest = System.Web.HttpContext.Current.Request;
var DocType = httpRequest.Headers["X-DocType"];
var Pages = httpRequest.Headers["X-DocPages"];
var Title = httpRequest.Headers["X-DocTitle"];
Agility.Sdk.Model.Capture.Document doc = new Agility.Sdk.Model.Capture.Document();
// doc.DocumentType = DocType; // Type DocumentTypeSummary
doc.NumberOfPages = Convert.ToInt32(Pages);
doc.FileName = Title;
...
I'm just wondering if I'm on the right track in doing this?
My other question is where can we store data from a custom header? In this example, I need to store custom header called Comments and AccountNumber.
Lastly, what service needs to be called to send this document to KTA jobs queue? Would CaptureDocumentService be the right one?
I'd greatly appreciate any help on this.
Start with the details laid out in the Sample App example. It shows what to add to your app.config, but what it doesn’t call out explicitly enough is that you should change the SdkServicesLocation value for your environment. You will simply call the functions in the services within the TotalAgility.Sdk namespaces and it will handle the webservice calls.
The CaptureDocumentService might be part of what you need, and there is a set of samples dedicated to the functions on that service. It refers to the Sample Processes folder, which by default is here:
C:\Program Files\Kofax\TotalAgility\Sample Processes\Capture SDK Sample Package
However what you will definitely need are the functions on the JobService. There are different functions with different options, but CreateJobWithDocuments is probably what you want to start with. You can see that this is creating document(s) and a job together in one step.
There is similarity with the parameters on CaptureDocumentService.CreateDocument3, so you might cross-reference with that to best understand the parameters. The difference is that CreateDocument3 just creates a document in the abstract: you want to actually use it as an input to create a job, so use the combined function.
Finally, to pass fields in, you will be setting RuntimeField objects as part of the RuntimeDocument objects going into your CreateJobWithDocuments call.
Suppose I have a nested json file which looks like this:
[{"toplevel":{"firstleveldata":{"id":12345,"no":123}},"secondleveldata":{"fruit":{"apples":"y"},"veg":{"small":{"gree":{"fresh":{"available":3}}}}},"thirdleveldata":{"fruit":{"changes":[{"itemid":1,"subno":1,"green":[],"red":[{"extraid":2,"element":5}]}]}}}]
and the following R code which can parse the it to a nice data.frame except for the last one (if that can be fixed to that's a bonus)
what would be neat altenative using C# look like? It's not something i've ever coded in before so wouldn't know where to start.
The end goal being to export the nested JSON into a csv, multiple csvs with primary / foreign keys to merge on of necessary.
R code for context.
library(jsonlite)
library(tidyverse)
fun <- function(x)
{list(
#keys
id = pluck(x,"toplevel","firstleveldata","id", .default = NA),
no = pluck(x,"toplevel","firstleveldata","no", .default = NA),
apples = pluck(x,"secondleveldata","fruit","apples", .default = NA),
itemid = pluck(x,"toplevel","thirdleveldata","fruit","changes","itemid", .default = NA) #doesn't work)}
out<-map_df(list.files("my_json_file",full.names=TRUE),~map_df(fromJSON(txt=., simplifyVector=FALSE), fun))
out
Thanks
A quick way to explore this structure and get started with json in C# is to copy the json packet as you expect it to be. Taking what you have in your question as an example, copy the json into the clipboard and create a new C# code file and write your default namespace only as follows:
namespace Data
{
// Next step will be here
}
After that, place your cursor between the curly braces, click the edit menu, then paste special, JSON as Classes
This will generate some classes for you that match the JSON schema you have. In this case, there's a Class1 because it cannot determine the name of the object with the toplevel, secondlevel etc properties. You can give it a more meaningful name for your case.
Update for .NET 5.0 and later versions
Following the release of .NET 5.0, a new namespace now comes out of the box that can handle JSON as easily as Newtonsoft and in a performant way.
The simplest way to parse the Json into your objects will be with a one liner as follows...
var items = JsonSerializer.Deserialize<IEnumerable<MyItem>>(jsonData);
For this to work, you'll need to have the following in your usings...
using System.Collections.Generic; // For IEnumerable<T>
using System.Text.Json; // For JsonSerializer.*
Following is the original answer that used the Newtonsoft.Json Nuget package. That still works with .NET 5.0 but I've included the new namespace for those who would wish to avoid including an extra DLL that has functionality that is covered out of the box.
Original method using Newtonsoft.Json
Open the package manager console (if you can't see it in VS, press Ctrl+Q and type "Package Manager Console"). In the package manager, type the following
Install-Package Newtonsoft.Json
after ensuring that you have your solution saved and specifying the project you want to add that package to as your default project as in the screenshot:
and wait for it to complete. You'll now have the ability to deserialize JSON you have into your classes. Go to where you want to do the processing and type the following...
// at the top of the file...
using Newtonsoft.Json;
// where you need to decode it...
string jsonData = GetYourJsonDataFromAFileOrAPICall(); // replace with how you get the json
var items = JsonConvert.DeserializeObject<IEnumerable<MyItem>>(jsonData);
At this point, you'll have an IEnumerable<MyItem> (MyItem is what I renamed Class1) representing your json. You can change that to be a list, or array if you want a more specific collection.
I followed the instructions in Put strings into resource files, instead of putting them directly in code or markup in Put UI strings into resources (except I don't understand step 4f). The structure in the Solution Explorer of the resources in my project is:
That is the hierarchy for the project's shared node. I opened the Resources1.resw file and added a couple of strings.
Then Add string resource identifiers to code and markup in that article has the following:
var loader = new Windows.ApplicationModel.Resources.ResourceLoader();
When I have that I get:
WinRT information: ResourceMap Not Found.
I have tried many other possibilities that I have found from searching but either I get that error or the class or method (in other solutions) do not exist for my project. I assume there is something relevant missing from that article.
Using C#, how do I get strings from that resources file ?
You dont need to rename your resource file. If the name of the resource file isnt the default (Resources.resw), you can add the special name in the GetForCurrentView method.
In your case the call should be:
var loader = Windows.ApplicationModel.Resources.ResourceLoader.GetForCurrentView("Resources1.resw");
Source:GetForCurrentView(System.String)
Did you tried to specify default language in app.xaml for your actual frame?
myFrame = new Frame
{
Language = Windows.Globalization.ApplicationLanguages.Languages[0]
};
Or more specific:
Windows.Globalization.ApplicationLanguages.Languages['en']
I've got a 3 sets of 9 images in seperate .resx files, and I'm trying to figure out how to loop a single set into 9 static picture boxes.
Loop through all the resources in a .resx file
I've looked through some of the solutions in the above link, like using ResXResourceReader, but it comes up with a parsing error when I use the GetEnumerator method.
When I use the ResourceSet resourceSet = MyResourceClass.ResourceManager.GetResourceSet(CultureInfo.CurrentUICulture, true, true); line, there's no definition for the ResourceManager within the Form class, or a GetResourceSet method when I create my own ResourceManager.
There is actually a method called CreateFileBasedResourceManager which I've dabbled in, but truth be told I don't understand the parameters it needs too well aside from the directory.
I've also looked at some of the solutions involving assemblies and retrieving the executing image assembly at runtime, but I think that's a little out of my depth at the moment.
Can anyone tell me what I'm doing wrong with the first two methods or maybe something entirely different?
Looking at MSDN, you should be able to iterate the values from a RESX file like so:
string resxFile = #".\CarResources.resx";
// Get resources from .resx file.
using (ResXResourceSet resxSet = new ResXResourceSet(resxFile))
{
// Retrieve the image.
Object image = resxSet.GetObject("NAMEOFFILE", true);
}
If you wanted to iterate all objects in the RESX file, you could do something like this:
using (ResXResourceReader resxReader = new ResXResourceReader(resxFile))
{
foreach (DictionaryEntry entry in resxReader) {
// entry.Key is the name of the file
// entry.Value is the actual object...add it to the list of images you were looking to keep track of
}
}
More can be found here.
I known that this is a old question, but today I got the same problem, and solve setting the BasePath property, like this:
oResReader = new ResXResourceReader(strInputFile);
oResReader.BasePath = Path.GetDirectoryName(strInputFile);
I found this solution here