How can I use a Dictionary parameter with Swagger UI? - c#

I'm looking to use Swashbuckle to generate Swagger docs and Swagger UI for a Web API method that looks like the following (it's effectively a mail-merge with PDFs):
[HttpPost]
[ResponseType(typeof(byte[]))]
public HttpResponseMessage MergeValues([FromUri]Dictionary<string, string> mergeValues,
[FromBody]byte[] pdfTemplate)
{
return MergeStuff();
}
This isn't currently working, or at least I'm not sure how to interact with the resulting Swagger UI.
This creates the following Swagger UI, which seems reasonable, but I'm not sure what to do with it to populate it correctly (or if it's even correct). I'm using pretty much all default Swashbuckle settings from the latest Nuget.
Byte Array: If I enter Base64-encoded text for the byte array, the byte array always shows up null. Turns out I just need my BASE64 text surrounded by double-quotes and then it works.
Dictionary: I've tried various types of JSON expressions (arrays, objects, etc) and am unable to get any of the values in the Dictionary to populate (the resulting Dictionary object has 0 items).
I have the ability to change things and would like to know how I can do this. For example, if changing the dictionary to an array of KeyValuePair<string,string> helps, let's do it.
Options I know that I have that I'd like to avoid:
Changing these input types to strings and doing my own manual deserialization/decoding. I'd like to be explicit with my types and not get too fancy.
Custom binders. I'd like to utilize standard/default binders so I have less code/complexity to maintain.

My question really was a two-parter. Here are the answers, although only a partial answer to the second question:
Question 1: How do I fill in the data for a byte array?
Answer 1: Paste in your base64-encoded value for it but be sure to surround that content with double-quotes, both at the beginning and end.
Question 2: How do I fill in the data for the Dictionary?
Answer 2: While it doesn't work with [FromUri], this will work with [FromBody] (either Dictionary or IDictionary<string,string> will work):
{"FirstName":"John","LastName":"Doe"}
I'm not sure why this doesn't work with FromUri but I'm going to ask a separate question that's much more focused than this one to get to the bottom of that. For the purposes of this question, both parameters can be put into a DTO, flagged as [FromBody], and all is good then.

Related

C# custom file parsing with 2 delimiters and different record types

I have a (not quite valid) CSV file that contains rows of multiple types. Any record could be one of about 6 different types and each type has a different number of properties. The first part of any row contains the timestamp and the type of record, followed by a standard CSV of the data.
Example
1456057920 PERSON, Ted Danson, 123 Fake Street, 555-123-3214, blah
1476195120 PLACE, Detroit, Michigan, 12345
1440581532 THING, Bucket, Has holes, Not a good bucket
And to make matters more complex, I need to be able to do different things with the records depending on certain criteria. So a PERSON type can be automatically inserted into a DB without user input, but a THING type would be displayed on screen for the user to review and approve before adding to DB and continuing the parse, etc.
Normally, I would use a library like CsvHelper to map the records to a type, but in this case since the types could be different, and the first part uses a space instead of comma, I dont know how to do that with a standard CSV library. So currently how I am doing it each loop is:
String split based off comma.
Split the first array item by the space.
Use a switch statement to determine the type and create the object.
Put that object into a List of type object.
Get confused as to where to go now because i now have a list of various types and will have to use yet another switch or if to determine the next parts.
I don't really know for sure if I will actually need that List but I have a feeling the user will want the ability to manually flip through records in the file.
By this point, this is starting to make for very long, confusing code, and my gut feeling tells me there has to be a cleaner way to do this. I thought maybe using Type.GetType(string) would help simplify the code some, but this seems like it might be terribly inefficient in a loop with 10k+ records and might make things even more confusing. I then thought maybe making some interfaces might help, but I'm not the greatest at using interfaces in this context and I seem to end up in about this same situation.
So what would be a more manageable way to parse this file? Are there any C# parsing libraries out there that would be able to handle something like this?
You can implement an IRecord interface that has a Timestamp property and a Process method (perhaps others as well).
Then, implement concrete types for each type of record.
Use a switch statement to determine the type and create and populate the correct concrete type.
Place each object in a List
After that you can do whatever you need. Some examples:
Loop through each item and call Process() to handle it.
Use linq .OfType<{concrete type}> to segment the list. (Warning with 10k
records, this would be slow since it would traverse the entire list for each concrete type.)
Use an overridden ToString method to give a single text representation of the IRecord
If using WPF, you can define a datatype template for each concrete type, bind an ItemsControl derivative to a collection of IRecords and your "detail" display (e.g. ListItem or separate ContentControl) will automagically display the item using the correct DataTemplate
Continuing in my comment - well that depends. What u described is actually pretty good for starters, u can of course expand it to a series of factories one for each object type - so that you move from explicit switch into searching for first factory that can parse a line. Might prove useful if u are looking to adding more object types in the future - you just add then another factory for new kind of object. Up to you if these objects should share a common interface. Interface is used generally to define a a behavior, so it doesn't seem so. Maybe you should rather just a Dictionary? You need to ask urself if you actually need strongly typed objects here? Maybe what you need is a simple class with ObjectType property and Dictionary of properties with some helper methods for easy typed properties access like GetBool, GetInt or generic Get?

Binding a TextBoxFor to a value of Dictionary<CultureInfo, string> for localization

In an ASP.NET MVC 5 web application, I have a Title string property in my model which is easily bound to a TextBoxFor.
However, because of localization needs, I turned the Title property into a Dictionary<CultureInfo, string>. I've already read around the net that I can bind the TextBoxFor to, say, Model.Title[new CultureInfo("en-US")].
Question #1: am I correct when I assume I can also bind to Model.Title[Model.CurrentLanguage] (or another variable holding the relevant CultureInfo)?
The main problem arises when there is no localization yet, i.e., I'm asking for a given localization for the first time. The dictionary does not contain the key yet, so the binding fails with an exception.
Question #2: how could I manage the missing key case? I know I could pre-fill the dictionary with all needed cultures having an empty or null string, but I would really rather not do that, as it removes flexibility, and it would create many entries in each dictionary while I may only need one or two.
EDIT
Maybe it wasn't clear from the question, but the localizations I'm talking about are user data, and will be read/written from/to a database. They are not application resources.
IMO I would go into resource files rather than dictionary way of localization. Just like here: https://docs.asp.net/en/latest/fundamentals/localization.html
In this way even if you don't have, let's say, fr version of you localized string (no xxx.fr.resx file), it will return default string (which will be the one in main resource file - the one without culture - xxx.resx)
To me, it looks like you're doing the decision-making in the wrong place. Rather than passing a dictionary containing titles for different cultures, you should do your culture/title selection in your controller and pass only what you need to the view. Then you can bind to a String title.

Deserializing a model from an api that contains Enums

We're developing a web application using ASP.Net MVC 5 C#, I'm having difficulties coming up with a nice solution for my problem.
We're accessing an api that has a few enums coming across as text (i.e. "yes", "no", "notfound", "na") on the model itself I'd like to have strongly typed enums that store in the database as an integer (save a little space I think) the issue comes when we're deserializing the response from the api. In the response it will comes across as text (see above) where as the enums will be expecting an integer.
How is this done? I really hate asking questions where I haven't tried anything but my web searches hasn't resulted in anything. If you needed any code please let me know and I'll update the question with the segments needed. As of right now the model has several strongly typed enums and the api is returning strings of the enum values. by the way The enum texts are the same as the return values.
In-case it makes any difference we're also using EF Code First 6
You can use Enum.TryParse to convert the string to its enum value
public enum MyEnum
{
NA
Yes,
No
}
then
MyEnum value = MyEnum.NA;
Enum.TryParse<MyEnum>(theTextValue, out value );

Url Encoding an array

This might seem dirty but it's for documentation purposes I swear!
I am accessing my services using GETs in my documentation so people can try things out without needing to get too complicated.
Appending x-http-method-override=POST to the URL forces the server to take a GET as a POST
This is all good except when I need to POST an array of objects. This would be simple in a standard POST but today I have a new bread of nightmare.
The expected POST looks like:
{"name":"String","slug":"String","start":"String","end":"String","id":"String","artists":[{"id":"String","name":"String","slug":"String"}],"locationId":"String"}
As you can see there is an array of artists up in here.
I have tried to do the following:
model/listing?start=10:10&end=12:30&artists[0].name=wayne&artists[0].id=artists-289&locationid=locations-641&x-http-method-override=POST
But to no avail.
How can I get an array of objects into a URL so that service stack will be happy with it?!
I appreciate this is not the done thing but it's making explaining my end points infinitely easier with clickable example URLs
You can use JSV to encode complex objects in the URL. This should work for your DTO:
model/listing?name=wayne&artists=[{id:artists-289,name:sample1},{id:artists-290,name:sample2}]&locationId=locations-641
You can programmatically create JSV from an arbitrary object using the ToJsv extension method in ServiceStack.Text.

Parsing a Auto-Generated .NET Date Object with Javascript/JQuery

There are some posts on this, but not an answer to this specific question.
The server is returning this: "/Date(1304146800000)/"
I would like to not change the server-side code at all and instead parse the date that is included in the .Net generated JSON object. This doesn't seem that hard because it looks like it is almost there. Yet there doesn't seem to be a quick fix, at least in these forums.
From previous posts it sounds like this can be done using REGEX but REGEX and I are old enemies that coldly stare at each other across the bar.
Is this the only way? If so, can someone point me to a REGEX reference that is appropriate to this task?
Regards,
Guido
The link from Robert is good, but we should strive to answer the question here, not to just post links.
Here's a quick function that does what you need. http://jsfiddle.net/Aaa6r/
function deserializeDotNetDate(dateStr) {
var matches = /\/Date\((\d*)\)\//.exec(dateStr);
if(!matches) {
return null;
}
return new Date( parseInt( matches[1] ) );
}
deserializeDotNetDate("/Date(1304146800000)/");
Since you're using jQuery I've extended its $.parseJSON() functionality so it's able to do this conversion for you automatically and transparently.
It doesn't convert only .net dates but ISO dates as well. ISO dates are supported by native JSON converters in all major browsers but they work only one way because JSON spec doesn't support date data type.
Read all the details (don't want to copy blog post content here because it would be too much) in my blog post and get the code as well. The idea is still the same: change jQuery's default $.parseJSON() behaviour so it can detect .Net and ISO dates and converts them automatically when parsing JSON data. This way you don't have to traverse your parsed objects and convert dates manually.
How it's used?
$.parseJSON(yourJSONstring, true);
See the additional variable? This makes sure that all your existing code works as expected without any change. But if you do provide the additional parameter and set it to true it will detect dates and convert them accordingly.
Why is this solution better than manual conversion? (suggested by Juan)
Because you lower the risk of human factor of forgetting to convert some variable in your object tree (objects can be deep and wide)
Because your code is in development and if you change some server-side part that returns JSON to the client (rename variables, add new ones, remove existing etc.), you have to think of these manual conversions on the client side as well. If you do it automatically you don't have to think (or do anything) about it.
Two top reasons from the top of my head.
When overriding jQuery functionality feels wrong
When you don't want to actually override existing $.parseJSON() functionality you can minimally change the code and rename the extension to $.parseJSONwithdates() and then always use your own function when parsing JSON. But you may have a problem when you set your Ajax calls to dataType: "json" which automatically calls the original parser. If you use this setting you will have to override jQuery's existing functionality.
The good thing is also that you don't change the original jQuery library code file. You put this extension in a separate file and use it at your own will. Some pages may use it, others may not. But it's wise to use it everywhere otherwise you have the same problem of human factor with forgetting to include the extension. Just include your extension in some global Javascript file (or master page/template) you may be using.

Categories

Resources