doubts about storing and processing JSON - c#

I have to implement a server side code to a Form Builder front end component which allows to create Forms through drag and drop, something like in this link
When the form is created a JSON is generated and has to be stored in the database. From there, then there are two uses, first is to edit an already stored form JSON, and second is to implement the Form in html view using the JSON, so that the users can consume it. The form JSON looks something like this, although the levels get deeper for some form components.
[{"name":"UserForm"},
{"id":"txtID","label":"Your Name","placeholder":"your name","helptext":"Please enter your name","required":true,"inputsize":"input-xlarge"},
{"id":"selectID","options":["Cricket","Football"],"inputsize":"input-xlarge","label":"Favorite Game"}]
When storing this JSON, I can see two ways and which one to choose is confusing me. Firstly, I can create entity classes representing the JSON, and also separate SQL Server Tables for these entities. So the idea is to create complete entities with relationships representing the JSON hierarchy.
Another is to store the complete JSON in one SQLServer Table string (or similar) field and work on that string for processing.
The first option seems to be fragile, since if the JSON keys change, the database schema has to be updated. The second option can absorb this without requiring any changes in database. But, I am not sure of how to process JSON string as objects in C# code. I will need this for implementing say, Validation Logic for any form implementation. So, for example if a "Name" key has a value of "Required" in JSON, then it has to be validated on server side using c#.
So in short how store JSON, as a string field or as Entities representing the JSON hierarchy.

Related

Accessing JSON on server side (asp.net)

I have a web app that we're still stuck in asp.net 2.5 with. We've started using some newer technology with our front end, including loading some of our data into JSON objects and passing them around the application using localStorage. It's working really great.
In the future we're going to change our ASP.NET web form architecture into ah HTML5/JQuery front and Web API back end. So we're trying to write for that future while still being constrained to our old web form post backs and business objects. So right now we're posting from our search form to our search result page web form and we'll be calling a method from our business object to grab and return search results.
The criteria object we pass in has 20 or so values and a couple of collections (product line ID's, category ID's, etc..). So it's a slightly complicated object. In the old form we grabbed values from the controls, validated them, and passed them in using the asp.net controls, it was a single form solution. Our new solution has a search form and a results page. We're passing our values from form to form in a JSON object in internal storage. I can't really get to that from server side so I also stashed the values in a hidden field on the form that I can grab on the server side when I POST to the results page (eventually we'll call an API from the new form using ajax). So now that I can see the data, how do I parse and work with a JSON object in the code behind of asp.net. I need to load the 20 or so search criteria values and iterate through the ID collections (Guid and int) to load them into the same criteria object. This object is then passed in as the search methods parameter and search results will come back. Not sure how to manipulate the json on the server side.
If I understand the question, you have a JSON string in the server, and you just need to work with it.
Easiest way is to define a class (or in this case, a few classes) representing the data, then simply deserialize the JSON into an instance of the class. From there, you've got a regular object, and can do whatever you need with it, including serializing it back to JSON if you want.
With JSON.NET (aka Newtonsoft.Json), it's as simple as:
var myObject = JsonConvert.DeserializeObject<SomeType>(jsonString);
If you need help building the class, you can use json2csharp, where you can pass in a sample JSON file, and it builds the appropriate C# classes for you.
You can use DataContractJsonSerializer class, if the .Net framework is upgrade to 3.5 or later.
But, I found this article in web, http://www.netfxharmonics.com/2008/01/DojrNET-Dojo-RPC-Library-NET-20,
which is using Dojr.Net library and its compatible with .Net 2.0.
Hope this helps.

Designing app to load, edit, and save hierarchical data

I am writing a GUI that will be integrated with SAP Business One. I'm having a difficult time determine how to load, edit, and save the data in the best way possible (reliable, fast, easy).
I have two tables which are called Structures and StructureRows (not great names). A structure can contain other structures, generics, and specifics. The structures rows hold all of these items and have a type associated with them. The generics are placeholders for specifics and the specifics are an actual item in inventory.
A job will contain job metadata as well as n structures. On the screen where you edit the job, you can add structures and delete structures as well as edit the rows underneath them. For example, if you added Structure 1 to Job 1 and Structure 1 contains Generic 1, the user would be able to swap Generic 1 for a Specific.
I understand how to store the data, but I don't know what the best method to load and save the data is...
I see a few different options:
When someone adds a structure to a job, load the structure, and then recursively load any structures beneath it (the generics and specifics will already be loaded). I would put this all into an Object Model such as List and each Structure object would have List and List. When I save the changes back to the database, I would have to manually loop through the data and persist the changes.
Somehow load the data into a view in SQL and then group and order the datatable/dataset on the client side. Bind the data to a GridView so changes are automatically reflected in the dataset. When you go to save, SQL / ADO.NET could handle this automatically? This seems like the ideal solution, but I don't know how to actually implement it...
The part that throws me off is being able to add a structure to a structure. If it wasn't for this, I would select the Specifics and Generics from the StructureRows table, and group them in the GUI based on the Structure they belong to. I would have them in a DataTable and bind that to the GridView so any changes were persisted automatically to the DataTable and then I could turn around and push them to SQL very easily...
Is loading and saving the data manually via an object model the only option I have? If not, how would you do it? I'm not sure if I'm just making it more complicated then it needs to be or if this is actually difficult to do with C#, ADO.NET and MS SQL.
The HierarchyID datatype was introduced in SQLServer 2008 to handle this kind of thing. Haven't done it myself, but here's a place to start that gives a fair example of how to use it.
That being said, if you aren't wedded to your current tables, and don't need to query out the individual elements (in other words, you are always dealing the job as a whole), I'd be tempted to store the data for each job as XML. (If you were doing a web-based app, you could also go with JSON.) It preserves the hierarchy and there are many tools in .NET for working with XML. There's also a built-in TreeView class for winForms, and doubtless other third party controls available.

C#: Save serialized object to the text field in the DB, how maintain object format versions in the future?

I've complex object (nested properties, collections etc) in my ASP.NET MVC C# application. I don't need to save it into the multiple tables in the DB, serializing the whole object and store it like a whole is ok.
I plan to serialize the whole object (in something human-readable like JSON/XML) and store in text field in the DB.
I need to later load this object from the DB and render it using strongly-typed view.
Here comes the question: in the future the class of the object can change (I can add\remove fields etc). But serialized versions saved into the DB before will not reflect change.
How to deal with this?
You should write some sort of conversion utility every time you significantly change structured, serialized messages, and run it as part of an upgrade process. Adding or removing fields that are nullable isn't likely to be a problem, but larger structural changes will be.
You could do something like implement IXmlSerializable, peek at the message and figure out what version the message is and convert it appropriately, but this will quickly become a mess if you have to do this a lot and your application has a long lifecycle. So, you're better off doing it up front in an upgrade process, and outside of your application.
If you are worried about running the conversion on lots of records as part of an upgrade, you could come up with some ways to make it more efficient (for example, add a column to the table that contains the message schema version, so you can efficiently target messages that are out of date).
As long as you're using JSON or XML, added fields shouldn't be a problem (as long as no specific version schemas are enforced), The default .net XML serializer for instance, doesn't include fields that have their default value (which can be set with the System.Component.DefaultValue attribute). So the new fields will be treated the same as the omitted fields while deserializing and get their default values (default class values that is, the DefaultValue attribute only applies to serialization/designer behaviour).
Removed fields depends on your deserialization implementation, but can be made so that those are ignored. Personally I tend to keep the properties but mark them as obsolete with a message of what they were once for. That way when coding you'll know not to use them, but can still be filled for backwards compatibility (and they shouldn't be serialized when marked obsolete). When possible you could implement logic in the obsolete property that fills the renewed data structure.

asp.net mvc save postback object tree to database and retrieve it. (implement save button)

For background reference: The application is built in asp.net mvc 3, the backend is built with the help of entity framework and services and for the front end I copy the domain objects to DTO objects. The DTO objects have the validation attributes on them.
The customer that works in this application needs to fill in some rather big forms. After it is filled it it will be submitted to someone else who has to evaluate the information. So after the submit the status becomes pending until validated.
But when the customer is called away I would like the customer to have the ability to save away its form without submitting it or loosing information. I realize I need to do 2 things to accomplish this. 1 is to disable the javascript validation on the save button. I think that wouldn't be that hard. Then step 2 store the form state (dto and some object that represents the validation result) to the database. And then when the form is opened afterwards those values need to be recovered. What I want to accomplish is a delayed serverside validation.
So the process will be:
Fill in form -> Push save -> disable js validation -> post object to server -> store dto + validation in database -> ...... -> load data from database and attach to form?? -> post back to the client.....
Conceptually I think this could be a way to do it. (Please tell me if you have an other idea or disagree with me).
Does anyone has a clue how to build it. Especially the save and load of the data from the database. What would I need to persist? Can I reattach it back to the contexts etc... One extra note, I don't use cookies/session variables etc etc.
The answer to this depends on the validation mechanism on your DTO's but if you can add invalid data to the DTO you can then serialize the DTO's object graph as whatever (Binary, DataContract, XML, Json, etc.) Once serialized you could store the object graph in the database and the next time the user logs in you can deserialize the data back into the DTO's to present to the view. I would take a look at the different serializers (DataContractSerializer is in System.Runtime.Serialization and serializes to XML) to see which fits your needs best.
MemoryStream ms = new MemoryStream();
System.Runtime.Serialization.DataContractSerializer serializer = new System.Runtime.Serialization.DataContractSerializer(typeof(ViewModel));
serializer.WriteObject(ms, vmInstance);

Storing objects in the database

I am using SQL Server 2008 with NHibernate for an application. In the application I need to create multiple object of a Info class and use it in multiple places. I also need to store that object in the databse.
There are multiple types of Info class.
To store these objects of Info class I have two options
Store the Serialized obejct of the class
Store the details of that class as string.
What is the advantage of storing the serialized object in the database over storing its values as multiple strings?
-Ram
If you store the serialized object into the db:
You don't have to rebuild it from the partial data (ie. write your own deserializer if the behaviour is default, create objects from the partial data)
You must create the object "manually"
May be faster in some cases
Stores redundant infrastructure data
You may choose multiple formats (XML, custom format, blobs)
You have fully prepared serialized objects that are ready to be processed anywhere (sent over the network, stored in a disk)
I you store the multiple strings, you:
Need to build the objects "manually"
May use the database data in different scenarios (from .net, to build another structures such as cubes)
The data is much more compact
May store the data in a relational normalized form which is (almost) always a good practice
Query the data
And the overall more versatile usage of the data.
I would definitely go for the relational normalized form to store the strings and then build the corresponding class builder in .net.
I would definitely store records and fields and not just a chunk of bytes ( either binary or text or xml ) representing the current status of your object.
it depends of course on the complexity of your business entities ( in your case the Info class ), but I would really avoid saving the serialized version of it in 1 column.
if you explode all properties into fields you can query better for records having certain values and you can handle new columns and releases much easier.
The most common issue with storing an object as a serialized stream is that it is difficult to search the properties of the object once it is stored, whereas if each property of the object is explicitly stored in its own strongly typed column, it can be indexed for searches, and you get the integrity benefit of strongly typed storage.
However, At least if the object is XmlSerialized into an XML column in SQL, you can use technologies such as xquery and OPENXML to ease your searches.
Serialized obejct (XML)
If you store the class as XML. You will be able to search the contect of the class by using Xquery. This way is eay way if you want to search(with or without conditions). More over, you can create index over XML column. The XML index will make you application faster.
AS string
If you don have bussines login to look at the content of class.

Categories

Resources