I have an Object:
public class BindObjectToFile
{
public int Property1 {get; set;}
public int Property2 {get; set;}
public BindObjectToFile(string BindingFilePath)
{
...
}
}
I have a Json File:
{
"Property1" : 1,
"Property2" : 2,
}
Whenever a property on the Object Changes, I want the Json File to change with it.
Whenever a property in the JsonFile Changes, I want the Object to change with it.
I'd like all children of BindObjectToFile to easily inherit this functionality, without requiring adjustments to the getters/setters on their properties.
Essentially, I want an object that feels like it is stored in a file, not in memory.
What I've Tried:
I started by (stupidly) Serializing/Deserializing the entire object on every single getter/setter in the object:
internal int _property1;
public int Property1
{
get
{
return JsonConvert.DeserializeObject<ObjectToFile>(File.ReadAllText(JsonFilePath))._property1;
}
set
{
_property1 = value;
File.WriteAllText(JsonFilePath, JsonConvert.SerializeObject(this));
}
}
Newtonsoft.json is great, But this approach seems wrong because it forces me to rewrite all of that boiler plate for each property on each new child without communicating to anyone that that is how my BindObjectToFile class is supposed to be used.
Next, I tried implementingINotifyPropertyChanged to only save the newly serialized object when a property changes. This was a little better, but it still doesn't account for situations where the file is changed by something other than this instance of 'BindObjectToFile'.
I tried Creating a Generic BindObjectToFile which also implemented INotifyPropertyChanged. I did this in hopes that by using a Generic object, I could get around this problem without using inheritance, so that I wouldn't need to find a way to force children to write crazy getters/setters for each of their properties:
public abstract class ObjectToFile<T> : INotifyPropertyChanged
{
public T _value;
public T Value
{
get
{
return JsonConvert.DeserializeObject<T>(File.ReadAllText(Path));
}
set
{
_value = value;
File.WriteAllText(Path, JsonConvert.SerializeObject(_value ));
}
}
public string JsonFilePath;
public ObjectToFile(T value, string path)
{
_value = value;
JsonFilePath = path;
}
}
But, this still forces me to either raise or manage PropertyChange events in the child classes when their properties change.
And I don't want to do any of that--I Just want a fix-the-problem parent class so that I can happily create child classes that behave correctly without me needing to think about any of this.
That seems reasonable, right?
Caveats:
I don't mind rewriting/rereading the entire file every time a property is accessed or changed.
In my context, I don't care much about speed. If someone else opens up one of these files and I have to wait a few hundred ms for my turn, that's not a problem.
All the objects that I'm working with are extremely simple, containing properties with basic data types.
All the objects that I'm working with are extremely small, and rarely have more than five or six properties.
Thanks, All!
I don't believe you can only change part of a json file, to only save the property value that was changed. If the length of value changed, it would have to move the rest of the file up or down and I don't think you can do that. So you will have to rewrite the entire file every time. Which really sucks if some code changes more than one property at a time. I would probably add a .Save method or maybe even add change tracking with a .Commit when done making changes - so the file is written once per set of changes. This also helps to communicate to anyone using the class that that is how the object is to be used.
You also shouldn't be reading/deserializing the file every time a property is referenced. I would instead try using the FileSystemWatcher class (setting the Filter property) to be notified of changes to the file.
If the number of consumers watching the same file is an issue, perhaps look into using a document/nosql database instead of manually doing it yourself.
Related
I have been constantly revising my code on the best method for me to handle inheritance with a single table db structure using entity framework. By single table i mean there is a column for each property of the core object and all inherited properties are stored as a json object in a column called params. This object will represent the DB structure.
public class Parent
{
public string Name {get; set;}
public DateTime Created {get; set;}
protected ExpandoObject Params {get; set;}
public TypeEnum Type {get; set;}
}
I pull 1 parent from the DB and i do not know its type. I need to first parse it to a Parent object and then read the Type property to figure out what it is. From here i can cast it to its correct type. I have tried so many ways to handle this and i think i have come up with the cleanest possible way, but it results in having to put some parsing and data conversion in the getters. Here is an example.
public class Child : Parent
{
public int ChildID{
get {
return Convert.ToInt32(this.Params.ChildID);
}
set {
this.Params.ChildID = value;
}
}
public SomeEnum SomeOtherType{
get {
return (SomeEnum)Convert.ToInt32(this.Params.SomeOtherType);
}
set {
this.Params.SomeOtherType = value;
}
}
}
This to me is super clean as it hides the complexity of the implementation while also allowing me to easily cast Child to Parent without a loss of data. The issue i am seeing is that integers are being stored as a long and i need to convert them to be an int (it never needs to be a long). For enums it does the same. Is this kind of data handling acceptable to have in a getter of an object? The main drawback i see here is that the parse needs to be done every single time the property is called, but it isnt too expensive. If this is terrible practice, how should i be doing this?
The alternatives i have already considered are
Load/save properties using methods - I dont want to add order of execution to an already complex setup.
Table per type - This wont work because it will double all of my DB queries as i do not know the object type on query time, therefore cannot query the type table to obtain that data at query time.
Completely new objects - cant do this as i need to keep the objects related so that entity can save an object of type Parent back to the DB
I have a Node object with a Parent that i have bound in my xaml like so,
<Label>
<Hyperlink>
<TextBlock Text="{Binding Path=Node.Parent.Name}"/>
</Hyperlink>
</Label>
My ViewModel looks something like this
public class NodeViewModel
{
public Node Node { get; set; }
public NodeViewModel(Node model)
{
Node = model;
if(model.Parent != null) { } // Check if it's null, then do nothing.
// When the above line is commented out, my label displays nothing.
}
}
Why is it that when my if statement is commented out, the label/textblock is blank? Am I doing something wrong? Does my object both exist and not exist until I check if it's null?
Edit:
Forgot to mention, my node class is pretty simple, and does implement INotifyPropertyChanged for the Name property.
2nd Edit: Added my simple Node class.
[ImplementPropertyChanged] // From Fody.PropertyChanged
public class Node
{
public int? ParentID { get; set; }
public Node Parent { get; set; }
public string Name { get; set; }
public Node Node(Node p = null)
{
Parent = p;
}
}
From comments:
It is actually. It is proxied by Entity Framework. Could that be the issue? If so, is there a less hackish way to fix this?
If the problem really lies in the proxy, then, well, no. You'll get a hard time with proxies/wrappers and WPF and change-notifications, really.
WPF's Binding engine does not work with proxies. At all. Well, maybe unless they are really well written and complex. So, usually, never. This comes form the fact how INPC interface works:
void PropertyChanged(object sender, .... args)
WPF tracks the sender and Binding.Source. If a Source of a Binding is an object called "Z", that is a Proxy to an object "A", then any notifications originating from "A" that are forwarded by proxy "Z" to the WPF's engine are .. discarded because for WPF the "Z" is the Source and it does not match the advertised sender "A".
I battled with this issue for quite a long time, and the only solution I found is to have the Proxy translate the P-Changed event so that the sender=A is subsituted for Z. This can have some nasty memory leaks when not written properly. It's because it's hard to "map" a delegate to a new delegate and provide proper disposal and GCing of both of them. And usually even this is only possible if you build your own proxies with ie. Castle or similar library. I have not found any dynamic-proxy library that supports sender-replacement out of the box.
If you know any - pleease let me know!
Anyways, be careful with the terminology. I mean, proxies. An object "Z" that wraps and trims or extends operations on another different original object "A". Your case may be different. If your so-called "proxy" is a dynamic subclass of your type and if it overrides any virtual methods/properties/events you have in your original class, then it is not the Proxy I meant. In this case, the 'proxy' and the original object is the same object, it's just that you see it as a class A and the actual class is Z:A, and for WPF the Source matches the sender. If I remember correctly, this is how EF often works.
Therefore, the first thing I'd check is to inspect who is really the culprit. Maybe Fody not EF? Try to remove that magic ImplementPropertyChanged, try to implement INPC manually in ALL classes. And I mean ALL. Both in Node and in NodeViewModel. If you are lazy, you can propdp and use DependencyProperty instead. If it works - then start removing the manual INPC and replacing it with ie. that from Fody's. When things start to break, gather all results and reanalyze.
Also, check what Fody does - maybe it provides a transparent proxy/wrapper in some point?
EDIT: Considering that your code starts working when you 'touch' the Fody'ied object, I'd start blaming Fody. Maybe it is because the NodeViewModel.Node property is currently not tracked bu Fody? Have you tried marking it with ImplementPropertyChanged too? It is a pure guess and it'd mean that this library has some serious problems, but it's quick and worth trying.
I think you are missing to implement INPC on the ViewModel. The binding won't update back to the UI when you did Node = node. Raise a property changed event on the Node setter.
I have various complex objects that often have collections of other complex objects. Sometimes I only want to load the collections when they're needed so I need a way to keep track of whether a collection has been loaded (null/empty doesn't necessarily mean it hasn't been loaded). To do this, these complex objects inherit from a class that maintains a collection of loaded collections. Then we just need to add a call to a function in the setter for each collection that we want to be tracked like so:
public List<ObjectA> ObjectAList {
get { return _objectAList; }
set {
_objectAList = value;
PropertyLoaded("ObjectAList");
}
}
The PropertyLoaded function updates a collection that keeps track of which collections have been loaded.
Unfortunately these objects get used in a webservice and so are (de)serialized and all setters are called and PropertyLoaded gets called when it actually hasn't been.
Ideally I'd like to be able to use OnSerializing/OnSerialized so the function knows if its being called legitimately however we use XmlSerializer so this doesn't work. As much as I'd like to change to using DataContractSerializer, for various reasons I can't do that at the moment.
Is there some other way to know if serialization is happening or not? If not or alternatively is there a better way to achieve the above without having to extra code each time a new collection needs to be tracked?
XmlSerializer does not support serialization callbacks. You have some options, though. For example, if you want to choose whether to serialize a property called ObjectAList, you can add a method:
public bool ShouldSerializeObjectAList () { /* logic */ }
If you need to know during deserialization too, you can use:
[XmlIgnore]
public bool ObjectAListSpecified {
get { /* logic whether to include it in serialization */ }
set { /* logic to apply during deserialization */ }
}
(although you might find - I can't be sure - that the set is only called for the true case)
The other option, of course, is to implement IXmlSerializable, but that should only be done as a last resort. It isn't fun.
I can't find any idea of the way to use this attribute?
MSDN indicates it is indeed to indicate that a property editor can be reused without needing to recreate each time.
This is a performance win, especially if your editor needs to do significant work on start up which can be avoided. Unless you are actually having performance issues then I wouldn't worry about it.
imagine you have scenario like this:
class Complex
{
public OtherComplex1 Property1 { get; set; }
public OtherComplex2 Property2 { get; set; }
public OtherComplex2 Property3 { get; set; }
.....
public OtherComplexN PropertyN { get; set; }
}
each of your properties has its own type designer, which displays some properties, etc.
say, you have two different instances of the Complex class + instance of some other arbitrary class.
now, when you toggle between your objects like this - complex instance 1 -> other -> complex instance 2 - everything will work fine, but if you do something like this:
complex instance 1 -> complex instance 2, you'd notice that properties are not beeing refreshed.
that is default behavior of property grid, which tries to optimize number of data refresh operations. unless you want to bake a lot of logic in order to keep your designers updated, i'd suggest marking your complexTypes with editor reuse attribute set to false - in this case, whenever selection changes to a different instance, property grid would still refresh your designers.
If you don't know what it does, why do you need to use it? Do you have any code that is using it currently that you could post as an example?
It sounds like it allows you to define that a property editor for your property can be reused without restarting. I am not particularly sure why this would be useful.
Is there a known pattern to inherit data in a hierarchical object structure? I have a hierarchical 'Item' structure which needs to inherit its 'Type' from its 'Parent' (have the same data as default). The type of sub item can be modified by its own, and when the type of parent Item changes, all sub items which their type is not changed, should get the new type of parent.
Note that I cannot fake it like
public string Type
{
get
{
if (type == null)
return Parent != null ? Parent.Type : null;
return type;
}
}
'cause I have to fill the values in the database, and the structure is too deep to use recursion and not worry about the performance.
The only way I can think of it now is
public string Type
{
set
{
type = value;
UpdateUnchangedChildren(value);
}
}
public int AddChild(Item item)
{
item.Type = Type;
return Items.Add(item);
}
Is there a better way?
Thanks.
It's a common problem, usually related to maintenance of various hierarchical settings/configurations. So, I guess a solution to it can be considered "a pattern".
Anyways, from the internal architecture perspective you have 2 major options:
normalized structure
denormalized structure
"Normazlied" is the one implemented with recursion. A particular piece of data is always stored in one place, all the other places have references to it (e.g., to parent). The structure is easily updated, but readng from it may be a problem.
"Denormalized" means that every node will store the whole set of settings for its level and whenever you update a node it takes some time to go down the hierarchy and corect all the children nodes. But the reading operation is instant.
And so the "denormalized" version seems to be more widely used, because the common scenario with settings is that you update them rarely, while read them often, hence you need better read performance. For example, Windows ACL security model uses the "denormalized" approach to make security checks fast. You can read how they resolve conflicts between the "inherited" and explicit permissions (ACEs) by checking them in a specific order. That might be an overkill for your particular system though, you can simply have a flag that a particular value was overriden or, on the opposite, reset to "default"...
Further details depend on your system needs, you might waht to have a "hybrid" architecture, where some of the fields would be "normalized" and some others won't. But you seem to be on the right way.
I'm not 100% sure what it is you are trying to do... but you could use generics to pass the type of a parent object into a child object... But having a setter there doesn't really make sense... The Parent object's type will be set when it's instantiated, so why would you have a setter there to change it.
Assuming you have something like this...
public class Child<T>
{
public string Type
{
get { return typeof(T).ToString(); }
}
}
So then, when you have a Parent Object of any type, you can pass that to your Child Property...
public class ParentA
{
public Child<ParentA> ChildObj { get; set; }
}
public class ParentB
{
public Child<ParentB> ChildObj { get; set; }
}
public class ParentC
{
public Child<ParentC> ChildObj { get; set; }
}
Calling any of those ChildObj.Type Properties will return ParentA, ParentB & ParentC respectively.
Buit I've a funny feeling you haven't fully explained what it is you're trying to do.
Can you post some more code examples showing a Parent Class & Child/Item Class
An obvious optimization would be to cache the value obtained from the parent when reading the type. That means you will only traverse each path at most once (whereas the naive solution means you'll be traversing each subpath again and again for each path containing it, which means up to O(h^2) instead of O(h)). That would work great if you have more reads than writes.
Consider this:
class Node
{
string _cachedParentType = null;
string _type;
string Type
{
get { return _type ?? _cachedParentType ?? (_cachedParentType = Parent.Type); }
set
{
_type = value;
foreach (var child in Children) { child._cachedParentType = null; }
}
}
}
This means with enough reads and few writes, reading becomes O(1) in the best case or, at worst, a "cache miss" will cost you O(h) with h being the height of the tree; while updating is O(k) with k being the branching level (because we only update one layer down!). I think this will generally be better than the UpdateUnchangedChildren solution (which I presume updates nodes recursively all the way to the leafs), unless you're doing WAY more reads than writes.
"...the structure is too deep to use recursion and not worry about the performance."
Have you actually measured this? How many items are you dealing with, how deep is the structure, and how common is it for items to not have their own "Type" value? What are your performance goals for the application, and how does the recursive solution compare with those goals?
It is very common for people to think that recursion is slow and therefore eliminate it from consideration without ever trying it. It is NEVER a good idea to reject the simplest design for performance reasons without measuring it first. Otherwise you go off and invent a more complicated solution when the simpler one would have worked just fine.
Of course, your second solution is also using recursion, just going down the hierarchy instead of up. If the child inserts are happening at a different time and can absorb the possible performance hit, then perhaps that will be more acceptable.