I have four classes. Person, NaturalPerson (Inherits from Person), GroupFamilyMember(Inherits from NaturalPerson), QuotationHolder(Inherits from GroupFamilyMember).
They all share the same ID.
My problem is the following one:
There is a method that returns an existing NaturalPerson(stored in DB) object based on a document number. Then, I have to create a QuotationHolder, and I want that QuotationHolder object to contain the retrieved NaturalPerson object.
The issue, is that I can´t cast the object like this (I know the reason):
QuotationHolder quotationHolder = (QuotationHolder) naturalPerson;
I tried creating a new QuotationHolder object and setting its values with the naturalPerson´s object values using reflection.
But as I lose the reference to the retrieved object, when I want to save in cascade, NHibernate gives me the following exception:
a different object with the same identifier value was already associated with the session
I guess that its trying to save the object as a new one.
Just to consider:
The IDs are set using the HILO Algorithm.
The mappings cannot be changed, neither the classes.
The way I understand your question, this is what you are trying to do:
class A {}
class SubA : A {}
A instance = new A();
instance = magic-convert-object-to-different-type<Sub>(instance);
Changing the class (type) of an existing object cannot be done in C#. NHibernate is designed to translate between the object model and a relational storage model, and therefore has no support for this either.
There are other possible models to handle when objects need to be perceived as changing classes, for instance the State design pattern. Or maybe you should reconsider if this is really what you want at all - perhaps the additional data that the subclasses hold should be in "sibling-objects", that reference back to the basic person class.
It is also possible to use plain SQL to convert the data that represents the NaturalPerson into data that represents a QuotationHolder - when asked to load the converted data, NHibernate will now to instantiate a QuotationHolder instead.
Related
lets say that I have a system that heavily relies on a single object, for example a Person.
I have a storage service (service 1) that handles basic CRUD operations for Person, and saves this person in mongo.
But this Person object is very big and nested, it has a lot of properties, some of them are irrelevant for some of the other services.
For example we have service 2 that gets a Person object from the storage service and renders it in the ui, he only cares about some of the properties and doesnt need the whole big and nested Person object.
And we have service 3 that gets a Person object from the storage service and needs a diff subset of properties then service 2.
We are using .Net so everything is strongly typed.
The straightforward solution is to define a subset of the Person class in each of the services, and have some converter that converts the Person object to the object this service needs (removing irrelevant properties). But some services need the exact Person object besides 5-10 properties, and as I said the Person is a huge nested object.
What is the best practice for this scenario? We dont want to re define a new “”mini person” for every service with its relevant properties because that feels like huge code duplication + creating heavy dependencies between every service and the storage service.
But we are using .Net so we have to have some strongly typed object, otherwise we wont be able to make any manipulations on the object we received from the storage service, considering we dont want to use it as plain json and just traverse the keys.
We thought of 2 solutions:
First is to use the same Person object between all services. Each service will get the person obj, so any manipulation it needs and then serialize it with a custom serializer that removes some keys from the json, this way the one whos getting the response will get only the relevant props.
Second is to add some kind of annotations to the props that says “if the req came from service 2 then do json ignore” and just dont serialize this prop in the return value from the storage service. But this makes the storage service notnot isolated and simple, and this way in service 2 we again cant deserialize and manipulate the obj cause we dont have a strongly typed “mini person”, so we have to use the json.
Is there a better known solution for this situation?
And again, this is under the assumption that the Person obj is huge and requires a lot of work do re define it again and again and will create heavy dependencies.
Thanks.
If we're talking best practices, maybe this Person object shouldn't have gotten so big to begin with. You can always break nested arrays and objects in their own separate files, entities or mongo collection.
But as it stands, maybe you could use dynamic or IDictionary<string, object> instead of creating a type Person and mapping every single strongly typed field in that class.
using System.Dynamic;
var person_from_database = person_repository.GetById(1);
dynamic person_result = new ExpandoObject();
person_result.name = person_from_database.fullname; //declare whatever properties your services need
person_result.first_address = person_from_database.addresses[0]; //even nested arrays and objects
person_result.name = person_result.name.ToUpper(); //modify values if needed
return Ok(person_result); //return whatever, no mapping
More details here!
Suppose i have multiple entities that inherit from TableEntity.
i want to use a generic class , like Cachemanager , and write to azure storage different T
This link below mentions TableEntityAdapter
https://learn.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.table.tableentityadapter-1.-ctor?view=azure-dotnet
I am looking for a Code examples.
Thanks,Peter
I wrote TableEntityAdapter class so I will try to provide usage example, though I see question is asked a month ago, hopefully it is still usefull.
Firstly have a look at how it is tested by the unit tests by the SDK:
https://github.com/Azure/azure-storage-net/blob/master/Test/ClassLibraryCommon/Table/TableOperationUnitTests.cs
search for TableEntityAdapter to find relevant tests.. Note that unit tests of course do not write to actual table Storage service, they simulate what would happen when you read and write in terms of api calls, but they should give you a good idea.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Below is a simplified sample code for TableEntityAdapter:
To write your custom .Net object to Table Storage:
// 1. ShapeEntity is the custom .Net object that we want to write and read from Table Storage.
ShapeEntity shapeEntity = new ShapeEntity(Guid.NewGuid().ToString(), Guid.NewGuid().ToString(), "square", 4, 4);
OperationContext operationContext = new OperationContext();
// 2. Instantiate a TableEntityAdapter object, passing in the custom object to its constructor. Internally this instance will keep a reference to our custom ShapeEntity object.
TableEntityAdapter<ShapeEntity> writeToTableStorage = new TableEntityAdapter<ShapeEntity>(shapeEntity, partitionKey, rowKey);
// 3. Now you can write the writeToTableStorage object to Table Storage just like any other object which inherits from ITableEntity interface. The TableAdapter generic class handles the boiler plate code and hand shaking between your custom .Net object and table storage sdk / service.
To read your custom object from Table Storage:
// 1. Read your entity back from table storage using the same partition / row key you specified (see step 2 above). You can use the Retrieve<T> table operation, specify T as TableEntityAdapter<ShapeEntity>
// 2. This should return you the TableEntityAdapter<ShapeEntity> object that you wrote to TableStorage at step 3 above.
// 3. To access the original ShapeEntity object, just refer to the OriginalEntity property of the returned TableEntityAdapter<ShapeEntity> object.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Note: TableEntityAdapter api does not support objects with collection type properties ie. List, Array, IEnumerable, ICollection etc..
If your objects contain these types of properties (directly under the root or somewhere in their object graph) then you should consider using the original Nuget package that I wrote which supports collection type property types as well.
ObjectFlattenerRecomposer Api version 2.0 that supports Collection, Enumerable type properties:
https://www.nuget.org/packages/ObjectFlattenerRecomposer/
and recently uploaded .Net Core version:
https://www.nuget.org/packages/ObjectFlattenerRecomposer.Core/1.0.0/
I load types dynamically through reflection, instantiate the classes, fill them with data and then save them to RavenDB. They all implement an interface IEntity.
In the RavenDB UI, I can see the classes correctly displayed and the meta data has the correct CLR type.
Now I want to get the data back out, change it and then save it.
What I'd like to do
I have the System.Type that matches the entity in RavenDB's CLR meta data, assuming that's called myType, I would like to do this:
session.Query(myType).ToList(); // session is IDocumentSession
But you can't do that, as RavenDB queries like so:
session.Query<T>();
I don't have the generic type T at compile time, I'm loading that dynamically.
Solution 1 - the Big Document
The first way I tried (for a proof of concept) was to wrap all the entities in a single data object and save that:
public class Data {
List<IEntity> Entities = new List<IEntity>();
}
Assuming the session is opened/closed properly and that I have an id:
var myDataObject = session.Load<Data>(Id);
myDataObject.Entities.First(); // or whatever query
As my types are all dynamic, specified in a dynamically loaded DLL, I need my custom json deserializer to perform the object creation. I specify that in the answer here.
I would rather not do this as loading the entire document in production would not be efficient.
## Possible solution 2 ##
I understand that Lucene can be used to query the type meta data and get the data out as a dynamic type. I can then do a horrible cast and make the changes.
Update after #Tung-Chau
Thank you to Tung, unfortunately neither of the solutions work. Let me explain why:
I am storing the entities with:
session.Store(myDataObject);
In the database, that will produce a document with the name of myDataObject.GetType().Name. If you do:
var myDataObject = session.Load<IEntity>(Id);
Then it won't find the document because it is not saved with the name IEntity, it is saved with the name of the dynamic type.
Now the Lucene solution doesn't work either but for a slightly more complex reason. When Lucene finds the type (which it does), Raven passes it to the custom JsonDeserialiser I mentioned. The custom Json Deserialiser does not have access to the meta data, so it does not know which type to reflect.
Is there a better way to retrieve data when you know the type but not at compile time?
Thank you.
If you have an ID (or IDs):
var myDataObject = session.Load<IEntity>(Id);
//change myDataObject. Use myDataObject.GetType() as you want
//....
session.SaveChange();
For query, LuceneQuery is suitable
var tag = documentStore.Conventions.GetTypeTagName(typeof(YourDataType));
var myDataObjects = session.Advanced
.LuceneQuery<IEntity, RavenDocumentsByEntityName>()
.WhereEquals("Tag", tag)
.AndAlso()
//....
maybe its been a long day, but I just cant figure this out.
I retrieve a large custom object from WCF, and store it in an application variable.
This happens every 20 minutes.
For each web user, I am checking the existence and timeout of this application variable, and if needed, re-query my wcf and build a new object, and re-store it in the application variable.
This all good and well.
Now, I am trying to "make a copy" of this "master" object, modify it, and store it in a session variable, modifying it as needed throughout the session life cycle. (modifying the session variable).
Everytime I modify the session object, the object in the application variable gets modified.
Pseudo
application("mastervar") = object from wcf (obejct type - xcustomclass)
dim mynewobject as new xcustomclass
mynewobject = application("mastervar")
* Modifying mynewobject, also modifies application("mastervar")
I have tried:
session("mynewSessionVar") = application("mastervar")
mynewobject = session("mynewSessionVar")
Modifying mynewobject, modifies application("mastervar")
I have tried:
Manually copying all properties in mastervar object to new object, with a for loop.
mycustomobject = new xcustomclass
mycustomobjectObject as new xcustomclass.object
mymasterobject = application("mastervar")
for each object in mymasterobject.objectslist
mycustomobjectObject = new xcustomclass.object
with mycustomobjectObject
.property = object.property
end with
mycustomobject.objectlist.add(mycustomobjectObject)
next
Same thing, modifying mycustomobject, also modifies application("mastervar")
As I said, maybe its been a long day, but I've been bumping my head against this for hours...
EDIT
Private Function copy_fresh_units(unitsFromWcf As WebResortUnits) As WebResortUnits
Dim myFreshUnits As New WebResortUnits
Dim myFreshUnit As WebResortUnits.qbunit
For Each Unit In unitsFromWcf.resortUnits
myFreshUnit = New WebResortUnits.qbunit
With myFreshUnit
' .Availability = Unit.Availability
.mapDetails = Unit.mapDetails
End With
myFreshUnits.resortUnits.Add(Unit)
Next
return myFreshUnits
End Function
Modifying the availability property in myfreshUnits, it still updates the app var. I have had a look at reference and value types, and it is definitely my issue. But taking this alst edit into account, I know I am missing something, what it is, I am not sure... :-)
You are creating a new reference to the objects and then modifying the object that you reference and expecting your two references to not be the same object?
It sounds like what you really want to do is to clone the objects or make a deepcopy of the objects.
If these are custom objects you will need custom code to make a Clone of them.
When you Clone your object be sure that you create a new object and set all of the value types on that new object from your old object. Then go through and clone all of your reference types and set the reference properties on your clone to point to the clones of the properties you've created.
EDIT: To address your update
The problem is still... everything you are copying is obviously a reference type and it's not being cloned. So there is only one object in existence therefor when you edit either reference it changes the object.
EDIT 2: Serialization will help you
Serializing and deserializing your base object in memory is an easy way to clone it. I generally write a custom clone method that serializes / deserializes the object. That way you have a Clone() method that you will always call and any custom code you need that doesn't get handled properly with serialize / deserialize you can handle in that method.
My guess is that the properties you are copying are not all primitives themselves (int, float, string, etc). When you then alter the corresponding property on the new object, it is then altering the original property (since it's not a primitive and is really an object reference).
Check out https://github.com/JesseBuesking/BB.DeepCopy which is probably a bit of overkill, but if you read what's under The problem this addresses you can see alternative approaches for helping in this situation.
I am accessing a service and I get returned an object in the form of (for example)
Car _car = _service.FetchCar(carId)
Car.Color
Car.Tires.Right.Front
Car.Tires.Left.Front
Car.Tires.Right.Back
Car.Tires.Left.Back
Car.Spoiler
etc, etc...you get the idea. My application is recieving many different objects with many differen structures. What I'd like to do is to be able to have one method that would be able to take one type of object and map it to another...
What I don't want to have to do is to manually map all the fields from the service object to my domain object with every object type
for example
If I get a Car object from the service I'd like to map it to my own Car object and if I get a Table object I'd like to map it to my own table object
any ideas?
Have a look at tools like AutoMapper to handle these "copy all fields from object A to object B" scenarios.
AutoMapper will automatically copy all fields with identical names from one instance to the other, and you can set up additional rules to allow copying of fields where the names don't match (and you can also define custom converters if you need to convert data types along the way).
Very useful, very helpful!
Marc