lets say that I have a system that heavily relies on a single object, for example a Person.
I have a storage service (service 1) that handles basic CRUD operations for Person, and saves this person in mongo.
But this Person object is very big and nested, it has a lot of properties, some of them are irrelevant for some of the other services.
For example we have service 2 that gets a Person object from the storage service and renders it in the ui, he only cares about some of the properties and doesnt need the whole big and nested Person object.
And we have service 3 that gets a Person object from the storage service and needs a diff subset of properties then service 2.
We are using .Net so everything is strongly typed.
The straightforward solution is to define a subset of the Person class in each of the services, and have some converter that converts the Person object to the object this service needs (removing irrelevant properties). But some services need the exact Person object besides 5-10 properties, and as I said the Person is a huge nested object.
What is the best practice for this scenario? We dont want to re define a new “”mini person” for every service with its relevant properties because that feels like huge code duplication + creating heavy dependencies between every service and the storage service.
But we are using .Net so we have to have some strongly typed object, otherwise we wont be able to make any manipulations on the object we received from the storage service, considering we dont want to use it as plain json and just traverse the keys.
We thought of 2 solutions:
First is to use the same Person object between all services. Each service will get the person obj, so any manipulation it needs and then serialize it with a custom serializer that removes some keys from the json, this way the one whos getting the response will get only the relevant props.
Second is to add some kind of annotations to the props that says “if the req came from service 2 then do json ignore” and just dont serialize this prop in the return value from the storage service. But this makes the storage service notnot isolated and simple, and this way in service 2 we again cant deserialize and manipulate the obj cause we dont have a strongly typed “mini person”, so we have to use the json.
Is there a better known solution for this situation?
And again, this is under the assumption that the Person obj is huge and requires a lot of work do re define it again and again and will create heavy dependencies.
Thanks.
If we're talking best practices, maybe this Person object shouldn't have gotten so big to begin with. You can always break nested arrays and objects in their own separate files, entities or mongo collection.
But as it stands, maybe you could use dynamic or IDictionary<string, object> instead of creating a type Person and mapping every single strongly typed field in that class.
using System.Dynamic;
var person_from_database = person_repository.GetById(1);
dynamic person_result = new ExpandoObject();
person_result.name = person_from_database.fullname; //declare whatever properties your services need
person_result.first_address = person_from_database.addresses[0]; //even nested arrays and objects
person_result.name = person_result.name.ToUpper(); //modify values if needed
return Ok(person_result); //return whatever, no mapping
More details here!
Related
In my system, I need my clients to be able to get a List of objects, representative of the service states/context.
Say you have the following action done by a given client to retrieve the context:
List<someObjects> aList = GetListOfObjects();
Now the client is able to retrieve aList, but it can also modify this list before sending it back through the channel, which I would like to prevent (as it would not be representative of the system anymore). Therefore, the client should not be able to do:
aList.RemoveAt(1); // Should not be possible to remove object from this list
SetListOfObjects(aList);
I thought I could create a readonly "aList" class, but I still want to be able to modify the objects' properties, so I don't thing it is the right thing to do.
One idea is to actually create a class, where all objects from the aList would be properties instead. That way, the class would impose a structure that could not be modified by the client. However, the system's context may vary (depending on the hardware being used), meaning this class would need to be dynamically created.
I however do not want to lose the type safety and hence would rather not use the dynamic or expando Objects.
Maybe using something of the sort is a good idea here, see answer from danijels. But I am not sure the type safety would be preserved by doing so.
Suppose i have multiple entities that inherit from TableEntity.
i want to use a generic class , like Cachemanager , and write to azure storage different T
This link below mentions TableEntityAdapter
https://learn.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.table.tableentityadapter-1.-ctor?view=azure-dotnet
I am looking for a Code examples.
Thanks,Peter
I wrote TableEntityAdapter class so I will try to provide usage example, though I see question is asked a month ago, hopefully it is still usefull.
Firstly have a look at how it is tested by the unit tests by the SDK:
https://github.com/Azure/azure-storage-net/blob/master/Test/ClassLibraryCommon/Table/TableOperationUnitTests.cs
search for TableEntityAdapter to find relevant tests.. Note that unit tests of course do not write to actual table Storage service, they simulate what would happen when you read and write in terms of api calls, but they should give you a good idea.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Below is a simplified sample code for TableEntityAdapter:
To write your custom .Net object to Table Storage:
// 1. ShapeEntity is the custom .Net object that we want to write and read from Table Storage.
ShapeEntity shapeEntity = new ShapeEntity(Guid.NewGuid().ToString(), Guid.NewGuid().ToString(), "square", 4, 4);
OperationContext operationContext = new OperationContext();
// 2. Instantiate a TableEntityAdapter object, passing in the custom object to its constructor. Internally this instance will keep a reference to our custom ShapeEntity object.
TableEntityAdapter<ShapeEntity> writeToTableStorage = new TableEntityAdapter<ShapeEntity>(shapeEntity, partitionKey, rowKey);
// 3. Now you can write the writeToTableStorage object to Table Storage just like any other object which inherits from ITableEntity interface. The TableAdapter generic class handles the boiler plate code and hand shaking between your custom .Net object and table storage sdk / service.
To read your custom object from Table Storage:
// 1. Read your entity back from table storage using the same partition / row key you specified (see step 2 above). You can use the Retrieve<T> table operation, specify T as TableEntityAdapter<ShapeEntity>
// 2. This should return you the TableEntityAdapter<ShapeEntity> object that you wrote to TableStorage at step 3 above.
// 3. To access the original ShapeEntity object, just refer to the OriginalEntity property of the returned TableEntityAdapter<ShapeEntity> object.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Note: TableEntityAdapter api does not support objects with collection type properties ie. List, Array, IEnumerable, ICollection etc..
If your objects contain these types of properties (directly under the root or somewhere in their object graph) then you should consider using the original Nuget package that I wrote which supports collection type property types as well.
ObjectFlattenerRecomposer Api version 2.0 that supports Collection, Enumerable type properties:
https://www.nuget.org/packages/ObjectFlattenerRecomposer/
and recently uploaded .Net Core version:
https://www.nuget.org/packages/ObjectFlattenerRecomposer.Core/1.0.0/
I have four classes. Person, NaturalPerson (Inherits from Person), GroupFamilyMember(Inherits from NaturalPerson), QuotationHolder(Inherits from GroupFamilyMember).
They all share the same ID.
My problem is the following one:
There is a method that returns an existing NaturalPerson(stored in DB) object based on a document number. Then, I have to create a QuotationHolder, and I want that QuotationHolder object to contain the retrieved NaturalPerson object.
The issue, is that I can´t cast the object like this (I know the reason):
QuotationHolder quotationHolder = (QuotationHolder) naturalPerson;
I tried creating a new QuotationHolder object and setting its values with the naturalPerson´s object values using reflection.
But as I lose the reference to the retrieved object, when I want to save in cascade, NHibernate gives me the following exception:
a different object with the same identifier value was already associated with the session
I guess that its trying to save the object as a new one.
Just to consider:
The IDs are set using the HILO Algorithm.
The mappings cannot be changed, neither the classes.
The way I understand your question, this is what you are trying to do:
class A {}
class SubA : A {}
A instance = new A();
instance = magic-convert-object-to-different-type<Sub>(instance);
Changing the class (type) of an existing object cannot be done in C#. NHibernate is designed to translate between the object model and a relational storage model, and therefore has no support for this either.
There are other possible models to handle when objects need to be perceived as changing classes, for instance the State design pattern. Or maybe you should reconsider if this is really what you want at all - perhaps the additional data that the subclasses hold should be in "sibling-objects", that reference back to the basic person class.
It is also possible to use plain SQL to convert the data that represents the NaturalPerson into data that represents a QuotationHolder - when asked to load the converted data, NHibernate will now to instantiate a QuotationHolder instead.
I am trying to figure out how to keep an object useable between client sessions in DB4O. From what I understand, once a client session is closed, the object no longer resides in any cache and despite the fact that I have a valid UUID, I cannot call Store on it without causing a duplicate to be inserted. I searched for a way to manually re-add it to the cache but there is no such mechanism. Re-retrieving it will force me to copy over all the values from the now useless object.
Here's the above paragraph in code:
Person person = new Person() { FirstName = "Howdoyu", LastName = "Du" };
Db4oUUID uuid;
// Store the new person in one session
using (IObjectContainer client = server.OpenClient())
{
client.Store(person);
uuid = client.Ext().GetObjectInfo(person).GetUUID();
}
// Guy changed his name, it happens
person.FirstName = "Charlie";
using (var client = server.OpenClient())
{
// TODO: MISSING SOME WAY TO RE-USE UUID HERE
client.Store(person); // will create a new person, named charlie, instead of changing Mr. Du's first name
}
The latest version of Eloquera supports these scenarios, either through an [ID] attribute or via Store(uid, object).
Any thoughts?
This functionality is indeed missing in db4o =(. That makes db4o very difficult to use in many scenarios.
You basically have to write your own reattach method by coping all attributes over. Maybe a library like Automapper can help, but in the end you have to do it yourself.
Another question is if you really want to use the db4o UUIDs to identify an object. db4o UUIDs are huge and not a well known type. I personally would prefer regular .NET GUIDs.
By the way: There's the db4o .Bind() method, which binds a object to an existing id. However it hardly does what you really want. I guess that you want to store changes made to an object. Bind basically replaces the object and breaks the object graph. For example if you have a partially loaded objects and then bind it, you loose references to objects. So .Bind is not usable.
Okay, Gamlor's response about the db4o IExtContainer.Bind() method pointed me to the solution. Please note that this solution is only valid in very specific situations where access to the DB is tightly controlled, and no external queries can retrieve object instances.
Warning: This solution is dangerous. It can fill your database with all kinds of duplicates and junk objects, because it replaces the object and doesn't update its values, therefore breaking any references to it. Click here for a complete explanation.
UPDATE: Even in tightly controlled scenarios, this can cause endless headaches (like the one I'm having now) for anything other than a flat object with value type properties only (string, int, etc.). Unless you can design your code to retrieve, edit and save objects in a single db4o connection, then I recommend not to use db4o at all.
Person person = new Person() { FirstName = "Charles", LastName = "The Second" };
Db4oUUID uuid;
using (IObjectContainer client = server.OpenClient())
{
// Store the new object for the first time
client.Store(person);
// Keep the UUID for later use
uuid = client.Ext().GetObjectInfo(person).GetUUID();
}
// Guy changed his name, it happens
person.FirstName = "Lil' Charlie";
using (var client = server.OpenClient())
{
// Get a reference only (not data) to the stored object (server round trip, but lightweight)
Person inactiveReference = (Person) client.Ext().GetByUUID(uuid);
// Get the temp ID for this object within this client session
long tempID = client.Ext().GetID(inactiveReference);
// Replace the object the temp ID points to
client.Ext().Bind(person, tempID);
// Replace the stored object
client.Store(person);
}
I am accessing a service and I get returned an object in the form of (for example)
Car _car = _service.FetchCar(carId)
Car.Color
Car.Tires.Right.Front
Car.Tires.Left.Front
Car.Tires.Right.Back
Car.Tires.Left.Back
Car.Spoiler
etc, etc...you get the idea. My application is recieving many different objects with many differen structures. What I'd like to do is to be able to have one method that would be able to take one type of object and map it to another...
What I don't want to have to do is to manually map all the fields from the service object to my domain object with every object type
for example
If I get a Car object from the service I'd like to map it to my own Car object and if I get a Table object I'd like to map it to my own table object
any ideas?
Have a look at tools like AutoMapper to handle these "copy all fields from object A to object B" scenarios.
AutoMapper will automatically copy all fields with identical names from one instance to the other, and you can set up additional rules to allow copying of fields where the names don't match (and you can also define custom converters if you need to convert data types along the way).
Very useful, very helpful!
Marc