Suppose i have multiple entities that inherit from TableEntity.
i want to use a generic class , like Cachemanager , and write to azure storage different T
This link below mentions TableEntityAdapter
https://learn.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.table.tableentityadapter-1.-ctor?view=azure-dotnet
I am looking for a Code examples.
Thanks,Peter
I wrote TableEntityAdapter class so I will try to provide usage example, though I see question is asked a month ago, hopefully it is still usefull.
Firstly have a look at how it is tested by the unit tests by the SDK:
https://github.com/Azure/azure-storage-net/blob/master/Test/ClassLibraryCommon/Table/TableOperationUnitTests.cs
search for TableEntityAdapter to find relevant tests.. Note that unit tests of course do not write to actual table Storage service, they simulate what would happen when you read and write in terms of api calls, but they should give you a good idea.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Below is a simplified sample code for TableEntityAdapter:
To write your custom .Net object to Table Storage:
// 1. ShapeEntity is the custom .Net object that we want to write and read from Table Storage.
ShapeEntity shapeEntity = new ShapeEntity(Guid.NewGuid().ToString(), Guid.NewGuid().ToString(), "square", 4, 4);
OperationContext operationContext = new OperationContext();
// 2. Instantiate a TableEntityAdapter object, passing in the custom object to its constructor. Internally this instance will keep a reference to our custom ShapeEntity object.
TableEntityAdapter<ShapeEntity> writeToTableStorage = new TableEntityAdapter<ShapeEntity>(shapeEntity, partitionKey, rowKey);
// 3. Now you can write the writeToTableStorage object to Table Storage just like any other object which inherits from ITableEntity interface. The TableAdapter generic class handles the boiler plate code and hand shaking between your custom .Net object and table storage sdk / service.
To read your custom object from Table Storage:
// 1. Read your entity back from table storage using the same partition / row key you specified (see step 2 above). You can use the Retrieve<T> table operation, specify T as TableEntityAdapter<ShapeEntity>
// 2. This should return you the TableEntityAdapter<ShapeEntity> object that you wrote to TableStorage at step 3 above.
// 3. To access the original ShapeEntity object, just refer to the OriginalEntity property of the returned TableEntityAdapter<ShapeEntity> object.
Your original ShapeEntity object does not even need to implement ITableEntity interface. It also can contain complex nested properties. TableEntityAdapter class supports these scenarios.
Note: TableEntityAdapter api does not support objects with collection type properties ie. List, Array, IEnumerable, ICollection etc..
If your objects contain these types of properties (directly under the root or somewhere in their object graph) then you should consider using the original Nuget package that I wrote which supports collection type property types as well.
ObjectFlattenerRecomposer Api version 2.0 that supports Collection, Enumerable type properties:
https://www.nuget.org/packages/ObjectFlattenerRecomposer/
and recently uploaded .Net Core version:
https://www.nuget.org/packages/ObjectFlattenerRecomposer.Core/1.0.0/
Related
lets say that I have a system that heavily relies on a single object, for example a Person.
I have a storage service (service 1) that handles basic CRUD operations for Person, and saves this person in mongo.
But this Person object is very big and nested, it has a lot of properties, some of them are irrelevant for some of the other services.
For example we have service 2 that gets a Person object from the storage service and renders it in the ui, he only cares about some of the properties and doesnt need the whole big and nested Person object.
And we have service 3 that gets a Person object from the storage service and needs a diff subset of properties then service 2.
We are using .Net so everything is strongly typed.
The straightforward solution is to define a subset of the Person class in each of the services, and have some converter that converts the Person object to the object this service needs (removing irrelevant properties). But some services need the exact Person object besides 5-10 properties, and as I said the Person is a huge nested object.
What is the best practice for this scenario? We dont want to re define a new “”mini person” for every service with its relevant properties because that feels like huge code duplication + creating heavy dependencies between every service and the storage service.
But we are using .Net so we have to have some strongly typed object, otherwise we wont be able to make any manipulations on the object we received from the storage service, considering we dont want to use it as plain json and just traverse the keys.
We thought of 2 solutions:
First is to use the same Person object between all services. Each service will get the person obj, so any manipulation it needs and then serialize it with a custom serializer that removes some keys from the json, this way the one whos getting the response will get only the relevant props.
Second is to add some kind of annotations to the props that says “if the req came from service 2 then do json ignore” and just dont serialize this prop in the return value from the storage service. But this makes the storage service notnot isolated and simple, and this way in service 2 we again cant deserialize and manipulate the obj cause we dont have a strongly typed “mini person”, so we have to use the json.
Is there a better known solution for this situation?
And again, this is under the assumption that the Person obj is huge and requires a lot of work do re define it again and again and will create heavy dependencies.
Thanks.
If we're talking best practices, maybe this Person object shouldn't have gotten so big to begin with. You can always break nested arrays and objects in their own separate files, entities or mongo collection.
But as it stands, maybe you could use dynamic or IDictionary<string, object> instead of creating a type Person and mapping every single strongly typed field in that class.
using System.Dynamic;
var person_from_database = person_repository.GetById(1);
dynamic person_result = new ExpandoObject();
person_result.name = person_from_database.fullname; //declare whatever properties your services need
person_result.first_address = person_from_database.addresses[0]; //even nested arrays and objects
person_result.name = person_result.name.ToUpper(); //modify values if needed
return Ok(person_result); //return whatever, no mapping
More details here!
I've looked through the source and I'm not finding anything (although I'm not great at IL), but I would like see if there is a way to provide Dapper a class instance instead of it always instantiating a new one. The reason for this is we may sometimes make two separate calls to two different stored procedures - one returns some columns of an 'entity', the other returns other columns. However, instead of the second query using the entity we received in the first call, we instead get two instances of essentially the same entity. It would be much more preferable for Dapper to use the existing entity class and map the query results to that existing class.
Is there any way to intercept Dapper's class instantiation so as to provide it with an existing instance if needed?
Excellent question. At the moment, it allows you to indicate a particular constructor, but it always news:
il.Emit(OpCodes.Newobj, specializedConstructor);
What we could do is make it possible to specify either a constructor or a static factory method; I suspect this would be just a three-line change to the core materializer code, plus a few other places. Not impossible, but then it gets into questions like calling-context: how does dapper provide caller-specified context to the factory. Again: all possible (protobuf-net does pretty much the same thing).
But none of that exists today. It wouldn't be impossible.
I'm using v2.0 of the API via the C# dll. But this problem also happens when I pass a Query String to the v2.0 API via https://rally1.rallydev.com/slm/doc/webservice/
I'm querying at the Artifact level because I need both Defects and Stories. I tried to see what kind of query string the Rally front end is using, and it passes custom fields and built-in fields to the artifact query. I am doing the same thing, but am not finding any luck getting it to work.
I need to be able to filter out the released items from my query. Furthermore, I also need to sort by the custom c_ReleaseType field as well as the built-in DragAndDropRank field. I'm guessing this is a problem because those built-in fields are not actually on the Artifact object, but why would the custom fields work? They're not on the Artifact object either. It might just be a problem I'm not able to guess at hidden in the API. If I can query these objects based on custom fields, I would expect the ability would exist to query them by built-in fields as well, even if those fields don't exist on the Ancestor object.
For the sake of the example, I am leaving out a bunch of the setup code... and only leaving in the code that causes the issues.
var request = new Request("Artifact");
request.Order = "DragAndDropRank";
//"Could not read: could not read all instances of class com.f4tech.slm.domain.Artifact"
When I comment the Order by DragAndDropRank line, it works.
var request = new Request("Artifact");
request.Query = (new Query("c_SomeCustomField", Query.Operator.Equals, "somevalue").
And(new Query("Release", Query.Operator.Equals, "null")));
//"Could not read: could not read all instances of class com.f4tech.slm.domain.Artifact"
When I take the Release part out of the query, it works.
var request = new Request("Artifact");
request.Query = (((new Query("TypeDefOid", Query.Operator.Equals, "someID").
And(new Query("c_SomeCustomField", Query.Operator.Equals, "somevalue"))).
And(new Query("DirectChildrenCount", Query.Operator.Equals, "0"))));
//"Could not read: could not read all instances of class com.f4tech.slm.domain.Artifact"
When I take the DirectChildrenCount part out of the query, it works.
Here's an example of the problem demonstrated by an API call.
https://rally1.rallydev.com/slm/webservice/v2.0/artifact?query=(c_KanbanState%20%3D%20%22Backlog%22)&order=DragAndDropRank&start=1&pagesize=20
When I remove the Order by DragAndDropRank querystring, it works.
I think most of your trouble is due to the fact that in order to use the Artifact endpoint you need to specify a types parameter so it knows which artifact sub classes to include.
Simply adding that to your example WSAPI query above causes it to return successfully:
https://rally1.rallydev.com/slm/webservice/v2.0/artifact?query=(c_KanbanState = "Backlog")&order=DragAndDropRank&start=1&pagesize=20&types=hierarchicalrequirement,defect
However I'm not tally sure if the C# API allows you to encode additional custom parameters onto the request...
Your question already contains the answer.
UserStory (HierarchicalRequirement in WS API) and Defect inherit some of their fields from Artifact, e.g. FormattedID, Name, Description, LastUpdateDate, etc. You may use those fields in the context of Artifact type.
The fields that you are trying to access on Artifact object do not exist on it. They exist on a child level, e.g. DragAndDropRank, Release, Iteration. It is not possible to use those fields in the context of Artifact type.
Parent objects don't have access to attributes specific to child object.
Artifact is an abstract type.
If you need to filter by Release, you need to make two separate requests - one for stories, the other for defects.
I have four classes. Person, NaturalPerson (Inherits from Person), GroupFamilyMember(Inherits from NaturalPerson), QuotationHolder(Inherits from GroupFamilyMember).
They all share the same ID.
My problem is the following one:
There is a method that returns an existing NaturalPerson(stored in DB) object based on a document number. Then, I have to create a QuotationHolder, and I want that QuotationHolder object to contain the retrieved NaturalPerson object.
The issue, is that I can´t cast the object like this (I know the reason):
QuotationHolder quotationHolder = (QuotationHolder) naturalPerson;
I tried creating a new QuotationHolder object and setting its values with the naturalPerson´s object values using reflection.
But as I lose the reference to the retrieved object, when I want to save in cascade, NHibernate gives me the following exception:
a different object with the same identifier value was already associated with the session
I guess that its trying to save the object as a new one.
Just to consider:
The IDs are set using the HILO Algorithm.
The mappings cannot be changed, neither the classes.
The way I understand your question, this is what you are trying to do:
class A {}
class SubA : A {}
A instance = new A();
instance = magic-convert-object-to-different-type<Sub>(instance);
Changing the class (type) of an existing object cannot be done in C#. NHibernate is designed to translate between the object model and a relational storage model, and therefore has no support for this either.
There are other possible models to handle when objects need to be perceived as changing classes, for instance the State design pattern. Or maybe you should reconsider if this is really what you want at all - perhaps the additional data that the subclasses hold should be in "sibling-objects", that reference back to the basic person class.
It is also possible to use plain SQL to convert the data that represents the NaturalPerson into data that represents a QuotationHolder - when asked to load the converted data, NHibernate will now to instantiate a QuotationHolder instead.
this is related to a post I made yesterday, but havent been able to resolve: ASP.Net Web API showing correctly in VS but giving HTTP500
I think I need to simplify what I'm trying to do, and work up from there.
Can anyone please point me to an example of using the asp.net Web API (I'm using VS 2012 Express RC), to return JSON from a parent/child model?
eg: (pseudo Json):
Parent: Mark
..Child: Tom
..Child: Adam
..Child: Becki
Parent: Terry
..Child: Sophie
..Child: robert
I can get it to return data from one table, but not from a linked table.
Thanks for any help,
Mark
There are two ways to go away with this.
Create a new template class, loop through the list fetched using the EF and assign the values to the properties defined in the template class. This would give you the accurate results from one to multiple tables if any. Finally return the list to the json call.
While fetching the list from the EF, create a new anonymous type and select your desired columns. For this your webmethod would have the return type as IEnumerable
Cheers!
After looking at your original post my guess is that you have circular references in your objects. This post makes reference to using Json.Net which will give you more control over what is being returned to the client.
Your other option is to remove the foreign key reference tblCustomerBooking from the tblRental object (see below).
This may allow you to return the JSON objects and test that circular references are the issue.
[ForeignKey("customer_id")]
public virtual tblCustomerBooking tblCustomerBooking { get; set; }
I do suggest using Json.NET if you're planning on returning your Domain (i.e. Entity Objects) as this will avoid all circular references, and allow you keep your two-way object relationships.
My personal preference is to using DTO's and map your Domain objects to these DTO's, allowing you to have more control over what the client sees (as seeing the 'tbl' prefix in a object name isn't good practice)