I already set up DocumentDb and uploaded the documents to DocumentDb.
My JSON documents have data nested upto 4 levels, eg:-
{
id: '12345',
properties:
{
Accessories:
{
watch:1,
WristBands: [{
blue:1,
red: 2}]
}
}
Name: Leo,
Age: 24
}
I want to use azure search service for able to search upto the last level(e.g. The count of wristband colour i.e. blue:1). I have setup the service as well.
I want some help in creating the index and the indexer for such nested data in C# so that I can query the service.
Already found, how to use DataType.Collection(DataType.String), but this supports till level 2.
Classes can also be made for each sub-level but have no idea how to define them during indexing. Refered to example https://github.com/Azure-Samples/search-dotnet-getting-started .
One more tutorial on setting up Azure Search indexer for DocumentDB: https://azure.microsoft.com/en-us/documentation/articles/documentdb-search-indexer/.
+1 on Kirk Evans' blog post, which shows how to flatten the JSON document. The idea is to leverage the optional query property under container in the data source creation request, and use a join query in that place.
Unfortunately, Azure Search does not support nested documents. As you can see from our Uservoice page, this by far the most requested feature so it is something that we are very interested in adding (please cast your vote here if you don't mind). Unfortunately, we do not yet have a timeline on when we will have it implemented yet.
In the meantime, for some applications it is realistic to either flatten the JSON documents or to leverage collection types. For more details on flattening documents, Kirk Evans has a really good blog post on the topic.
Related
So i just recently able to play with google spreadsheet API,
i intend to use it as my C# apps database, so people can use either web (via docs) or desktop (via C#).
There are 1 problem that i'm afraid of: Will there conflict if 2 people are doing inserting + updating.
Scenario:
There are 2 user,
A will be in charge of adding new row
B will be in charge of reviewing A work and putting comment.
Both A and B will working at the same time.
Conflict I'm afraid of:
When B using the apps to update comment, the apps will get the data and have the row ranges, let say A100:H100, Then A proceed to add data.
I'm afraid that if the data is added above A100:H00, then when the apps submit changes from B, it will not placed in correct row.
Is there anyway to avoid this?
Yes, you can Add and update at the same time by using Google Sheets API. batchUpdate would be the best method to work by taking one or more request objects, each one specifying a single kind of request to perform.
When updating a spreadsheet, some kinds of requests may return responses. These are returned in an array, with each response occupying the same index as the corresponding request. Some requests do not have responses. For those requests, the response will be empty.
Typically the "Add" requests will have responses, so that you know information (such as the ID) of the newly added object. See Response for the list of supported responses.
Sample update from response:
{
// Union field kind can be only one of the following:
"addNamedRange": {
object(AddNamedRangeResponse)
},
"addSheet": {
object(AddSheetResponse)
},
"addFilterView": {
object(AddFilterViewResponse)
},
my following problem is, that I have a List of Items and want to index those with elasticsearch. I have a running elasticsearch instance, and this instance has an index called "default".
So I'm running following code:
var items = GetAListOfItem();
var response = Client.IndexMany(items);
I also tried it with Client.IndexManyAsync(items). But that didn't do anything.
Only 1 Item of this List gets indexed. Nothing more. I think its the last item, which got indexed.
I thought it could be a thing with IEnumerable and multiple enumerations, but i parsed it as a List<Item>.
Another Question would be about the best practice with Elasticsearch. Is it common to use a Index per Model. So if I'm gathering data from for example Exchange and another system, I would do 2 indeces?
ExchangeIndex
OtherSystemIndex
Thank you for your help.
Update: I saw that my Client.Index does all those calls succesful, but all those objects got the same ID from NEST. Normally she had to increment by herself, isnt it?
Update 2: I fixed the Indexing Problem. I had setup an empty ID-Field.
But still have the question mit best practive about Elasticsearch.
If you are uploading all the data with the same id, it will not increment the id, that will update the record with that id and you will have only one record, so you can upload the data without an id or give wherever unique id to identified the records.
The other common problem is that your records have not the same mapping that you give for the index.
About the other question, in the indexes, you store the information that is relevant for you, even if that have content from many models, the only thing that you have to avoid is mix information, if you have an index about server logs dont mix it with user activities for example.
Really struggling with this and not finding much helpful stuff on google.
I set up a sitemap, and I want some breadcrumbs such that url:
/CatManagement/Cats/38
displays breadcrumbs
Cat Management > Cats > Mr. Fuzzy Wuzzy
I don't quite understand what the sitemap node structure would be for this as the 38 is sort of a parameter of Cats.
In the dynamicNodeProvider I created I can probably grab the ID somehow and do a quick lookup to get the name, but I am not sure how to bring it all together.
Any ideas?
Have a look at Routing Basics in the MvcSiteMapProvider wiki. You just need to ensure that your parameter (38) is preserved from the current request, the node matching logic already takes into consideration action method parameters. That example shows how you would do that using a custom dynamic node provider, but I recommend reading the entire document as understanding it is key to making MvcSiteMapProvider work.
I have also created a more in depth look at the problem here with working demos for download: http://www.shiningtreasures.com/post/2013/09/02/how-to-make-mvcsitemapprovider-remember-a-user-position
I'm using Ektron CMS version 8.5 SP2.
I have some items in a taxonomy. Some are actual pages, some are library items (documents like Word files and PDFs).
Let's say there are 3 pages and 2 library items for a total of 5 items in my taxonomy.
I use the following code...
ContentManager cManager = new Ektron.Cms.Framework.Content.ContentManager();
Ektron.Cms.Content.ContentTaxonomyCriteria ctCriteria = new Ektron.Cms.Content.ContentTaxonomyCriteria();
ctCriteria.AddFilter(1707, true); // hard coded taxonomy ID
List<ContentData> list = cManager.GetList(ctCriteria);
Label1.Text = list.Count.ToString();
When this code runs, the count of items in the list is 3. If I output the actual list, I can see it's only the pages in the taxonomy, not the 2 library items.
It seems that the ContentManager.getList() function does not get library items, even when those items have been added to the taxonomy. I can confirm that in the admin workarea, the library items are visible in the taxonomy.
For clarification, this is a problem with retrieving items that have already been added to the taxonomy.
Does anyone know how I can retirieve a list of all items in a taxonomy, including any library items in there.
Note: If I add the files to the Document Managment System instead of the library, it works perfectly. But in the live system, I have hundreds of items in the library and I'm hoping theres' a way to view them via a taxonomy without having to move them all into the DMS.
I have posted this question on the Ektron developers forum as well, but I've had no reply. I'm hoping somebody here can help.
Cheers.
A follow up to my comment from the other day on #nedlud's answer, I felt like this deserved its own answer though.
According to the Framework API docs:
If intent is to retrieve CMS items that have been categorized in Taxonomies, use TaxonomyItemManager.
But as already noted in the comments, the TaxonomyItemData objects returned by this API have a number of empty properties such as QuickLink and Html. I've found that using the TaxonomyManager, one can successfully query for items assigned to particular taxonomy categories.
Here's a brief snippet using the Framework API (version >= 8.5); this feels reminiscent of working with the older (version <= 8.0) taxonomy API wherein one would create a TaxonomyRequest and get an object structure back that encapsulated not only the taxonomy iteself, but the items categorized into that taxonomy:
//e.g. for a single-level taxonomy
long taxRoot = 1707; //from OP's question
TaxonomyManager taxManager = new TaxonomyManager();
//GetTree overload supplying includeItems parameter
TaxonomyData taxTree = taxManager.GetTree(taxRoot, includeItems: true);
foreach(TaxonomyItemData taxItem in taxTree.TaxonomyItems)
{
//these should print true
Response.Write(!String.IsNullOrEmpty(taxItem.QuickLink));
Response.Write(!String.IsNullOrEmpty(taxItem.Html));
}
I'm currently refactoring some version 8.0 code into version 8.6 and converting to the Framework API. Until Ektron fixes the (bug?) of TaxonomyItemManager returning TaxonomyItemData with null properties, I'll be using the above method + LINQ for the sorting/filtering/etc.
I would look at the TaxonomyItemManager rather than the ContentManager.
Thanks to #maddoxej suggestion of using the TaxonomyItemManager, I have working solution code...
TaxonomyItemCriteria criteria = new TaxonomyItemCriteria();
criteria.AddFilter(TaxonomyItemProperty.TaxonomyId, CriteriaFilterOperator.EqualTo, 1707);
TaxonomyItemManager taxonomyItemManager = new TaxonomyItemManager();
List<TaxonomyItemData> taxonomyItemList = taxonomyItemManager.GetList(criteria);
Label1.Text = taxonomyItemList.Count.ToString();
This code now shows the expected count of "5", and I can display all the itmes :)
So many "manager" classes in Ektron.
I'm trying to get grouping and paging (in a datagrid) to work simultaneously in RIA Services. I already have a pretty elaborate UserControl that is based on the excellent DomainCollectionView. However, I've had trouble making the grouping work.
I added this line to the sample:
this.CollectionView.GroupDescriptions.Add(new PropertyGroupDescription("Int32"));
Note about the sample: I changed how Int32 is being assigned as key % 2, so there should be
two resulting groups, which should have hundreds of items per group. The paging is set to 10 items. No grouping is applied at the query level.
So in this case, I would expect the grouping header to show the total number of items, however, it'll just 'Group 0', as having an item count of 10, which is clearly incorrect. It is only showing the item counts in the first page.
Question
Did any make grouping and paging work together with or without DomainCollectionView?
For proper context refer to the article regarding DomainCollectionView and the sample posted for it.
Kyle McClellan of Microsoft replied to a personal email regarding this question and provided the detailed explanation below.
In summary, grouping and paging won't work together well - you certainly won't get a fully featured grouping experience in the DataGrid.
For me the solution will be to remove paging when grouping is present.
On Wed, Jun 6, 2012 at 7:51 PM, Kyle McClellan
wrote: Ah, now I understand what you were expecting. There are two
(competing?) things at play here. First, the client technology knows
nothing about the server. It only sees data that exists locally.
Because of this, the controls, etc. will report that there’s only a
single group and it only contains a page’s worth of data. Second, the
server technology can see the all the data but has only been asked to
return a single page. It sorts and slices the data appropriately and
then returns it. It could determine the number of groups and the size
of each, but there’s no way to communicate it back to the client.
What you’re seeing is the view functioning as designed. It will show
you all the items in group 0 before all the items in group 1. At some
point in the middle you will see two groups on a page, but otherwise
the results will all be in the same group.