Is it possible (preferably using the C# Builders) to add a new item to a deeply nested array I.e. an array within an array within an array.
My data model looks something like :
public class Company
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Department> Departments { get; set; }
}
public class Department
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Managers> Managers { get; set; }
}
public class Manager
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Employee> Employees { get; set; }
}
public class Employee
{
public string Id { get; set; }
public string Name { get; set; }
}
Which translates to:
{
"Id": 12345,
"Name": "Company Ltd",
"Departments": [
{
"Id": 1,
"Name": "Development",
"Managers" : [
{
"Id" : 5555,
"Name" : "The Boss",
"Employees": [
{
"Id" : 123,
"Name" : "Developer 1"
},
{
"Id" : 124,
"Name" : "Developer 2"
}
]
}
]
}
]
}
If I wanted to add another employee under a specific manager how would I go about doing that?
In order to push to a nested array, you must make use of the positional operator $ in order to specify a matching outer array element to apply the operation to. For example:
db.collection.update(
{"my_array._id": myTargetId},
{$push: {"my_array.$.my_inner_array": myArrayElem}}
);
This breaks down, however, for traversing nested arrays--that is, you can only use the positional operator on the single array, not any nested ones. This is a well-defined problem as noted in the MongoDB documentation.
If you absolutely need to perform these kinds of nested array operations, then you have a couple of options available to you:
The first, and preferred, is to update your document structure and avoid nesting arrays more than one level deep. This will avoid the issue altogether, but will require any existing data to be migrated to the new structure and additional efforts to be made to structure the data in the way you need on the fly on retrieval. Separate client and server representations of your data will end up being required.
The second is to perform a series of less-reliable steps:
1. Retrieve the original document.
2. Locate the indexes for each array where your target element is located manually.
3. Attempt an update on the specific index chain and attempt to match that index chain as well.
4. Check the result of the update attempt--if it fails, then it's possible that the document was changed while the indexes were being calculated.
For example, if you wanted to update manager with ID 5555 to have the additional employee, you'd perform the following query after retrieving the indexes:
// Index chain found to be Departments.0 and Managers.0
db.collection.update(
{
"Id": 12345,
"Departments.0.Managers.0.Id": 5555 // Specify index chain 0,0 and ensure that our target still has Id 5555.
},
{ $push: {
"Departments.0.Managers.0.Employees": myNewEmployee // Push to index chain 0,0
}}
);
Use positional operator for each of the arrays except the one you want to push to.
Use array filters in update options to specify the department and manager ids. The letter used in the array filters should match the letter used as the positional operator in the update definition. So "d.Id" -> "Departments.$[d]"
If you want to match on more than one property you can use a dictionary in the array filter.
private IMongoCollection<Company> _collection;
public async Task AddEmployee()
{
var filter = Builders<Company>.Filter.Where(d => d.Id == "companyId");
var update = Builders<Company>.Update
.Push("Departments.$[d].Managers.$[m].Employees", new Employee { Id = "employeeId", Name = "employeeName" });
var updateOptions = new UpdateOptions
{
ArrayFilters = new List<ArrayFilterDefinition>
{
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("d.Id", "departmentId")),
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("m.Id", "managerId")),
}
};
await _collection.UpdateOneAsync(filter, update, updateOptions);
}
The problem here is that you need to use strings in the update definition and filters but not sure how to manage it without strings.
To remove an employee from the array is similar but you will have to specify an extra filter for the employee that you want to remove.
public async Task FireEmployee()
{
var filter = Builders<Company>.Filter.Where(d => d.Id == "companyId");
var employeeFilter = Builders<Employee>.Filter.Where(e => e.Id == "employeeId");
var update = Builders<Company>.Update
.PullFilter("Departments.$[d].Managers.$[m].Employees", employeeFilter);
var updateOptions = new UpdateOptions
{
ArrayFilters = new List<ArrayFilterDefinition>
{
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("d.Id", "departmentId")),
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("m.Id", "managerId")),
}
};
await _collection.UpdateOneAsync(filter, update, updateOptions);
}
Related
I've got this entity:
internal record OrderBookEntity
{
[BsonId]
public required AssetDefinition UnderlyingAsset { get; init; }
public required List<OrderEntity> Orders { get; init; }
}
which leads to this sort of document:
{
_id: {
Class: 0,
Symbol: 'EURUSD'
},
Orders: [
{
_id: 'a611ffb1-c3e7-43d6-8238-14e311122125',
Price: '-10.000000101',
Amount: '30.000000003',
OrderAction: 1,
EffectiveTime: ISODate('2022-10-14T06:33:02.872Z')
},
{
_id: 'a611ffb1-c3e7-43d6-8238-14e311122126',
Price: '-10.000000101',
Amount: '30.000000003',
OrderAction: 1,
EffectiveTime: ISODate('2022-10-14T06:33:08.264Z')
}
]
}
I can add and remove from the Orders set without updating the whole document with:
Builders<OrderBookEntity>.Update.AddToSet(...);
Builders<OrderBookEntity>.Update.Pull(...);
I can't, however, see a way to modify one of those in place.
How would I go about changing the Amount on saying a611ffb1-c3e7-43d6-8238-14e311122125 without having to read the document, modify the collection, and update the whole thing, or just pulling and reading the order... Neither of which seems particularly performant.
You can work with FieldDefinition by providing the field to be updated in string instead of Func expression.
MongoDB query
db.collection.update({
"Orders._id": "a611ffb1-c3e7-43d6-8238-14e311122125"
},
{
$set: {
"Orders.$.Amount": "100"
}
})
Demo # Mongo Playground
MongoDB .NET Driver syntax
var filter = new BsonDocument
{
{ "Orders._id", "a611ffb1-c3e7-43d6-8238-14e311122125" }
};
var update = Builders<OrderBookEntity>.Update.Set("Orders.$.Amount", /* value */);
UpdateResult result = await _collection.UpdateOneAsync(filter, update);
Demo
How can I create an array of unique elements in MongoDB c# driver,
I don't want to check every time if this element already in the array or not.
suppose :
list=[1,2,3,4]
then I shouldn't be able to add duplicate element (such as 3 )
You can use the AddToSet or AddToSetEach method, every time you create or update the array, as mentioned in the comments:
var update = Builders<Entity>.Update.AddToSetEach(e => e.Items, new [] {1, 2});
collection.UpdateOne(new BsonDocument(), update, new UpdateOptions { IsUpsert = true });
And you can define a schema validation when creating the collection, to ensure that duplicate items will never be allowed (an error would be thrown on insert/update, “Document failed validation”).
You can define the schema in the MongoDB shell, or here is how to do it in C#:
var options = new CreateCollectionOptions<Entity>
{
ValidationAction = DocumentValidationAction.Error,
ValidationLevel = DocumentValidationLevel.Strict,
Validator = new FilterDefinitionBuilder<Entity>().JsonSchema(new BsonDocument
{
{ "bsonType", "object" },
{ "properties", new BsonDocument("Items", new BsonDocument
{
{ "type" , "array" },
{ "uniqueItems", true }
})
}
})
};
database.CreateCollection("entities", options, CancellationToken.None);
where Entity is an example class like this:
public class Entity
{
public ObjectId Id { get; set; }
public int[] Items { get; set; }
}
Here are the API docs for CreateCollectionOptions and in the unit tests you can see examples of usage - e.g. JsonSchema(). Unfortunately I don't see anything in the reference docs with more thorough explanations.
Scenario:
I have to export an excel file which will contain list of Parts. We have enabled the user to select the columns and get only selected columns' data in the exported file. Since this is a dynamic report, I am not using any concrete class to map the report as this will result in exporting empty column headers in the report, which is unnecessary. I am using Dynamic Linq to deal with this scenario.
I have a list of dynamic objects fetched from dynamic linq.
[
{"CleanPartNo":"Test","Description":"test","AliasPartNo":["258","145","2313","12322"]},
{"CleanPartNo":"Test1","Description":"test1","AliasPartNo":[]}
]
How can I get 4 rows out of this json like
Please note that I cannot use a strongly typed object to deserialize/ Map it using JSON.Net
Update
Following is the code:
public class Part
{
public int Id { get; set; }
public string CleanPartNo { get; set; }
public string Description { get; set; }
public List<PartAlias> AliasPartNo { get; set; }
}
public class PartAlias
{
public int PartId { get; set; }
public int PartAliasId { get; set; }
public string AliasPartNo { get; set; }
}
var aliases = new List<PartAlias> {
new PartAlias{AliasPartNo="258" },
new PartAlias{AliasPartNo="145" },
new PartAlias{AliasPartNo="2313" },
new PartAlias{AliasPartNo="12322" }
};
List<Part> results = new List<Part> {
new Part{CleanPartNo="Test", Description= "test", PartAlias=aliases },
new Part{CleanPartNo="Test1", Description= "test1" }
};
var filters = "CleanPartNo,Description, PartAlias.Select(AliasPartNo) as AliasPartNo";
var dynamicObject = JsonConvert.SerializeObject(results.AsQueryable().Select($"new ({filters})"));
in the dynamicObject variable I get the json mentioned above
Disclaimer: The following relies on anonymous classes, which is not exactly the same as dynamic LINQ (not at all), but I figured that it may help anyway, depending on your needs, hence I decided to post it.
To flatten your list, you could go with a nested Select, followed by a SelectMany (Disclaimer: This assumes that every part has at least one alias, see below for the full code)
var flattenedResult = result.Select(part => part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
.SelectMany(part => part);
You are first projecting your items from result (outer Select). The projection projects each item to an IEnumerable of an anonymous type in which each item corresponds to an alias part number. Since the outer Select will yield an IEnumerable<IEnumerable> (or omething alike), we are using SelectMany to get a single IEnumerable of all the items from your nested IEnumerables. You can now serialize this IEnumerable of instances of an anonymous class with JsonConvert
var json = sonConvert.SerializeObject(flatResults);
Handling parts without aliases
If there are no aliases, the inner select will yield an empty IEnumerable, hence we will have to introduce a special case
var selector = (Part part) => part.AliasPartNumber?.Any() == true
? part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
: new[]
{
new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
}
};
var flattenedResult = result.Select(selector).SelectMany(item => item);
From json you provided you can get values grouped by their name in this way:
var array = JArray.Parse(json);
var lookup = array.SelectMany(x => x.Children<JProperty>()).ToLookup(x => x.Name, x => x.Value);
then this is just a manner of simple loop over the lookup to fill the excel columns.
However, I would suggest to do the flatenning before JSON. I tried for some time to make it happen even without knowing the names of the columns that are arrays, but I failed, and since it's your job, I won't try anymore :P
I think the best way here would be to implement custom converter that would just multiply objects for properties that are arrays. If you do it well, you would get infinite levels completely for free.
I have a data model that's structured something like this:
{
"_id": "1234abcd",
"name": "The Stanley Parable"
"notes": [
{
"_id": "5678efgh",
"content": "Kind of a walking simulator."
},
{
"_id": "5678efgh",
"content": "Super trippy."
},
]
}
I'm using the mongo driver for C# to interact with this model in .NET Core. I'm trying to add a new note to a particular game -- without necessarily being sure that there are any notes on that game already. This is what I've been able to find for adding to a nested list:
var filter = Builders<Mongo_VideoGame>.Filter.Eq("Id", id);
var update = Builders<Mongo_VideoGame>.Update.Push<Mongo_Note>(v => v.Notes, mongoNote);
var editedGame = _context.VideoGames.FindOneAndUpdate(filter, update);
But my problem is that this only works if there's already a "notes" element on the game model. I could just add an empty array when I make the original game, but that seems less flexible to changing the schema (i.e. what if I was adding notes to existing data).
An exception of type 'MongoDB.Driver.MongoCommandException' occurred in MongoDB.Driver.Core.dll but was not handled in user code: 'Command findAndModify failed: The field 'notes' must be an array but is of type null in document'
So I ended up finding a solution to this.
First off, .Push() does work as I originally expected. It will create an array if one doesn't already exist.
My actual problem had to do with the C# Model I was using. Previously:
public class Mongo_VideoGame
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
[BsonElement("name")]
public string Name { get; set; }
[BsonElement("notes")]
public List<Mongo_Note> Notes { get; set; }
}
But since there was no notes on the document it found, I was basically trying to insert into a null notes array. So to fix this, I set the default value for Notes so that it could be pushed into. Like this:
public class Mongo_VideoGame
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
[BsonElement("name")]
public string Name { get; set; }
[BsonElement("notes")]
public List<Mongo_Note> Notes { get; set; } = new List<Mongo_Note>();
}
Both the case of adding to an existing array and creating the array if it does not exist work now.
I use Searchblox to index and search my files, which itself calls ES 2.x to do the job. Searchblox uses a "mapping.json" file to initialize a mapping upon the creation of an index. Here's the link to that file. As "#Russ Cam" suggested here, I created my own class content with the following code (just like he did with the "questions" index and "Question" class):
public class Content
{
public string type { get; set; }
public Fields fields { get; set; }
}
public class Fields
{
public Content1 content { get; set; }
public Autocomplete autocomplete { get; set; }
}
public class Content1
{
public string type { get; set; }
public string store { get; set; }
public string index { get; set; }
public string analyzer { get; set; }
public string include_in_all { get; set; }
public string boost { get; set; }
} //got this with paste special->json class
These fields from the content class (type,store etc.) come from the mapping.json file attached above. Now, when I (just like you showed me) execute the following code:
var searchResponse = highLevelclient.Search<Content>(s => s.Query(q => q
.Match(m => m.Field(f => f.fields.content)
.Query("service")
All I get as a response on the searchResponse variable is:
Valid NEST response built from a successful low level call on POST: /idx014/content/_search
Audit trail of this API call:
-HealthyResponse: Node: http://localhost:9200/ Took: 00:00:00.7180404
Request:
{"query":{"match":{"fields.content":{"query":"service"}}}}
Response:
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":0,"max_score":null,"hits":[]}}
And no documents in searchResponse.Documents. Contradictorily, when I search for the "service" query on Searchblox or make an API call to localhost:9200 with the Sense extension of Google Chrome, I get 2 documents. (the documents that I was looking for)
In brief, all I want is to be able to :
get all the documents (no criteria)
get all the documents within a time range and based upon keywords.. such as "service"
What am I doing wrong? I can provide with more information if needed.. Thank you all for your detailed answers.
Your C# POCO is not correct in regards to your mapping; your document type is "sdoc" and each of the properties under the "properties" property is a field on that document type; These fields map to properties on your C# POCO.
As an example to get you started
public class Document
{
[String(Name = "uid")]
public string UId { get; set; }
public string Content { get; set; }
}
NEST by default will camel case POCO property names, so "content" will be case correctly according to your mapping, however, we use attribute mapping for the "uid" field in order to name it to match the mapping (we can go further here and set additional attribute property values to fully match the mapping; see the automapping documentation).
Now, to search with the document, let's create the connection settings and a client to use
void Main()
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool)
.InferMappingFor<Document>(t => t
// change the index name to the name of your index :)
.IndexName("index-name")
.TypeName("sdoc")
.IdProperty(p => p.UId)
);
var client = new ElasticClient(connectionSettings);
// do something with the response
var searchResponse = client.Search<Document>(s => s
.Query(q => q
.Match(m => m
.Field(f => f.Content)
.Query("service")
)
)
);
}
We set up the client with some inference rules for the Document type which will be used when interacting with Elasticsearch. The above query emits the following query json
{
"query": {
"match": {
"content": {
"query": "service"
}
}
}
}
As an aside, I noticed that the mapping contained a multi_field type; multi_field types were removed in Elasticsearch 1.0 (multi fields are still there, just the actual type is not), so be sure that you're actually running Elasticsearch 2.x on Searchblox, as NEST 2.x is only supported against Elasticsearch 2.x.