How can I create an array of unique elements in MongoDB c# driver,
I don't want to check every time if this element already in the array or not.
suppose :
list=[1,2,3,4]
then I shouldn't be able to add duplicate element (such as 3 )
You can use the AddToSet or AddToSetEach method, every time you create or update the array, as mentioned in the comments:
var update = Builders<Entity>.Update.AddToSetEach(e => e.Items, new [] {1, 2});
collection.UpdateOne(new BsonDocument(), update, new UpdateOptions { IsUpsert = true });
And you can define a schema validation when creating the collection, to ensure that duplicate items will never be allowed (an error would be thrown on insert/update, “Document failed validation”).
You can define the schema in the MongoDB shell, or here is how to do it in C#:
var options = new CreateCollectionOptions<Entity>
{
ValidationAction = DocumentValidationAction.Error,
ValidationLevel = DocumentValidationLevel.Strict,
Validator = new FilterDefinitionBuilder<Entity>().JsonSchema(new BsonDocument
{
{ "bsonType", "object" },
{ "properties", new BsonDocument("Items", new BsonDocument
{
{ "type" , "array" },
{ "uniqueItems", true }
})
}
})
};
database.CreateCollection("entities", options, CancellationToken.None);
where Entity is an example class like this:
public class Entity
{
public ObjectId Id { get; set; }
public int[] Items { get; set; }
}
Here are the API docs for CreateCollectionOptions and in the unit tests you can see examples of usage - e.g. JsonSchema(). Unfortunately I don't see anything in the reference docs with more thorough explanations.
Related
SOLUTION FOUND AT EDIT
I'm using the DynamoDBContext class in ASP.NET Core (Object persistence model) and have items stored in my Inventory table in this form:
{
"itemID": 0,
"stockCount": 0,
"testAttributes": {
"color": "white",
"additionalProp2": "string",
"additionalProp3": "string"
}
}
Which are mapped to my C# Objects like so:
[DynamoDBTable("Inventory")]
public class ItemDto
{
[DynamoDBHashKey]
public int ItemID { get; set; }
public int StockCount { get; set; }
public Dictionary<string,string> TestAttributes { get; set; }
}
I'm attempting to scan this table and find any item which is the color white. The TestAttributes are not constant so I am using this mapping style. Examples of what I have tried thus far include:
Dictionary<string, string> x = new Dictionary<string, string>()
{
{"color", "white"}
};
var conditions = new List<ScanCondition>()
{
new ScanCondition("TestAttributes", ScanOperator.Contains, x.ElementAt(0))
};
var result = await Context.ScanAsync<ItemDto>(conditions).GetRemainingAsync();
The Documentation says the scan parameter (in this case x.ElementAt(0)) must be the same as the attribute you are scanning on, therefore I used this construction style. In other posts I have seen some cases where "TestAttributes.color" may work but none of these accesses with the correct scanning object type have worked for me.
When I do match the correct parameter type I see you cannot use the contains operator on M type objects, which leads me to believe the TestAttributes.color accessing style should work in some way or another - but have not found that method as of yet. I would like to continue using the Object Persistence model throughout, so using DynamoDBContext rather than DynamoDBClient.
Any advice would help, the documentation is terrible and I have not found the answer for this issue anywhere. Thanks.
EDIT W/ SOLUTION:
From an obscure aws forum a dev mentioned that ScanFilter was implemented before mappings were added, therefore ScanFilter cannot work with scanning with conditions for mappings. Therefore, you must not use DynamoDBContext to do this kind of scan, but rather use the DynamoDBClient class and use an Expression Filter like so:
var expressionValues = new Dictionary<string, AttributeValue>();
expressionValues.Add(":att", new AttributeValue{ {S="white"} });
var request = new ScanRequest{
TableName = "Inventory",
FilterExpression = "Attributes.color = :att",
ExpressionAttributeValues = expressionValues};
var response = await Client.ScanAsync(request);
You can then loop through the items matching the scan conditions like so:
foreach(var item in response.Items)
{
//Do something with item here, below just gets my itemID field
var itemID = item["ItemID"].S;
}
Is it possible (preferably using the C# Builders) to add a new item to a deeply nested array I.e. an array within an array within an array.
My data model looks something like :
public class Company
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Department> Departments { get; set; }
}
public class Department
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Managers> Managers { get; set; }
}
public class Manager
{
public string Id { get; set; }
public string Name { get; set; }
public IEnumerable<Employee> Employees { get; set; }
}
public class Employee
{
public string Id { get; set; }
public string Name { get; set; }
}
Which translates to:
{
"Id": 12345,
"Name": "Company Ltd",
"Departments": [
{
"Id": 1,
"Name": "Development",
"Managers" : [
{
"Id" : 5555,
"Name" : "The Boss",
"Employees": [
{
"Id" : 123,
"Name" : "Developer 1"
},
{
"Id" : 124,
"Name" : "Developer 2"
}
]
}
]
}
]
}
If I wanted to add another employee under a specific manager how would I go about doing that?
In order to push to a nested array, you must make use of the positional operator $ in order to specify a matching outer array element to apply the operation to. For example:
db.collection.update(
{"my_array._id": myTargetId},
{$push: {"my_array.$.my_inner_array": myArrayElem}}
);
This breaks down, however, for traversing nested arrays--that is, you can only use the positional operator on the single array, not any nested ones. This is a well-defined problem as noted in the MongoDB documentation.
If you absolutely need to perform these kinds of nested array operations, then you have a couple of options available to you:
The first, and preferred, is to update your document structure and avoid nesting arrays more than one level deep. This will avoid the issue altogether, but will require any existing data to be migrated to the new structure and additional efforts to be made to structure the data in the way you need on the fly on retrieval. Separate client and server representations of your data will end up being required.
The second is to perform a series of less-reliable steps:
1. Retrieve the original document.
2. Locate the indexes for each array where your target element is located manually.
3. Attempt an update on the specific index chain and attempt to match that index chain as well.
4. Check the result of the update attempt--if it fails, then it's possible that the document was changed while the indexes were being calculated.
For example, if you wanted to update manager with ID 5555 to have the additional employee, you'd perform the following query after retrieving the indexes:
// Index chain found to be Departments.0 and Managers.0
db.collection.update(
{
"Id": 12345,
"Departments.0.Managers.0.Id": 5555 // Specify index chain 0,0 and ensure that our target still has Id 5555.
},
{ $push: {
"Departments.0.Managers.0.Employees": myNewEmployee // Push to index chain 0,0
}}
);
Use positional operator for each of the arrays except the one you want to push to.
Use array filters in update options to specify the department and manager ids. The letter used in the array filters should match the letter used as the positional operator in the update definition. So "d.Id" -> "Departments.$[d]"
If you want to match on more than one property you can use a dictionary in the array filter.
private IMongoCollection<Company> _collection;
public async Task AddEmployee()
{
var filter = Builders<Company>.Filter.Where(d => d.Id == "companyId");
var update = Builders<Company>.Update
.Push("Departments.$[d].Managers.$[m].Employees", new Employee { Id = "employeeId", Name = "employeeName" });
var updateOptions = new UpdateOptions
{
ArrayFilters = new List<ArrayFilterDefinition>
{
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("d.Id", "departmentId")),
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("m.Id", "managerId")),
}
};
await _collection.UpdateOneAsync(filter, update, updateOptions);
}
The problem here is that you need to use strings in the update definition and filters but not sure how to manage it without strings.
To remove an employee from the array is similar but you will have to specify an extra filter for the employee that you want to remove.
public async Task FireEmployee()
{
var filter = Builders<Company>.Filter.Where(d => d.Id == "companyId");
var employeeFilter = Builders<Employee>.Filter.Where(e => e.Id == "employeeId");
var update = Builders<Company>.Update
.PullFilter("Departments.$[d].Managers.$[m].Employees", employeeFilter);
var updateOptions = new UpdateOptions
{
ArrayFilters = new List<ArrayFilterDefinition>
{
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("d.Id", "departmentId")),
new BsonDocumentArrayFilterDefinition<BsonDocument>(new BsonDocument("m.Id", "managerId")),
}
};
await _collection.UpdateOneAsync(filter, update, updateOptions);
}
Scenario:
I have to export an excel file which will contain list of Parts. We have enabled the user to select the columns and get only selected columns' data in the exported file. Since this is a dynamic report, I am not using any concrete class to map the report as this will result in exporting empty column headers in the report, which is unnecessary. I am using Dynamic Linq to deal with this scenario.
I have a list of dynamic objects fetched from dynamic linq.
[
{"CleanPartNo":"Test","Description":"test","AliasPartNo":["258","145","2313","12322"]},
{"CleanPartNo":"Test1","Description":"test1","AliasPartNo":[]}
]
How can I get 4 rows out of this json like
Please note that I cannot use a strongly typed object to deserialize/ Map it using JSON.Net
Update
Following is the code:
public class Part
{
public int Id { get; set; }
public string CleanPartNo { get; set; }
public string Description { get; set; }
public List<PartAlias> AliasPartNo { get; set; }
}
public class PartAlias
{
public int PartId { get; set; }
public int PartAliasId { get; set; }
public string AliasPartNo { get; set; }
}
var aliases = new List<PartAlias> {
new PartAlias{AliasPartNo="258" },
new PartAlias{AliasPartNo="145" },
new PartAlias{AliasPartNo="2313" },
new PartAlias{AliasPartNo="12322" }
};
List<Part> results = new List<Part> {
new Part{CleanPartNo="Test", Description= "test", PartAlias=aliases },
new Part{CleanPartNo="Test1", Description= "test1" }
};
var filters = "CleanPartNo,Description, PartAlias.Select(AliasPartNo) as AliasPartNo";
var dynamicObject = JsonConvert.SerializeObject(results.AsQueryable().Select($"new ({filters})"));
in the dynamicObject variable I get the json mentioned above
Disclaimer: The following relies on anonymous classes, which is not exactly the same as dynamic LINQ (not at all), but I figured that it may help anyway, depending on your needs, hence I decided to post it.
To flatten your list, you could go with a nested Select, followed by a SelectMany (Disclaimer: This assumes that every part has at least one alias, see below for the full code)
var flattenedResult = result.Select(part => part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
.SelectMany(part => part);
You are first projecting your items from result (outer Select). The projection projects each item to an IEnumerable of an anonymous type in which each item corresponds to an alias part number. Since the outer Select will yield an IEnumerable<IEnumerable> (or omething alike), we are using SelectMany to get a single IEnumerable of all the items from your nested IEnumerables. You can now serialize this IEnumerable of instances of an anonymous class with JsonConvert
var json = sonConvert.SerializeObject(flatResults);
Handling parts without aliases
If there are no aliases, the inner select will yield an empty IEnumerable, hence we will have to introduce a special case
var selector = (Part part) => part.AliasPartNumber?.Any() == true
? part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
: new[]
{
new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
}
};
var flattenedResult = result.Select(selector).SelectMany(item => item);
From json you provided you can get values grouped by their name in this way:
var array = JArray.Parse(json);
var lookup = array.SelectMany(x => x.Children<JProperty>()).ToLookup(x => x.Name, x => x.Value);
then this is just a manner of simple loop over the lookup to fill the excel columns.
However, I would suggest to do the flatenning before JSON. I tried for some time to make it happen even without knowing the names of the columns that are arrays, but I failed, and since it's your job, I won't try anymore :P
I think the best way here would be to implement custom converter that would just multiply objects for properties that are arrays. If you do it well, you would get infinite levels completely for free.
Problem statement:
I have a collection in MongoDB that has a field with the type Int32. I would like to add a document to this collection. I need to increment the value by 1 for each insert as that field is indexed and must be unique.
Options:
[preferable] Increment the value on the DB side. That is, not specifying a new (higher) value. Just instruct MongoDB to auto increment upon insert.
Reading first. Executing a find query against the DB to find the current (before insert) highest value first, incrementing in memory, and inserting the new doc. This might fail due to racing conditions (the operation is not atomic).
keeping an index counter in memory. Not an option for me as there are multiple apps writing to the same collection (legacy limitation).
Other Ideas?
Example:
{
_id: ....
index: 123,
open: true
}
await collection.InsertOneAsync(record.ToBsonDocument());
The new doc inserted should have index value of 124
Language:
C#
Questions:
Can you provide a sample code (C#) to achieve the first option?
Extra info:
I do not have access to the code of the other app (which keeps its own index number). So having another collection and adding an sequence resolver function will not work as this will trigger a change to the legacy app.
MongoDB has a default tutorial on how to achieve that here
1 - Create a counters collections and insert the id there:
db.counters.insert(
{
_id: "userid",
seq: 0
}
)
2 - Create a custom function to retrieve the next value:
function getNextSequence(name) {
var ret = db.counters.findAndModify(
{
query: { _id: name },
update: { $inc: { seq: 1 } },
new: true
}
);
return ret.seq;
}
Use the getNextSequence to retrieve the next value:
db.users.insert(
{
_id: getNextSequence("userid"),
name: "Sarah C."
}
)
db.users.insert(
{
_id: getNextSequence("userid"),
name: "Bob D."
}
)
I had to do this in a project using MongoDB C# Driver.
Here's what I did: I created a separated collection called Sequence, with the name and value of it and I also created a repository for it.
Here is the code of class Sequence:
public class Sequence
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
[BsonElement("_id")]
public string Id { get; set; }
public string SequenceName { get; set; }
public int SequenceValue { get; set; }
}
And now the code of the method to generate the sequence value:
public class SequenceRepository
{
protected readonly IMongoDatabase _database;
protected readonly IMongoCollection<Sequence> _collection;
public SequenceRepository(IMongoDatabase database)
{
_database = database;
_colecao = _database.GetCollection<Sequence>(typeof(Sequence).Name);
}
public int GetSequenceValue(string sequenceName)
{
var filter = Builders<Sequence>.Filter.Eq(s => s.SequenceName, sequenceName);
var update = Builders<Sequence>.Update.Inc(s => s.SequenceValue , 1);
var result = _colecao.FindOneAndUpdate(filter, update, new FindOneAndUpdateOptions<Sequence, Sequence> { IsUpsert = true, ReturnDocument = ReturnDocument.After });
return result.SequenceValue;
}
}
Finally I called this method before insert some document:
public void Inserir(Order order)
{
order.Code = new SequenceRepository(_database).GetSequenceValue("orderSequence");
_collection.InsertOne(order);
}
You can create a Mongo Sequence in a separate collection counter
db.counter.insert({ _id: "mySeq", seq: 0 })
You can encapsulate sequence logic in a simple function like this
function getNextMySeq(name) {
var ret = db.counter.findAndModify({
query: { _id: name },
update: { $inc: { seq: 1 } },
new: true
});
return ret.seq;
}
Now simply use the function call during the insert
db.collection.insert({
index: getNextMySeq("mySeq")
})
This is my MongoDB document structure:
{
string _id;
ObservableCollection<DataElement> PartData;
ObservableCollection<DataElement> SensorData;
...
other ObservableCollection<DataElement> fields
...
other types and fields
...
}
Is there any possibility to retrieve a concatenation of fields with the type ObservableCollection<DataElement>? Using LINQ I would do something like
var query = dbCollection
.AsQueryable()
.Select(x => new {
data = x
.OfType(typeof(ObservableCollection<DataElement>))
.SelectMany(x => x)
.ToList()
});
or alternatively
data = x.Where(y => typeof(y) == typeof(ObservableCollection<DataElement>)
.SelectMany(x => x).ToList()
Unfortunately .Where() and .OfType() do not work on documents, only on queryables/lists, so is there another possibility to achieve this? The document structure must stay the same.
Edit:
After dnickless answer I tried it with method 1b), which works pretty well for getting the fields thy way they are in the collection. Thank you!
Unfortunately it wasn't precisely what I was looking for, as I wanted to be all those fields with that specific type put together in one List, at it would be returned by the OfType or Where(typeof) statement.
e.g. data = [x.PartData , x.SensorData, ...] with data being an ObsverableCollection<DataElement>[], so that I can use SelectMany() on that to finally get the concatenation of all sequences.
Sorry for asking the question unprecisely and not including the last step of doing a SelectMany()/Concat()
Finally I found a solution doing this, but it doesn't seem very elegant to me, as it needs one concat() for every element (and I have more of them) and it needs to make a new collection when finding a non-existing field:
query.Select(x => new
{
part = x.PartData ?? new ObservableCollection<DataElement>(),
sensor = x.SensorData ?? new ObservableCollection<DataElement>(),
}
)
.Select(x => new
{
dataElements = x.part.Concat(x.sensor)
}
).ToList()
In order to limit the fields returned you would need to use the MongoDB Projection feature in one way or the other.
There's a few alternatives depending on your specific requirements that I can think of:
Option 1a (fairly static approach): Create a custom type with only the fields that you are interested in if you know them upfront. Something like this:
public class OnlyWhatWeAreInterestedIn
{
public ObservableCollection<DataElement> PartData { get; set; }
public ObservableCollection<DataElement> SensorData { get; set; }
// ...
}
Then you can query your Collection like that:
var collection = new MongoClient().GetDatabase("test").GetCollection<OnlyWhatWeAreInterestedIn>("test");
var result = collection.Find(FilterDefinition<OnlyWhatWeAreInterestedIn>.Empty);
Using this approach you get a nicely typed result back without the need for custom projections.
Option 1b (still pretty static): A minor variation of Option 1a, just without a new explicit type but a projection stage instead to limit the returned fields. Kind of like that:
var collection = new MongoClient().GetDatabase("test").GetCollection<Test>("test");
var result = collection.Find(FilterDefinition<Test>.Empty).Project(t => new { t.PartData, t.SensorData }).ToList();
Again, you get a nicely typed C# entity back that you can continue to operate on.
Option 2: Use some dark reflection magic in order to dynamically create a projection stage. Downside: You won't get a typed instance reflecting your properties but instead a BsonDocument so you will have to deal with that afterwards. Also, if you have any custom MongoDB mappings in place, you would need to add some code to deal with them.
Here's the full example code:
First, your entities:
public class Test
{
string _id;
public ObservableCollection<DataElement> PartData { get; set; }
public ObservableCollection<DataElement> SensorData { get; set; }
// just to have one additional property that will not be part of the returned document
public string TestString { get; set; }
}
public class DataElement
{
}
And then the test program:
public class Program
{
static void Main(string[] args)
{
var collection = new MongoClient().GetDatabase("test").GetCollection<Test>("test");
// insert test record
collection.InsertOne(
new Test
{
PartData = new ObservableCollection<DataElement>(
new ObservableCollection<DataElement>
{
new DataElement(),
new DataElement()
}),
SensorData = new ObservableCollection<DataElement>(
new ObservableCollection<DataElement>
{
new DataElement(),
new DataElement()
}),
TestString = "SomeString"
});
// here, we use reflection to find the relevant properties
var allPropertiesThatWeAreLookingFor = typeof(Test).GetProperties().Where(p => typeof(ObservableCollection<DataElement>).IsAssignableFrom(p.PropertyType));
// create a string of all properties that we are interested in with a ":1" appended so MongoDB will return these fields only
// in our example, this will look like
// "PartData:1,SensorData:1"
var mongoDbProjection = string.Join(",", allPropertiesThatWeAreLookingFor.Select(p => $"{p.Name}:1"));
// we do not want MongoDB to return the _id field because it's not of the selected type but would be returned by default otherwise
mongoDbProjection += ",_id:0";
var result = collection.Find(FilterDefinition<Test>.Empty).Project($"{{{mongoDbProjection}}}").ToList();
Console.ReadLine();
}
}