MongoDB Map Property 'new' in findAndModify using FindOneAndUpdateOptions class C# Driver - c#

I'm trying to implement a getNextSequence function for mongoDB explain on this Link I'm using the lattes C# driver but I not sure how to map the new : true property in the FindOneAndUpdateOptions
MongoDB Code
function getNextSequence(name) {
var ret = db.counters.findAndModify(
{
query: { _id: name },
update: { $inc: { seq: 1 } },
new: true,
upsert: true
}
);
return ret.seq;
}
C# Code
public async Task<long> GetNextObjectSequenceAsync(string objectName)
{
var collection = this.Context.GetCollection<ObjectSequence>("Counters");
var filter = new FilterDefinitionBuilder<ObjectSequence>().Where(x => x.Name == objectName);
var options = new FindOneAndUpdateOptions<ObjectSequence, ObjectSequence>() { IsUpsert = true };
var update = new UpdateDefinitionBuilder<ObjectSequence>().Inc(x => x.Sequence, 1);
ObjectSequence seq = await collection.FindOneAndUpdateAsync<ObjectSequence>(filter, update, options);
return seq.Sequence;
}

FindOneAndUpdateOptions has ReturnDocument enum where
ReturnDocument.Before = 'new':false
ReturnDocument.After = 'new':true
In your case options should be:
var options = new FindOneAndUpdateOptions<ObjectSequence, ObjectSequence>() { ReturnDocument = ReturnDocument.After, IsUpsert = true };

Related

Elasticsearch.NET.InMemoryConnection not applying filters on responseData

This bounty has ended. Answers to this question are eligible for a +50 reputation bounty. Bounty grace period ends in 9 hours.
Keppy is looking for an answer from a reputable source.
I have a Elastic Client from Elasticsearch.Net which fetching data from InMemoryConnection and add Query filter on the search but result is not filtering. Its returning entire data from responseBody as result.
Am I missing something or this is how InMemoryConnection is working?
CurrenciesDTO.cs
internal class CurrenciesDTO
{
[Keyword(Name = "CCY")]
public string CCY { get; set; }
}
Program.cs
using ConsoleApp_Elastic;
using Elasticsearch.Net;
using Nest;
using Newtonsoft.Json;
using System.Collections.Generic;
using System.Text;
using System.Threading;
List<CurrenciesDTO> listCurrencies = new List<CurrenciesDTO> { new CurrenciesDTO() { CCY = "GEL" }, new CurrenciesDTO() { CCY = "INR" }, new CurrenciesDTO() { CCY = "JPY" }, new CurrenciesDTO() { CCY = "USD" } };
var response = new
{
took = 1,
timed_out = false,
_shards = new
{
total = 1,
successful = 1,
skipped = 0,
failed = 0
},
hits = new
{
total = new
{
value = 193,
relation = "eq"
},
max_score = 1.0,
hits = Enumerable.Range(0, listCurrencies.Count).Select(i => (object)new
{
_index = "test.my.currencies",
_type = "_doc",
_id = listCurrencies[i].CCY,
_score = 1.0,
_source = new
{
CCY = listCurrencies[i].CCY,
}
})
}
};
string json = JsonConvert.SerializeObject(response);
var responseBody = Encoding.UTF8.GetBytes(json);
ConnectionSettings connectionSettings = new ConnectionSettings(new InMemoryConnection(responseBody, 200));
connectionSettings.OnRequestCompleted(apiCallDetails =>
{
if (apiCallDetails.RequestBodyInBytes != null)
{// not reaching here
Console.WriteLine(
$"{apiCallDetails.HttpMethod} {apiCallDetails.Uri} " +
$"{Encoding.UTF8.GetString(apiCallDetails.RequestBodyInBytes)}");
}
});
var client = new ElasticClient(connectionSettings);
var filterItems = new List<Func<QueryContainerDescriptor<CurrenciesDTO>, QueryContainer>>();
filterItems.Add(p => p.Term(v => v.Field(f=>f.CCY).Value("USD")));
var result = await client.SearchAsync<CurrenciesDTO>(s => s
.Index("test.my.currencies")
.Query(q => q.Bool(x => x.Filter(filterItems))), CancellationToken.None);
// .Query(q => q.Term(p => p.CCY, "USD")));
//expected 1 record but 4 records are returned.
foreach (var a in result.Documents.ToArray())
{
Console.WriteLine(a.CCY);
}
Console.ReadLine();
Yes, this is by design. InMemoryConnection was created to make unit testing easier and won't be much help with validating actual queries.
For making sure that Elasticsearch is configured the way you are expecting it to be and that queries sent to Elasticsearch are valid I would suggest using Testcontainers.
Simple test would look like:
spin up new docker instance of Elasticsearch with Testcontainers help
index some data
run you code against Elasticsearch running inside container

MongoDb - cursor option is required after upgrade of mongodb

Since we were forced to upgrade our mongo installation, we're receiving an error during some aggregation function calls:
MongoDB.Driver.MongoCommandException: "Command 'aggregate' failed: The
'cursor' option is required, except for aggregate with the explain
argument (response: { "ok" : 0.0, "errmsg" : "The 'cursor' option is
required, except for aggregate with the explain argument", "code" : 9,
"codeName" : "FailedToParse" })"
BsonArray arr = BsonSerializer.Deserialize<BsonArray>("[{ \"$match\" : { \"Param1\" : \"VAL\" } }, { \"$unwind\" : \"$Entries\" }, { \"$match\" : { \"PARAM\" : \"VALUE\" } }]");
var pipeline = arr.Select(x => x.AsBsonDocument).ToList();
// AggregateArgs aArgs = new AggregateArgs { Pipeline = bsonList };
var cursor = collection.Aggregate(pipeline).ResultDocuments;
I already figured out, that we have to manually add cursor configuration to the BsonDocument - but we weren't able to figure out, how the query should be configured.
Is there any work around for this exception (without changing drivers)?
give this a shot:
var cursor = collection.Aggregate<BsonDocument>(pipeline);
var results = cursor.ToList(); //just get a list of documents and be done with it
while (cursor.MoveNext()) // or iterate over cursor
{
foreach (var doc in cursor.Current.ToArray())
{
//access your documents here
}
}
You have extra brace in the end of query string
was finally able to fix it, by building the command by myself:
var cmd = new CommandDocument()
{
{"aggregate", "collection_name" },
{"pipeline", arr},
{"cursor", BsonDocument.Parse("{}") }
};
var res = db.RunCommand(cmd);
This is what worked in my situation (mongocshardriver v1.9.0-rc0, mongodb server 4.4.0); OutputMode = AggregateOutputMode.Cursor in the AggregateArgs.
public IEnumerable<BsonDocument> Run(MongoCollection<Item> items)
{
var priceRange = new BsonDocument(
"$subtract",
new BsonArray
{
"$Price",
new BsonDocument(
"$mod",
new BsonArray{"$Price", 100})
});
var grouping = new BsonDocument(
"$group",
new BsonDocument
{
{"_id", priceRange},
{"count", new BsonDocument("$sum", 1)}
});
var sort = new BsonDocument(
"$sort",
new BsonDocument("_id", 1)
);
var args = new AggregateArgs
{
Pipeline = new[] { grouping, sort },
OutputMode = AggregateOutputMode.Cursor,
};
return items.Aggregate(args);
}

Looping through MongoDB collections and joining them in C#

I have a collection in MongoDB which has document with names of collections I need to work. I need to query this collection, get all the collection names from the document inside this collection and then query those collections and join them based on ParentId references. Following is the collection which stores the name of other collection
db.AllInfoCollection.find()
{
"_id" : ObjectId("5b83b982a5e17c383c8424f3"),
"CollName" : "Collection1",
},
{
"_id" : ObjectId("5b83b9aaa5e17c383c8424f7"),
"CollName" : "Collection2",
},
{
"_id" : ObjectId("5b83b9afa5e17c383c8424f8"),
"CollName" : "Collection3",
},
{
"_id" : ObjectId("5b83b9b5a5e17c383c8424f9"),
"CollName" : "Collection4",
},
{
"_id" : ObjectId("5b83b9b9a5e17c383c8424fa"),
"CollName" : "Collection5",
},
{
"_id" : ObjectId("5b84f41bc5eb3f1f7c291f94"),
"CollName" : "Collection6",
}
All the above collections (Collection1, Collection2,.... Collection6) are created on run time with empty documents. They are connected to each other with Id and ParentId fields.
Now I need to query this AllInfoCollection, get the collection names and join them and generate the final joined ($lookup) output. I am able to query and get the collection list, but I am not sure how to add lookup projection inside the for loop. Any help would be appreciated.
public void RetrieveDynamicCollection()
{
IMongoDatabase _db = client.GetDatabase("MyDb");
var collectionList = _db.GetCollection<AllInfoCollection>("AllInfoCollection").AsQueryable().Distinct().Select(x => x.CollectionName).ToList();
for(int i = 0; i < collectionList.Count; i++)
{
var collectionName = collectionList[i];
IMongoCollection<BsonDocument> collection = _db.GetCollection<BsonDocument>(collectionName);
var options = new AggregateOptions()
{
AllowDiskUse = false
};
//not able to proceed here
}
}
Finally I was able to retrieve collections dynamically with all required joins(lookup aggregation) as below, hope it helps someone:
public async Task<string> RetrieveDynamicCollection()
{
try
{
IMongoDatabase _db = client.GetDatabase("MyDB");
var list = _db.GetCollection<HazopCollectionInfo>("AllCollectionInfo").AsQueryable().ToList();
var collectionList = list.OrderBy(x => x.CollectionOrder).Select(x => x.CollectionName).Distinct().ToList();
var listOfJoinDocuments = new List<BsonDocument>();
var firstCollection = _db.GetCollection<BsonDocument>(collectionList[0]);
var options = new AggregateOptions()
{
AllowDiskUse = false
};
var previousCollectionName = "";
for (int i = 0; i < collectionList.Count; i++)
{
var collectionName = collectionList[i];
IMongoCollection<BsonDocument> collection = _db.GetCollection<BsonDocument>(collectionName);
if (i == 0)
{
firstCollection = collection;
var firstarray = new BsonDocument("$project", new BsonDocument()
.Add("_id", 0)
.Add(collectionName, "$$ROOT"));
listOfJoinDocuments.Add(firstarray);
}
else
{
var remainingArray = new BsonDocument("$lookup", new BsonDocument()
.Add("localField", previousCollectionName + "." + "Id")
.Add("from", collectionName)
.Add("foreignField", "ParentId")
.Add("as", collectionName));
listOfJoinDocuments.Add(remainingArray);
remainingArray = new BsonDocument("$unwind", new BsonDocument()
.Add("path", "$" + collectionName)
.Add("preserveNullAndEmptyArrays", new BsonBoolean(true)));
listOfJoinDocuments.Add(remainingArray);
}
previousCollectionName = collectionName;
}
// Project the columns
list.OrderBy(x => x.ColumnOrder);
var docProjection = new BsonDocument();
for(int i=0;i<list.Count;i++)
{
docProjection.Add(list[i].ColumnName, "$"+list[i].CollectionName + "." + list[i].FieldName);
}
listOfJoinDocuments.Add(new BsonDocument("$project", docProjection));
PipelineDefinition<BsonDocument, BsonDocument> pipeline = listOfJoinDocuments;
var listOfDocs = new List<BsonDocument>();
using (var cursor = await firstCollection.AggregateAsync(pipeline, options))
{
while (await cursor.MoveNextAsync())
{
var batch = cursor.Current;
foreach (BsonDocument document in batch)
{
listOfDocs.Add(document);
}
}
}
var jsonString = listOfDocs.ToJson(new MongoDB.Bson.IO.JsonWriterSettings { OutputMode = MongoDB.Bson.IO.JsonOutputMode.Strict });
return jsonString;
}
catch(Exception ex)
{
throw ex;
}
}

Retrieve only internal _id with NEST ElasticClient

I try to execute a search with NEST ElasticClient and getting only the _id of the hits.
Here is my Code:
var client = new ElasticClient();
var searchResponse = client.Search<ElasticResult>(new SearchRequest {
From = this.query.Page * 100,
Size = 100,
Source = new SourceFilter {
Includes = "_id"
},
Query = new QueryStringQuery {
Query = this.query.Querystring
}
});
public class ElasticResult {
public string _id;
}
But the _id of the Documents (ElasticResult-Objects) is always null. What am I doing wrong?
The _id is not part of the _source document, but part of the hit metadata for each hit in the hits array.
The most compact way to return just the _id fields would be with using response filtering which is exposed as FilterPath in NEST
private static void Main()
{
var defaultIndex = "documents";
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var settings = new ConnectionSettings(pool)
.DefaultIndex(defaultIndex)
.DefaultTypeName("_doc");
var client = new ElasticClient(settings);
if (client.IndexExists(defaultIndex).Exists)
client.DeleteIndex(defaultIndex);
client.Bulk(b => b
.IndexMany<object>(new[] {
new { Message = "hello" },
new { Message = "world" }
})
.Refresh(Refresh.WaitFor)
);
var searchResponse = client.Search<object>(new SearchRequest<object>
{
From = 0 * 100,
Size = 100,
FilterPath = new [] { "hits.hits._id" },
Query = new QueryStringQuery
{
Query = ""
}
});
foreach(var id in searchResponse.Hits.Select(h => h.Id))
{
// do something with the ids
Console.WriteLine(id);
}
}
The JSON response from Elasticsearch to the search request looks like
{
"hits" : {
"hits" : [
{
"_id" : "6gs8lmQB_8sm1yFaJDlq"
},
{
"_id" : "6Qs8lmQB_8sm1yFaJDlq"
}
]
}
}

MongoDb c# 2.0 driver AddToSet method

I have the following code which was implemeted with MongoDb 2.0 c# driver. But I need to access to the MailLists collection of Profile which will be inserted. I've written the expected solution using p in constructor, but how to implement it via multiple operation?
IMongoCollection<Profile> dbCollection = DetermineCollectionName<Profile>();
var filter = Builders<Profile>.Filter.In(x => x.ID, profiles.Select(x => x.ID));
var updateMl = Builders<Profile>.Update.AddToSet(p => p.MailLists, new Profile2MailList
{
MailListId = maillistId,
Status = p.MailLists.MergeMailListStatuses(),
SubscriptionDate = DateTime.UtcNow
});
dbCollection.UpdateManyAsync(filter, updateMl, new UpdateOptions { IsUpsert = true });
I found the following solution:
IMongoCollection<Profile> dbCollection = DetermineCollectionName<Profile>();
var filter = Builders<Profile>.Filter.In(x => x.ID, profiles.Select(x => x.ID));
var profile2maillists = new List<Profile2MailList>();
foreach(var profile in profiles)
{
profile2maillists.Add(
new Profile2MailList
{
MailListId = maillistId,
Status = profile.MailLists.MergeMailListStatuses(),
SubscriptionDate = DateTime.UtcNow
});
}
var updateMl = Builders<Profile>.Update.AddToSetEach(p => p.MailLists, profile2maillists);
dbCollection.UpdateManyAsync(filter, updateMl, new UpdateOptions { IsUpsert = true });

Categories

Resources