Creating MongoDB map-reduce from aggregation pipeline in .net - c#

I need help converting an aggregation pipeline to map-reduce. I know that map-reduce is deprecated, however, I need it. I am using mongoDB version 4.4 and MongoDB.Driver version 2.11.0, so it still should be available. I got the following aggregation function
PipelineDefinition<Airplane, BsonDocument> pipeline = new BsonDocument[]
{
new BsonDocument("$unwind",
new BsonDocument("path", "$Tickets")),
new BsonDocument("$group",
new BsonDocument
{
{ "_id", "$Tickets.Price" },
{ "Count", new BsonDocument("$sum", 1) }
})
};
var results = await _airplanesCollection.Aggregate(pipeline).ToListAsync();
var pricesCounts = new Dictionary<int, int>();
foreach (var result in results)
{
pricesCounts.Add(result.AsBsonDocument[0].ToInt32(), result.AsBsonDocument[1].ToInt32());
}
return pricesCounts;
It works exactly as I want it, however when I tried several ways to create a map-reduce function, it either threw exceptions or returned no results
/*string map = #"
function() {
var ticket = this;
emit(ticket.Price, { count: 1 });
}";
string reduce = #"
function(key, values) {
var result = {count: 0};
values.forEach(function(value){
result.count += value.count;
});
return result;
}";*/
string map = #"
function() {
for(var i = 0; i < this.Tickets.lenght; i++){
emit( this.Tickets[i].Price, { count: 1} );
}
}";
string reduce = #"
function(key, values) {
reducedVal = { count: 0};
for (var idx = 0; idx < values.length; idx++) {
reducedVal.count += values[idx].count;
}
return reducedVal;
}";
var options = new MapReduceOptions<Airplane, KeyValuePair<int, int>>();
options.OutputOptions = MapReduceOutputOptions.Inline;
var results = _airplanesCollection.MapReduceAsync(map, reduce, options).Result.ToList();
var pricesCounts = new Dictionary<int, int>();
foreach (var result in results)
{
pricesCounts.Add(result.Key, result.Value);
}
return pricesCounts;
The commented-out code throws this exception
The non commented-out code returns empty list.
The result of aggregation pipeline and wanted output from map-reduce
This is how I give it to controller of swagger
[ProducesResponseType(typeof(List<string>), 200)]
[ProducesResponseType(typeof(BadRequestObjectResult), 400)]
[HttpGet("groupTickets")]
public async Task<List<string>> GroupTickets()
{
var groupedTickets = await _mongoService.GroupTickets();
var result = new List<string>();
foreach(var item in groupedTickets)
{
result.Add($"Price: {item.Key} Bought tickets: {item.Value}");
}
return result.Any() ? result : new List<string>() { $"No tickets were bought yet." };
}
This is how my Airplane and Ticket models look like
public class Airplane
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
public string Name { get; set; }
public int Capacity { get; set; }
public DateTime? FlightTime { get; set; }
public string Destination { get; set; }
public IEnumerable<Ticket> Tickets { get; set; }
public Airplane()
{
Tickets = new List<Ticket>();
}
}
public class Ticket
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
public int TicketClass { get; set; }
public int Price { get; set; }
public string Destination { get; set; }
public string PassengerId { get; set; }
public Ticket() { Id = ObjectId.GenerateNewId().ToString(); }
}
Thank you for any help in advance.

Related

In C#, How can I use MongoDb's Aggregation/Lookup to make this join more efficient?

We have a microservice which caches Salesforce data to mitigate Salesforce limits. While this cache doesn't even attempt to mimic the nuanced/complicated data security policies of Salesforce, we do have some logic to secure fields using roles.
We wish to be join records from multiple SObjects (Tables), each of which are stored in separate Collections once they are in MongoDb. And in fact, I've implemented a solution for this, but (not surprisingly) the performance leaves much to be desired as the join takes place in C# instead of MongoDb.
The relevant part of the solution looks like this:
public async Task<(List<JObject> records, int totalCount)> FindRecords(GenericQueryParameters parameters, CancellationToken cancellationToken)
{
(List<BsonDocument> resultRecords, int count) = await FindMongoRecords(parameters, cancellationToken);
List<BsonDocument> joinedResults = await Join(resultRecords, parameters.ExtractedJoinCriteria, cancellationToken);
List<JObject> jObjectRecords = resultRecords.Select(ToJObject)
.ToList();
return (jObjectRecords, (int)count);
}
private async Task<(List<BsonDocument> records, int totalCount)> FindMongoRecords(GenericQueryParameters parameters, CancellationToken cancellationToken)
{
ISObjectConfigurable config = await _sObjectConfigurationManager.Get(parameters.SObjectName);
IMongoCollection<BsonDocument> collection = GetReadCollection(config);
FilterDefinition<BsonDocument> filters = BuildFilters(config, parameters);
IFindFluent<BsonDocument, BsonDocument> filteredCollection = collection.Find(filters);
IFindFluent<BsonDocument, BsonDocument> sortedCollection = (parameters.SortTransition == Transitions.Ascending)
? filteredCollection.SortBy(x => x[parameters.SortField])
: filteredCollection.SortByDescending(x => x[parameters.SortField]);
List<BsonDocument> resultRecords = await sortedCollection
.Skip((parameters.PageNumber - 1) * parameters.PageSize)
.Limit(parameters.PageSize)
.Project(GetReadableFields(config))
.ToListAsync(cancellationToken);
long count = await collection.CountDocumentsAsync(filters, cancellationToken: cancellationToken);
return (resultRecords, (int)count);
}
private async Task<List<BsonDocument>> Join(List<BsonDocument> resultRecords, List<JoinCriteria> joinCriteria, CancellationToken cancellationToken)
{
foreach (JoinCriteria joinCriterium in joinCriteria)
{
HashSet<string> targets = resultRecords.Select(x => x[joinCriterium.ParentFieldName])
.Select(x => $"\"{x}\"")
.ToHashSet();
GenericQueryParameters childQueryParameters = new()
{
SObjectName = joinCriterium.ChildSObjectName,
PageSize = 50,
Filters = new List<string> {
$"{joinCriterium.ChildFieldName} IN {{{string.Join(",", targets)}}}"
}
};
(List<BsonDocument> allChildRecords, int _) = await FindMongoRecords(childQueryParameters, cancellationToken);
Dictionary<BsonValue, List<BsonDocument>> childRecordsByChildField = allChildRecords
.GroupBy(x => x[joinCriterium.ChildFieldName], x => x)
.ToDictionary(group => group.Key, group => group.ToList());
foreach (BsonDocument resultRecord in resultRecords)
{
BsonValue parentFieldValue = resultRecord[joinCriterium.ParentFieldName];
resultRecord[joinCriterium.Collection] = childRecordsByChildField.TryGetValue(parentFieldValue, out List<BsonDocument> childRecords)
? ToBsonDocumentArray(childRecords)
: new BsonArray();
}
}
return resultRecords;
}
private static BsonArray ToBsonDocumentArray(List<BsonDocument> childRecords)
{
BsonArray array = new();
foreach (BsonDocument childRecord in childRecords)
{
_ = array.Add(childRecord);
}
return array;
}
private ProjectionDefinition<BsonDocument, BsonDocument> GetReadableFields(ISObjectConfigurable config)
{
ProjectionDefinitionBuilder<BsonDocument> projectionDefinitionBuilder = Builders<BsonDocument>.Projection;
IEnumerable<ProjectionDefinition<BsonDocument>> projectionDefinitions = _oAuthRoleValidator.IsAdmin()
? new List<ProjectionDefinition<BsonDocument>>()
: CreateSecureProjection(config, projectionDefinitionBuilder);
return projectionDefinitionBuilder.Combine(projectionDefinitions);
}
private IEnumerable<ProjectionDefinition<BsonDocument>> CreateSecureProjection(
ISObjectConfigurable config,
ProjectionDefinitionBuilder<BsonDocument> projectionDefinitionBuilder
)
{
IEnumerable<string> hiddenFields = config.SObjectFields.Where(field => !_oAuthRoleValidator.UserCanReadField(config.FieldConfigByName, field));
IEnumerable<string> hiddenShadows = hiddenFields.Select(x => "_" + x);
return hiddenFields.Concat(hiddenShadows)
.Select(field => projectionDefinitionBuilder.Exclude(field));
}
GenericQueryParameters looks like this:
public class GenericQueryParameters
{
private static readonly Regex GenericJoinRegex = new(#"(\w+)\s*:(\w+).(\w+)\s*==\s*(\w+)", RegexOptions.Compiled);
[JsonProperty(PropertyName = "sObjectName")]
public string SObjectName { get; set; }
[JsonProperty(PropertyName = "filters")]
public List<string> Filters { get; set; } = new List<string>();
[JsonProperty(PropertyName = "joins")]
public List<string> JoinCriteria { get; set; } = new List<string>();
[JsonIgnore]
[SuppressMessage("Style", "IDE1006:Naming Styles", Justification = "This is a hidden backing field.")]
private List<JoinCriteria> _extractedJoinCriteria { get; set; }
[JsonIgnore]
public List<JoinCriteria> ExtractedJoinCriteria
{
get
{
if (_extractedJoinCriteria == null)
{
_extractedJoinCriteria = JoinCriteria.Select(x => ExtractCriterium(x))
.ToList();
}
return _extractedJoinCriteria;
}
}
[JsonProperty(PropertyName = "pageNumber")]
public int PageNumber { get; set; } = 1;
private const int MaxPageSize = 50;
private int _pageSize = 10;
[JsonProperty(PropertyName = "pageSize")]
public int PageSize
{
get => _pageSize;
set => _pageSize = (value > MaxPageSize)
? MaxPageSize
: value;
}
[JsonProperty(PropertyName = "sortField")]
public string SortField { get; set; } = "_CreatedDate";
[JsonProperty(PropertyName = "sortTransition")]
public Transitions SortTransition { get; set; } = Transitions.Ascending;
[JsonProperty(PropertyName = "includeDeleted")]
public bool IncludeDeleted { get; set; } = false;
[JsonProperty(PropertyName = "syncNow")]
public bool SynchronizeFirst { get; set; } = false;
[JsonProperty(PropertyName = "transformationTemplateName")]
public string TransformationTemplateName { get; set; }
private static JoinCriteria ExtractCriterium(string joinCriteriumString)
{
Match match = GenericJoinRegex.Match(joinCriteriumString);
return match.Success
? new()
{
Collection = match.Groups[1].Value,
ChildSObjectName = match.Groups[2].Value,
ChildFieldName = match.Groups[3].Value,
ParentFieldName = match.Groups[4].Value
}
: throw new MalformedDatabaseJoinException($"The filter '{joinCriteriumString}' could not be parsed.");
}
and JoinCriteria looks like:
public class JoinCriteria
{
public string Collection { get; init; }
public string ChildSObjectName { get; init; }
public string ChildFieldName { get; init; }
public string ParentFieldName { get; init; }
}
As you can see, in the present solution, each SObject/Collection is queried separately, data is then redacted to conform the the consumer's permissions, and then finally the data is assembled for return to the consumer.
How can I refactor this solution to perform the join within MongoDb?

Remove ObjectId from mongodb document serialized to JSON in C#

I'm having a bit of an issue here. I am getting all my products from a mongodb collection with this function:
public async Task<string> getAllProducts()
{
List<string> all = new List<string>();
var document = await getCollection("produits").Find(new BsonDocument()).ToCursorAsync();
foreach (var doc in document.ToEnumerable())
{
var res = doc.ToJson();
all.Add(res);
}
return JsonConvert.SerializeObject(all);
}
and it returns a JSON that looks like this to my react front end.
{ "_id" : ObjectId("5e49bdf5f040e808847a17d7"),
"email" : "example#gmail.com",
"quantite" : 1,
"matricule" : 1}
problem is i cant parse this in my javascript because of this : ObjectId("5e49bdf5f040e808847a17d7")
Of course I could do some string magic before I parse it, but id rather it be corrected on the server side. So is there a way I can get rid of this problem and get a result like this?
{ "_id" : "5e49bdf5f040e808847a17d7",
"email" : "example#gmail.com",
"quantite" : 1,
"matricule" : 1}
give this a try. it will serialize string ids without objectid stuff.
public static async Task<string> getAllProducts()
{
var collection = db.GetCollection<object>("produits");
var all = new List<object>();
using (var cursor = await collection.FindAsync("{}"))
{
while (await cursor.MoveNextAsync())
{
foreach (var doc in cursor.Current.ToArray())
{
all.Add(doc);
}
}
}
return Newtonsoft.Json.JsonConvert.SerializeObject(all);
}
Fixed by creating a class for the mongodb object.
public class Product
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
public int matricule { get; set; }
public int quantite { get; set; }
public string email { get; set; }
public float prix { get; set; }
public string image { get; set; }
}
get them and deserialize with BsonSerializer :
public async Task<List<Product>> getAllProducts(){
var collection = await getCollection("produits").Find(new BsonDocument()).ToListAsync();
List<Product> all = new List<Product>();
foreach(var doc in collection){
all.Add(BsonSerializer.Deserialize<Product>(doc));
}
return all;
}
return them on request :
[HttpGet]
public async Task<string> ShowProductsAsync()
{
MongodbModel model = new MongodbModel();
var products = await model.getAllProducts();
Console.WriteLine(products);
return JsonConvert.SerializeObject(products);
}
string GetAllProduits(){
var collection = database.GetCollection<BsonDocument>("produits");
var project = Builders<BsonDocument>.Projection.Exclude("_id");
var filter = Builders<BsonDocument>.Filter.Empty;
var rlt=collection.Find(filter).Project(project);
return rlt.ToList().ToJson();
}

Map one class data to another class with iteration

I have a C# project and looking for simple solution for map one class object data to list of another class object.
This is my input class
public class RatesInput
{
public string Type1 { get; set; }
public string Break1 { get; set; }
public string Basic1 { get; set; }
public string Rate1 { get; set; }
public string Type2 { get; set; }
public string Break2 { get; set; }
public string Basic2 { get; set; }
public string Rate2 { get; set; }
public string Type3 { get; set; }
public string Break3 { get; set; }
public string Basic3 { get; set; }
public string Rate3 { get; set; }
}
This is my another class structure
public class RateDetail
{
public string RateType { get; set; }
public decimal Break { get; set; }
public decimal Basic { get; set; }
public decimal Rate { get; set; }
}
it has a object like below. (For easiering the understanding, I use hardcoded values and actually values assign from a csv file)
RatesInput objInput = new RatesInput();
objInput.Type1 = "T";
objInput.Break1 = 100;
objInput.Basic1 = 50;
objInput.Rate1 = 0.08;
objInput.Type2 = "T";
objInput.Break2 = 200;
objInput.Basic2 = 50;
objInput.Rate2 = 0.07;
objInput.Type3 = "T";
objInput.Break3 = 500;
objInput.Basic3 = 50;
objInput.Rate3 = 0.06;
Then I need to assign values to "RateDetail" list object like below.
List<RateDetail> lstDetails = new List<RateDetail>();
//START Looping using foreach or any looping mechanism
RateDetail obj = new RateDetail();
obj.RateType = //first iteration this should be assigned objInput.Type1, 2nd iteration objInput.Type2 etc....
obj.Break = //first iteration this should be assigned objInput.Break1 , 2nd iteration objInput.Break2 etc....
obj.Basic = //first iteration this should be assigned objInput.Basic1 , 2nd iteration objInput.Basic2 etc....
obj.Rate = //first iteration this should be assigned objInput.Rate1, 2nd iteration objInput.Rate2 etc....
lstDetails.Add(obj); //Add obj to the list
//END looping
Is there any way to convert "RatesInput" class data to "RateDetail" class like above method in C#? If yes, how to iterate data set?
Try this:
public class RatesList : IEnumerable<RateDetail>
{
public RatesList(IEnumerable<RatesInput> ratesInputList)
{
RatesInputList = ratesInputList;
}
private readonly IEnumerable<RatesInput> RatesInputList;
public IEnumerator<RateDetail> GetEnumerator()
{
foreach (var ratesInput in RatesInputList)
{
yield return new RateDetail
{
RateType = ratesInput.Type1,
Break = Convert.ToDecimal(ratesInput.Break1, new CultureInfo("en-US")),
Basic = Convert.ToDecimal(ratesInput.Basic1, new CultureInfo("en-US")),
Rate = Convert.ToDecimal(ratesInput.Rate1, new CultureInfo("en-US"))
};
yield return new RateDetail
{
RateType = ratesInput.Type2,
Break = Convert.ToDecimal(ratesInput.Break2),
Basic = Convert.ToDecimal(ratesInput.Basic2),
Rate = Convert.ToDecimal(ratesInput.Rate2, new CultureInfo("en-US"))
};
yield return new RateDetail
{
RateType = ratesInput.Type3,
Break = Convert.ToDecimal(ratesInput.Break3),
Basic = Convert.ToDecimal(ratesInput.Basic3),
Rate = Convert.ToDecimal(ratesInput.Rate3, new CultureInfo("en-US"))
};
}
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
And use:
var list = new RatesList(new List<RatesInput>() { objInput });
foreach (var item in list)
{
Console.WriteLine(item.Basic);
}
You can use Reflection to get the properties info like this:
var props = objInput.GetType().GetProperties();
var types = props.Where(x => x.Name.StartsWith("Type"))
.Select(x => x.GetValue(objInput)).ToList();
var breaks = props.Where(x => x.Name.StartsWith("Break"))
.Select(x => x.GetValue(objInput)).ToList();
var basics = props.Where(x => x.Name.StartsWith("Basic"))
.Select(x => x.GetValue(objInput)).ToList();
var rates = props.Where(x => x.Name.StartsWith("Rate"))
.Select(x => x.GetValue(objInput)).ToList();
List<RateDetail> lstDetails = new List<RateDetail>();
for (int i = 0; i < types.Count; i++)
{
lstDetails.Add(new RateDetail
{
RateType = types[i].ToString(),
Break = Convert.ToDecimal(breaks[i]),
Basic = Convert.ToDecimal(basics[i]),
Rate = Convert.ToDecimal(rates[i])
});
}

get correct json output from c# class

Please check from Json output i wanted using JavaScriptSerializer. Then check class helper i created using json2csharp.com. Problem is in controller. I am not getting correct output as per my required Jason. I am doing correct in controller? Where the problem can be? Please ask question if you want to know something specific, sorry its hard to describe more clearly.
Helper Class Code:
public class ItemsFromFile
{
public string ASIN { get; set; }
public string Title { get; set; }
public List<Product> products { get; set; }
}
public class ItemsDeatails
{
public List<ItemsFromFile> ItemsFromFile { get; set; }
}
public class File
{
public string nameLocator { get; set; }
public ItemsDeatails itemsDeatails { get; set; }
}
public class RootObject
{
public string Token { get; set; }
public File file { get; set; }
}
Controller code:
if (type == "Salefreaks")
{
var token = ctx.BulkStores.FirstOrDefault(x => x.StoreName == store && x.Type == 1).Token;
var ItemsFromFile = new ItemsFromFile()
{
products = new List<Product>()
};
var ItemsDeatails = new ItemsDeatails()
{
};
var File = new File()
{
nameLocator = "testimport1"
};
var RootObject = new RootObject()
{
Token = token
};
var singleItems = ctx.BulkScannedItems.Where(x => x.UserSellerScanRequestId == id).ToList();
foreach (var item in singleItems)
{
ItemsFromFile.products.Add(new Product { ASIN = item.ASIN, Title = item.EbayTitle });
}
var json = new JavaScriptSerializer().Serialize(RootObject);
}
Required Json Output code:
{
"Token": "7f3099b0-36b1",
"file": {
"nameLocator": "testimport1",
"itemsDeatails": {
"ItemsFromFile": [
{
"ASIN": "B011KVFT9Y",
"Title": "Disney Batman Durable Party Beach Outdoor Adventure Camp Chair w/ Storage Bag"
},
{
"ASIN": "B01D4KRBW2",
"Title": "High Quality Diy Oil Painting Paint Number Kit Theme-Romantic Street A Frameless"
}
]
}
}
}
You can initialize internal objects in the code in the constructor as well.
public class RootObject
{
public string Token { get; set; }
public File file { get; set; }
}
public class File
{
public File()
{
this.itemsDeatails = new ItemsDeatails();
}
public string nameLocator { get; set; }
public ItemsDeatails itemsDeatails { get; set; }
}
public class ItemsDeatails
{
public ItemsDeatails(){
this.ItemsFromFile = new List<ItemsFromFile>();
}
public List<ItemsFromFile> ItemsFromFile { get; set; }
}
public class ItemsFromFile
{
public ItemsFromFile(){
this.products = new List<Product>();
}
public List<Product> products { get; set; }
}
public class Product {
public string ASIN { get; set; }
public string Title { get; set; }
}
Initialize your items properly. And create Root Object from the ground up.
Populate the internal classes first and then later ones.
var itemDetails = new ItemsDeatails();
itemDetails.ItemsFromFile = new ItemsFromFile();
var singleItems = ctx.BulkScannedItems.Where(x => x.UserSellerScanRequestId == id).ToList();
foreach (var item in singleItems)
{
itemDetails.ItemsFromFile.products.Add(new Product { ASIN = item.ASIN, Title = item.EbayTitle });
}
var fl = new File(){
nameLocator = "testimport1",
itemsDeatails = itemDetails
}
var token = ctx.BulkStores.FirstOrDefault(x => x.StoreName == store && x.Type == 1).Token;
var root = new RootObject()
{
Token = token,
file = fl
}
var json = new JavaScriptSerializer().Serialize(root);
Ensure that all your objects are assigned appropriately.
var token = ctx.BulkStores.FirstOrDefault(x => x.StoreName == store && x.Type == 1).Token;
var RootObject = new RootObject() {
Token = token,
file = new File() {
nameLocator = "testimport1",
itemsDeatails = new ItemsDeatails() {
ItemsFromFile = new List<ItemsFromFile>()
}
}
};
var itemsFromFile = new ItemsFromFile();
itemsFromFile.products = new List<Product>();
var singleItems = ctx.BulkScannedItems.Where(x => x.UserSellerScanRequestId == id).ToList();
foreach (var item in singleItems) {
itemsFromFile.products.Add(new Product { ASIN = item.ASIN, Title = item.EbayTitle });
}
RootObject.file.itemsDeatails.ItemsFromFile.Add(itemsFromFile);
var json = new JavaScriptSerializer().Serialize(RootObject);
That being said, it appears that you do not need the list of products inside of the ItemsFromFile class. This definition likely makes more sense:
public class ItemsFromFile {
public string ASIN { get; set; }
public string Title { get; set; }
}
Then your code would be something like this:
var token = ctx.BulkStores.FirstOrDefault(x => x.StoreName == store && x.Type == 1).Token;
var RootObject = new RootObject() {
Token = token,
file = new File() {
nameLocator = "testimport1",
itemsDeatails = new ItemsDeatails() {
ItemsFromFile = new List<ItemsFromFile>()
}
}
};
var singleItems = ctx.BulkScannedItems.Where(x => x.UserSellerScanRequestId == id).ToList();
foreach (var item in singleItems) {
RootObject.file.itemsDeatails.ItemsFromFile.Add(new ItemsFromFile { ASIN = item.ASIN, Title = item.EbayTitle });
}

C# Reactive Extensions (rx) FirstOrDefault enumerates entire collection

It seems that the expected behavior of FirstOrDefault is to complete after finding an item that matches the predicate and the expected behavior of concat is to evaluate lazily. However, the following example enumerates the entire collection even though the predicate matches the first item.
(Thanks for the friendlier code Shlomo)
void Main()
{
var entities = Observable.Defer(() => GetObservable().Concat());
Entity result = null;
var first = entities.FirstOrDefaultAsync(i => i.RowId == 1).Subscribe(i => result = i);
result.Dump();
buildCalled.Dump();
}
// Define other methods and classes here
public IEnumerable<IObservable<Entity>> GetObservable()
{
var rows = new List<EntityTableRow>
{
new EntityTableRow { Id = 1, StringVal = "One"},
new EntityTableRow { Id = 2, StringVal = "Two"},
};
return rows.Select(i => Observable.Return(BuildEntity(i)));
}
public int buildCalled = 0;
public Entity BuildEntity(EntityTableRow entityRow)
{
buildCalled++;
return new Entity { RowId = entityRow.Id, StringVal = entityRow.StringVal };
}
public class Entity
{
public int RowId { get; set; }
public string StringVal { get; set; }
}
public class EntityTableRow
{
public int Id { get; set; }
public string StringVal { get; set; }
}
Is this the expected behavior? Is there a way to defer the enumeration of the objects (specifically the building in this case) until truly needed?
The following is Linqpad-friendly code equivalent to what you have:
void Main()
{
var entities = Observable.Defer(() => GetObservable().Concat());
Entity result = null;
var first = entities.FirstOrDefaultAsync(i => i.RowId == 1).Subscribe(i => result = i);
result.Dump();
buildCalled.Dump();
}
// Define other methods and classes here
public IEnumerable<IObservable<Entity>> GetObservable()
{
var rows = new List<EntityTableRow>
{
new EntityTableRow { Id = 1, StringVal = "One"},
new EntityTableRow { Id = 2, StringVal = "Two"},
};
return rows.Select(i => Observable.Return(BuildEntity(i)));
}
public int buildCalled = 0;
public Entity BuildEntity(EntityTableRow entityRow)
{
buildCalled++;
return new Entity { RowId = entityRow.Id, StringVal = entityRow.StringVal };
}
public class Entity
{
public int RowId { get; set; }
public string StringVal { get; set; }
}
public class EntityTableRow
{
public int Id { get; set; }
public string StringVal { get; set; }
}
If you change GetObservable to the following, you'll get the desired result:
public IObservable<IObservable<Entity>> GetObservable()
{
var rows = new List<EntityTableRow>
{
new EntityTableRow { Id = 1, StringVal = "One"},
new EntityTableRow { Id = 2, StringVal = "Two"},
};
return rows.ToObservable().Select(i => Observable.Return(BuildEntity(i)));
}
It appears the implementation of Concat<TSource>(IEnumerable<IObservable<TSource>>) is eager in evaluating the enumerable, whereas the implementation of Concat<TSource>(IObservable<IObservable<TSource>>) and ToObservable<TSource>(IEnumerable<TSource>) maintain laziness appropriately. I can't say I know why.

Categories

Resources