Calculations on grouped rows using lambda functions in c# - c#

I am trying to calculate Best and Worst body measurement changes for people who go on a fitness trip.
I have a database full of before and after body composition measurements for several people who go on various trips. Every participant and every trip has an Id. There are 3 types of readings, B(efore), M(iddle) and A(fter). Here is an example of the data:
ParticipantId TripId Type Weight BodyFatPct
1 2 B 195 22.8
1 2 B 189.6 24.1
1 2 A 186.6 21.2
1 2 A 187.6 23.8
2 3 B 199.2 23.7
2 3 B 198.4 25.1
2 3 A 193 22.4
Here is the class I'm using to represent the data:
public partial class Detail
{
public int ParticipantId { get; set; }
public int TripId { get; set; }
public string Type { get; set; }
public double? Weight { get; set; }
public double? BodyFatPct { get; set; }
}
Here is my highly inefficient C# code to calculate best and worst for weight and body fat.
List<Detail> result = new List<Detail>();
var _result = result.GroupBy(x => new { x.ParticipantId, x.TripId });
foreach(var res in _result)
{
var beforeHighWeight = res.Where(x => x.Type == "B").Max(x => x.Weight);
var beforeLowWeight = res.Where(x => x.Type == "B").Min(x => x.Weight);
var afterWeight = res.Where(x => x.Type == "A").Min(x => x.Weight);
var beforeHighFat = res.Where(x => x.Type == "B").Max(x => x.BodyFatPct);
var beforeLowFat = res.Where(x => x.Type == "B").Min(x => x.BodyFatPct);
var afterFat = res.Where(x => x.Type == "A").Min(x => x.BodyFatPct);
var BestWeightDiff = BeforeHighWeight - afterWeight;
var WorstWeightDiff = BeforeLowWeight - afterWeight;
var BestFatDiff = BeforeHighFat - afterFat;
var WorstFatDiff = BeforeLowFat - afterFat;
}
In actuality, I have about 15 fields to calculate, not just two. Is there a lambda function that does row-wise calculations on grouped data? Any help appreciated.

Performance optimizations normally come with the cost of less maintainable code. So if performance is almost sufficient you should probably follow the hint as commented by Prasad Telkikar and avoid filtering res more than once with the same predicate, i.e. you should assign the filtered lists and work with them:
var resA = res.Where(x => x.Type == "A")
var resB = res.Where(x => x.Type == "B")
If performance is an issue and affords work, you can enumerate the list once and maybe stream the values from the database by using Aggregate. You could e.g. create a class that holds all the values you want to calculate and has a method to update the values given a new entry. You could e.g. create the classes
public class Statistic
{
public int BeforeHighWeight { get; private set; }
public int BeforeLowWeight { get; private set; }
// Add the dimensions you are interested in
public Statistic(Detail detail)
{
// initialize with given Detail
}
public Statistic AddDetail(Detail detail)
{
// update Statistic with given Detail
return this;
}
}
public class Statistics
{
private readonly ConcurrentDictionary<(int, int), Statistic> _statistics = new ConcurrentDictionary<(int, int), Statistic>();
public Statistics AddDetail(Detail detail)
{
_statistics.AddOrUpdate(
(detail.ParticipantId, detail.TripId),
key => new Statistic(detail),
(key, statistic) => statistic.AddDetail(detail)
);
return this;
}
}
Then you could aggregate your values like so:
var rand = new Random();
var result = Enumerable.Range(1, 1000000)
.Select(i => new Detail {ParticipantId = i % 10000, TripId = rand.Next(100000) /*, ...*/})
.Aggregate(
new Statistics(),
(statistics, detail) => statistics.AddDetail(detail)
);

Related

Atomic updates in Entity Framework

This is my simple example of incrementing record's value in EF:
public class context : DbContext
{
public DbSet<Result> Results { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer(#"Server=(localdb)\mssqllocaldb;Database=Test;Trusted_Connection=True;");
}
}
public class Result
{
public int Id { get; set; }
public List<History> Histories { get; set; }
}
public class History
{
public int Id { get; set; }
public DateTime Date { get; set; }
public int Value { get; set; }
}
public static int Get()
{
using (var db = new context())
{
var entity = db.Results
.Include(x => x.Histories)
.First(x => x.Id == 1);
return entity.Histories.Last().Value;
}
}
public static void Increment(int newValue)
{
using (var db = new context())
{
using (var transaction = db.Database.BeginTransaction())
{
var entity = db.Results
.Include(x => x.Histories)
.First(x => x.Id == 1);
if (entity.Histories.Last().Value != newValue)
{
entity.Histories.Add(new History() { Value = newValue });
db.SaveChanges();
transaction.Commit();
}
}
}
}
Next, I launch simple console app several times:
while(true)
{
var val = Service.Get();
Service.Increment(val + 1);
}
What I expect is Increment to be atomic, which means, there should be no History records with the same value (because of if (entity.Histories.Last().Value != newValue)). Unfortunately, when I run this in sql:
SELECT [value], COUNT(value) AS amount
FROM history
WHERE ResultId = 1
GROUP BY [value]
ORDER BY COUNT(value) DESC
I can see, that in fact, some values are duplicated
value amount
5 5
7 7
12 4
Whats wrong there? How can I make Increment to be atomic?
First a crash course on projection. EF is an ORM that allows you to query data in a relational database and map it to entities (classes) or project pieces of information from the relevant tables, working out the necessary SQL for you.
So for instance instead of this:
public static int Get()
{
using (var db = new context())
{
var entity = db.Results
.Include(x => x.Histories)
.First(x => x.Id == 1);
return entity.Histories.Last().Value;
}
}
It is much faster to do this:
public static int Get()
{
using (var db = new context())
{
var lastHistoryValue = db.Results
.Where(x => x.Id == 1)
.SelectMany(x => x.Histories)
.OrderByDescending(x => x.Date)
.Select(x => x.Value)
.FirstOrDefault();
return lastHistoryValue;
}
}
The first example is loading a Result and all Histories for that result into memory then attempting to get the last history, just to return it's value. (Accounting for the bug that there is no order by to determine what is the last item.)
The second example builds a query that will return the latest Historical Value. This doesn't load any entities into memory. You will get back a value of "0" if no History record is found.
Now when it comes to incrementing, this is where we typically do want to load entities and their related entities. EF works with a Transaction by default, so there is no need to complicate things introducing explicit ones:
public static void Increment(int newValue)
{
using (var db = new context())
{
var entity = db.Results
.Include(x => x.Histories)
.Single(x => x.Id == 1);
var latestHistory = entity.Histories
.OrderByDescending(x => x.Date)
.FirstOrDefault();
if (latestHistory == null || latestHistory.Value != newValue)
{
entity.Histories.Add(new History() { Value = newValue });
db.SaveChanges();
}
}
}
This also doesn't mean your check is that accurate since there is nothing stopping a value from being repeated. For instance:
Increment(4);
Increment(5);
Increment(4);
... would be perfectly valid. When the first increment happens provided the value wasn't already 4, the history would be added with a value of 4. Then a history would be added for 5. Then a history could be added for 4 again since the latest history would be "5", so you could end up with duplicates.

LINQ Group By Year to Form a Tree View

I am trying to create a tree view that would essentially break down like so:
- Year
- Month
- Related Item
So we might have the Year 2022, that has several related items within the several months.
I have created the following model:
public class TreeYear
{
public string NodeYear { get; set; }
public DateTime CreatedDateTime { get; set; }
public List<TreeMonth> Months { get; set; }
}
public class TreeMonth
{
public int MonthID { get; set; }
public string MonthName { get; set; }
public quoteSummary QuoteSummary{ get; set; }
}
I have written some code in my controller which currently returns every item like so:
var allQuotes = QuoteSummary.ToList();
var tree = new TreeYear();
foreach (var quote in allQuotes)
{
tree.NodeYear= quote.CreatedTime.Year.ToString();
tree.CreatedDateTime = quote.CreatedTime;
tree.Months = new List<TreeMonth>()
{
new TreeMonth() {
MonthID = quote.CreatedTime.Month,
MonthName = getAbbreviatedName(quote.CreatedTime.Month),
QuoteSummary = quote
}
};
}
But obviously over here you can see that it has all 41 records of which none are grouped up by year.
I thought maybe I could write some linq something like but at the moment incorrect:
var groups = TheResponse.Details
.GroupBy(
d => Int32.Parse(d.NodeYear),
(key, g) => g.GroupBy(
d => d.Months.Select(x => x.MonthID)),
(key2, g2) => g2.GroupBy(d => d.CreatedDateTime)
)
);
Or would I need to change the model for this idea to work?
If I understood your question correctly, then you need to flatten the inner list and then group by months again.
var groups = TheResponse.Details
.GroupBy(d => Int32.Parse(d.NodeYear))
.Select(d => new
{
Year = d.Key,
MonthObj = d.SelectMany(m => m.Months)
.GroupBy(m => m.MonthID)
.Select(x => new
{
MonthID = x.Key,
RelatedItem = x.ToList()
})
});
I have simplified it by using anonymous types, but you can obviously tweek it based on your resp. Model.

C# Dynamic Test for fields before aggregation functions like GroupBy

Assume the following code:
var citizens = await _stateProvider.SelectWhere(whereParams);
var retDto = new PercentGroupBy()
{
Total = citizens.Count,
Elements = citizens.GroupBy(p => p.Content.Current.AggState.ToString()).ToDictionary(g => g.Key, g => g.Count())
};
return retDto;
citizens is a list of the following class:
public class Citizen {
public string ETag { get; set; }
public string Name{ get; set; }
public dynamic Content { get; set; }
}
What is the best option to test that "p.Content.Current.AggState" property exists?
SelectWhere might return a few citizens where Content.Current is null and therefore asking for AggState throws error.
Oops I found the answer while posting the question, so here it is to share the knowledge:
Add a fluid "Where" before "GroupBy"
var citizens = await _stateProvider.SelectWhere(whereParams);
var retDto = new PercentGroupBy()
{
Total = citizens.Count,
Elements = citizens
.Where(p => p.Content.Current != null)
.GroupBy(p => p.Content.Current.AggState.ToString())
.ToDictionary(g => g.Key, g => g.Count())
};
return retDto;

Intersect list of list of object and list of object

I have a list of transaction class :
class Transactions
{
public Transactions()
{
Products = new List<Product>();
}
public string Date { get; set; }
public string TransactionID { get; set; }
public List<Product> Products { get; set; }
}
And a Product Class:
class Product
{
public decimal ProductCode { get; set; }
}
I have a list of Products like this:
List<Product> unioned = product.Union(secondProduct).ToList();
And I want Intersect of unioned and transaction products,
This code does not work:
var intersection = transactions.Where(q => q.Products.Intersect(unioned).Any());
I think the reason is transaction products length is variant and union length is fixed.
How can I do it?
Intersect is using the default equality comparer, so will do a reference check - i.e. compare that the object references are the same.
You need to use the overload which allows you to specify an equality comparer:
Thus:
public class ProductComparer : IEqualityComparer<Product>
{
public bool Equals(Product x, Product y)
{
// TODO - Add null handling.
return x.ProductCode == y.ProductCode;
}
public int GetHashCode(Product obj)
{
return obj.ProductCode.GetHashCode();
}
}
And then:
var intersection = transactions
.Where(q => q.Products.Intersect(unioned, new ProductComparer()).Any());
This test will now pass:
[TestMethod]
public void T()
{
Product p = new Product { ProductCode = 10M };
List<Product> product = new List<Product> { p };
List<Product> secondProduct = new List<Product> { new Product { ProductCode = 20M } };
List<Product> unioned = product.Union(secondProduct).ToList();
var transaction = new Transactions();
// add a different object reference
transaction.Products.Add(new Product { ProductCode = 10M });
IList<Transactions> transactions = new List<Transactions> { transaction };
var intersection = transactions
.Where(q => q.Products.Intersect(unioned, new ProductComparer()).Any());
Assert.AreEqual(1, intersection.Count());
}
Try this to solution without using Intersect. I use ProductCode to check if the Product is the same:
transactions.Where(q => q.Products.Exists(x => unioned.Exists(z => z.ProductCode == x.ProductCode))).ToList();
You could do something like:
List<Product> allproductsTrans = new List<Product>();
transactions.ForEach(p => allproductsTrans.Concat(p.Products));
var result = allproductsTrans.Intersect(unioned);
but as Slava Utesinov said in the comments, the following code would do the same:
transactions.SelectMany(x => x.Products).Intersect(unioned)
Instead of using EqualityComparer, you can intersect ids, and then, if needed, find objects by ids.
Your example:
transactions.Where(q => q.Products.Select(x => x.ProductCode).Intersect(unioned.Select(x => x.ProductCode)).Any());
General case:
var ids = list1.Select(x => x.Id).Intersect(list2.Select(x => x.Id));
var result = list1.Where(x => ids.Contains(x.Id));

Linq Get totals using group by

I have the following class:
class Item
{
public decimal TransactionValue { get; set; }
public string TransactionType { get; set; }
}
And I have this list:
var items = new List<Item>
{
new Item
{
TransactionValue = 10,
TransactionType = "Income"
},
new Item
{
TransactionValue = 10,
TransactionType = "Income"
},
new Item
{
TransactionValue = -5,
TransactionType = "Outgoing"
},
new Item
{
TransactionValue = -20,
TransactionType = "Outgoing"
}
};
And I am trying to get the sums based on ValueType, I have tried the below but it is adding everything and giving me one total which is -5, what I want is totals for each transaction type so I want to get a new class which is Totals class below and with this data: TotalIncoming : 20 and TotalOutgoing : - 25.
var r = items.Sum(x => x.TransactionValue);
class Totals
{
public decimal TotalIncoming { get; set; }
public decimal TotalOutgoing { get; set; }
}
Thanks
You can achieve your desired result with following query:-
Totals result = new Totals
{
TotalIncoming = items.Where(x => x.TransactionType == "Income")
.Sum(x => x.TransactionValue),
TotalOutgoing = items.Where(x => x.TransactionType == "Outgoing")
.Sum(x => x.TransactionValue)
};
But, as you can see with your Type Totals, we need to hard-code the TransactionType and we have no clue from the result that this Sum belongs to which type apart from the naming convention used.
I will create the below type instead:-
class ItemTotals
{
public string ItemType { get; set; }
public decimal Total { get; set; }
}
Here we will have the TransactionType along with its corresponding Total in the result, we can simply group by TransactionType & calculate the sum, here is the query for same:-
List<ItemTotals> query = items.GroupBy(x => x.TransactionType)
.Select(x => new ItemTotals
{
ItemType = x.Key,
Total = x.Sum(z => z.TransactionValue)
}).ToList();
Here is the Complete Working Fiddle, you can choose from both.
I'm sure there is a probably a clever way to do this in one line using Linq, but everything I could come up with was quite ugly so I went with something a bit more readable.
var results = items.GroupBy(x => x.TransactionType)
.ToDictionary(x => x.Key, x => x.Sum(y => y.TransactionValue));
var totals = new Totals
{
TotalIncoming = results["Income"],
TotalOutgoing = results["Outgoing"]
};

Categories

Resources