I have an IQueryable that should be able to sum the points of all records in a subquery like so (This is meant to work with OData):
public Task<IQueryable<TemplatePMPResultViewModel>> GetTemplatePMPResultViewModel()
{
var result = _dbSet.Select(x => new TemplatePMPResultViewModel
{
Id = x.Id,
InsertDate = x.InsertDate,
Latitude = x.Latitude,
Longitude = x.Longitude,
CompanyAssetName = x.TemplatePMP.CompanyAsset.CodeAsset,
ConstructionSiteName = x.ConstructionSite.Name,
MechanicName = x.Mechanic.Name,
SupervisorName = x.Supervisor.Name,
Points = _dbContext.ItemCheckListResults.Where(x => x.TemplatePMPResultId.Equals(x.Id)).Sum(x => x.Points),
PendingPoints = _dbContext.ItemCheckListResults
.Where(x => x.TemplatePMPResultId.Equals(x.Id) &&
x.ExecuteActionPlan &&
(x.EndDate ?? DateTime.MaxValue) <= DateTime.Now)
.Sum(x => x.Points),
RegionName = x.ConstructionSite.Region.Name,
TypeName = x.TemplatePMP.CompanyAsset.Group.GroupType.Description
});
return Task.FromResult(result);
}
However the sum of points is returning the wrong value, when I get the query from the output window and execute it in my database manager query tool it works just fine. (I just change the Id of the record I want to see)
From what I cant tell, the lambda expression returns only the sum of the first record of ItemCheckListResults but not the sum of all records.
Can I run subqueries like that or should I try another aproach, if so, how should I do it?
Best regards!
Related
We have aroung 140K documents in a collection, and a typical search returns around 6 MB of data.
This process is taking around 30 to 40 seconds, and I would like to know why, and how can I improve it?
Sample data to see how the structure looks like: https://pastebin.com/0eJJfVKB
_id is the primarmy key, and there's a compound index on HotelID, GiataCityID, and CountryCode.
This is how I'm getting my data from cosmos by code:
The first part is to only get these data , as I figured out why return everything, just get what I need, that should speed things up.
Would appreciate any input, and ready to provide more details.
var hoteldetails = new List<HotelDetails>();
var projection = Builders<HotelDetails>.Projection.Expression(
t => new HotelDetails
{
Id = t.Id,
PropertyName = t.PropertyName,
GiataCityId = t.GiataCityId,
City = t.City,
CityId = t.CityId,
SupplierList = t.SupplierList,
CountryCode = t.CountryCode,
CountryName = t.CountryName,
longitude = t.longitude,
latitude = t.latitude,
Images = t.Images,
HotelId = t.HotelId,
SupplierRoomList = t.SupplierRoomList
});
this.GetDBClient();
var database = this.client.GetDatabase(this.settings.DataBase);
var collection = database.GetCollection<HotelDetails> (MongoCollections.HotelDetails.ToString());
if (HotelIDs != null && HotelIDs.Count > 0)
{
hoteldetails = await collection
.Aggregate()
.Match(x => x.CountryCode == CountryCode && (x.CityId == CityId || x.GiataCityId ==CityId.ToString())
&& x.SupplierList.Any(y => HotelIDs.Contains(y.HotelId) && y.SupplierName == SupplierNameType))
.Project(projection)
.Limit(Limit)
.ToListAsync();
}
The RU data page
RUs
I tried to use projecting to only get the properties i need, that sped things up a bit but not by a whole lot.
I tried to play with the filter, but really to no effect.
I'm currently creating a site that showcases all my patients within a data table and I have to use FromSqlRaw in order to get the data from my database. I have a search funtion that allows me to search the patients within the table but upon entering the page I get this error when I use AsQueryable and no data is displayed in the table. It recommends me to use AsEnumerable but when I do I get an intellisense error. Any ideas on how to fix?
public async Task<IActionResult> Search(StaySearchViewModel model)
{
if (model.Cleared)
{
return Json(new
{
draw = model.Draw,
data = new object[] { },
recordsFiltered = 0,
recordsTotal = 0,
total = 0
});
}
var records = getSearchData(model);
//var records = System.Linq.Enumerable.AsEnumerable(getSearchData(model)); // Hard coding this an enumerable will break line 55, 57, and 64
//Sorting
if (!string.IsNullOrEmpty(model.SortOrder))
records = records.OrderBy(model.SortOrder);
var count = await records.CountAsync().ConfigureAwait(false);
records = records.Skip(model.Start);
if (model.Length != -1) records = records.Take(model.Length);
// Create models
var result = new List<SpStaySearchResultViewModel>();
try
{
await records.ForEachAsync(r =>
{
result.Add(new SpStaySearchResultViewModel()
{
BuildingName = r.BuildingName,
CaseManager = r.CaseManager,
CaseManagerId = r.CaseManagerId,
OccupantFileAs = r.OccupantFileAs,
StayOCFSNumber = r.StayOCFSNumber,
StayId = r.StayId,
MaxOfBillSentDate = r.MaxOfBillSentDate,
CountOfChildren = r.CountOfChildren,
StartDate = r.StartDate,
EndDate = r.EndDate,
OccupantId = r.OccupantId,
IsActive = r.IsActive,
});
}).ConfigureAwait(false);
}
catch (Exception e) { }
return Json(new
{
draw = model.Draw,
data = result,
recordsFiltered = count,
recordsTotal = count,
});
}
private IQueryable<spStaysSearch> getSearchData(StaySearchViewModel model)
{
var records = db.SpStaySearches.FromSqlRaw("dbo.spStaysSearch").AsQueryable();
if (model.OccupantId.HasValue)
records = records.Where(x => x.OccupantId == model.OccupantId);
if (!string.IsNullOrWhiteSpace(model.OccupantFileAs))
records = records.Where(x => x.OccupantFileAs == model.OccupantFileAs);
if (!string.IsNullOrWhiteSpace(model.BuildingName))
records = records.Where(x => x.BuildingName == model.BuildingName);
if (!string.IsNullOrWhiteSpace(model.CaseManager))
records = records.Where(x => x.CaseManager == model.CaseManager);
if (!string.IsNullOrWhiteSpace(model.BuildingName))
records = records.Where(x => x.BuildingName == model.BuildingName);
if (model.IntakeDateStart.HasValue && model.IntakeDateEnd.HasValue)
{
records = records.Where(x => x.StartDate >= model.IntakeDateStart && x.StartDate <= model.IntakeDateEnd);
}
else
{
if (model.IntakeDateStart.HasValue)
records = records.Where(x => x.StartDate >= model.IntakeDateStart);
if (model.IntakeDateEnd.HasValue)
records = records.Where(x => x.StartDate <= model.IntakeDateEnd);
}
if (model.ExitDateStart.HasValue && model.ExitDateEnd.HasValue)
{
records = records.Where(x => x.EndDate >= model.ExitDateStart && x.EndDate <= model.ExitDateEnd);
}
else
{
if (model.ExitDateStart.HasValue)
records = records.Where(x => x.EndDate >= model.ExitDateStart);
if (model.ExitDateEnd.HasValue)
records = records.Where(x => x.EndDate <= model.ExitDateEnd);
}
if (model.IsActive.HasValue)
records = records.Where(x => x.IsActive == model.IsActive);
return records;
}
Try this
var records = getSearchData(model).ToList();
var count = records.Count;
.....
You can't order records by model.SortOrder since it has nothing to do with records.
You can only do something like this
if (!string.IsNullOrEmpty(model.SortOrder)) records = records.OrderBy(r=> r.Id);
because your source data is a Stored Procedure, you cannot compose additional query expressions over the top of it. Instead you must load it into memory, as the error suggests, by enumerating the result set.
Including Related Data
SQL Server doesn't allow composing over stored procedure calls, so any attempt to apply additional query operators to such a call will result in invalid SQL. Use AsEnumerable or AsAsyncEnumerable method right after FromSqlRaw or FromSqlInterpolated methods to make sure that EF Core doesn't try to compose over a stored procedure.
The obvious way to interpret this in the code is to call .ToList() on the results from the SP, then to match your existing code pattern you can cast that result back to IQueryable:
var records = db.SpStaySearches.FromSqlRaw("dbo.spStaysSearch")
.ToList()
.AsQueryable()
Using AsEnumerable() is sometimes problematic as there are many different libraries that you may have implemented that might all provide an AsEnumerable() extension method.
We have to do this because even in SQL you cannot simply select from an SP and then add where clauses to it, you first have to read the results into a temporary table or a table variable, then you can re-query from the result set, that is what we are effectively doing now, we are reading the results into a C# variable (.ToList()) and then composing an in-memory query over the top of that result.
If your search logic must be encapsulated in a stored procedure, then given the technical limitations, the usual expectation is that you would add the search arguments as optional parameters to the stored procedure, rather then tring to add filter clauses on top of the results in C#.
We can help with how to move your filter logic into dbo.spStaysSearch but you'll have to post the content of that SP, ideally as a new question.
Instead of using an SP at all, where we lose practically all the goodness that EF can offer us, an alternative approach is to replace your SP entirely with a raw SQL then the rest of your logic will work as expected.
var sql = #"
SELECT
tblStays.*, tblOccupant.OccupantID,
tblOccupant.FileAs AS OccupantFileAs,
IIF(tblStays.BuildingName LIKE 'Main Shelter',
tblOccupant.OCFSMainNumber,
tblOccupant.OCFSNorthNumber) AS StayOCFSNumber,
COALESCE([CountOfOccupantStayID], 0) AS CountOfChildren,
tblCaseManager.FileAs AS CaseManager,
StaysMaxBillSentDate.MaxOfBillSentDate
FROM tblStays
LEFT JOIN tblOccupantStays ON tblStays.StayID = tblOccupantStays.StayID
LEFT JOIN tblOccupant ON tblOccupantStays.OccupantID = tblOccupant.OccupantID
LEFT JOIN (
SELECT lkpOccStays.StayID
, COUNT(tblOccupantStays.OccupantStayID) AS CountOfOccupantStayID
FROM tblOccupantStays lkpOccStays
INNER JOIN tblOccupant lkpChild ON lkpOccStays.OccupantID = lkpChild.OccupantID
WHERE lkpChild.OccupantType LIKE 'Child'
GROUP BY lkpOccStays.StayID
) OccupantStays_CountOfChildren ON tblStays.StayID = OccupantStays_CountOfChildren.StayID
LEFT JOIN tblCaseManager ON tblStays.CaseManagerID = tblCaseManager.CaseManagerID
LEFT JOIN (SELECT tblStayBillingHx.StayID
, MAX(tblStayBillingHx.BillSentDate) AS MaxOfBillSentDate
FROM tblStayBillingHx
GROUP BY tblStayBillingHx.StayID
) StaysMaxBillSentDate ON tblStays.StayID = StaysMaxBillSentDate.StayID
";
var records = db.SpStaySearches.FromSqlRaw(sql);
In this way the SP is providing the structure of the resultset, which might be necessary if you are using the Database-First approach but you are no longer executing the SP at all.
The SQL in this answer is provided as a guide to the syntax only, there is not enough information available to determine the validity of the query or that the results conform to your business requirements.
I've been using Stopwatch and it looks like the below query is very expensive in terms of performance, even though what I already have below I find most optimal based on various reading (change foreach loop with for, use arrays instead of collection, using anonymous type not to take the whole table from DB). Is there a way to make it faster? I need to fill the prices array, which needs to be nullable. I'm not sure if I'm missing something?
public float?[] getPricesOfGivenProducts(string[] lookupProducts)
{
var idsAndPrices = from r in myReadings select
new { ProductId = r.ProductId, Price = r.Price };
float?[] prices = new float?[lookupProducts.Length];
for(int i=0;i<lookupProducts.Length;i++)
{
string id = lookupProducts[i];
if (idsAndPrices.Any(r => r.ProductId == id))
{
prices[i] = idsAndPrices.Where(p => p.ProductId == id)
.Select(a=>a.Price).FirstOrDefault();
}
else
{
prices[i] = null;
}
}
return prices;
}
It's likely every time you call idsAndPrices.Any(r => r.ProductId == id), you are hitting the database, because you haven't materialized the result (.ToList() would somewhat fix it). That's probably the main cause of the bad performance. However, simply loading it all into memory still means you're searching the list for a productID every time (twice per product, in fact).
Use a Dictionary when you're trying to do lookups.
public float?[] getPricesOfGivenProducts(string[] lookupProducts)
{
var idsAndPrices = myReadings.ToDictionary(r => r.ProductId, r => r.Price);
float?[] prices = new float?[lookupProducts.Length];
for (int i = 0; i < lookupProducts.Length; i++)
{
string id = lookupProducts[i];
if (idsAndPrices.ContainsKey(id))
{
prices[i] = idsAndPrices[id];
}
else
{
prices[i] = null;
}
}
return prices;
}
To improve this further, we can identify that we only care about products passed to us in the array. So let's not load the entire database:
var idsAndPrices = myReadings
.Where(r => lookupProducts.Contains(r.ProductId))
.ToDictionary(r => r.ProductId, r => r.Price);
Now, we might want to avoid the 'return null price if we can't find the product' scenario. Perhaps the validity of the product id should be handled elsewhere. In that case, we can make the method a lot simpler (and we won't have to rely on having the array in order, either):
public Dictionary<string, float> getPricesOfGivenProducts(string[] lookupProducts)
{
return myReadings
.Where(r => lookupProducts.Contains(r.ProductId))
.ToDictionary(r => r.ProductId, r => r.Price);
}
And a note unrelated to performance, you should use decimal for money
Assuming that idsAndPrices is an IEnumerable<T>, you should make it's initialization:
var idsAndPrices = (from r in myReadings select
new { ProductId = r.ProductId, Price = r.Price })
.ToList();
It's likely that the calls to:
idsAndPrices.Any(r => r.ProductId == id)
and:
idsAndPrices.Where(p => p.ProductId == id)
..are causing the IEnumerable<T> to be evaluated every time it's called.
Based on
using anonymous type not to take the whole table from DB
I assume myReadings is the database table and
var idsAndPrices =
from r in myReadings
select new { ProductId = r.ProductId, Price = r.Price };
is the database query.
Your implementation is far from optimal (I would rather say quite inefficient) because the above query is executed twice per each element of lookupProducts array - idsAndPrices.Any(...) and idsAndPrices.Where(...) statements.
The optimal way I see is to filter as much as possible the database query, and then use the most efficient LINQ to Objects method for correlating two in memory sequences - join, in your case left outer join:
var dbQuery =
from r in myReadings
where lookupProducts.Contains(r.ProductId)
select new { ProductId = r.ProductId, Price = r.Price };
var query =
from p in lookupProducts
join r in dbQuery on p equals r.ProductId into rGroup
from r in rGroup.DefaultIfEmpty().Take(1)
select r?.Price;
var result = query.ToArray();
The Any and FirstOrDefault are O(n) and redundant. You can get a 50% speed up just by removing theAll call. FirstOrDefault will give you back a null, so use it to get a product object (remove the Select). If you want to really speed it up you should just loop through the products and check if prices[p.ProductId] != null before setting prices[p.ProductId] = p.Price.
bit of extra code code there
var idsAndPrices = (from r in myReadings select
new { ProductId = r.ProductId, Price = r.Price })
.ToList();
for(int i=0;i<lookupProducts.Length;i++)
{
string id = lookupProducts[i];
prices[i] = idsAndPrices.FirstOrDefault(p => p.ProductId == id);
}
better yet
Dictionary<Int, Float?> dp = new Dictionary<Int, Float?>();
foreach(var reading in myReadings)
dp.add(r.ProductId, r.Price);
for(int i=0;i<lookupProducts.Length;i++)
{
string id = lookupProducts[i];
if(dp.Contains(id)
prices[i] = dp[id];
else
prices[i] = null;
}
First off, this worked with MSSQL Server but we've switched over to MySQL and now it isn't working.
I have a simple query which returns total values based off a certain year.
It looks like this:
public Dictionary<string, decimal> GetYearValues(int[] SupplierID) {
_ctx.Database.CommandTimeout = 5 * 60;
var Total = (from x in _ctx.Invoices
where x.Turnover == true &&
SupplierID.Contains(x.FK_SupplierID)
group x by new { x.InvoiceDate.Year } into summ
select new {
Year = summ.Key.Year,
TurnOver = summ.Sum(s => s.NetAmount_Home ?? 0)
}).ToDictionary(x => x.Year.ToString(), x => x.TurnOver);
return Total;
}
I've noticed that nothing is wrong with the query, it's only the ToDictionary which fails and gives the error message. Can anyone tell me why?
I have a query that processes about 500 records pulled from various tables, grouped and then summaried (if that's a word) into a working report. Everything works fine but it takes about 30 seconds to run this one report and i'm getting complaints from my users.
The procedure in question is this one:
public static List<LabourEfficiencies> GetLabourEfficienciesByTimeSheet(DateTime dateFrom, DateTime dateTo)
{
CS3Entities ctx = new CS3Entities();
//get all relevant timesheetline items
var tsItems = from ti in ctx.TimeSheetItems
where ti.TimeSheetHeader.Date >= dateFrom && ti.TimeSheetHeader.Date <= dateTo && ti.TimeSheetHeader.TimeSheetCategory != "NON-PROD"
select new TimesheetLine
{
TimesheetNo = ti.TimeSheetNo,
HoursProduced = ti.HoursProduced,
HoursProducedNet = ti.HoursProducedNet,
ItemID = ti.ItemID,
ProcessID = ti.ProcessID,
ProcessDuration = ti.ProcessDuration,
DowntimeHours = 0M
};
//get all relevant downtimeline items
var tsDownT = from dt in ctx.DowntimeItems
where dt.TimeSheetHeader.Date >= dateFrom && dt.TimeSheetHeader.Date <= dateTo && dt.TimeSheetHeader.TimeSheetCategory != "NON-PROD"
select new TimesheetLine
{
TimesheetNo = dt.TimeSheetNo,
HoursProduced = 0M,
HoursProducedNet = 0M,
ItemID = "",
ProcessID = "",
ProcessDuration = 0M,
DowntimeHours = dt.DowntimeHours
};
//combine them into single table
var tsCombi = tsItems.Concat(tsDownT);
var flatQuery = (from c in tsCombi
join th in ctx.TimeSheetHeaders on c.TimesheetNo equals th.TimeSheetNo
select new
{
th.TimeSheetNo,
th.EmployeeNo,
th.TimeSheetCategory,
th.Date,
c.HoursProduced,
c.ProcessDuration,
th.HoursWorked,
c.HoursProducedNet,
c.DowntimeHours,
c.ItemID
});
//add employee details & group by timesheet no (1 line per timesheet no)
//NB. FnTlHrs checks whether there are any indirect hrs & deducts them if there are
var query = flatQuery.GroupBy(f => f.TimeSheetNo).Select(g => new LabourEfficiencies
{
Eno = g.FirstOrDefault().EmployeeNo,
Dept =g.FirstOrDefault().TimeSheetCategory,
Date = g.FirstOrDefault().Date,
FnGrHrs =g.Where(w =>w.TimeSheetCategory == "FN" &&!w.ItemID.StartsWith("090")).Sum(h => h.HoursProduced),
FnTlHrs =g.Where(w =>w.ItemID.StartsWith("090")).Sum(h => h.ProcessDuration) >0? (g.FirstOrDefault(w =>w.TimeSheetCategory =="FN").HoursWorked) -(g.Where(w =>w.ItemID.StartsWith("090")).Sum(h =>h.ProcessDuration)): g.FirstOrDefault(w =>w.TimeSheetCategory =="FN").HoursWorked,
RmGrHrs =g.Where(w =>w.TimeSheetCategory == "RM").Sum(h => h.HoursProduced),RmGrHrsNet =g.Where(w =>w.TimeSheetCategory == "RM").Sum(h => h.HoursProducedNet),
RmTlHrs =g.FirstOrDefault(w =>w.TimeSheetCategory == "RM").HoursWorked,
MpGrHrs =g.Where(w =>w.TimeSheetCategory =="MATPREP").Sum(h => h.HoursProduced),
MpTlHrs =g.FirstOrDefault(w =>w.TimeSheetCategory =="MATPREP").HoursWorked,
DtHrs = g.Sum(s => s.DowntimeHours),
Indirect =g.Where(w =>w.ItemID.StartsWith("090")).Sum(h => h.ProcessDuration)
});
return query.ToList();
}
The first few bits just gather the data, it's the last query that is the "meat" of the procedure and takes the time.
I'm fairly sure I've done something horrid as the SQL it spits out is terrible, but for the life of me i can't see how to improve it.
Any hints greatly appreciated.
Gordon
Your expression gets optimized both in IQueriable compilation and SQL server query optimization and even here takes that long. It's highly probable that you have no column indexes needed for execution plan to be faster. Copy/paste your rendered SQL expression to SSMS, run it and see the actual plan. Optimize database structire if needed (put indexes). Otherwise, you got that really large amont of data that makes process slow.