Creating a Linq group by query - c#

//SELECT table1.GG_ITEM, Sum(table1.REM_QTY) AS SumPerGG_ITEM
//FROM table1
//WHERE (table1.SUGG_DOCK_DATE Is Not Null)
//GROUP BY table1.GG_ITEM
//ORDER BY table1.GG_ITEM;
var try1 = (from row in db2.Dumps select new { Type1 = row.GA_ITEM, Type2 = row.REM_QTY });
Debug.Print(":::::try1:::::");
foreach (var row in try1)
{
Debug.Print(row.Type1.ToString());
Debug.Print(row.Type2.ToString());
}
var try2 = (from row in db2.Dumps group row by row.GA_ITEM into g select new { Type1 = g.Key, Type2 = g.ToList() });
Debug.Print("::::try2:::::");
foreach (var row in try2)
{
Debug.Print(row.Type1.ToString());
Debug.Print(row.Type2.ToString());
}
I'm converting an Access SQL query to Linq. The two columns I am selecting from my table Dumps are GA_ITEM and REM_QTY. My try1 is working out just fine and I see the contents of both columns printed out. My try1 is not yet duplicating the functionality of the Access SQL query.
My try2 is an attempt at grouping. For my try2 row.Type1.ToString() is readable however row.Type2.ToString() is showing up in the output window as:
System.Collections.Generic.List`1[garminaspsandbox3.Models.Dump]
What I really would like to do is in try2 select GA_ITEM and REM_QTY like I did in try1 and group by GA_ITEM however those fields aren't showing up in my autocomplete for the g object.
Does anyone know how to do this in Linq?
Thank you for posting...

Your Type2 property holds a List, not a single item,So you need to use another loop and iterate over the items in that group:
foreach (var row in try2)
{
Debug.Print(row.Type1.ToString());
foreach(var item in row.Type2)
{
Debug.Print(item.GA_ITEM);
Debug.Print(item.REM_QTY);
}
}

Related

Group by, get max value of each group and update the group, linq to sql

From a table, I group the elements by name and get the max value of its qty column (quantity) of each group, then I update each group with the max value. I do it this way:
var result = from skuTableElements in tableElements
group skuTableElements by skuTableElements.Material
into grouped
select new
{
Material = grouped.First().Material,
Qty = grouped.Max(x => x.Qty)
};
foreach (var item in result) {
var result2 = from elements in tableElements
where elements.Material == item.Material
select elements;
foreach (var item2 in result2) {
item2.Qty = item.Qty;
}
}
It works fine with few elements, but I'm guessing with a lot of rows this might take too long.
Is there any better way to do this?

How to simplify LINQ query just to filter out some special rows

I am very unfamiliar with Entity Framework and LINQ. I have a single entity set with some columns where I want to filter our some special rows.
4 of the rows are named Guid (string), Year (short), Month (short) and FileIndex (short). I want to get all rows which have the maximum FileIndex for each existing combination of Guid-Year-Month.
My current solution looks like this:
var maxFileIndexRecords = from item in context.Udps
group item by new { item.Guid, item.Year, item.Month }
into gcs
select new { gcs.Key.Guid, gcs.Key.Year, gcs.Key.Month,
gcs.OrderByDescending(x => x.FileIndex).FirstOrDefault().FileIndex };
var result = from item in context.Udps
join j in maxFileIndexRecords on
new
{
item.Guid,
item.Year,
item.Month,
item.FileIndex
}
equals
new
{
j.Guid,
j.Year,
j.Month,
j.FileIndex
}
select item;
I think there should be a shorter solution with more performance. Does anyone have a hint for me?
Thank you
You were close. It's not necessary to actually select the grouping key. You can simply select the first item of each group:
var maxFileIndexRecords =
from item in context.Udps
group item by new { item.Guid, item.Year, item.Month }
into gcs
select gcs.OrderByDescending(x => x.FileIndex).FirstOrDefault();

linq group by statement not working

I have a linq statement like following,
var v1 = from s in context.INFOONEs group s by s.DATET into xyz select xyz;
I am trying like following to display the results
foreach (var x in v1)
{
Console.WriteLine(x.);
}
But intellisence is not showing the columns when I type x.
What I am doing wrong? And what is the right way to achieve what I am trying to accomplish?
Thanks
because there is no column in x. There are some record(s) under each group.So you need to get those records:
foreach (var x in v1)
{
Console.WriteLine(x.Key); // display the Key of current group
foreach(var item in x) // iterate over the records in the group
{
Console.WriteLine(item.) // here you can access your columns
}
}

c# Using Select Linq statements?

I have Rows object that is IEnumerable<dynamic>, it has 5 properties (columns) and 100 rows. One of the properties/columns is Group, only 2 distinct groups out of 100 rows, so first I run a distinct against it:
IEnumerable<dynamic> Groups = Rows.Select(x => x.Group).Distinct();
This works, no error.
Then I want to go back to my Rows object and loop through them where this group = the group in Rows, like this:
foreach (string Group in Groups)
{
IEnumerable<dynamic> GroupData =
from rowdata in Rows
where rowdata.Group = #Group
select rowdata;
But I get this error on the last line:
'WebMatrix.Data.DynamicRecord' does not contain a definition for 'Group'
Anyone knows why this isn't working?
Surely I can do this another way, but I wanted to use c# select statement instead. How can I though?
Edit to show usage:
foreach (var row in GroupData){
string ThisGroup = row.Group
}
...
Instead of selecting twice, group on the Group value:
IEnumerable<IGrouping<string, dynamic>> groups = Rows.GroupBy(x => (string)x.Group);
Now you can just loop through the result:
foreach (IGrouping<string, dynamic> group in groups) {
...
}
The IGrouping<> object has a Key property which is the value that you grouped on, and it's also a collection of the values in the group.

Super slow performance of querying a DataTable with LINQ

I have a code similar to this structure:
my table has 108000 rows.
This datatable is really just I read a tab delimited text file to process so I put it in a datatable.
private void Foo(DataTable myDataTable)
{
List<string> alreadyProcessed = new List<string>();
foreach(DataRow row in myDataTable.Rows)
{
string current = row["Emp"].ToString().Trim();
if (alreadyProcessed.Contains(current))
continue;
alreadyProcessed.Add(current);
var empRows = from p in myDataTable.AsEnumerable
where p["Emp"].ToString().Trim() == current
select new EmpClass
{
LastName = (string) p["lName"],
// some more stuff similr
};
// do some stuff with that EmpClass but they shouldn't take too long
}
}
Running such a thing is taking more than 15 minutes. How can I improve this?
Here is a rather naive rewrite of your code.
Instead of tracking which employees you have already processed, let's just group all rows by their employees and process them each separately.
var rowsPerEmployee =
(from DataRow row in dt.Rows
let emp = row["Emp"].ToString().Trim()
group row by emp into g
select g)
.ToDictionary(
g => g.Key,
g => g.ToArray());
foreach (var current in rowsPerEmployee.Keys)
{
var empRows = rowsPerEmployee[current];
... rest of your code here, note that empRows is not all the rows for a single employee
... and not just the lastname or similar
}
This will first group the entire datatable by the employee and create a dictionary from employee to rows for that employee, and then loop on the employees and get the rows.
You should do Group By "EMP", otherwise you're going through each row and for some rows you're querying the whole table. Something like this
from p in myDataTable.AsEnumerable
group p by p.Field<string>("Emp") into g
select new { Emp = g.Key,
Data = g.Select(gg=>new EmpClass
{
LastName = gg.Field<string>("lName")
}
)
}
One thing that might slow things down for you in the linq statement is, how much data you're selecting! you write 'select new EmpClass', and depending on how many columns (and rows/information for that matter) your selecting to become your output may slow things drastically down for you. Other tips and tricks to work on that problem may be found in: http://visualstudiomagazine.com/articles/2010/06/24/five-tips-linq-to-sql.aspx

Categories

Resources