I am trying to create a new row when I have the filtered results of a linq query. I want the syntax to be something like the following, but I don't know what to do. I am trying to create a new row with a selection of different fields from the old row.
dv = (from row in MV.Data.AsEnumerable()
where !(row["eCode"].ToString().Contains("OP"))
select DataRow(row["time"], row["value"],
row["size"], row["condition"],
row["eCode"]))
.CopyToDataTable().DefaultView;
Without knowing your specific application, you might try something like this:
//where dt is your DataTable...
var source = MV.Data;
var newRows = source.Where(d => d.eCode.ToString().Contains("OP"))
.Select(d => dt.Rows.Add(d.time, d.value, d.size, d.condition, d.eCode));
I write most of my linq like this as I find the lambdas easier to read. Note that DataTables accepts a list of parameters as an input; no need to construct a DataRow first. The first Where clause checks for your eCode, and select adds the row to your DataTable AND returns it to newRows in case you still need them. If you don't need them, you can either iterate through using ForEach or, if there are few enough rows, do this:
//where dt is your DataTable...
var source = MV.Data;
source.Where(d => d.eCode.ToString().Contains("OP"))
.ToList() //copies everything into memory
.Foreach(d => dt.Rows.Add(d.time, d.value, d.size, d.condition, d.eCode));
This approach just incorporates an inline Foreach iteration.
Related
I am filtering data from datatable using linq, it is working fine, except when search criteria doesn't match, the resultant table is not handled for emptiness.
Below is my code:
var table = dtTokensInfo.AsEnumerable()
.Where(r => r.Field<string>("tokenName").Contains(txtTokenName.Text))
.CopyToDataTable();
I want table to contain values as filtered from dtTokensInfor, but am not able to handle it when search criteria doesn't match
You should detach the call to CopyToDataTable from the result of the Linq expression
DataTable table = null;
var temp = dtTokensInfo.AsEnumerable()
.Where(r => r.Field<string>("tokenName")
.Contains(txtTokenName.Text));
if(temp != null)
table = temp.CopyToDataTable();
If you look at the MSDN page about CopyToDataTable IEnumerable extension you will notice this between the possible exceptions thrown
ArgumentNullException: The source IEnumerable(Of T) sequence is
Nothing and a new table cannot be created.
Here are a list of column names:
var colNames = new List<string> { "colE", "colL", "colO", "colN" };
Based on the position of the column names in the list, I want to make that column's visible index equal to the position of the column name, but without returning a list. In other words, the following lambda expression without "ToList()" at the end:
colNames.Select((x, index) => { grid_ctrl.Columns[x].VisibleIndex = index; return x; }).ToList();
Can this be coded in a one-line lambda expression?
Use a loop to make side-effects. Use queries to compute new data from existing data:
var updates =
colNames.Select((x, index) => new { col = grid_ctrl.Columns[x].VisibleIndex, index })
.ToList();
foreach (var u in updates)
u.col.VisibleIndex = u.index;
Hiding side-effects in queries can make for nasty surprises. We can still use a query to do the bulk of the work.
You could also use List.ForEach to make those side-effects. That approach is not very extensible, however. It is not as general as a query.
Yes, here you are:
colNames.ForEach((x) => grid_ctrl.Columns[x].VisibleIndex = colNames.IndexOf(x));
Note that you need unique strings in your list, otherwise .IndexOf will behave badly.
Unfortunately LINQ .ForEach, as its relative foreach doesn't provide an enumeration index.
The following C# code takes a large datatable with many columns and an array of 2 column names. It will give a new datatable with two rows where there are duplicate rows for the two fields supplied staff no & skill.
This is too specific and I need to supply any number of fields as the groupby.
can someone help me?
string[] excelField = new string[0]; // contains a list of field name for uniquness
excelField[0] = "staff No";
excelField[1] = "skill";
DataTable dataTableDuplicateRows = new DataTable();
dataTableDuplicateRows.Clear();
dataTableDuplicateRows.Columns.Clear();
foreach (string fieldName in excelField)
{
dataTableDuplicateRows.Columns.Add(fieldName);
}
var duplicateValues = dataTableCheck.AsEnumerable()
.GroupBy(row => new { Field0 = row[excelField[0]], Field1 = row[excelField[1]] })
.Where(group => group.Count() > 1)
.Select(g => g.Key);
foreach (var duplicateValuesRow in duplicateValues)
{
dataTableDuplicateRows.Rows.Add(duplicateValuesRow.Field0, duplicateValuesRow.Field1);
}
I think what you require is something make the linq more dynamic, even though you could achieve it by using expression tree, the DynamicLinq library would appear to solve your issue in an easier way.
For you case, with the library, just use the GroupBy extension method with a string value.
More info about DynamicLinq library:
Scott Gu's blog
I have an array of ProgramIDs and would like to create a number of Select statements dynamically depending on how many ProgramIds there are.
For example:
var surveyProgramVar = surveyProgramRepository.Find().Where(x => x.ProgramId == resultsviewmodel.ProgramIds.FirstOrDefault());
This is an example of the select statement working with a single ProgramId.FirstOrDefault(). How do I create a list/array of SurveyProgramVars and select for each ProgramIds in the array?
It won't be necessarily optimal, but you might try:
var surveyProgramVar = surveyProgramRepository.Find()
.Where(x => resultsviewmodel.ProgramIds.Contains(x.ProgramId));
You could try something like:
var surveyProgramVar = surveyProgramRepository.Find().Where(x => resultsviewmodel.ProgramIds.Contains(x.ProgramId));
Tip: If the Find() method does a hit on a database, would be nice if you create a specific method to to a IN statment on the query. If you does not do this, it will take all records on a table and filter it in memory (linq to objects), which works but not very nice. Your code could be something like:
var surveyProgramVar = surveyProgramRepository.FindByProgramsId(resultsviewmodel.ProgramIds);
I have a DataSet with several rows and columns (as DataSets tend to have). I need to create a tally row on the bottom with sums of each column. I'd like to do this with a single LINQ expression, as it will simplify a bunch of my code. I can get the total of a single column like so:
var a = (from m in month
where <some long expression>
select m["BGCO_MINUTES"] as Decimal?).Sum();
However, I want totals for other columns as well. I don't want to use multiple LINQ expressions because there's also a complicated where clause in there, and I'm doing several tally rows with various expressions and only want to loop through this set once. I also don't want to manually loop through the dataset myself and add up the totals since I'm creating many of these tally rows and think it would be messier.
What I want is an anonymous type that contains a total of BGCO_MINUTES, 800IB_MINUTES and TSDATA_MINUTES.
Is there any way to do this?
You could do this:
// run the filters once and get List<DataRow> with the matching rows
var list = (from m in month
where <some long expression>
select m).ToList();
// build the summary object
var result = new {
BGCO_MINUTES = list.Sum(m => m["BGCO_MINUTES"] as Decimal?),
_800IB_MINUTES= list.Sum(m => m["800IB_MINUTES"] as Decimal?),
}
And that's assuming your where clause is not just long to type, but computationally expensive to evaluate. That will iterate through the list once per column.
If you really want to only iterate the list once, you can probably do it with Enumerable.Aggregate, but the code is less elegant (in my opinion):
// run the filters once and get List<DataRow> with the matching rows
var a = (from m in month
where <some long expression>
select m)
.Aggregate( new { BGCO_MINUTES = (decimal?)0m,
_800IB_MINUTES = (decimal?)0m },
(ac,v) => new { BGCO_MINUTES = ac.BGCO_MINUTES + (decimal?)v["BGCO_MINUTES"],
_800IB_MINUTES = ac._800IB_MINUTES + (decimal?)v["800IB_MINUTES"] });
Like I said, I think it's less elegant than the first version, but it should work. Even though the first one requires a temporary copy of the values that match the where clause (memory cost) and 1 pass through the list for each field (CPU cost), I think it's a lot more readable than the latter version - make sure the performance difference is worth it before using the less-understandable version.
Use Aggregate instead of Sum as it is more flexible - you will be able to have object (or simply dictionary) to hold sums for individual columns while iterating through each row.
(non-compiled code ahead)
class SumObject {
public float First;
public float Second;
}
var filtered = (from m in month
where <some long expression>
select m;
filtered.Aggregate(new SumObject(), (currentSum, item)=> {
currentSum.First += item.First;
currentSum.Second += item.Second;
return currentSum;
});