c# deleting duplicate rows from data table - c#

I'm currently having a problem with my C# code; it is producing duplicate lines in my data table but not in my database itself. I can't for the life of me find whats causing it, so I've tried creating a work around, how ever that doesn't seem to be working either. Any help would be appreciated.
for (int i = 0; i < ds.Tables[0].Rows.Count; i ++ )
{
for(int n = 0; n< ds.Tables[0].Rows.Count; n++)
{
if (n == i)
{
//do nothing
}
else
{
if (ds.Tables[0].Rows[i] == ds.Tables[0].Rows[n])
{
ds.Tables[0].Rows[n].Delete();
}
}
}
}

I suppose your issue is related with equality comparer try to map DataRow (difficult to hanlde as is) on more simple POCO class (a small class used only ad data model.
Then use group by or distinct to filter your data.
Here a little example :
The POCO class
public class DataRowModel
{
public object FirstCellData { get; set; }
public object SecondCellData { get; set; }
//... for every column in DataRow
public object LastCellData { get; set; }
public DataRowModel (DataRow row)
{
this.FirstCellData = row["FIELD_1"] ;
this.SecondCellData = row["FIELD_2"];
this.LastCellData = row["FIELD_3"];
}
}
The code to achieve unique rows :
List<DataRowModel> rows = new List<DataRowModel> ();
foreach (DataRow row in ds.Tables[0].Rows)
rows.Add( new DataRowModel ( row) );
List<DataRowModel> uniqueRows = new List<DataRowModel> ();
foreach (var groupedRow in rows.GroupBy(x => new { x.FirstCellData, x.SecondCellData, /*.. for each cell..*/ x.LastCellData }))
uniqueRows.Add(groupedRow.First());

Related

Bulk insert of ICollection into an Oracle Table

I'm writing an API client from a spec that came from Nswag studio. I am able to retrieve that data using the supplied client.PlansAsync(apikey).GetAwaiter().GetResult(), but I'm struggling to turn the returned ICollection into something that I'm able to bulk insert into an Oracle database table.
I have attempted to create a DataTable but during the transformation into a dataTable, an exception is being thrown. I'm suspect it has something to do with nullable types in the collection.
My guess is that I should be attempting to do the inserts using Entity Framework, but it seems like adding all of the extra EF Core stuff is overkill for this particular client.
I feel like the Oracle Bulk copy methods are perfect for what I'm intending, but I've been running to the issue listed above.
Any help would be greatly appreciated.
TIA
EDIT: Here's the code in question.
//first in the calling class
ICollection<Plans> plansFromApi = client_.PlansAsync(apiKey).GetAwaiter().GetResult();
ListToDatTable listToDt = new();
List<Plans> ps = plansFromApi.ToList();
DataTable dt = listToDt.ToDataTable<Plans>(ps);
//second the List to Datatable class
public class ListToDataTable
{
public DataTable ToDataTable<T>(List<T> items)
{
DataTable dataTable = new(typeof(T).Name);
PropertyInfo[] Props = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
foreach(T item in items)
{
var values = new object[Props.Length];
for(int i = 0; i < Props.Length; i++)
{
values[i] = Props[i].GetValue(item);
}
//line where the exception is thrown
//System.ArgumentException: 'Input array is longer than the number of columns in this table.'
dataTable.Rows.Add(values);
}
}
}
EDIT 2:
Here is what came out of Nswag studio. This is just one of 15 datasets that I need to retrieve. This isn't the one I'm currently testing, as that one has 25 properties, so for brevity I'm including one of the smaller ones. In the end they all will be the same, since they are all going to be processed the exact same way, and yes, I have tested with this dataset as well, and received the same exception.
[System.CodeDom.Compiler.GeneratedCode("NJsonSchema", "10.5.2.0 (Newtonsoft.Json v12.0.0.0)")]
public partial class ContactGroupedManufacturer
{
[Newtonsoft.Json.JsonProperty("lastContacted", Required = Newtonsoft.Json.Required.Default, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public System.DateTimeOffset? LastContacted { get; set; }
[Newtonsoft.Json.JsonProperty("vendorContactId", Required = Newtonsoft.Json.Required.DisallowNull, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public int VendorContactId { get; set; }
[Newtonsoft.Json.JsonProperty("ManufacturerId", Required = Newtonsoft.Json.Required.DisallowNull, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public int ManufacturerId { get; set; }
[Newtonsoft.Json.JsonProperty("website", Required = Newtonsoft.Json.Required.Default, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public string Website { get; set; }
}
Here area few rows of data:
lastContacted
vendorContactId
manufacturerId
website
6575
1848
6599
2693
6604
8878
06/08/2018
6692
6879
6930
4040
some url
UPDATE 2021/11/10: I found a NuGet package called MoreLinq that contained an extension method that handled the transformation to a DataTable.
ICollection<ActionPlans> actionPlans = client.ActionPlansAsync(apiKey).GetAwaiter().GetResult();
_logger.LogInformation(${actionPlans.Count} APs returned");
DataTable actionPlansDt = actionPlans.ToDataTable();
You can wrap the IEnumerable in a IDataReader to pass to the BulkCopy method. See eg:
ObjectDataReader
Once you have an IDataReader, pass it to OracleBulkCopy.WriteToServer(IDataReader) See OracleBulkCopy.
You probably have problem with quantity of columns in the DataTable.
Try this code:
public class ListToDataTable
{
public DataTable ToDataTable<T>(List<T> items)
{
DataTable dataTable = new DataTable();
PropertyInfo[] Props = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
bool columnsAlreadyCreated=false;
foreach(T item in items)
{
if (columnsAlreadyCreated==false)
{
for(int i = 0; i < Props.Length; i++)
{
dataTable.Columns.Add(Props[i].Name,Nullable.GetUnderlyingType(
Props[i].PropertyType) ?? Props[i].PropertyType);
}
columnsAlreadyCreated=true;
}
var values = new object[Props.Length];
for(int i = 0; i < Props.Length; i++)
{
values[i] = Props[i].GetValue(item);
}
//line where the exception is thrown
//System.ArgumentException: 'Input array is longer than the number of columns in this table.'
dataTable.Rows.Add(values);
}
}
}

How to query a Datatable using LINQ and add values to a List?

Sample Class:
public class ProductData
{
private Guid ProductID { get; set; }
private string ProductDescription { get; set; }
public ProductData(Guid pID, string pDescription)
{
this.ProductID = pID;
this.ProductDescription = pDescription;
}
}
Create a list of ProductData:
private static List<ProductData> GetProductDataList()
{
// code to populate DataSet ds here
DataTable dtReport = ds.Tables[0];
List<AssetData> lstProductData = new List<ProductData>();
int index = 1;
foreach (DataRow row in dtReport.Rows)
{
lstProductData.Add(new ProductData(new Guid(row["ProductID"].ToString()), row["Product"].ToString()));
index++;
}
return lstProductData.ToList();
}
Code works perfectly fine and as expected. But, I think the foreach loop can be avoided using LINQ. I try to utilize LINQ as much as possible for various reasons (cleaner looking code is one of the reason - correct me if I am wrong).
Is there any way I can achieve the same thing as above using LINQ and with minimum code.
This can be done using Select and ToList:
var lstProductData = dtReport.Rows.Cast<DataRow>()
.Select(row => new ProductData(new Guid(row["ProductID"].ToString())
, row["Product"].ToString()))
.ToList();

Importing an Excel Sheet and Validate the Imported Data with Loosely Coupled

I am trying to develop a module which will read excel sheets (possibly from other data sources too, so it should be loosely coupled) and convert them into Entities so to save.
The logic will be this:
The excel sheet can be in different format, for example column names in Excel sheet can be different so my system needs to be able to map different fields to my entities.
For now I will be assuming the format defined above will be same and hardcoded for now instead of coming from database dynamically after set on a configuration mapping UI kinda thing.
The data needs to be validated before even get mapped. So I should be able validate it beforehand against something. We're not using like XSD or something else so I should validate it against the object structure I am using as a template for importing.
The problem is, I put together some things together but I don't say I liked what I did. My Question is how I can improve the code below and make things more modular and fix the validation issues.
The code below is a mock-up and is not expected to work, just to see some structure of the design.
This is code I've come up with so far, and I've realized one thing that I need to improve my design patterns skills but for now I need your help, if you could help me:
//The Controller, a placeholder
class UploadController
{
//Somewhere here we call appropriate class and methods in order to convert
//excel sheet to dataset
}
After we uploaded file using an MVC Controller, there could be different controllers specialized to import certain behaviors, in this example I will uploading person related tables,
interface IDataImporter
{
void Import(DataSet dataset);
}
//We can use many other importers besides PersonImporter
class PersonImporter : IDataImporter
{
//We divide dataset to approprate data tables and call all the IImportActions
//related to Person data importing
//We call inserting to database functions here of the DataContext since this way
//we can do less db roundtrip.
public string PersonTableName {get;set;}
public string DemographicsTableName {get;set;}
public Import(Dataset dataset)
{
CreatePerson();
CreateDemograhics();
}
//We put different things in different methods to clear the field. High cohesion.
private void CreatePerson(DataSet dataset)
{
var personDataTable = GetDataTable(dataset,PersonTableName);
IImportAction addOrUpdatePerson = new AddOrUpdatePerson();
addOrUpdatePerson.MapEntity(personDataTable);
}
private void CreateDemograhics(DataSet dataset)
{
var demographicsDataTable = GetDataTable(dataset,DemographicsTableName);
IImportAction demoAction = new AddOrUpdateDemographic(demographicsDataTable);
demoAction.MapEntity();
}
private DataTable GetDataTable(DataSet dataset, string tableName)
{
return dataset.Tables[tableName];
}
}
I have IDataImporter and specialized concrete class PersonImporter. However, I am not sure it looks good so far since things should be SOLID so basically easy to extend later in the project cycle, this will be a foundation for future improvements, lets keep going:
IImportActions are where the magic mostly happens. Instead of designing things table based, I am developing it behavior based so one can call any of them to import things in more modular model. For example a table may have 2 different actions.
interface IImportAction
{
void MapEntity(DataTable table);
}
//A sample import action, AddOrUpdatePerson
class AddOrUpdatePerson : IImportAction
{
//Consider using default values as well?
public string FirstName {get;set;}
public string LastName {get;set;}
public string EmployeeId {get;set;}
public string Email {get;set;}
public void MapEntity(DataTable table)
{
//Each action is producing its own data context since they use
//different actions.
using(var dataContext = new DataContext())
{
foreach(DataRow row in table.Rows)
{
if(!emailValidate(row[Email]))
{
LoggingService.LogWarning(emailValidate.ValidationMessage);
}
var person = new Person(){
FirstName = row[FirstName],
LastName = row[LastName],
EmployeeId = row[EmployeeId],
Email = row[Email]
};
dataContext.SaveObject(person);
}
dataContext.SaveChangesToDatabase();
}
}
}
class AddOrUpdateDemographic: IImportAction
{
static string Name {get;set;}
static string EmployeeId {get;set;}
//So here for example, we will need to save dataContext first before passing it in
//to get the PersonId from Person (we're assuming that we need PersonId for Demograhics)
public void MapEntity(DataTable table)
{
using(var dataContext = new DataCOntext())
{
foreach(DataRow row in table.Rows)
{
var demograhic = new Demographic(){
Name = row[Name],
PersonId = dataContext.People.First(t => t.EmployeeId = int.Parse(row["EmpId"]))
};
dataContext.SaveObject(person);
}
dataContext.SaveChangesToDatabase();
}
}
}
And the validation, which mostly where I suck at unfortunately. The validation needs to be easy to extend and loosely coupled and also I need to be able to call this validation beforehand instead of adding everything.
public static class ValidationFactory
{
public static Lazy<IFieldValidation> PhoneValidation = new Lazy<IFieldValidation>(()=>new PhoneNumberValidation());
public static Lazy<IFieldValidation> EmailValidation = new Lazy<IFieldValidation>(()=>new EmailValidation());
//etc.
}
interface IFieldValidation
{
string ValidationMesage{get;set;}
bool Validate(object value);
}
class PhoneNumberValidation : IFieldValidation
{
public string ValidationMesage{get;set;}
public bool Validate(object value)
{
var validated = true; //lets say...
var innerValue = (string) value;
//validate innerValue using Regex or something
//if validation fails, then set ValidationMessage propert for logging.
return validated;
}
}
class EmailValidation : IFieldValidation
{
public string ValidationMesage{get;set;}
public bool Validate(object value)
{
var validated = true; //lets say...
var innerValue = (string) value;
//validate innerValue using Regex or something
//if validation fails, then set ValidationMessage propert for logging.
return validated;
}
}
I have done the same thing on a project. The difference is that I didn't have to import Excel sheets, but CSV files. I created a CSVValueProvider. And, therefore, the CSV data was bound to my IEnumerable model automatically.
As for validation, I figured that going through all rows, and cells, and validating them one by one is not very efficient, especially when the CSV file has thousands of records. So, what I did was that I created some validation methods that went through the CSV data column by column, instead of row by row, and did a linq query on each column and returned the row numbers of the cells with invalid data. Then, added the invalid row number/column names to ModelState.
UPDATE:
Here is what I have done...
CSVReader Class:
// A class that can read and parse the data in a CSV file.
public class CSVReader
{
// Regex expression that's used to parse the data in a line of a CSV file
private const string ESCAPE_SPLIT_REGEX = "({1}[^{1}]*{1})*(?<Separator>{0})({1}[^{1}]*{1})*";
// String array to hold the headers (column names)
private string[] _headers;
// List of string arrays to hold the data in the CSV file. Each string array in the list represents one line (row).
private List<string[]> _rows;
// The StreamReader class that's used to read the CSV file.
private StreamReader _reader;
public CSVReader(StreamReader reader)
{
_reader = reader;
Parse();
}
// Reads and parses the data from the CSV file
private void Parse()
{
_rows = new List<string[]>();
string[] row;
int rowNumber = 1;
var headerLine = "RowNumber," + _reader.ReadLine();
_headers = GetEscapedSVs(headerLine);
rowNumber++;
while (!_reader.EndOfStream)
{
var line = rowNumber + "," + _reader.ReadLine();
row = GetEscapedSVs(line);
_rows.Add(row);
rowNumber++;
}
_reader.Close();
}
private string[] GetEscapedSVs(string data)
{
if (!data.EndsWith(","))
data = data + ",";
return GetEscapedSVs(data, ",", "\"");
}
// Parses each row by using the given separator and escape characters
private string[] GetEscapedSVs(string data, string separator, string escape)
{
string[] result = null;
int priorMatchIndex = 0;
MatchCollection matches = Regex.Matches(data, string.Format(ESCAPE_SPLIT_REGEX, separator, escape));
// Skip empty rows...
if (matches.Count > 0)
{
result = new string[matches.Count];
for (int index = 0; index <= result.Length - 2; index++)
{
result[index] = data.Substring(priorMatchIndex, matches[index].Groups["Separator"].Index - priorMatchIndex);
priorMatchIndex = matches[index].Groups["Separator"].Index + separator.Length;
}
result[result.Length - 1] = data.Substring(priorMatchIndex, data.Length - priorMatchIndex - 1);
for (int index = 0; index <= result.Length - 1; index++)
{
if (Regex.IsMatch(result[index], string.Format("^{0}.*[^{0}]{0}$", escape)))
result[index] = result[index].Substring(1, result[index].Length - 2);
result[index] = result[index].Replace(escape + escape, escape);
if (result[index] == null || result[index] == escape)
result[index] = "";
}
}
return result;
}
// Returns the number of rows
public int RowCount
{
get
{
if (_rows == null)
return 0;
return _rows.Count;
}
}
// Returns the number of headers (columns)
public int HeaderCount
{
get
{
if (_headers == null)
return 0;
return _headers.Length;
}
}
// Returns the value in a given column name and row index
public object GetValue(string columnName, int rowIndex)
{
if (rowIndex >= _rows.Count)
{
return null;
}
var row = _rows[rowIndex];
int colIndex = GetColumnIndex(columnName);
if (colIndex == -1 || colIndex >= row.Length)
{
return null;
}
var value = row[colIndex];
return value;
}
// Returns the column index of the provided column name
public int GetColumnIndex(string columnName)
{
int index = -1;
for (int i = 0; i < _headers.Length; i++)
{
if (_headers[i].Replace(" ","").Equals(columnName, StringComparison.CurrentCultureIgnoreCase))
{
index = i;
return index;
}
}
return index;
}
}
CSVValueProviderFactory Class:
public class CSVValueProviderFactory : ValueProviderFactory
{
public override IValueProvider GetValueProvider(ControllerContext controllerContext)
{
var uploadedFiles = controllerContext.HttpContext.Request.Files;
if (uploadedFiles.Count > 0)
{
var file = uploadedFiles[0];
var extension = file.FileName.Split('.').Last();
if (extension.Equals("csv", StringComparison.CurrentCultureIgnoreCase))
{
if (file.ContentLength > 0)
{
var stream = file.InputStream;
var csvReader = new CSVReader(new StreamReader(stream, Encoding.Default, true));
return new CSVValueProvider(controllerContext, csvReader);
}
}
}
return null;
}
}
CSVValueProvider Class:
// Represents a value provider for the data in an uploaded CSV file.
public class CSVValueProvider : IValueProvider
{
private CSVReader _csvReader;
public CSVValueProvider(ControllerContext controllerContext, CSVReader csvReader)
{
if (controllerContext == null)
{
throw new ArgumentNullException("controllerContext");
}
if (csvReader == null)
{
throw new ArgumentNullException("csvReader");
}
_csvReader = csvReader;
}
public bool ContainsPrefix(string prefix)
{
if (prefix.Contains('[') && prefix.Contains(']'))
{
if (prefix.Contains('.'))
{
var header = prefix.Split('.').Last();
if (_csvReader.GetColumnIndex(header) == -1)
{
return false;
}
}
int index = int.Parse(prefix.Split('[').Last().Split(']').First());
if (index >= _csvReader.RowCount)
{
return false;
}
}
return true;
}
public ValueProviderResult GetValue(string key)
{
if (!key.Contains('[') || !key.Contains(']') || !key.Contains('.'))
{
return null;
}
object value = null;
var header = key.Split('.').Last();
int index = int.Parse(key.Split('[').Last().Split(']').First());
value = _csvReader.GetValue(header, index);
if (value == null)
{
return null;
}
return new ValueProviderResult(value, value.ToString(), CultureInfo.CurrentCulture);
}
}
For the validation, as I mentioned before, I figured that it would not be efficient to do it using DataAnnotation attributes. A row by row validation of the data would take a long time for CSV files with thousands of rows. So, I decided to validate the data in the Controller after the Model Binding is done. I should also mention that I needed to validate the data in the CSV file against some data in the database. If you just need to validate things like Email Address or Phone Number, you might as well just use DataAnnotation.
Here is a sample method for validating the Email Address column:
private void ValidateEmailAddress(IEnumerable<CSVViewModel> csvData)
{
var invalidRows = csvData.Where(d => ValidEmail(d.EmailAddress) == false).ToList();
foreach (var invalidRow in invalidRows)
{
var key = string.Format("csvData[{0}].{1}", invalidRow.RowNumber - 2, "EmailAddress");
ModelState.AddModelError(key, "Invalid Email Address");
}
}
private static bool ValidEmail(string email)
{
if(email == "")
return false;
else
return new System.Text.RegularExpressions.Regex(#"^[\w-\.]+#([\w-]+\.)+[\w-]{2,6}$").IsMatch(email);
}
UPDATE 2:
For validation using DataAnnotaion, you just use DataAnnotation attributes in your CSVViewModel like below (the CSVViewModel is the class that your CSV data will be bound to in your Controller Action):
public class CSVViewModel
{
// User proper names for your CSV columns, these are just examples...
[Required]
public int Column1 { get; set; }
[Required]
[StringLength(30)]
public string Column2 { get; set; }
}

Datagridview databinding when create new object on create new row

I've got a little problem with data-binding between DataGridView and a PropertyGrid.
Here is the code from the object I am binding to and the DataGridView:
public class Effort
{
public BindingList<EffortCalculationRelation> CalculationRelations { get; set; }
public int ID { get; set; }
// more properties
public Effort()
{
CalculationRelations = new BindingList<EffortCalculationRelation>();
CalculationRelations.Clear();
for (int i=0;i<10;i++)
{
CalculationRelations.Add( new EffortCalculationRelation() { ID = i, Name = "Round:" + i.ToString(), calculation = "Some calc" });
}
}
public Effort(int id) : this()
{
this.ID = id;
// Load all other properties
}
public class EffortCalculationRelation
{
public int ID { get; set; }
public string Name { get; set; }
public string calculation { get; set; }
public int Save()
{
// save or insert and return id or 0 on fail
if (this.ID > 0)
{
return this.Update();
}
else
{
return this.Insert();
}
}
public string Delete()
{
// delete and return "" or errormsg on fail
return "";
}
private int Insert()
{
// insert and return id or 0 on fail
return ID;
}
private int Update()
{
// return affected rows or 0 on fail
return 1;
}
public string Representation
{
get { return String.Format("{0}: {1}", ID, Name); }
}
}
}
The datagridview connection is realy simple an only just a little style:
public test()
{
effort = new Effort(1209);
dgv.DataSource = effort.CalculationRelations;
dgv.SelectionMode = System.Windows.Forms.DataGridViewSelectionMode.FullRowSelect;
dgv.AllowUserToAddRows = true;
//this.dgv.AllowUserToDeleteRows = false;
dgv.AllowUserToResizeRows = false;
dgv.ReadOnly = true;
dgv.SelectionChanged += (sender, args) =>
{
var selectedObjects =
(from System.Windows.Forms.DataGridViewRow r in dgv.SelectedRows
where r.DataBoundItem != null && r.DataBoundItem.GetType() == typeof(EffortCalculationRelation)
select r.DataBoundItem).ToArray();
// pg is a propertygrid
this.pg.SelectedObjects = selectedObjects;
};
}
So and my problem is, when I select the new row in the datagridview, that no properties are displayed in the propertygrid.
When I select a row that has an object in the list at the moment I load it, then I can edit the properties.
So could you please help?
The reason the new row does not show in the property grid is that its DataBoundItem is null so is removed by your LINQ statement where r.DataBoundItem != null. 1
I agree with you that this is very annoying behaviour, particularly since the object has in some sense been created.
I had thought there was a viable workaround of in certain circumstance binding the property grid to the new object in either the parent Effort object or in a BindingSource using some code like:
var selectedObjects =
(from System.Windows.Forms.DataGridViewRow r in dataGridView1.SelectedRows
where r.DataBoundItem != null && r.DataBoundItem.GetType() == typeof(Effort.EffortCalculationRelation)
select r.DataBoundItem).ToArray();
if (dataGridView1.CurrentRow.IsNewRow && dataGridView1.SelectedRows.Count == 1)
{
// I tried accessing the parent object like this:
//Effort.EffortCalculationRelation ecr = effort.CalculationRelations[effort.CalculationRelations.Count - 1];
//propertyGrid1.SelectedObject = ecr;
// Or accessing a binding source like:
propertyGrid1.SelectedObject = calculationRelations.Current;
}
else
{
propertyGrid1.SelectedObjects = selectedObjects;
}
I experimented with these variations a bit, as well as adding these items into the SelectedObjects array, thinking something would meet your requirements but what I eventually realised was shifting focus from the new row to the property grid before the DataGridView had committed the new row meant that the new row was lost and could no longer be edited.
So - what to do?
If I was in your spot I'd consider one of two things:
Allow direct editing in the grid of some form - maybe just to the new row.
Something like this in the selection changed event would work:
if (dataGridView1.CurrentRow.IsNewRow)
{
dataGridView1.CurrentRow.ReadOnly = false;
}
else
{
dataGridView1.CurrentRow.ReadOnly = true;
}
Keep the grid as is but don't allow new rows - instead handle new objects between a seperate row creation panel.
1 This works this way apparently by design - the DataBoundItem is not committed uptil you leave the grid. There is a little discussion including the DataGridView code in question here.

How can I add sequence IDs to classes in a list of another class?

I have the following class. Inside of the Parent class is a List of ParentDetail. Now I need to add a new field to the ParentDetail class. The field is called Id. What I need is a method in the main class that will iterate through the ParentDetails and populate the Id field with a number starting at 1.
Can anyone think of an easy way to do this? I am not sure how I can iterate through the List.
public class Parent {
public IList<ParentDetail> ParentDetails {
get { return _ParentDetails; }
}
private List<ParentDetail> _ParentDetails = new List<ParentDetail>();
public Parent() {
this._ParentDetails = new List<ParentDetail>();
}
}
public class ParentDetail {
public int Id { get; set; } <<<<<<<< new field
}
}
for(int i = 0; i < _ParentDetails.Count; i++)
{
_ParentDetails[i].Id = i + 1;
}
Could do a straight for(int i; i < Count; i++) loop as suggested by Roy Dictus (+1 from me) - I'm just chucking this up there as an alternative, which is very useful in situations where you don't know the count of an enumerable.
foreach(var detail in _ParentDetails.
Select((d, i) => new { Item = d, Index = i + 1})
{
detail.Item.Id = detail.Index;
}
In your case you do; as you have an IList, however.

Categories

Resources