Fastest coding way to add rows to an ASP.net DataTable - c#

What's the fastest way in term of speed of coding to add rows to a DataTable? I don't need to know neither the name of columns nor datatype. Is it possible to add rows without previously specify the number or name of dataTable columns?
DataTable t = new DataTable();
t.Rows.Add(value1,
value1,
value2,
value3,
...
valueN
);
DataSet ds = new DataSet();
ds.Tables.Add(t);

If the input comes out of a collection, you could loop it to create the DataColumns with the correct type:
var data = new Object[] { "A", 1, 'B', 2.3 };
DataTable t = new DataTable();
// create all DataColumns
for (int i = 0; i < data.Length; i++)
{
t.Columns.Add(new DataColumn("Column " + i, data[i].GetType()));
}
// add the row to the table
t.Rows.Add(data);

To answer your first question: no, you have to have columns defined on the table. You can't just say, "Hey, make a column for all these values." Nothing stopping you from creating columns on the fly, though, as Mr. Schmelter says.

Without knowing the rows or columns (you first need to add columns, without that not possible)
for(int intCount = 0;intCount < dt.Rows.Count; intCount++)
{
for(int intSubCount = 0; intSubCount < dt.Columns.Count; intSubCount++)
{
dt.Rows[intCount][intSubCount] = yourValue; // or assign to something
}
}

Related

Copy a range of data rows to another datatable - C#

I have a data table that contains 100 rows, I want to copy a range of rows(31st row to 50th row) to another data table.
I am following below logic.
DataTable dtNew = table.Clone();
for(int k=30; k < 50 && k < table.Rows.Count; k++)
{
dtNew.ImportRow(table.Rows[k]);
}
Is there any better approach to do this?
Using LINQ you can do something like:
DataTable dtNew = table.Select().Skip(31).Take(20).CopyToDataTable();
Performance wise, using LINQ wont do any better, it however makes it more readable.
EDIT: Added handling check
int numOFEndrow = 20;
int numOfStartRow = 31;
if (table.Rows.Count > numOFEndrow +numOfStartRow)
{
DataTable dtNew = table.Select().Skip(numOfStartRow).Take(numOFEndrow).CopyToDataTable();
}
If it's about readability, then a good idea would be to throw this into an extension method.
Without changing your logic:
public static class Utils
{
public static void CopyRows(this DataTable from, DataTable to, int min, int max)
{
for (int i = min; i < max && i < from.Rows.Count; i++)
to.ImportRow(from.Rows[i]);
}
}
Then you can always reuse it without all the fancy syntax and know that it does exactly what you need if there is a concern of performance:
DataTable dt1 = new DataTable();
DataTable dt2 = new DataTable();
dt1.CopyRows(dt2, 30, 50);

C# - Specified cast is not valid using DataTable and Field<int>

I have a csv file with 8 columns, and I am trying to populate an object with 8 variables, each being a list to hold the columns in the csv file. Firstly, I am populating a DataTable with my csv data.
I am now trying to populate my object with the data from the DataTable
DataTable d = GetDataTableFromCSVFile(file);
CoolObject l = new CoolObject();
for (int i = 0; i < d.Rows.Count; i++)
{
l.column1[i] = d.Rows[i].Field<int>("column1"); <-- error here
}
And here is my CoolObject
public class CoolObject
{
public List<int> column1 { set; get; }
protected CoolObject()
{
column1 = new List<int>();
}
}
Unfortunately I am receiving an error on the highlighted line:
System.InvalidCastException: Specified cast is not valid
Why is this not allowed? How do I work around it?
Obviously you DataTable contains columns of type string, so do integer validation in GetDataTableFromCSVFile method, so consumers of this method don't need to worry about it.
Obviously you DataTable contains columns of type string, so do integer validation in GetDataTableFromCSVFile method, so consumers of this method don't need to worry about it.
private DataTable GetDataTableFromCSVFile()
{
var data = new DataTable();
data.Columns.Add("Column1", typeof(int));
// Read lines of file
// line is imaginery object which contains values of one row of csv data
foreach(var line in lines)
{
var row = data.NewRow();
int.TryParse(line.Column1Value, out int column1Value)
row.SetField("Column1", column1Value) // will set 0 if value is invalid
// other columns
}
return data;
}
Then another problem with your code, that you assugn new values to List<int> through index, where list is empty
l.column1[i] = d.Rows[i].Field<int>("column1");
Above line will throw exception because empty list doesn't have item on index i.
So you in the end your method will look
DataTable d = GetDataTableFromCSVFile(file);
CoolObject l = new CoolObject();
foreach (var row in d.Rows)
{
l.column1.Add(row.Field<int>("column1"));
}
In case you are using some third-party library for retrieving data from csv to DataTable - you can check if that library provide possibility to validate/convert string values to expected types in DataTable.
Sounds like someone didn't enter a number in one of the cells. You'll have to perform a validation check before reading the value.
for (int i = 0; i < d.Rows.Count; i++)
{
object o = d.rows[i]["column1"];
if (!o is int) continue;
l.column1[i] = (int)o;
}
Or perhaps it is a number but for some reason is coming through as a string. You could try it this way:
for (int i = 0; i < d.Rows.Count; i++)
{
int n;
bool ok = int.TryParse(d.rows[i]["column1"].ToString(), out n);
if (!ok) continue;
l.column1[i] = n;
}

Coded UI - Filling DataTable from a UI Table (HtmlTable)

In Coded UI, I am facing a problem with Filling my Data Table on the top of HTmlTable on the UI.
Actually it is taking a lot of time to fill the datatable when there are 1000 of rows on the UI Table. I am working like this :
DataTable TestDataTable = new DataTable();
for (int i = 0; i < Table.RowCount; i++)
{
HtmlRow hr = (HtmlRow)Table.Rows[i];
for (int k = 0; k < hr.CellCount; k++)
{
TestDataTable.Rows[i][k] = hr.Cells[k].FriendlyName;
}
}
Its working Fine but as said it takes a lot of time. So is there any way i could fill the dataTable FASTER ?
Thanks,
Aashish GUpta
Possibly moving TestDataTable.Rows[i] out of the inner loop as it may be doing a full table (i,k) index evaluation every time.
HtmlRow hr = (HtmlRow)Table.Rows[i];
DataRow dest = (DataRow)TestDataTable.Rows[i];
for (int k = 0; k < hr.CellCount; k++)
{
dest[k] = hr.Cells[k].FriendlyName;
}
Have altered the data type in the code above based on the asker's comment. As the member types within DataTable are not specified I have assumed HtmlRow for dest.

Fastest way to copy a list of ado.net datatables

I have a list of DataTables like
List<DataTable> a = new List<DataTable>();
I want to make a deep copy of this list (i.e. copying each DataTable). My code currently looks like
List<DataTable> aCopy = new List<DataTable>();
for(int i = 0; i < a.Rows.Count; i++) {
aCopy.Add(a[i].Copy());
}
The performance is absolutely terrible, and I am wondering if there is a known way to speed up such a copy?
Edit: do not worry about why I have this or need to do this, just accept that it is part of a legacy code base that I cannot change
if you have to copy a data table it is essentially an N time operation. If the data table is very large and causing a large amount of allocation you may be able to speed up the operation by doing a section at a time, but you are essentially bounded by the work set.
You can try the following - it gave me a performance boost, although your mileage might vary! I've adapted it to your example to demonstrate how to copy a datatable using an alternative mechanism - clone the table, then stream the data in. You could easily put this in an extension method.
List<DataTable> aCopy = new List<DataTable>();
for(int i = 0; i < a.Rows.Count; i++) {
DataTable sourceTable = a[i];
DataTable copyTable = sourceTable.Clone(); //Clones structure
copyTable.Load(sourceTable.CreateDataReader());
}
This was many times faster (around 6 in my use case) than the following:
DataTable copyTable = sourceTable.Clone();
foreach(DataRow dr in sourceTable.Rows)
{
copyTable.ImportRow(dr);
}
Also, If we look at what DataTable.Copy is doing using ILSpy:
public DataTable Copy()
{
IntPtr intPtr;
Bid.ScopeEnter(out intPtr, "<ds.DataTable.Copy|API> %d#\n", this.ObjectID);
DataTable result;
try
{
DataTable dataTable = this.Clone();
foreach (DataRow row in this.Rows)
{
this.CopyRow(dataTable, row);
}
result = dataTable;
}
finally
{
Bid.ScopeLeave(ref intPtr);
}
return result;
}
internal void CopyRow(DataTable table, DataRow row)
{
int num = -1;
int newRecord = -1;
if (row == null)
{
return;
}
if (row.oldRecord != -1)
{
num = table.recordManager.ImportRecord(row.Table, row.oldRecord);
}
if (row.newRecord != -1)
{
if (row.newRecord != row.oldRecord)
{
newRecord = table.recordManager.ImportRecord(row.Table, row.newRecord);
}
else
{
newRecord = num;
}
}
DataRow dataRow = table.AddRecords(num, newRecord);
if (row.HasErrors)
{
dataRow.RowError = row.RowError;
DataColumn[] columnsInError = row.GetColumnsInError();
for (int i = 0; i < columnsInError.Length; i++)
{
DataColumn column = dataRow.Table.Columns[columnsInError[i].ColumnName];
dataRow.SetColumnError(column, row.GetColumnError(columnsInError[i]));
}
}
}
It's not surprising that the operation will take a long time; not only is it row by row, but it also does additional validation.
You should specify the capacity of the list otherwise it will have to grow internally to accommodate the data. See here for the detailed explanation.
List<DataTable> aCopy = new List<DataTable>(a.Count);
I found following approach much more efficient than other ways of filtering records like LINQ, provided your search criteria is simple:
public static DataTable FilterByEntityID(this DataTable table, int EntityID)
{
table.DefaultView.RowFilter = "EntityId = " + EntityID.ToString();
return table.DefaultView.ToTable();
}

Merging DataTables - disregarding the first row

How can I merge DataTable objects ignoring the first row?
The datatable I need to merge with the one I've got comes from a parsed CSV file and its first row (sometimes) still contains headers, which are obviously not supposed to end up in the resulting table...
DataTable.Merge method does not seem to offer such an option. What's the best way to do that? Just removing the first row beforehand? But that affects (alters) the "original", and what if I wanted it to stay as it was. Removing and reinserting after the merge? Smells like "clever coding". Is there really no better way?
Editing my previous
I wrote code on similar lines and ended up with all rows of dt1 intact and dt2 containing only row 2 &3 of from dt1
var dt1 = new DataTable("Test");
dt1.Columns.Add("id", typeof(int));
dt1.Columns.Add("name", typeof(string));
var dt2 = new DataTable("Test");
dt2.Columns.Add("id", typeof(int));
dt2.Columns.Add("name", typeof(string));
dt1.Rows.Add(1, "Apple"); dt1.Rows.Add(2, "Oranges");
dt1.Rows.Add(3, "Grapes");
dt1.AcceptChanges();
dt1.Rows[0].Delete();
dt2.Merge(dt1);
dt2.AcceptChanges();
dt1.RejectChanges();
Let me know if you find it acceptable.
Vijay
You could go through the rows separately and merge them into the table, something like
public static class DataTableExtensions
{
public static void MergeRange(this DataTable dest, DataTable table, int startIndex, int length)
{
List<string> matchingColumns = new List<string>();
for (int i = 0; i < table.Columns.Count; i++)
{
// Only copy columns with the same name and type
string columnName = table.Columns[i].ColumnName;
if (dest.Columns.Contains(columnName))
{
if (dest.Columns[columnName].DataType == table.Columns[columnName].DataType)
{
matchingColumns.Add(columnName);
}
}
}
for (int i = 0; i < length; i++)
{
int row = i + startIndex;
DataRow destRow = dest.NewRow();
foreach (string column in matchingColumns)
{
destRow[column] = table.Rows[row][column];
}
dest.Rows.Add(destRow);
}
}
}

Categories

Resources