I have a datatable with around 100k records. Datatable has multiple columns that contains some integer values. I need to add those integer values and write it into column Total Count and Common Count. I am using following code but it taking about 10-15 seconds. How can i do this in efficient way ??
Edited Part Start
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#pudTowerList", SqlDbType.VarChar, 8000).Value = cellId;
cmd.Parameters.Add("#pudTowerCol", SqlDbType.VarChar, 8000).Value = cellIdCol;
sqlCon.Open();
SqlDataReader dr = cmd.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dr);
SqlCon.Close();
cmd.Dispose();
Edited Part Ended
Here i edit this datable and then bind to gridview at the end
foreach (DataRow drow in dt.Rows)
{
int nTotalCount = 0;
int nCommonCount = 0;
for (int i = 2; i < nColumnCount; i++)
{
nTotalCount += int.Parse(drow[i].ToString());
if (int.Parse(drow[i].ToString()) != 0)
{
nCommonCount += 1;
}
}
drow["Total Count"] = nTotalCount; // On commenting this lines it runs fast
drow["Common Count"] = nCommonCount; // On commenting this lines it runs fast
}
Find a way to replace
int.Parse(drow[i].ToString());
with direct casting:
(int)drow[i];
If you already don't have integers in cell values, do whatever you can to have them there.
Other than that - look out for any events that might be fired as a result of column value update - maybe that is what's slowing you down.
Another idea: use SQL to calculate column values!
I suggest not to perform such heavy task in c#, instead return this calculated values from sql server it self(try to manipulate this heavy data in a stored procedure) and return calculated value.
Instead of computing in C#, you can compute this in SQL as given below:
SELECT Id,Col1,Col2,Col3,
Col1+Col2+Col3 as TotalCount,
SIGN(Col1)+SIGN(Col2)+SIGN(Col3) as CommonCount
FROM test.SO1;
If you HAVE to use C#, immediate improvement i can suggest is to replace two Parse() calls with one Convert() call. You will also avoid two ToString() calls. Try this:
REPLACE
nTotalCount += int.Parse(drow[i].ToString());
if (int.Parse(drow[i].ToString()) != 0)
{
nCommonCount += 1;
}
WITH THIS
var val = Convert.ToInt32(drow[i]);
if(val != 0)
{
nTotalCount += val;
nCommonCount++;
}
Related
I have a data table that contains 100 rows, I want to copy a range of rows(31st row to 50th row) to another data table.
I am following below logic.
DataTable dtNew = table.Clone();
for(int k=30; k < 50 && k < table.Rows.Count; k++)
{
dtNew.ImportRow(table.Rows[k]);
}
Is there any better approach to do this?
Using LINQ you can do something like:
DataTable dtNew = table.Select().Skip(31).Take(20).CopyToDataTable();
Performance wise, using LINQ wont do any better, it however makes it more readable.
EDIT: Added handling check
int numOFEndrow = 20;
int numOfStartRow = 31;
if (table.Rows.Count > numOFEndrow +numOfStartRow)
{
DataTable dtNew = table.Select().Skip(numOfStartRow).Take(numOFEndrow).CopyToDataTable();
}
If it's about readability, then a good idea would be to throw this into an extension method.
Without changing your logic:
public static class Utils
{
public static void CopyRows(this DataTable from, DataTable to, int min, int max)
{
for (int i = min; i < max && i < from.Rows.Count; i++)
to.ImportRow(from.Rows[i]);
}
}
Then you can always reuse it without all the fancy syntax and know that it does exactly what you need if there is a concern of performance:
DataTable dt1 = new DataTable();
DataTable dt2 = new DataTable();
dt1.CopyRows(dt2, 30, 50);
I've got two arrays, and I need to put the second forward items to the other array using Array.Copy, but nothing happens, it just does not add anything.
Here's the code:
DataRow[] auxRows = rFComDataSet.TestStepNames
.Select("ScenarioName = '" + scenarioName + "'");
DataRow[] newRows = new DataRow[auxRows.Count()];
auxRows.CopyTo(newRows, 0);
foreach (DataRow row in newRows)
{
DataRow teste = this.rFComDataSet.TestStepNames.NewRow();
Array.Copy(row.ItemArray, 1, teste.ItemArray, 0, 4);
row["ScenarioName"] = newScenarioName;
this.rFComDataSet.TestStepNames.Rows.Add(row.ItemArray);
}
This behavior is the consequence of the implementation of the ItemArray property.
This is the code of the GET accessor
public object[] ItemArray
{
get
{
int num;
object[] objArray;
DataColumn column;
int num2;
num2 = this.GetDefaultRecord();
objArray = new object[this._columns.Count];
num = 0;
goto Label_0037;
Label_001C:
column = this._columns[num];
objArray[num] = column[num2];
num += 1;
Label_0037:
if (num < ((int) objArray.Length))
{
goto Label_001C;
}
return objArray;
}
}
As you can see calling DataRow.ItemArray returns a new object array where the values from the underlyng row are copied to.
When you use Array.Copy you are setting values in this array not in
the underlying values of the DataRow. So your row remains with the null values
A possible workaround is the following (NOT TESTED)
object[] itemArray = new object[this.rFComDataSet.TestStepNames.Columns.Count];
Array.Copy(row.ItemArray, 1, itemArray, 0, 4);
this.rFComDataSet.TestStepNames.Rows.Add(itemArray);
In this way we force the underlying values of new row created by Rows.Add to be the value of the object array created separately
There are a couple of things to take note however. Your call auxRows.CopyTo(newRows, 0); doesn't create a new row, it just copy all the rows reference to the new array, but they points at the same data, so changing anything in newRows change the corresponding row in auxRows.
Finally it is not clear why you have all this work to copy the row and then add to the TestStepNames table the same row from the foreach loop
Just skip Copy and do like:
DataRow teste = this.rFComDataSet.TestStepNames.NewRow();
teste.ItemArray = row.ItemArray;
row.ItemArray will create a new object for you.
I'm coding in c# on webpages/razor with MS SQL database
I have a table with the following columns
Sat1
Sat2
Sat3
Sat4
...
Sat25
I want to loop through each of these, and assign the value to satAvail
I have the following
for (var i = 1; i < 26; i++)
{
satWeek = "Sat" + i;
satAvail = item.satWeek;
}
I want the equivalent of satAvail = item.Sat1;
I've tried a few different lines but having no joy
use reflection
var value = item.GetType().GetProperty("Sat" + i).GetValue(item, null);
and if you want a sum (assuming Sat1... Sat2 are integers)
var sum = 0;
for (var i = 1; i < 26; i++) {
sum +=(int)item.GetType().GetProperty("Sat" + i).GetValue(item, null);
}
satAvail = sum;
or linq way :
var sum = Enumerable.Range(1, 25)
.Select(x => (int)item.GetType().GetProperty("Sat" + x).GetValue(item, null))
.Sum();
It's not clear if you're using an ORM or ADO, but assuming ADO, you could use something like:
DataTable dt = new DataTable();
foreach (DataRow row in dt.Rows)
{
foreach (DataColumn column in dt.Columns)
{
var satAvail = row[column];
}
}
I'm not sure I'm clear on your actual requirement, but in general, when working with the Database helper, if you want to access a column value resulting from a Database.Query or Database.QuerySingle call, you can either do it using dot notation or an indexer.
For example, you may get data doing this:
var db = Database.Open("MyDatabase");
var item = db.QuerySingle("SELECT * FROM Mytable WHERE ID = 1");
If you know want to access the value of a column called Sat1, you would use item.Sat1. However, if the column name is represented as a variable, you would need to use an indexer instead:
var satWeek = "Sat" + "1";
var satAvail = item[satWeek];
What's the fastest way in term of speed of coding to add rows to a DataTable? I don't need to know neither the name of columns nor datatype. Is it possible to add rows without previously specify the number or name of dataTable columns?
DataTable t = new DataTable();
t.Rows.Add(value1,
value1,
value2,
value3,
...
valueN
);
DataSet ds = new DataSet();
ds.Tables.Add(t);
If the input comes out of a collection, you could loop it to create the DataColumns with the correct type:
var data = new Object[] { "A", 1, 'B', 2.3 };
DataTable t = new DataTable();
// create all DataColumns
for (int i = 0; i < data.Length; i++)
{
t.Columns.Add(new DataColumn("Column " + i, data[i].GetType()));
}
// add the row to the table
t.Rows.Add(data);
To answer your first question: no, you have to have columns defined on the table. You can't just say, "Hey, make a column for all these values." Nothing stopping you from creating columns on the fly, though, as Mr. Schmelter says.
Without knowing the rows or columns (you first need to add columns, without that not possible)
for(int intCount = 0;intCount < dt.Rows.Count; intCount++)
{
for(int intSubCount = 0; intSubCount < dt.Columns.Count; intSubCount++)
{
dt.Rows[intCount][intSubCount] = yourValue; // or assign to something
}
}
I have a list of DataTables like
List<DataTable> a = new List<DataTable>();
I want to make a deep copy of this list (i.e. copying each DataTable). My code currently looks like
List<DataTable> aCopy = new List<DataTable>();
for(int i = 0; i < a.Rows.Count; i++) {
aCopy.Add(a[i].Copy());
}
The performance is absolutely terrible, and I am wondering if there is a known way to speed up such a copy?
Edit: do not worry about why I have this or need to do this, just accept that it is part of a legacy code base that I cannot change
if you have to copy a data table it is essentially an N time operation. If the data table is very large and causing a large amount of allocation you may be able to speed up the operation by doing a section at a time, but you are essentially bounded by the work set.
You can try the following - it gave me a performance boost, although your mileage might vary! I've adapted it to your example to demonstrate how to copy a datatable using an alternative mechanism - clone the table, then stream the data in. You could easily put this in an extension method.
List<DataTable> aCopy = new List<DataTable>();
for(int i = 0; i < a.Rows.Count; i++) {
DataTable sourceTable = a[i];
DataTable copyTable = sourceTable.Clone(); //Clones structure
copyTable.Load(sourceTable.CreateDataReader());
}
This was many times faster (around 6 in my use case) than the following:
DataTable copyTable = sourceTable.Clone();
foreach(DataRow dr in sourceTable.Rows)
{
copyTable.ImportRow(dr);
}
Also, If we look at what DataTable.Copy is doing using ILSpy:
public DataTable Copy()
{
IntPtr intPtr;
Bid.ScopeEnter(out intPtr, "<ds.DataTable.Copy|API> %d#\n", this.ObjectID);
DataTable result;
try
{
DataTable dataTable = this.Clone();
foreach (DataRow row in this.Rows)
{
this.CopyRow(dataTable, row);
}
result = dataTable;
}
finally
{
Bid.ScopeLeave(ref intPtr);
}
return result;
}
internal void CopyRow(DataTable table, DataRow row)
{
int num = -1;
int newRecord = -1;
if (row == null)
{
return;
}
if (row.oldRecord != -1)
{
num = table.recordManager.ImportRecord(row.Table, row.oldRecord);
}
if (row.newRecord != -1)
{
if (row.newRecord != row.oldRecord)
{
newRecord = table.recordManager.ImportRecord(row.Table, row.newRecord);
}
else
{
newRecord = num;
}
}
DataRow dataRow = table.AddRecords(num, newRecord);
if (row.HasErrors)
{
dataRow.RowError = row.RowError;
DataColumn[] columnsInError = row.GetColumnsInError();
for (int i = 0; i < columnsInError.Length; i++)
{
DataColumn column = dataRow.Table.Columns[columnsInError[i].ColumnName];
dataRow.SetColumnError(column, row.GetColumnError(columnsInError[i]));
}
}
}
It's not surprising that the operation will take a long time; not only is it row by row, but it also does additional validation.
You should specify the capacity of the list otherwise it will have to grow internally to accommodate the data. See here for the detailed explanation.
List<DataTable> aCopy = new List<DataTable>(a.Count);
I found following approach much more efficient than other ways of filtering records like LINQ, provided your search criteria is simple:
public static DataTable FilterByEntityID(this DataTable table, int EntityID)
{
table.DefaultView.RowFilter = "EntityId = " + EntityID.ToString();
return table.DefaultView.ToTable();
}