I need to eliminate all rows that contain either string notUsed or string notUsed2, where a particular identity is 2.
I'm using a foreach loop to accomplish this, prior to appending any of my data to a stringbuilder.
I would have chosen a method like so:
foreach (DataRow dr in ds.Tables[0].Rows)
{
int auth = (int)dr[0];
if (auth == 2) continue;
string notUsed = "NO LONGER USED";
string notUsed2 = "NO LONGER IN USE";
if (dr.Cells[3].ToString().Contains(string)notUsed)
{
dr.Delete();
}
else
{
if (dr.Cells[3].ToString().Contains(string)notUsed2)
{
dr.Delete();
}
}
}
However, the above is... utterly wrong. It seems logical to me, but I don't quite understand how to form that method in a way that C# understands.
You should
Remove Cells as it is not a property of DataRow instead use
dr[3].
Remove string from Contains so your check should be like: if (dr[3].ToString().Contains(notUsed))
You are modifying the collection inside a foreach loop, by deleting the row. You should use for loop backward.
Like:
for (int i = ds.Tables[0].Rows.Count - 1; i >= 0; i--)
{
DataRow dr = ds.Tables[0].Rows[i];
int auth = (int)dr[0];
if (auth == 2) continue;
string notUsed = "NO LONGER USED";
string notUsed2 = "NO LONGER IN USE";
if (dr[3].ToString().Contains(notUsed))
{
dr.Delete();
}
else
{
if (dr[3].ToString().Contains(notUsed2))
{
dr.Delete();
}
}
}
You can also use LINQ to DataSet and get a new DataTable like:
DataTable newDataTAble = ds.Tables[0].AsEnumerable()
.Where(r => !r.Field<string>(3).Contains(notUsed) &&
r.Field<string>(3).Contains(notUsed2))
.CopyToDataTable();
Related
For my application there are a few separate dataTables and I need to create a new dataTable based on matching ids. I have to do the process a few times so I created a function so I'm not duplicating code, I've done this like so:
private static DataTable CloneTable(DataTable originalTable, DataTable newTable, DataTable targetTable,
string addedColumn, string columnToExtract, bool multipleConditions = false, string secondColumnName = null, string secondColumnConditon= null)
{
newTable = originalTable.Clone();
newTable.Columns.Add(addedColumn);
foreach (DataRow row in originalTable.Rows)
{
DataRow[] rowsTarget;
if (multipleConditions == false)
{
rowsTarget = targetTable.Select(string.Format("ItemId='{0}'", Convert.ToString(row["ItemId"])));
} else
{
rowsTarget = targetTable.Select(string.Format("ItemId='{0}' AND {1} ='{2}'", Convert.ToString(row["ItemId"]), secondColumnName, secondColumnConditon));
}
if (rowsTarget != null && rowsTarget.Length > 0)
{
string data = rowsTarget[0][columnToExtract].ToString();
var lst = row.ItemArray.ToList();
lst.Add(data);
newTable.Rows.Add(lst.ToArray());
}
else
{
string data = "";
var lst = row.ItemArray.ToList();
lst.Add(data);
newTable.Rows.Add(lst.ToArray());
}
}
return newTable;
}
I then call this in a separate function like so:
private DataTable GetExtractData()
{
.........................
DataTable includeLastModified = new DataTable();
DataTable includeFunction = new DataTable();
DataTable includeDiscipline = new DataTable();
CloneTable(itemsTable, includeLastModified, lastModifiedTable, "LastModifiedDate", "LastModifiedDate");
CloneTable(includeLastModified, includeFunction, customPropertiesTable, "Function", "ItemTitle", true, "Title", "Function");
CloneTable(includeFunction, includeDiscipline, customPropertiesTable, "Discipline", "ItemTable", true, "Title", "Discipline");
return includeDiscipline;
}
The issue I am having is that the dataTable here is returning empty and I am not sure why.
In my CloneTable function I did the following to make sure that the new table is not empty:
foreach (DataRow row in newTable.Rows)
{
foreach (var item in row.ItemArray)
{
Console.WriteLine(item);
}
}
It is not empty so I am not sure why when I'm returning it in a separate function it is now empty?
I call the same thing but for the includeDiscipline table in the GetData function but it comes back empty.
There are no errors but there is a message that comes and goes that says that the parameter "newTable" can be removed as the initial value isn't used. I'm not sure how that could be the case though as it is clearly being used?
I'm assuming that it is probably the way I am calling the function but I'm really not sure what it is that I have done wrong here
Okay face palm moment, just realised I forgot to assign it to something.
So if I do something like:
var test = CloneTable(itemsTable, includeLastModified, lastModifiedTable, "LastModifiedDate", "LastModifiedDate");
return test;
It works fine and no longer returns empty
I have the following code
private void bgwSendMail_DoWork(object sender, DoWorkEventArgs e)
{
DataSet ds = getMailToSend();
DataTable table = ds.Tables[0];
{
foreach (DataRow row in table.Rows)
{
{
string attachment1 = ds.Tables[0].Rows[0]["Attachment1"].ToString();
string attachment2 = ds.Tables[0].Rows[0]["Attachment2"].ToString();
string attachment3 = ds.Tables[0].Rows[0]["Attachment3"].ToString();
string attachment4 = ds.Tables[0].Rows[0]["Attachment4"].ToString();
string mailTo = ds.Tables[0].Rows[0]["EmailTo"].ToString();
string mailSubject = ds.Tables[0].Rows[0]["EmailSubject"].ToString();
string mailBody= ds.Tables[0].Rows[0]["EmailBody"].ToString();
string uid = ds.Tables[0].Rows[0]["uid"].ToString();
if (String.IsNullOrEmpty(attachment1))
{
//TODO Send Email Straight away ignore rest
}
else
{
if (!String.IsNullOrEmpty(attachment1))
{
bool attachment1Exists = checkFileExists(attachment1);
if (attachment1Exists == false)
{
continue;
}
}
Now I would expect, when we hit continue (which does get hit) at the bottom, that we should exit back up to the foreach as below and move on to the next record in the dataset
This does not happen, it iterates over the same record from which the continue came from over and over, is this normal?
If it's normal what's the best way to get the foreach to ignore that row in the datatable once it's been exited once?
The continue is working as expected.
You are enumerating all rows in the table but you aren't using it. Instead you are always accessing the first row in the table:
DataTable table = ds.Tables[0];
foreach(DataRow row in table.Rows)
{
string attachment1 = ds.Tables[0].Rows[0]["Attachment1"].ToString();
// ...
}
You are always accessing ds.Tables[0].Rows[0].
Instead you should use this code:
foreach(DataRow row in table.Rows)
{
string attachment1 = row["Attachment1"].ToString();
// ...
}
So you are actually enumerating all rows in the table as expected, it's not an infinite loop, but you are not using every row in the table but only the first.
Change
string attachment1 = ds.Tables[0].Rows[0]["Attachment1"].ToString();
to
string attachment1 = row["Attachment1"].ToString();
and all other subsequent references to the DataRow.
I'm surprised that I haven't seen anything about this on here (or maybe I missed it). When parsing a CSV file, if there are rows with no data, how can/should that be handled? I'm not talking about blank rows, but empty rows, for example:
ID,Name,Quantity,Price
1,Stuff,2,5
2,Things,1,2.5
,,,
,,,
,,,
I am using TextFieldParser to handle commas in data, multiple delimiters, etc. The two solutions I've thought of is to either use ReadLine instead of ReadFields, but that would remove the benefits of using the TextFieldParser, I'd assume, because then I'd have to handle commas a different way. The other option would be to iterate through the fields and drop the row if all of the fields are empty. Here's what I have:
dttExcelTable = new DataTable();
using (TextFieldParser parser = new TextFieldParser(fileName))
{
parser.Delimiters = new string[] { ",", "|" };
string[] fields = parser.ReadFields();
if (fields == null)
{
return null;
}
foreach (string columnHeader in fields)
{
dttExcelTable.Columns.Add(columnHeader);
}
while (true)
{
DataRow importedRow = dttExcelTable.NewRow();
fields = parser.ReadFields();
if (fields == null)
{
break;
}
for (int i = 0; i < fields.Length; i++)
{
importedRow[i] = fields[i];
}
foreach (var field in importedRow.ItemArray)
{
if (!string.IsNullOrEmpty(field.ToString()))
{
dttExcelTable.Rows.Add(importedRow);
break;
}
}
}
}
Without using a thirdy party CSV reader you could change your code in this way
.....
DataRow importedRow = dttExcelTable.NewRow();
for (int i = 0; i < fields.Length; i++)
importedRow[i] = fields[i];
if(!importedRow.ItemArray.All (ia => string.IsNullOrWhiteSpace(ia.ToString())))
dttExcelTable.Rows.Add(importedRow);
Using the All IEnumerable extension you could check every element of the ItemArray using string.IsNullOrWhiteSpace. If the return is true you have an array of empty string and you could skip the Add
You can just replace commas in the line by nothing and test this if it is null.
strTemp = s.Replace(",", "");
if (!String.IsNullOrEmpty(strTemp)) { /*code here */}
http://ideone.com/8wKOVD
Doesn't seem like there's really a better solution than the one that I provided. I will just need to loop through all of the fields and see if they are all empty before adding it to my datatable.
The only other solution I've found is Steve's answer, which is to not use TextFieldParser
I know this is literally years later, but I recently had this issue and was able to find a workaround similar to previous responses. You can see the whole flushed out function
public static DataTable CSVToDataTable(IFormFile file)
{
DataTable dt = new DataTable();
using (StreamReader sr = new StreamReader(file.OpenReadStream()))
{
string[] headers = sr.ReadLine().Split(',');
foreach (string header in headers)
{
dt.Columns.Add(header);
}
var txt = sr.ReadToEnd();
var stringReader = new StringReader(txt);
TextFieldParser parser = new TextFieldParser(stringReader);
parser.HasFieldsEnclosedInQuotes = true;
parser.SetDelimiters(",");
while (!parser.EndOfData)
{
string[] rows = parser.ReadFields();
string tmpStr = string.Join("", rows);
if (!string.IsNullOrWhiteSpace(tmpStr))
{
DataRow dr = dt.NewRow();
for (int i = 0; i < headers.Length; i++)
{
dr[i] = rows[i];
}
dt.Rows.Add(dr);
}
}
}
return dt;
}
It works for me and has proven fairly reliable. The main snippet is found in the WHILE loop after calling .ReadFields()--I join the returned rows to a string and then check if its nullorempty. Hopefully this can help someone who stumbles upon this.
This is a take on some experimental code that #Tim Schmelter pointed me in the correct direction towards earlier this afternoon. The majority of it is almost exactly the same as what worked earler, but it is throwing a invalidCastException on the last line or second last line, depending whichever i try. I cannot see why this is.
Boolean test = false;
string filePathStudent = System.IO.Path.GetFullPath("StudentInfo.txt");
DataTable studentDataTable = new DataTable();
studentDataTable.Columns.Add("Id", typeof(int));
studentDataTable.Columns.Add("StudentID");
studentDataTable.Columns.Add("FirstName");
studentDataTable.Columns.Add("LastName");
studentDataTable.Columns.Add("StreetAdd");
studentDataTable.Columns.Add("City");
studentDataTable.Columns.Add("State");
studentDataTable.Columns.Add("Zip");
studentDataTable.Columns.Add("Choice1");
studentDataTable.Columns.Add("CreditHrs1");
studentDataTable.Columns.Add("Choice2");
studentDataTable.Columns.Add("CreditHrs2");
studentDataTable.Columns.Add("Choice3");
studentDataTable.Columns.Add("CreditHrs3");
studentDataTable.Columns.Add("Choice4");
studentDataTable.Columns.Add("CreditHrs4");
studentDataTable.Columns.Add("Choice5");
studentDataTable.Columns.Add("CreditHrs5");
studentDataTable.Columns.Add("Choice6");
studentDataTable.Columns.Add("CreditHrs6");
foreach (string line in File.ReadLines(filePathStudent))
{
DataRow row = studentDataTable.Rows.Add();
string[] fields = line.Split(new[] { (char)9 });
int id;
if (fields.Length == 19 && int.TryParse(fields[0], out id))
{
row.SetField("Id", id);
row.SetField("StudentID", fields[1]);
row.SetField("FirstName", fields[2]);
row.SetField("LastName", fields[3]);
row.SetField("StreetAdd", fields[4]);
row.SetField("City", fields[5]);
row.SetField("State", fields[6]);
row.SetField("Zip", fields[7]);
row.SetField("Choice1", fields[8]);
row.SetField("CreditHrs1", fields[9]);
row.SetField("Choice2", fields[10]);
row.SetField("CreditHrs2", fields[11]);
row.SetField("Choice3", fields[12]);
row.SetField("CreditHrs3", fields[13]);
row.SetField("Choice4", fields[14]);
row.SetField("CreditHrs4", fields[15]);
row.SetField("Choice5", fields[16]);
row.SetField("CreditHrs5", fields[17]);
row.SetField("Choice6", fields[18]);
row.SetField("CreditHrs6", fields[19]);
}
}
using (StreamReader reader = new StreamReader(filePathStudent))
{
String line1 = reader.ReadLine();
if (line1 == null)
maxIDStdTable = 0;
else
test = true;
reader.Dispose();
reader.Close();
}
if(test)
int maxIDStdTable = studentDataTable.AsEnumerable().Max(r => r.Field<int>("Id"));
//int maxIDStdTable = (int)studentDataTable.Compute("Max(Id)", "");
You have made two mistakes :
1) You have create new DataRow with DataTable.NewRow()
2) Sfter setting DataRow you have to add it to the DataTable with DataTable.Rows.Add(youDataRow).
Update your code as and try:
foreach (string line in File.ReadLines(filePathStudent))
{
DataRow row = studentDataTable.NewRow();
string[] fields = line.Split(new[] { (char)9 });
int id;
if (fields.Length == 19 && int.TryParse(fields[0], out id))
{
row["Id"]= id;
row["StudentID"]= fields[1];
row["FirstName"]= fields[2];
row[LastName"]= fields[3];
row["StreetAdd"]= fields[4];
row["City"]=fields[5];
row["State"]= fields[6];
row["Zip"]=fields[7];
row["Choice1"]= fields[8];
row["CreditHrs1"]= fields[9];
row["Choice2"]= fields[10];
row["CreditHrs2"]= fields[11];
row[("Choice3"]= fields[12];
row["CreditHrs3"]=, fields[13];
row["Choice4"]= fields[14];
row["CreditHrs4"]= fields[15];
row["Choice5"]= fields[16];
row["CreditHrs5"]= fields[17];
row["Choice6"]= fields[18];
row["CreditHrs6"] =fields[19];
}
studentDataTable.Rows.Add(row);
}
It might not be the best solution, but it works.
string filePathStudent = System.IO.Path.GetFullPath("StudentInfo.txt");
DataTable studentDataTable = new DataTable();
studentDataTable.Columns.Add("Id", typeof(Int32));
studentDataTable.Columns.Add("StudentID");
studentDataTable.Columns.Add("FirstName");
studentDataTable.Columns.Add("LastName");
studentDataTable.Columns.Add("StreetAdd");
studentDataTable.Columns.Add("City");
studentDataTable.Columns.Add("State");
studentDataTable.Columns.Add("Zip");
studentDataTable.Columns.Add("Choice1");
studentDataTable.Columns.Add("CreditHrs1");
studentDataTable.Columns.Add("Choice2");
studentDataTable.Columns.Add("CreditHrs2");
studentDataTable.Columns.Add("Choice3");
studentDataTable.Columns.Add("CreditHrs3");
studentDataTable.Columns.Add("Choice4");
studentDataTable.Columns.Add("CreditHrs4");
studentDataTable.Columns.Add("Choice5");
studentDataTable.Columns.Add("CreditHrs5");
studentDataTable.Columns.Add("Choice6");
studentDataTable.Columns.Add("CreditHrs6");
// Read in a file line-by-line, and store it
var txtFileLine = File.ReadAllLines(filePathStudent).ToList();
//Reads line splits data to colums at tab (ASCII value 9)
txtFileLine.ForEach(line => studentDataTable.Rows.Add(line.Split((char)9)));
List<int> rowsForColumn1 = studentDataTable.AsEnumerable().Select(x => x.Field<int>(0)).ToList();
//Tests for empty Datatable
foreach (DataRow row in studentDataTable.Rows)
{
if (row.IsNull("Id"))
break;
else
//get max value from "Id" row.
maxIDStdTable = rowsForColumn1.Max();
}
}
I am importing data from csv file, sometimes there are column headers and some times not the customer chooses custom columns(from multiple drop downs)
my problem is I am able to change the columns type and name but when I want to import data row into cloned table it just adds rows but no data with in those rows. If I rename the column to old values it works, let's say column 0 name is 0 if I change that to something else which I need to it won't fill the row below with data but If I change zero to zero again it will any idea:
here is my coding:
#region Manipulate headers
DataTable tblCloned = new DataTable();
tblCloned = tblDataTable.Clone();
int i = 0;
foreach (string item in lstRecord)
{
if (item != "Date")
{
var m = tblDataTable.Columns[i].DataType;
tblCloned.Columns[i].DataType = typeof(System.String);
tblCloned.Columns[i].ColumnName = item;
}
else if(item == "Date")
{
//get the proper date format
//FillDateFormatToColumn(tblCloned);
tblCloned.Columns[i].DataType = typeof(DateTime);
tblCloned.Columns[i].ColumnName = item;
}
i++;
}
tblCloned.AcceptChanges();
foreach (DataRow row in tblDataTable.Rows)
{
tblCloned.ImportRow(row);
}
tblCloned.AcceptChanges();
#endregion
in the second foreach loop when it calls to import data to cloned table it adds empty rows.
After couple of tries I came up with this solution which is working:
foreach (DataRow row in tblDataTable.Rows)
{
int x = 0;
DataRow dr = tblCloned.NewRow();
foreach (DataColumn dt in tblCloned.Columns)
{
dr[x] = row[x];
x++;
}
tblCloned.Rows.Add(dr);
//tblCloned.ImportRow(row);
}
but I will accept Scottie's answer because it is less code after all.
Instead of
foreach (DataRow row in tblDataTable.Rows)
{
tblCloned.ImportRow(row);
}
try
foreach (DataRow row in tblDataTable.Rows)
{
tblCloned.LoadDataRow(row.ItemArray, true);
}