C# doesnt pass foreach code - c#

The code below read a text file with a header of a path and followed by a list of file names.
the code adds each file (from the second line and on) to a ListView.
For some reason the last two lines are never reached.
Any ideas?
private void loadFromFile()
{
if ((faxInOn != null) && File.Exists(#"D:\Settings.ye"))
{
string[] s;
StreamReader sr = new StreamReader(#"D:\Settings.ye", Encoding.Default);
s = sr.ReadToEnd().Split(new string[] { "\r\n", "\n" }, StringSplitOptions.None);
faxInOn.changePath(s[0]);
foreach (string temp in s)
foreach (ListViewItem lvi in listView1.Items)
if (lvi.Text == temp.Substring(1))
lvi.ImageIndex = int.Parse(temp.Substring(0, 1));
sr.Close();
sr.Dispose();
}
}
Thanks

The way the foreach blocks are used are really inefficient for what I think you are trying to do. Also, it would be much easier and cleaner for you to use the "using(...)" block so your resources are properly cleaned up and handled correctly. Please see: http://msdn.microsoft.com/en-us/library/yh598w02.aspx

Try using File.ReadLines
For example:
// Read all lines in file, skipping the first header line
foreach(var line in File.ReadLines(#"D:\Settings.ye").Skip(1))
{
// add to list view
}

Related

C# Search file for a string and get the string's line number

I've been at this all day and I'm struggling to do it, I've searched far and wide and though there are similar questions, there are no answers.
What I'm trying to do:
I'm trying to search a file for multiple strings and receive the line numbers so that when users wish to change said strings inside my application, my application knows what lines to replace with their string.
What I've tried so far:
I've tried the below code, my attempt was to read the file then receive the line number, however it doesn't seem to be working at all.
var lines = File.ReadAllLines(filepath);
foreach (var match in lines // File.ReadLines(filePath)
.Select((text, index) => new { text, lineNumber = index + 1 })
.Where(x => x.text.Contains("motd=")))
{
lines[match.lineNumber] = "motd=" + textBox1.Text;
File.WriteAllLines(filepath, lines);
}
}
What I expect the above code to do is find the string "motd=" in the file, get the line number and attempt to re-write it with what the user has inputted.
However, I receive this error: "Index was outside the bounds of the array".
I think a for loop makes more sense here
var lines = File.ReadAllLines(filepath);
for (int i = 0; i < lines.Length; i++)
{
if(lines[i].Contains("motd="))
{
lines[i] = "motd=" + textBox1.Text;
}
}
File.WriteAllLines(filepath, lines);
A couple of issues with your code was that you were writing out the file in side the loop and you incremented the index by 1 which would then point to the wrong line in the array and result in the exception you are getting if the last line contains the text you are searching for.
It should be noted that this could be a memory hog if you are working with a really large file. In that case I'd read the lines in one at a time and write them out to a temp file, then delete the original file and rename the temp at the end.
var tempFile = Path.GetTempFileName();
using (var file = File.OpenWrite(tempFile))
using (var writer = new StreamWriter(file))
{
foreach (var line in File.ReadLines(filepath))
{
if (line.Contains("motd="))
{
writer.WriteLine("motd=" + textBox1.Text);
}
else
{
writer.WriteLine(line);
}
}
}
File.Delete(filepath);
File.Move(tempFile, filepath);
This will be much faster, and use less memory, and work with very large files, but you need to write to a new file:
File.WriteAllLines(filepath2,
File
.ReadLines(filepath)
.Select(x=>x.Contains("motd=")?"motd="+TextBox1.Text:x));
It uses ReadLines which reads each line as it can instead of reading the entire file and breaking it into lines in one step. So it processes it as quickly as your file system can read, processes the line, then writes the result.
A complete replacement would look like this:
var filepath2= Path.GetTempFileName();
File.WriteAllLines(filepath2,
File
.ReadLines(filepath)
.Select(x=>x.Contains("motd=")?"motd="+TextBox1.Text:x));
File.Delete(filepath);
File.Move(filepath2, filepath);
To do multiple values:
var filepath2= Path.GetTempFileName();
File.WriteAllLines(filepath2,
File
.ReadLines(filepath)
.Select(x=> {
if (x.Contains("motd=")) return "motd="+TextBox1.Text;
if (x.Contains("author=")) return "author="+TextBox2.Text;
return x;
}));
File.Delete(filepath);
File.Move(filepath2, filepath);
or move the replace logic to it's own function:
string ReplaceTokens(string src)
{
if (src.Contains("motd=")) return "motd="+TextBox1.Text;
if (src.Contains("author=")) return "author="+TextBox2.Text;
return src;
}
var filepath2= Path.GetTempFileName();
File.WriteAllLines(filepath2,
File
.ReadLines(filepath)
.Select(ReplaceTokens));
File.Delete(filepath);
File.Move(filepath2, filepath);
You're writing the file on every single iteration which makes no sense, and doing a select/where/index thing is just weird.
Since you're doing a Where you have to iterate over every single line, so why not save the trouble and just iterate over everything explicitly?
var lines = File.ReadAllLines(filepath);
for(int i = 0; i < lines.Length; i++)
{
if(lines[i].Contains("motd="))
{
lines[i] = "motd=" + textBox1.Text;
}
}
File.WriteAllLines(filepath, lines);

How to display data from text file into many columns?

I have text file which consists of many rows and 18 columns of data seperated by tabs. I used this code and it is displaying entire data in single column. What I need is the data should be displayed in columns.
public static List<string> ReadDelimitedFile(string docPath)
{
var sepList = new List<string>();
// Read the file and display it line by line.
using (StreamReader file = new StreamReader(docPath))
{
string line;
while ((line = file.ReadLine()) != null)
{
var delimiters = new char[] { '\t' };
var segments = line.Split(delimiters, StringSplitOptions.RemoveEmptyEntries);
foreach (var segment in segments)
{
//Console.WriteLine(segment);
sepList.Add(segment);
}
}
file.Close();
}
// Suspend the screen.
Console.ReadLine();
return sepList;
}
You're outputting everything in one column like this (pseudo-code, to illustrate structure):
while (reading lines)
for (reading entries)
WriteLine(entry)
That is, for every line in the file and for every entry in that line, you output a new line. Instead, you want to only write a new line for every line in the file, and write the entries with separators (tabs?). Something more like this:
while (reading lines)
for (reading entries)
Write(entry)
WriteLine(newline)
That way all the entries for any given line in the file are on the same line in the output.
How you delimit those entries in the output is up to you, of course. And to write a carriage return could be as simple as Console.WriteLine(string.Empty), though I bet there are lots of other ways to do it.
18 columns would seem to be served best by using a dataGridView.
// Create your dataGrodView with the 18 columns using your designer.
int col = 0;
foreach (var segment in segments)
{
//Console.WriteLine(segment);
//sepList.Add(segment);
dataGridView1.Rows[whateverRow].Cells[col].Value = segment;
}
So according to your code, you have a following loop:
while{
<reads the lines one by one>
for each line{
<reading each segment and adding to the list.>
}
}
Your code read each segment of a line and append to the list. Ideally you should have 18 list for 18 columns. In java this problem can be solved with hashmaps:
Hashmap <String, ArrayList<String>> hmp = new Hashmap<String, ArrayList<String>>();`
while(read each line){
List<String> newList = new ArrayList<String>
foreach(segment as segments){
newList.add(segment);
}
hmp.put(column1,segment);
}
return hmp;
so you will have hmp.put(column2, segment), hmp.put(column3, segment) and so on.
Hope it helps.
You should be using DataTable or similar type for that but if you want to use List you can "emulate" rows and columns like this:
var rows = new List<List<string>>();
foreach(var line in File.ReadAllLines(docPath))
{
var columns = line.Split(new char[] { '\t' }, StringSplitOptions.RemoveEmptyEntries).ToList();
rows.Add(columns);
}
That will give you row/column like structure
foreach(var row in rows)
{
foreach(var column in row)
{
Console.Write(column + ",");
}
Console.WriteLine();
}

Writing List<String> contents to text file after deleting string

I'm trying to get the contents of a Text File, delete a line of string, and re-write back to the Text File, deleting the line of string. I'm using StreamReader to get the text, importing into a List, removing the string, then rewriting using StreamWriter. My problems arises somewhere around the removing or writing of the string. Instead of writing back the existing, non deleted contents to the text file, all the text is replaced with :
System.Collections.Generic.List`1[System.String]
My code for this function is as follows:
{
for (int i = deleteDevice.Count - 1; i >= 0; i--)
{
string split = "";
//deleteDevice[i].Split(',').ToString();
List<string> parts = split.Split(',').ToList();
if (parts.Contains(deviceList.SelectedItem.ToString()))
{
deleteDevice.Remove(i.ToString());
}
}
if (deleteDevice.Count != 0) //Error Handling
{
writer.WriteLine(deleteDevice);
}
}
deviceList.Items.Remove(deviceList.SelectedItem);
}
I would just like the script to write back any string that isn't deleted (If there is any), without replacing it. Any help is appreciated, Cheers
You can read all the info from the text file into a list and then remove from the list and rewrite that to the text file.
I would change the list 'deleteDevice' to store a string array instead and use the code below to determine which item to remove.
List<int> toRemove = new List<int>();
int i = 0;
/*build a list of indexes to remove*/
foreach (string[] x in deleteDevice)
{
if (x[0].Contains(deviceList.SelectedItem.ToString()))
{
toRemove.Add(i);
}
i++;
}
/*Remove items from list*/
foreach (int fd in toRemove)
deleteDevice.RemoveAt(fd);
/*write to text file*/
using (StreamWriter writer = new StreamWriter("Devices.txt"))
{
if (deleteDevice.Count != 0) //Error Handling
{
foreach (string[] s in deleteDevice)
{
StringBuilder sb = new StringBuilder();
for (int fds = 0; fds < s.Length; fds++ )
{
sb.Append(s[fds] + ",");
}
string line = sb.ToString();
writer.WriteLine(line.Substring(0, line.Length - 1));
}
}
}
This isn't the best solution but should work for your needs. There's probably a much easier way of doing this.
The problem is in the following line:
writer.WriteLine(deleteDevice);
You're writing deleteDevice (I assume this is of type List). List.ToString() returns the type name of the list, because this has no specific implementation. What you want is
foreach(String s in deleteDevice)
{
writer.WriteLine(s);
}
Problems
deleteDevice is of type List<string>, and because it also doesn't overload ToString(), the default behaviour of List<string>.ToString() is to return the name of the type.
Hence your line writer.WriteLine(deleteDevice); writes the string System.Collections.Generic.List1[System.String]`.
Other than that, there are many things wrong with your code...
For example, you do this:
string split = "";
and then on the line afterwards you do this:
List<string> parts = split.Split(',').ToList();
But because split is "", this will always return an empty list.
Solution
To simplify the code, you could first write a helper method that will remove from a file all the lines that match a specified predicate:
public void RemoveUnwantedLines(string filename, Predicate<string> unwanted)
{
var lines = File.ReadAllLines(filename);
File.WriteAllLines(filename, lines.Where(line => !unwanted(line)));
}
Then you can write the predicate something like this (this might not be quite right; I don't really know exactly what your code is doing because it's not compilable and omits some of the types):
string filename = "My Filename";
string deviceToRemove= deviceList.SelectedItem.ToString();
Predicate<string> unwanted = line =>
line.Split(new [] {','})
.Contains(deviceToRemove);
RemoveUnwantedLines(filename, unwanted);

(C#) How to read all files in a folder to find specific lines?

I'm very new to C# so please have some extra patience. What I am looking to do is read all files in a folder, to find a specific line (which can occur more than once in the same file) and get that output to show onscreen.
If anyone could point me in the direction to which methods I need to use it would be great.
Thanks!
Start with
const string lineToFind = "blah-blah";
var fileNames = Directory.GetFiles(#"C:\path\here");
foreach (var fileName in fileNames)
{
int line = 1;
using (var reader = new StreamReader(fileName))
{
// read file line by line
string lineRead;
while ((lineRead = reader.ReadLine()) != null)
{
if (lineRead == lineToFind)
{
Console.WriteLine("File {0}, line: {1}", fileName, line);
}
line++;
}
}
}
As Nick pointed out below, you can make search parallel using Task Library, just replace 'foreach' with Parallel.Foreach(filesNames, file=> {..});
Directory.GetFiles: http://msdn.microsoft.com/en-us/library/07wt70x2
StreamReader: http://msdn.microsoft.com/en-us/library/f2ke0fzy.aspx
What output do you want to get on the screen?
If you want to find the first file with the given line, you can use this short code:
var firstMatchFilePath = Directory.GetFiles(#"C:\Temp", "*.txt")
.FirstOrDefault(fn => File.ReadLines(fn)
.Any(l => l == lineToFind));
if (firstMatchFilePath != null)
MessageBox.Show(firstMatchFilePath);
I've used Directory.GetFiles with a search pattern to find all text files in a directory. I've used the LINQ extension methods FirstOrDefault and Any to find the first file with a given line.

Searching for line of one text file in another text file, faster

Is there a faster way to search each line of one text file for occurrence in another text file, than by going line by line in both files?
I have two text files - one has ~2500 lines (let's call it TxtA), the other has ~86000 lines(TxtB). I want to search TxtB for each line in TxtA, and return the line in TxtB for each match found.
I currently have this setup as: for each line in TxtA, search TxtB line by line for a match. However this is taking a really long time to process. It seems like it would take 1-3 hours to find all the matches.
Here is my code...
private static void getGUIDAndType()
{
try
{
Console.WriteLine("Begin.");
System.Threading.Thread.Sleep(4000);
String dbFilePath = #"C:\WindowsApps\CRM\crm_interface\data\";
StreamReader dbsr = new StreamReader(dbFilePath + "newdbcontents.txt");
List<string> dblines = new List<string>();
String newDataPath = #"C:\WindowsApps\CRM\crm_interface\data\";
StreamReader nsr = new StreamReader(newDataPath + "HolidayList1.txt");
List<string> new1 = new List<string>();
string dbline;
string newline;
List<string> results = new List<string>();
while ((newline = nsr.ReadLine()) != null)
{
//Reset
dbsr.BaseStream.Position = 0;
dbsr.DiscardBufferedData();
while ((dbline = dbsr.ReadLine()) != null)
{
newline = newline.Trim();
if (dbline.IndexOf(newline) != -1)
{//if found... get all info for now
Console.WriteLine("FOUND: " + newline);
System.Threading.Thread.Sleep(1000);
new1.Add(newline);
break;
}
else
{//the first line of db does not contain this line...
//go to next dbline.
Console.WriteLine("Lines do not match - continuing");
continue;
}
}
Console.WriteLine("Going to next new Line");
System.Threading.Thread.Sleep(1000);
//continue;
}
nsr.Close();
Console.WriteLine("Writing to dbc3.txt");
System.IO.File.WriteAllLines(#"C:\WindowsApps\CRM\crm_interface\data\dbc3.txt", results.ToArray());
Console.WriteLine("Finished. Press ENTER to continue.");
Console.WriteLine("End.");
Console.ReadLine();
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex);
Console.ReadLine();
}
}
Please let me know if there is a faster way. Preferably something that would take 5-10 minutes... I've heard of indexing but didn't find much on this for txt files. I've tested regex and it's no faster than indexof. Contains won't work because the lines will never be exactly the same.
Thanks.
There might be a faster way, but this LINQ apporoach should be faster than 3 hours and is a sight better to read and maintain:
var f1Lines = File.ReadAllLines(f1Path);
var f2LineInf1 = File.ReadLines(f2Path)
.Where( line => f1Lines.Contains(line))
.Select(line => line).ToList();
Edit: tested and required less than 1 second for 400000 lines in file2 and 17000 lines in file1. I can use File.ReadLines for the big file which does not load all into memory at once. For the smaller file i need to use File.ReadAllLines since Contains needs the complete list of lines of file 1.
If you want to log the result in a third file:
File.WriteAllLines(logPath, f2LineInf1);
EDIT: Note that I'm assuming it's reasonable to read at least one file into memory. You may want to swap the queries below around to avoid loading the "big" file into memory, but even 86,000 lines at (say) 1K per line is going to be less than 2G of memory - which is relatively little to do something significant.
You're reading the "inner" file each time. There's no need for that. Load both files into memory and go from there. Heck, for exact matches you can do the whole thing in LINQ easily:
var query = from line1 in File.ReadLines("newDataPath + "HolidayList1.txt")
join line2 in File.ReadLines(dbFilePath + "newdbcontents.txt")
on line1 equals line2
select line1;
var commonLines = query.ToList();
But for non-joins it's still simple; just read one file completely first (explicitly) and then stream the other:
// Eagerly read the "inner" file
var lines2 = File.ReadAllLines(dbFilePath + "newdbcontents.txt");
var query = from line1 in File.ReadLines("newDataPath + "HolidayList1.txt")
from line2 in lines2
where line2.Contains(line1)
select line1;
var commonLines = query.ToList();
There's nothing clever here - it's just a really simple way of writing code to read all the lines in one file, then iterate over the lines in the other file and for each line check against all the lines in the first file. But even without anything clever, I strongly suspect it would perform well enough for you. Concentrate on simplicity, eliminate unnecessary IO, and see whether that's good enough before trying to do anything fancier.
Note that in your original code, you should be using using statements for your StreamReader variables, to ensure they get disposed properly. Using the above code makes it simple to not even need that though...
Quick and dirty because I've got to go... If you can do it in memory, try working with this snippet:
//string[] searchIn = File.ReadAllLines("File1.txt");
//string[] searchFor = File.ReadAllLines("File2.txt");
string[] searchIn = new string[] {"A","AB","ABC","ABCD", null, "", " "};
string[] searchFor = new string[] {"A","BC","BCD", null, "", " "};
matchDictionary;
foreach(string item in file2Content)
{
string[] matchingItems = Array.FindAll(searchIn, x => (x == item) || (!string.IsNullOrEmpty(x) && !string.IsNullOrEmpty(item) ? (x.Contains(item) || item.Contains(x)) : false));
}

Categories

Resources