I would like to consecutively read from a text file that is generated by my program. The problem is that after parsing the file for the first time, my program reads the last line of the file before it can begin re-parsing, which causes it to accumulates unwanted data.
3 photos: first is creating tournament and showing points, second is showing text file and the third is showing that TeamA got more 3 points
StreamReader = new StreamReader("Torneios.txt");
torneios = 0;
while (!rd.EndOfStream)
{
string line = rd.ReadLine();
if (line == "Tournament")
{
torneios++;
}
else
{
string[] arr = line.Split('-');
equipaAA = arr[0];
equipaBB = arr[1];
res = Convert.ToChar(arr[2]);
}
}
rd.Close();
That is what I'm using at the moment.
To avoid mistakes like these, I highly recommend using File.ReadAllText or File.ReadAllLines unless you are using large files (in which case they are not good choices), here is an example of an implementation of such:
string result = File.ReadAllText("textfilename.txt");
Regarding your particular code, an example using File.ReadAllLines which achieves this is:
string[] lines = File.ReadAllLines("textfilename.txt");
for(int i = 0; i < lines.Length; i++)
{
string line = lines[i];
//Do whatever you want here
}
Just to make it clear, this is not a good idea if the files you intend to read from are large (such as binary files).
Related
I created a loop that takes user names from a text boxes. I would like to put the names into a text file and add a new line each time. It's not working. It's once again overrides the previous name. I know how to add new line to a text file, but in the loop statement it dose not work.
Here's my code:
for (int i = 0; i < txt_user.length ; i++ )
{
File.WriteAllText(#"C:\mail\users.txt", txt_user[i].Text + Environment.NewLine);
}
Here is a sample code out of the loop as writing a new line - and it works:
File.WriteAllText(#"C:\mail\users.txt", txt_user1.Text + Environment.NewLine + "abc");
You're close: there is File.AppendAllText() or File.AppendText(). You could also collect all lines in memory first and use File.AppendAllLines() (if you have enough RAM to store all lines).
WriteAllText() will write a new file or overwrite an existing one.
This will work well enough for smaller files, since the OS may apply some caching strategies. However, doing that in a loop for very large files may not be efficient. You should have a look at FileStreams then.
If you have too many entities it's better use this code
using (StreamWriter sw = File.CreateText(#"C:\mail\users.txt"))
{
for (int i = 0; i < txt_user.length ; i++ )
{
sw.WriteLine(txt_user[i].Text);
}
}
This will open the file once, and write lines to the text file as it enumerates them, and then close the file. It doesn't do multiple opens, nor does it try and build up a large string in the process, which likely makes it the most I/O and memory efficient of the bunch of answers given so far.
File.WriteAllLines(#"C:\mail\users.txt",
Enumerable
.Range(0,txt_user.length)
.Select(i=>txt_user[i].Text));
Use File.AppendText if you want to add to an existing file.
I recommend use a StringBuilder:
var builder = new StringBuilder();
for (int i = 0; i < txt_user.length ; i++ )
builder.AppendLine(txt_user[i].Text);
File.WriteAllText(#"C:\mail\users.txt", builder.ToString());
Instead of openeing and closing the file all the time (because this what File.WriteAllText does at every call), prepare the text by using a string builder and write it to the file at once. Since the text comes from text boxes, I assume that the resulting string will not be very long anyway.
var sb = new Stringbuilder();
for (int i = 0; i < txt_user.length; i++)
{
sb.AppendLine(txt_user[i].Text);
}
File.WriteAllText(#"C:\mail\users.txt", sb.ToString());
I have a master file called FileName with IDs of people. It is in sorted order.
I want to divide IDs into 27 chunks and copy each chunk into a different text file.
using (FileStream fs = File.Open(FileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
string line;
int numOfLines = File.ReadAllLines(FileName).Length; -- I have 73467
int eachSubSet = (numOfLines / 27);
var lines = File.ReadAllLines(dataFileName).Take(eachSubSet);
File.WriteAllLines(FileName1,lines);
}
I have 27 different text files. so I want 73467 of IDs divided equally and copied over to 27 different files. So, 1st file will have ID#1 to ID#2721
2nd Dile will have ID#2722 to ID#(2722+2721) and so on. I do not know how to automate this and run this quickly.
Thanks
HR
The simplest way would be to run File.ReadLine and WriteLine inside a loop and decide what file will receive which line.
I wouldn't recommend you to parallelize this routine since it's an IO operation, but just the copy of lines would be pretty fast.
Note that in your sample code you called File.ReadAllLines twice, so you actually parse your entire input file twice.
So avoiding that should speed up the process, and also you didn't actually split the files, you only wrote the first file out of the 27.
Untested, but something along these lines should work:
const int numOfFiles = 27;
string[] lines = File.ReadAllLines(FileName);
int numOfLines = lines.Length;
int eachSubSet = numOfLines/numOfFiles;
int firstSubset = numOfLines%numOfFiles + eachSubSet;
IEnumerable<string> linesLeftToWrite = lines;
for (int index = 0; index < numOfFiles; index++)
{
int numToTake = index == 0 ? firstSubset : eachSubSet;
File.WriteAllLines(string.Format("{0}_{1}.txt", FileName, index), linesLeftToWrite.Take(numToTake));
linesLeftToWrite = linesLeftToWrite.Skip(numToTake);
}
So let me give you an example of how my file could look:
Hello
my
name
is
Now, for example, I want to go through the different lines of this file with System.IO.File.ReadAllLines(); and then check with a loop whether the current line has the word "my" in it (so the second line in this case).
As my next step, I want to add a new line right after "my", so it looks like this:
Hello
my
name
is
I approached this with 2 methods now. I was hoping File.Append(); would offer a method where I could append anything after it has found the string I am looking for, but obviously it only offers methods to append strings to the end of files.
My second approach was to read in all the lines with string[] test=System.IO.File.ReadAllLines();
and then iterate through all the lines, checking each line with
for (int i = 0; i < (test.Length - 1); i++)
{
if(test[i].Contains("my"))
{
test[i] = test[i] + Environment.NewLine;
}
}
and then write all this back in the file with System.IO.File.WriteAllLines();
The problem I am facing here is the fact that this command does not really add a real new line to the file, as I've checked test.Length before and after, and both time I got 4 as a result.
Another option is to add the lines to a List which would give you the Insert() method:
*Only use this for relatively small files.
Something like:
string path = #"c:\some\path\file.txt";
List<String> lines = new List<string>(System.IO.File.ReadAllLines(path));
for (int i = 0; i < lines.Count; i++)
{
if (lines[i].Contains("my"))
{
if (i < lines.Count -1)
{
lines.Insert(i + 1, "");
}
else
{
lines.Add("");
}
}
}
System.IO.File.WriteAllLines(path, lines.ToArray());
First, I suggest you use StringBuilder. It's best to use it when you're adding many strings, since strings are immutable and thus each string is created again when you do +=, or simply assigning a new one to an array slot.
This code will do what you're looking for, and it treats the no new line edge case:
var filePath = //your file path
var test = File.ReadAllLines(filePath);
var sb = new StringBuilder();
for (int i = 0; i < (test.Length - 1); i++)
{
sb.Append(test[i]);
sb.Append(Environment.NewLine);
if (test[i].Contains("my"))
{
// This adds that extra new line
sb.Append(Environment.NewLine);
}
}
sb.Append(test[test.Length-1]);
File.WriteAllText(filePath, sb.ToString());
[TestMethod]
public void InsertLines()
{
var test = File.ReadAllLines(#"c:\SUService.log");
var list = new List<string>();
for (int i = 0; i < (test.Length - 1); i++)
{
list.Add(test[i]);
if (test[i].Contains("my"))
{
list.Add(Environment.NewLine);
}
}
File.WriteAllLines(#"c:\SUService.log", list);
}
Not for the above question: But in general if you want to add a new line after a line you just have to add a new line character -> so if you want to add an empty line after a line Just add 2 newline characters
File.WriteAllText(filePath,"\n\n");
Here the the writer will jump 2 times and add the upcoming contents in the send line
I'm trying to transpose a large data file that may have many rows and columns, for subsequent analysis in Excel. Currently rows might contain either 2 or 125,000 points, but I'm trying to be generic. (I need to transpose because Excel can't handle that many columns, but is fine if the large sets span many rows.)
Initially, I implemented this is Python, using the built-in zip function. I process the source file to separate long rows from short, then transpose the long rows with zip:
tempdata = zip(*csv.reader(open(tempdatafile,'r')))
csv.writer(open(outfile, 'a', newline='')).writerows(tempdata)
os.remove(tempdatafile)
This works great and takes a few seconds for a 15MB csv file, but since the program that generated the data in the first place is in C#, I thought it would be best to do it all in one program.
My initial approach in C# is a little different, since from what I've read, the zip function might not work quite the same. Here's my approach:
public partial class Form1 : Form
{
StreamReader source;
int Rows = 0;
int Columns = 0;
string filePath = "input.csv";
string outpath = "output.csv";
List<string[]> test_csv = new List<string[]>();
public Form1()
{
InitializeComponent();
}
private void button_Load_Click(object sender, EventArgs e)
{
source = new StreamReader(filePath);
while(!source.EndOfStream)
{
string[] Line = source.ReadLine().Split(',');
test_csv.Add(Line);
if (test_csv[Rows].Length > Columns) Columns = test_csv[Rows].Length;
Rows++;
}
}
private void button_Write_Click(object sender, EventArgs e)
{
StreamWriter outfile = new StreamWriter(outpath);
for (int i = 0; i < Columns; i++)
{
string line = "";
for (int j = 0; j < Rows; j++)
{
try
{
if (j != 0) line += ",";
line += test_csv[j][i];
}
catch { }
}
outfile.WriteLine(line);
}
outfile.Close();
MessageBox.Show("Outfile written");
}
}
I used the List because the rows might be of variable length, and I have the load function set to give me total number of columns and rows so I can know how big the outfile has to be.
I used a try/catch when writing to deal with variable length rows. If the indices are out of range for the row, this catches the exception and just skips it (the next loop writes a comma before an exception occurs).
Loading takes very little time, but actually saving the outfile is an insanely long process. After 2 hours, I was only 1/3 of the way through the file. When I stopped the program and looked at the outfile, everything is done correctly, though.
What might be causing this program to take so long? Is it all the exception handling? I could implement a second List that stores row lengths for each row so I can avoid exceptions. Would that fix this issue?
Try using StringBuilder. Concatenation (+) of long strings is very inefficient.
Create a List<string> of lines and then make a single call System.IO.File.WriteAllLines(filename, lines). This will reduce disk IO.
If you don't care about the order of the points try changing your outside for loop to System.Threading.Tasks.Parallel.For. This will run multiple threads. Since these run parallel it won't preserve the order when writing it out.
In regards to your exception handling: Since this is an error that you can determine ahead of time, you should not use a try/catch to take care of it. Change it to this:
if (j < test_csv.Length && i < test_csv[j].Length)
{
line += test_csv[j][i];
}
There is a list of things I want to do. I have a forms application.
Go to a particular line. I know how to go in a serial manner, but is there any way by which I can jump to a particular line no.
To find out total no of line.
If the file is not too big, you can try the ReadAllLines.
This reads the whole file, into a string array, where every line is an element of the array.
Example:
var fileName = #"C:\MyFolder\MyFileName.txt";
var contents = System.IO.File.ReadAllLines(fileName);
Console.WriteLine("Line: 10: " + contents[9]);
Console.WriteLine("Number of lines:");
Console.WriteLine(contents.Lenght);
But be aware: This reads in the whole file into memory.
If the file is too big:
Open the file (OpenText), and create a Dictionary to store the offset of every line. Scan every line, and store the offset. Now you can go to every line, and you have the number of lines.
var lineOffset = new Dictionary<int, long>();
using (var rdr = System.IO.File.OpenText(fileName)) {
int lineNr = 0;
lineOffset.Add(0,0);
while (rdr.ReadLine() != null)) {
lineNr++;
lineOffset.Add(lineNr, rdr.BaseStream.Position);
}
// Goto line 10
rdr.BaseStream.Position = lineOffset[10];
var line10 = rdr.ReadLine();
}
This would help for your first point: jump into file line c#