I have a list of file names that I will be creating and writing to. I have a foreach loop going through them all something like this
void WriteFiles(byte[] data)
{
foreach (string fileName in fileNames)//fileNames is my List<string>
{
FileStream fs = File.Create(fileName)
fs.Write(data, 0, data.Length);
fs.Close();
}
}
Let's say my list of files is 1.txt, 2.txt, and 3.txt
The strange thing is, 1.txt, 2.txt, and 3.txt are all created. but the data is just written 3 times to 1.txt, and 2.txt and 3.txt are empty. I have double checked in the debugger, and fileName is different each time it loops. I have written many programs that read from and write to files, but I have never encountered any behavior like this. I'm very confused.
EDIT
Perhaps this will make more sense. This is actual code that I have run and produced the problem with, copied and pasted straight from Visual Studio.
using System;
using System.Collections.Generic;
using System.IO;
namespace ConsoleApplication1
{
class Program
{
static List<string> fileNames = new List<string>();
static void Main()
{
Directory.CreateDirectory("C:\\textfiles");
fileNames.AddRange(new string[] { "1.txt", "2.txt", "3.txt" });
WriteFiles();
}
static void WriteFiles()
{
foreach (string fileName in fileNames)
{
using (StreamWriter sw = new StreamWriter(File.Create("c:\\textfiles\\" + fileName)))
{
sw.Write("This is my text");
}
}
}
}
}
After executing this, I now have 3 text files (1.txt, 2.txt, 3.txt) in the folder C:\textfiles, none of which existed before.
When I open the files in notepad here's what I've got
1.txt - "This is my textThis is my textThis is my text"
2.txt - nothing
3.txt - nothing
WTF??? This doesn't make sense.
Try using a "using":
using (FileStream fs = File.Create( filename ))
{
fs.Write( data, 0, data.Length );
}
Code looks OK (using would be better instead of .Close).
Most likely your data
empty altogether (data.length == 0)
does not represent text that can be displayed (if you write something like [0,0,0] to a text file it will display nothing.
Your code works perfectly in my test environment so I have no idea what's going on there for you. Are the actual files you're writing to in the same directory? What I'm getting at is whether or not some security issue in your environment could be interfering with writing to files 2 and 3?
Related
I am very very novice to c# and .net and trying to understand it.
I am using solution from how to read all files inside particular folder and trying to apply in my below code.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
namespace HowToCopyTextFiles
{
class Program
{
static void Main(string[] args)
{
StringBuilder sb = new StringBuilder();
foreach (string txtName in Directory.GetFiles(#"C:\Users\Environ ment\Desktop\newfolder","*.rtf"))
{
using (StreamReader sr = new StreamReader(txtName))
{
sb.Append(sr.ReadToEnd());
sb.AppendLine();
}
}
Console.Write(sb.ToString());
Console.ReadLine();
}
}
}
The result is ok but at the end of my test file it shows environment name.
like.
this is content of first file
this is content of second file
↑My environment full name ↑My
environment full name ↑My environment full name (Yes 3 times)
I am using cs-script, Is it due to that?
While using .txt files, it is working fine. so the question is how to properly open .rtf files as text stream?
If rtf file is opened, it sometimes saves super hidden(not visible even show hidden file option) temp file as ~filename.rtf which is also read by c#.
I used code from here: C# - Get a list of files excluding those that are hidden
DirectoryInfo directory = new DirectoryInfo(#"C:\temp");
FileInfo[] files = directory.GetFiles();
var filtered = files.Where(f => !f.Attributes.HasFlag(FileAttributes.Hidden));
foreach (var f in filtered)
{
Debug.WriteLine(f);
}
This solved my problem.
In my application there is a situation like this.Before creating a file, my application search for files in a directory under a particular filename. If any file/files found, then it should read each files contents and write these contents(of each file) to a new file. I have googled many and tried some like this:
string temp_file_format = "ScriptLog_" + DateTime.Now.ToString("dd_MM_yyyy_HH");
string[] files = Directory.GetFiles(path,temp_file_format);
foreach (FileAccess finfo in files)
{
string text = File.ReadAllText(finfo);
}
and
System.IO.DirectoryInfo dir = new DirectoryInfo(path);
System.IO.FileInfo[] files = dir.GetFiles(temp_file_format);
foreach (FileInfo finfo in files)
{
finfo.OpenRead();
}
But all these failed..Can anyone show me an alternative for this?
Is there anything wrong in my temp_file_format string?
It will be nice if I could prepend these contents to the new file. Else also, no worries..
any help would be really appreciated..
This is a compete working implementation that does all of that
without reading everything in memory at one time (which doesn't work for large files)
without keeping any files open for more than the required time
using System.IO;
using System.Linq;
public static class Program {
public static void Main()
{
var all = Directory.GetFiles("/tmp", "*.cpp")
.SelectMany(File.ReadAllLines);
using (var w = new StreamWriter("/tmp/output.txt"))
foreach(var line in all)
w.WriteLine(line);
}
}
I tested it on mono 2.10, and it should work on any .NET 4.0+ (for File.ReadAllLines which is a lazy linewise enumerable)
Here's a short snippet that reads all the files and out puts them to the path outputPath
var lines = from file in Directory.GetFiles(path,temp_file_format)
from line in File.ReadAllLines(file)
select line;
File.WriteAllLines(outputPath, content);
The problem you are having with your code is not really related to reading files but simply trying to use an object as a type it's not. Directory.GetFiles returns an array of string and File.ReadXXX and File.OpenRead expects the path as a string. So you simply need to pass each of the strings returned as the path argument to the appropriate method. The above is one such example. Hope it helps both solve your problem and explain the actually issue with your code
try this:
foreach (FileInfo finfo in files)
{
try
{
using (StreamReader sr = new StreamReader("finfo "))
{
String line = sr.ReadToEnd();
Console.WriteLine(line);
}
}
catch (Exception e)
{
Console.WriteLine("The file could not be read:");
Console.WriteLine(e.Message);
}
}
using (var output = File.Create(outputPath))
{
foreach (var file in Directory.GetFiles(InputPath,temp_file_format))
{
using (var input = File.OpenRead(file))
{
input.CopyTo(output);
}
}
}
In C#, I'm reading a moderate size of file (100 KB ~ 1 MB), modifying some parts of the content, and finally writing to a different file. All contents are text. Modification is done as string objects and string operations. My current approach is:
Read each line from the original file by using StreamReader.
Open a StringBuilder for the contents of the new file.
Modify the string object and call AppendLine of the StringBuilder (until the end of the file)
Open a new StreamWriter, and write the StringBuilder to the write stream.
However, I've found that StremWriter.Write truncates 32768 bytes (2^16), but the length of StringBuilder is greater than that. I could write a simple loop to guarantee entire string to a file. But, I'm wondering what would be the most efficient way in C# for doing this task?
To summarize, I'd like to modify only some parts of a text file and write to a different file. But, the text file size could be larger than 32768 bytes.
== Answer == I'm sorry to make confusin to you! It was just I didn't call flush. StremWriter.Write does not have a short (e.g., 2^16) limitation.
StreamWriter.Write
does not
truncate the string and has no limitation.
Internally it uses String.CopyTo which on the other hand uses unsafe code (using fixed) to copy chars so it is the most efficient.
The problem is most likely related to not closing the writer. See http://msdn.microsoft.com/en-us/library/system.io.streamwriter.flush.aspx.
But I would suggest not loading the whole file in memory if that can be avoided.
can you try this :
void Test()
{
using (var inputFile = File.OpenText(#"c:\in.txt"))
{
using (var outputFile = File.CreateText(#"c:\out.txt"))
{
string current;
while ((current = inputFile.ReadLine()) != null)
{
outputFile.WriteLine(Process(current));
}
}
}
}
string Process(string current)
{
return current.ToLower();
}
It avoid to have to full file loaded in memory, by processing line by line and writing it directly
Well, that entirely depends on what you want to modify. If your modifications of one part of the text file are dependent on another part of the text file, you obviously need to have both of those parts in memory. If however, you only need to modify the text file on a line-by-line basis then use something like this :
using (StreamReader sr = new StreamReader(#"test.txt"))
{
using (StreamWriter sw = new StreamWriter(#"modifiedtest.txt"))
{
while (!sr.EndOfStream)
{
string line = sr.ReadLine();
//do some modifications
sw.WriteLine(line);
sw.Flush(); //force line to be written to disk
}
}
}
Instead of of running though the hole dokument i would use a regex to find what you are looking for Sample:
public List<string> GetAllProfiles()
{
List<string> profileNames = new List<string>();
using (StreamReader reader = new StreamReader(_folderLocation + "profiles.pg"))
{
string profiles = reader.ReadToEnd();
var regex = new Regex("\nname=([^\r]{0,})", RegexOptions.IgnoreCase);
var regexMatchs = regex.Matches(profiles);
profileNames.AddRange(from Match regexMatch in regexMatchs select regexMatch.Groups[1].Value);
}
return profileNames;
}
I have a few multimillion lined text files located in a directory, I want to read line by line and replace “|” with “\” and then write out the line to a new file. This code might work just fine but I’m not seeing any resulting text file, or it might be I’m just be impatient.
{
string startingdir = #"K:\qload";
string dest = #"K:\D\ho\jlg\load\dest";
string[] files = Directory.GetFiles(startingdir, "*.txt");
foreach (string file in files)
{
StringBuilder sb = new StringBuilder();
using (FileStream fs = new FileStream(file, FileMode.Open))
using (StreamReader rdr = new StreamReader(fs))
{
while (!rdr.EndOfStream)
{
string begdocfile = rdr.ReadLine();
string replacementwork = docfile.Replace("|", "\\");
sb.AppendLine(replacementwork);
FileInfo file_info = new FileInfo(file);
string outputfilename = file_info.Name;
using (FileStream fs2 = new FileStream(dest + outputfilename, FileMode.Append))
using (StreamWriter writer = new StreamWriter(fs2))
{
writer.WriteLine(replacementwork);
}
}
}
}
}
DUHHHHH Thanks to everyone.
Id10t error.
Get rid of the StringBuilder, and do not reopen the output file for each line:
string startingdir = #"K:\qload";
string dest = #"K:\D\ho\jlg\load\dest";
string[] files = Directory.GetFiles(startingdir, "*.txt");
foreach (string file in files)
{
var outfile = Path.Combine(dest, Path.GetFileName(file));
using (StreamReader reader = new StreamReader(file))
using (StreamWriter writer = new StreamWriter(outfile))
{
string line = reader.ReadLine();
while (line != null)
{
writer.WriteLine(line.Replace("|", "\\"));
line = reader.ReadLine();
}
}
}
Why are you using a StringBuilder - you are just filling up your memory without doing anything with it.
You should also move the FileStream and StreamWriter using statements to outside of your loop - you are re-creating your output streams for every line, causing unneeded IO in the form of opening and closing the file.
Use Path.Combine(dest, outputfilename), from your code it looks like you're writing to the file K:\D\ho\jlg\load\destouputfilename.txt
This code might work just fine but I’m not seeing any resulting text file, or it might be I’m just be impatient.
Have you considered having a Console.WriteLine in there to check the progress. Sure, it's going to slow down performance a tiny tiny bit - but you'll know what's going on.
It looks like you might want to do a Path.Combine, so that instead of new FileStream(dest + outputfilename), you have new FileStream(Path.Combine(dest + outputfilename)), which will create the files in the directory that you expect, rather than creating them in K:\D\ho\jlg\load.
However, I'm not sure why you're writing to a StringBuilder that you're not using, or why you're opening and closing the file stream and stream writer on each line that you're writing, is that to force the writer to flush it's output? If so, it might be easier to just flush the writer/stream on each write.
you're opening and closing the output strean for each line in the output, you'll have to be very patient!
open it once outside the loop.
I guess the problem is here:
string begdocfile = rdr.ReadLine();
string replacementwork = docfile.Replace("|", "\\");
you're reading into begdocfile variable but replacing chars in docfile which I guess is empty
string replacementwork = docfile.Replace("|", "\\");
I believe the above line in your code is incorrect : it should be "begdocfile.Replace ..." ?
I suggest you focus on getting as much of the declaration and "name manufacture" out of the inner loop as possible : right now you are creating new FileInfo objects, and path names for every single line you read in every file : that's got to be hugely expensive.
make a single pass over the list of target files first, and create, at one time, the destination files, perhaps store them in a List for easy access, later. Or a Dictionary where "string" will be the new file path associated with that FileInfo ? Another strategy : just copy the whole directory once, and then operate to directly change the copied files : then rename them, rename the directory, whatever.
move every variable declaration out of that inner loop, and within the using code blocks you can.
I suspect you are going to hear from someone here at more of a "guru level" shortly who might suggest a different strategy based on a more profound knowledge of streams than I have, but that's a guess.
Good luck !
I've apparently worked myself into a bad coding habit. Here is an example of the code I've been writing:
using(StreamReader sr = new StreamReader(File.Open("somefile.txt", FileMode.Open)))
{
//read file
}
File.Move("somefile.txt", "somefile.bak"); //can't move, get exception that I the file is open
I thought that because the using clause explicitly called Close() and Dispose() on the StreamReader that the FileStream would be closed as well.
The only way I could fix the problem I was having was by changing the above block to this:
using(FileStream fs = File.Open("somefile.txt", FileMode.Open))
{
using(StreamReader sr = new StreamReader(fs))
{
//read file
}
}
File.Move("somefile.txt", "somefile.bak"); // can move file with no errors
Should closing the StreamReader by disposing in the first block also close the underlying FileStream? Or, was I mistaken?
Edit
I decided to post the actual offending block of code, to see if we can get to the bottom of this. I am just curious now.
I thought I had a problem in the using clause, so I expanded everything out, and it still can't copy, every time. I create the file in this method call, so I don't think anything else has a handle open on the file. I've also verified that the strings returned from the Path.Combine calls are correct.
private static void GenerateFiles(List<Credit> credits)
{
Account i;
string creditFile = Path.Combine(Settings.CreditLocalPath, DateTime.Now.ToString("MMddyy-hhmmss") + ".credits");
StreamWriter creditsFile = new StreamWriter(File.Open(creditFile, FileMode.Create));
creditsFile.WriteLine("code\inc");
foreach (Credit c in credits)
{
if (DataAccessLayer.AccountExists(i))
{
string tpsAuth = DataAccessLayer.GetAuthCode(i.Pin);
creditsFile.WriteLine(String.Format("{0}{1}\t{2:0.00}", i.AuthCode, i.Pin, c.CreditAmount));
}
else
{
c.Error = true;
c.ErrorMessage = "NO ACCOUNT";
}
DataAccessLayer.AddCredit(c);
}
creditsFile.Close();
creditsFile.Dispose();
string dest = Path.Combine(Settings.CreditArchivePath, Path.GetFileName(creditFile));
File.Move(creditFile,dest);
//File.Delete(errorFile);
}
Yes, StreamReader.Dispose closes the underlying stream (for all public ways of creating one). However, there's a nicer alternative:
using (TextReader reader = File.OpenText("file.txt"))
{
}
This has the added benefit that it opens the underlying stream with a hint to Windows that you'll be accessing it sequentially.
Here's a test app which shows the first version working for me. I'm not trying to say that's proof of anything in particular - but I'd love to know how well it works for you.
using System;
using System.IO;
class Program
{
public static void Main(string[] args)
{
for (int i=0; i < 1000; i++)
{
using(StreamReader sr = new StreamReader
(File.Open("somefile.txt", FileMode.Open)))
{
Console.WriteLine(sr.ReadLine());
}
File.Move("somefile.txt", "somefile.bak");
File.Move("somefile.bak", "somefile.txt");
}
}
}
If that works, it suggests that it's something to do with what you do while reading...
And now here's a shortened version of your edited question code - which again works fine for me, even on a network share. Note that I've changed FileMode.Create to FileMode.CreateNew - as otherwise there could still have been an app with a handle on the old file, potentially. Does this work for you?
using System;
using System.IO;
public class Test
{
static void Main()
{
StreamWriter creditsFile = new StreamWriter(File.Open("test.txt",
FileMode.CreateNew));
creditsFile.WriteLine("code\\inc");
creditsFile.Close();
creditsFile.Dispose();
File.Move("test.txt", "test2.txt");
}
}
Note - your using blocks do not need to be nested in their own blocks - they can be sequential, as in:
using(FileStream fs = File.Open("somefile.txt", FileMode.Open))
using(StreamReader sr = new StreamReader(fs))
{
//read file
}
The order of disposal in this case is still the same as the nested blocks (ie, the StreamReader will still dispose before the FileStream in this case).
I would try to use FileInfo.Open() and FileInfo.MoveTo() instead of File.Open() and File.Move(). You could also try to use FileInfo.OpenText(). But these are just suggestions.
Is there any possibility that something else has a lock to somefile.txt?
A simple check from a local (to the file) cmd line
net files
may well give you some clues if anything else has a lock.
Alternatively you can get something like FileMon to take even more details, and check that your app is releasing properly.
Since this doesn't seem to be a coding issue, I'm going to put my syadmin hat on and offer a few suggestions.
Virus scanner on either the client or server that's scanning the file as it's created.
Windows opportunistic locking has a habit of screwing things up on network shares. I recall it being mostly an issue with multiple read/write clients with flat file databases, but caching could certainly explain your problem.
Windows file open cache. I'm not sure if this is still a problem in Win2K or not, but FileMon would tell you.
Edit: If you can catch it in the act from the server machine, then Sysinternal's Handle will tell you what has it open.