C# StreamWriter won't work after calling StreamReader - c#

I'm trying to make betting program in C#, storing the user's data in a txt file. I have no problem reading the data from it. However, I can't manage to overwrite it.
From what I've tested, if I call the StreamWriter part alone the overwriting happens just fine. When I put the same code after the StreamReader part, the code will reach the Console.WriteLine("reached"); line and ignore everything after it (username is never written in the console). No error is detected and compilation won't stop either.
Here's the class code:
class Dinero
{
private List<string> data;
private string path = #"C:\Users\yy\Documents\Visual Studio 2015\Projects\ErikaBot\ErikaBot\img\bank_data.txt";
...
some other methods here
...
public void thing(string username, int money)
{
FileStream fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None);
data = new List<string>();
using (StreamReader sr = new StreamReader(fs))
{
string a = sr.ReadLine();
for (int i = 0; a != null; i++)
{
if (a != username)
{
data.Add(a);
}
else i++;
a = sr.ReadLine();
}
}
string b = Convert.ToString(money);
Console.WriteLine("reached");
using (StreamWriter tw = new StreamWriter(fs))
{
Console.WriteLine(username);
if (data != null)
{
for (int i = 0; i < data.Count; i++)
{
tw.WriteLine(data.ElementAt(i));
}
}
string money2 = Convert.ToString(money);
tw.WriteLine(username);
tw.WriteLine(money2);
}
}
}

By disposing StreamReader you also dispose the FileStream.
Either repeat the filestream initialisation before the using statement for StreamWriter or put the latter in the using statement for StreamReader.

Related

How to remove lag during data fetching from large text files from my code in C#?

I have a text file consisting of 21000 lines, i have an attribute and i need to search it in the .txt file and need to return a value from the same too. All code is done and tried async plus new thread but there is a five second lag during the button click . how can i remove the lag.
Tried on new unity and C#
public async void read()
{
string[] lines = await ReadAllLinesAsync("Assets/Blockchain Module/" + File + ".csv");
fields = null;
for (int j = 0; j < lines.Length; j++)
{
fields = lines[j].Split(',');
x[j] = System.Convert.ToDouble(fields[1]);
y[j] = System.Convert.ToDouble(fields[2]);
z[j] = System.Convert.ToDouble(fields[3]);
temp[j] = System.Convert.ToDouble(fields[4]);
}
}
public void Start()
{
Thread thread = new Thread(read);
thread.Start();
//gradient.Evaluate()
//var main = particleSystem.main;
//main.maxParticles = 200;
}
private const FileOptions DefaultOptions = FileOptions.Asynchronous | FileOptions.SequentialScan;
public static Task<string[]> ReadAllLinesAsync(string path) => ReadAllLinesAsync(path, Encoding.UTF8);
public static async Task<string[]> ReadAllLinesAsync(string path, Encoding encoding)
{
var lines = new List<string>();
// Open the FileStream with the same FileMode, FileAccess
// and FileShare as a call to File.OpenText would've done.
using (var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read, DefaultBufferSize, DefaultOptions))
using (var reader = new StreamReader(stream, encoding))
{
string line;
while ((line = await reader.ReadLineAsync()) != null)
{
lines.Add(line);
}
}
return lines.ToArray();
}
Reading and parsing the file apparently costs 5 seconds on your system. I don't think reading it line by line is the fastest approach, but anyway, don't parse the file for each request.
Read it once on application startup, and cache it in an appropriate data type.
In general if your search is line based is better to read line one by one instead to read all the file:
using (StreamReader reader = new StreamReader("filename"))
{
while (true)
{
string line = await reader.ReadLineAsync();
if (line == null)
{
break;
}
//logic here...
}
}

StreamWriter only writes one line

I am trying to write from a .csv file to a new file.
Every time StreamWriter writes, it writes to the first line of the new file. It then overwrites that line with the next string, and continues to do so until StreamReader reaches EndOfStream.
Has anybody ever experienced this? How did you overcome it?
This is my first solution outside of those required in by my school work. There is an unknown number of rows in the original file. Each row of the .csv file has only 17 columns. I need to write only three of them and in the order found in the code snippet below.
Before coding the StreamWriter I used Console.WriteLine() to make sure that each line was in the correct order.
Here is the code snippet:
{
string path = # "c:\directory\file.csv";
string newPath = # "c:\directory\newFile.csv"
using(FileStream fs = new FileStream(path, FileMode.Open))
{
using(StreamReader sr = new StreamReader(fs))
{
string line;
string[] columns;
while ((line = sr.ReadLine()) != null)
{
columns = line.Split(',');
using(FileStream aFStream = new FileStream(
newPath,
FileMode.OpenOrCreate,
FileAccess.ReadWrite))
using(StreamWriter sw = new StreamWriter(aFStream))
{
sw.WriteLine(columns[13] + ',' + columns[10] + ',' + columns[16]);
sw.Flush();
sw.WriteLine(sw.NewLine);
}
}
}
}
}
You should open the target in the same scope as you are opening the source instead of doing so in the loop which will cause you to overwrite the file every time with the FileMode option OpenOrCreate.
var path = #"c:\directory\file.csv";
var newPath = #"c:\directory\newFile.csv"
using(var sr = new StreamReader(new FileStream(path, FileMode.Open)))
using(var sw = new StreamWriter(new FileStream(newPath, FileMode.OpenOrCreate, FileAccess.ReadWrite)))
{
while(!sr.EndOfStream)
{
string line = sr.ReadLine();
var columns = line.Split(',');
sw.WriteLine(columns[13] + ',' + columns[10] + ',' + columns[16]);
sw.WriteLine(sw.NewLine);
}
sw.Flush();
}
I also hope you are sure about your CSV spacing as you are hard coding the positions in your code.
To correctly fix your code, you'll want to structure more:
public void CopyFileContentToLog()
{
var document = ReadByLine();
WriteToFile(document);
}
public IEnumerable<string> ReadByLine()
{
string line;
using(StreamReader reader = File.OpenText(...))
while ((line = reader.ReadLine()) != null)
yield return line;
}
public void WriteToFile(IEnumerable<string> contents)
{
using(StreamWriter writer = new StreamWriter(...))
{
foreach(var line in contents)
writer.WriteLine(line);
writer.Flush();
}
}
You could obviously tailor and make it a bit more flexible. But this should demonstrate and resolve some of the issues you have with your loop and streams.
First off, you are creating and closing a write stream to the same file for every single line. This means the file gets overwritten every line. You want to take your using block outside of the while loop; however, if you insist on opening and closing the write stream for every single line, then you need to use FileMode.Append
{
string path=#"c:\directory\file.csv";
string newPath=#"c:\directory\newFile.csv"
using(StreamReader sr = new StreamReader(new FileStream(path, FileMode.Open))) // no need for 2 usings
using (FileStream aFStream = new FileStream (newPath, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
string line;
string[] columns;
{
while((line = sr.ReadLine()) != null)
{
columns = line.Split(',');
using (StreamWriter sw = new StreamWriter(aFStream))
{
sw.WriteLine(columns[13] + ',' + columns[10] + ',' + columns[16]);
sw.Flush();
sw.WriteLine(sw.NewLine);
}
}
}
}
}

The process cannot access the file because it is being used by another process

public bool ReadFile()
{
string fname = "text.txt";
FileStream fs = null;
fs = new FileStream(fname, FileMode.OpenOrCreate,FileAccess.Read);
StreamReader sr = new StreamReader(fs);
string res = sr.ReadToEnd();
if (res == "1")
return true;
else
return false;
}
public void WriteToFile()
{
string fname = "text.txt";
FileStream fs = null;
fs = new FileStream(fname, FileMode.Open,FileAccess.Write);
StreamWriter sw = new StreamWriter(fs);
sw.Write("1");
}
So it should work like if ReadFile returns false than i do WriteFile.
But when it reaches writefile, it throws IO expection:
The process cannot access the file ... because it is being used by another process
You aren't closing the file when you read it.
Put your FileStream and StreamReader objects in using statements:
using (var fs = new FileStream(fname, FileMode.OpenOrCreate,FileAccess.Read)) {
using (var sr = new StreamReader(fs)) {
//read file here
}
}
Make sure you do the same when you write to the file.
You need to dispose the StreamReader object in the ReadFile method. The StreamReader inherits from IDisposable and therfor you need to dispose the object.
Check this link for more info:StreamReader Class

C# ZipArchive losing data

I'm trying to copy the contents of one Excel file to another Excel file while replacing a string inside of the file on the copy. It's working for the most part, but the file is losing 27 kb of data. Any suggestions?
public void ReplaceString(string what, string with, string path) {
List < string > doneContents = new List < string > ();
List < string > doneNames = new List < string > ();
using(ZipArchive archive = ZipFile.Open(_path, ZipArchiveMode.Read)) {
int count = archive.Entries.Count;
for (int i = 0; i < count; i++) {
ZipArchiveEntry entry = archive.Entries[i];
using(var entryStream = entry.Open())
using(StreamReader reader = new StreamReader(entryStream)) {
string txt = reader.ReadToEnd();
if (txt.Contains(what)) {
txt = txt.Replace(what, with);
}
doneContents.Add(txt);
string name = entry.FullName;
doneNames.Add(name);
}
}
}
using(MemoryStream zipStream = new MemoryStream()) {
using(ZipArchive newArchive = new ZipArchive(zipStream, ZipArchiveMode.Create, true, Encoding.UTF8)) {
for (int i = 0; i < doneContents.Count; i++) {
int spot = i;
ZipArchiveEntry entry = newArchive.CreateEntry(doneNames[spot]);
using(var entryStream = entry.Open())
using(var sw = new StreamWriter(entryStream)) {
sw.Write(doneContents[spot]);
}
}
}
using(var fileStream = new FileStream(path, FileMode.Create)) {
zipStream.Seek(0, SeekOrigin.Begin);
zipStream.CopyTo(fileStream);
}
}
}
I've used Microsoft's DocumentFormat.OpenXML and Excel Interop, however, they are both lacking in a few main components that I need.
Update:
using(var fileStream = new FileStream(path, FileMode.Create)) {
var wrapper = new StreamWriter(fileStream);
wrapper.AutoFlush = true;
zipStream.Seek(0, SeekOrigin.Begin);
zipStream.CopyTo(wrapper.BaseStream);
wrapper.Flush();
wrapper.Close();
}
Try the process without changing the string and see if the file size is the same. If so then it would seem that your copy is working correctly, however as Marc B suggested, with compression, even a small change can result in a larger change in the overall size.

c# exception file is being used by another process

i have trouble with the following two functions. Both have a indentical basic scheme but first one work, second one causes an exception at marked line("File is used by another process").
// this works
public static void EncryptFile(string FileName)
{
string ToEncrypt = null;
using(StreamReader sr = new StreamReader(FileName))
{
ToEncrypt = sr.ReadToEnd();
}
using(StreamWriter sw = new StreamWriter(FileName, false))
{
string Encrypted = Encrypt(ToEncrypt, true);
sw.Write(Encrypted);
}
}
// this works not - see commented lin
public static void DecryptFile(string FileName)
{
string ToDecrypt = null;
using (StreamReader sr = new StreamReader(FileName))
{
ToDecrypt = sr.ReadToEnd();
}
// here comes the exception
using (StreamWriter sw = new StreamWriter(FileName, false))
{
string Decrypted = Decrypt(ToDecrypt, true);
sw.Write(Decrypted);
}
}
I have tried with an additional Close() after read and write, but this works not too.
I hope, somebody can help.
Thanks
Torsten
Is the function called from multiple threads? If yes you may want to declare a static object on class level and place a lock statement around the entire body of that method. Like this:
private static Object syncObject = new Object()
// this works not - see commented lin
public static void DecryptFile(string FileName)
{
lock(syncObject)
{
string ToDecrypt = null;
using (StreamReader sr = new StreamReader(FileName))
{
ToDecrypt = sr.ReadToEnd();
}
// here comes the exception
using (StreamWriter sw = new StreamWriter(FileName, false))
{
string Decrypted = Decrypt(ToDecrypt, true);
sw.Write(Decrypted);
}
}
}
Also could you, just for fun, comment the StreamReader statement and try to run the method again? If it still doesn't work, check if you've that file open in a texteditor or something alike by using ProcessExplorer or something similiar.
edit
could you comment the StreamReader part? So that it looks like this:
public static void DecryptFile(string FileName)
{
//string ToDecrypt = null;
//using (StreamReader sr = new StreamReader(FileName))
//{
// ToDecrypt = sr.ReadToEnd();
//}
// here comes the exception
using (StreamWriter sw = new StreamWriter(FileName, false))
{
string Decrypted = Decrypt(ToDecrypt, true);
sw.Write(Decrypted);
}
}
also could you try to open an exclusive FileStream on that file before the StreamReader and once after the StreamReader but before the StreamWriter? http://msdn.microsoft.com/de-de/library/tyhc0kft%28v=vs.110%29.aspx
Also could you try and use another file for that method?

Categories

Resources