We are having an issue with one server and it's utilization of the StreamWriter class. Has anyone experienced something similar to the issue below? If so, what was the solution to fix the issue?
using( StreamWriter logWriter = File.CreateText( logFileName ) )
{
for (int i = 0; i < 500; i++)
logWriter.WriteLine( "Process completed successfully." );
}
When writing out the file the following output is generated:
Process completed successfully.
... (497 more lines)
Process completed successfully.
Process completed s
Tried adding logWriter.Flush() before close without any help. The more lines of text I write out the more data loss occurs.
Had a very similar issue myself. I found that if I enabled AutoFlush before doing any writes to the stream and it started working as expected.
logWriter.AutoFlush = true;
sometimes even u call flush(), it just won't do the magic. becus Flush() will cause stream to write most of the data in stream except the last block of its buffer.
try
{
// ... write method
// i dont recommend use 'using' for unmanaged resource
}
finally
{
stream.Flush();
stream.Close();
stream.Dispose();
}
Cannot reproduce this.
Under normal conditions, this should not and will not fail.
Is this the actual code that fails ? The text "Process completed" suggests it's an extract.
Any threading involved?
Network drive or local?
etc.
This certainly appears to be a "flushing" problem to me, even though you say you added a call to Flush(). The problem may be that your StreamWriter is just a wrapper for an underlying FileStream object.
I don't typically use the File.CreateText method to create a stream for writing to a file; I usually create my own FileStream and then wrap it with a StreamWriter if desired. Regardless, I've run into situations where I've needed to call Flush on both the StreamWriter and the FileStream, so I imagine that is your problem.
Try adding the following code:
logWriter.Flush();
if (logWriter.BaseStream != null)
logWriter.BaseStream.Flush();
In my case, this is what I found with output file
Case 1: Without Flush() and Without Close()
Character Length = 23,371,776
Case 2: With Flush() and Without Close()
logWriter.flush()
Character Length = 23,371,201
Case 3: When propely closed
logWriter.Close()
Character Length = 23,375,887 (Required)
So, In order to get proper result, always need to close Writer instance.
I faced same problem
Following worked for me
using (StreamWriter tw = new StreamWriter(#"D:\Users\asbalach\Desktop\NaturalOrder\NatOrd.txt"))
{
tw.Write(abc.ToString());// + Environment.NewLine);
}
Using framework 4.6.1 and under heavy stress it still has this problem. I'm not sure why it does this, though i found a way to solve it very differently (which strengthens my feeling its indeed a .net bug).
In my case i tried write huge jagged arrays to disk (video caching).
Since the jagged array is quite large it had to do lot of repeated writes to store a large set of video frames, and despite they where uncompressed and each cache file got exact 1000 frames, the logged cash files had all different sizes.
I had the problem when i used this
//note, generateLogfileName is just a function to create a filename()
using (FileStream fs = new FileStream(generateLogfileName(), FileMode.OpenOrCreate))
{
using (StreamWriter sw = new StreamWriter(fs)
{
// do your stuff, but it will be unreliable
}
}
However when i provided it an Encoding type, all logged files got an equal size, and the problem was gone.
using (FileStream fs = new FileStream(generateLogfileName(), FileMode.OpenOrCreate))
{
using (StreamWriter sw = new StreamWriter(fs,Encoding.Unicode))
{
// all data written correctly, no data lost.
}
}
Note also read the file width the same encoding type!
This did the trick for me:
streamWriter.flush();
Related
I am trying to download a file using ServeRelativeUrl by OpenBinaryDirect Method and it works fine. But often, irrespective of size, extension or any other file metadata it just get stucks.
Please see the code below, it gets stucks at either of the bolded lines (more often at ExecuteQuery()) and throws an operation timeout error, later I tried to give timeouts (shown in italic) but then it got stuck for infinite time until the main thread got killed.
clientContext.RequestTimeout = -1;
FileInformation fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(clientContext, file.ServerRelativeUrl);
clientContext.ExecuteQuery();
using (var fileStream = new FileStream(location, FileMode.Create))
{
fileInfo.Stream.WriteTimeout = -1;
fileInfo.Stream.CopyTo(fileStream);
}
fileInfo.Stream.Dispose();
size of the file in explorer remains 0kb.
Can anyone please help me out with this?
https://i.stack.imgur.com/bKWau.png
I maybe had a similar problem: I got a timeout problem after downloading a lot of files, I solved this by disposing the "fileInfo" object by putting it into a using statement:
using (var fileInfo = File.OpenBinaryDirect(_context, relativeUrl))
{
if (Directory.Exists(targetDirectory))
{
targetDirectory = Path.Combine(targetDirectory,
Path.GetFileName(new Uri(relativeUrl).LocalPath));
}
using (Stream destination = System.IO.File.Create(targetDirectory))
{
for (var a = fileInfo.Stream.ReadByte(); a != -1; a = fileInfo.Stream.ReadByte())
destination.WriteByte((byte)a);
}
}
After that it worked. It seems if the object is not disposed you run into a connection limit.
I just migrated ~100 files from OnPrem to SPO using the same method.
Some difference from my solution:
clientContext.ExecuteQuery();
is only required to ensure file.ServerRelativeUrl is initialized, so you can remove it from this position and load it before.
As I had to download+upload multiple dependend files, I copied each SP binary stream into MemoryStreams with Async methods and disposed the SP streams as fast as possible. I think there will be a connection limit.
I disposed the fileInfo itself instead of the inner Stream object.
It's me again and I have another problem. Somewhere, I've found following code:
private T DeepDeserialize<T>(string fileName)
{
T returnValue;
using (FileStream str = new FileStream(fileName, FileMode.Open))
{
BinaryFormatter binaryFormatter = new BinaryFormatter();
returnValue = (T)binaryFormatter.Deserialize(str);
}
return returnValue;
}
I've modified some classes today and now, it always throws an error, which could be translated like this: Before completing the analysis was detected ending stream (I don't know the right translation, the error message is in my language, not in English)
I've tried to insert str.Position = 0; between these two lines in using, which I've found somewhere here, but it doesn't help.
Can someone help me to make it work again? I have no ideas what to do...
You have changed the binary layout of your files but most likely trying to deserialize old files. This is not gonna work. You have to serialize new versions first.
P.S. If you would consider versioning and custom formatter at early stages, you might be able to deserialize old data with new classes, depending on how drastic was your change
I am doing the following:
if (File.Exists(filePath))
{
string base64 = File.ReadAllText(filePath);
return new ImageContentDTO
{
ImageContentGuid = imageContentGuid,
Base64Data = base64
};
}
This works perfectly fine. What I want to ask is if I need to Close the file or anything similar after I am done reading from it. And if so, how?
No, you don't have to explicitly close the file, File.ReadAllText takes care of that for you.
The documentation contains this information very explicitly:
This method opens a file, reads each line of the file, and then adds each line as an element of a string. It then closes the file.
[...]
The file handle is guaranteed to be closed by this method, even if exceptions are raised.
You don't need to close anything when using File.ReadAllText since the underling stream reader is closed implicitely.
MSDN: File.ReadAllText
Opens a text file, reads all lines of the file, and then closes the
file.
Here's the implementation in .NET 4 (ILSpy):
string result;
using (StreamReader streamReader = new StreamReader(path, encoding))
{
result = streamReader.ReadToEnd();
}
return result;
The using statement disposes the StreamReader (even on error), that also closes it.
I know this question has been answered and this is almost a year now but for those who search and read this question, I would like to suggest you close a file when done with it, or at least do an investigation like my answer shows.
I am no programming expert but I have come across this situation recently.
I created a WinForms c# program and used File.ReadAllText to copy text to a string. Afterwards I tried to delete the file, directly from the folder not through the program, but I got an error that the file was still open in another program. I then stopped running the program and was able to delete the file.
That's my experience in Visual Studio 2012 Ultimate. It might be supposed to do something different, but that's what it did for me.
When I used StreamReader.ReadToEnd then StreamReader.Close on the same file, I had no problem deleting the file while running the program.
You have to close IDisposable instances only, usually by means of using, e.g.:
// StreamReader is IDisposable and should be Closed/Disposed
// either explicitly or by using
using (StreamReader sr = new StreamReader(filePath)) {
String base64 = sr.ReadToEnd();
...
}
since you don't have an IDisposable instance in your code (File.ReadAllText
returns String which is not IDisposable) you have nothing to Close/Dispose
StreamWriter outputFile = new StreamWriter(#"C:\Users\Marc\Desktop\_App\files\Data" + dat1 + ".txt");
outputFile.WriteLine(sb.ToString());
outputFile.Close();
StreamWriter outputFileex = new StreamWriter(#"C:\Users\Marc\Desktop\_App\files\DataEx" + dat1 + ".txt");
outputFileex.WriteLine(sbex.ToString());
outputFileex.Close();
Here's a working example I just did with a stringbuilder: "sb". If I remove one of those closes' the file gets generated but the file shows up blank with no data. I had to add in a close to get it to work properly.
I have two filestreams which collects different information from different files:
FileStream dataStruc = new FileStream("c:\\temp\\dataStruc.txt", FileMode.Create, FileAccess.ReadWrite);
FileStream csvFile = new FileStream("c:\\temp\\" + fileName + ".txt", FileMode.Create, FileAccess.ReadWrite);
StreamWriter sw = new StreamWriter(csvFile);
StreamWriter swc = new StreamWriter(dataStruc);
when both streamwriters are used to get the same piece of information like shown below:
sw.WriteLine(sheet);
swc.WriteLine(sheet);
then sw streamwriter has information from file. Have I set up my filestreams incorrectly?
Assuming you don't get any exceptions/errors and that basic stuff like the correct path for the csvFile FileStream is verified and found to be correct: Try adding a Flush() or propery closing the stream using Close(). Even better: use a using statement.
EDIT
After reading your question again: are you sure you just didn't switch the filestreams?
StreamWriter sw = new StreamWriter(csvFile);
StreamWriter swc = new StreamWriter(dataStruc);
as opposed to
StreamWriter sw = new StreamWriter(dataStruc);
StreamWriter swc = new StreamWriter(csvFile);
Your question and description is rather vague: "both streamwriters are used to get the same piece of information". How would stream writers be used to get information? Also: "sw streamwriter has information from file": could you be more specific? This doesn't make sense.
Whatever the case may be; use the debugger luke!
I suppose that you have conflicting concurrent access to the file by both StreamWriters.
You open the streams with FileMode.Create. See the MSDN documentation (highlights by me):
Specifies that the operating system should create a new file. If the
file already exists, it will be overwritten. This operation requires
FileIOPermissionAccess.Write permission. System.IO.FileMode.Create is
equivalent to requesting that if the file does not exist, use
CreateNew; otherwise, use Truncate.
I am not sure if the second StreamWriter, depending on the order of the initialization, overwrites the file of the first StreamWriter or simply fails. Either way, they do try conflicting work.
Possible solutions:
Make sure the streams access the file only one after the other, e.g. by closing the first stream before the second one accesses the file, e.g. with a using block.
Change the FileMode on the streams so that an existing file does not get overridden if possible. (See the documentation above.)
Here is my code which opens an XML file (old.xml), filter invalid characters and write to another XML file (abc.xml). Finally I will load the XML (abc.xml) again. When executing the followling line, there is exception says the xml file is used by another process,
xDoc.Load("C:\\abc.xml");
Does anyone have any ideas what is wrong? Any leaks in my code and why (I am using "using" keyword all the time, confused to see leaks...)?
Here is my whole code, I am using C# + VSTS 2008 under Windows Vista x64.
// Create an instance of StreamReader to read from a file.
// The using statement also closes the StreamReader.
Encoding encoding = Encoding.GetEncoding("utf-8", new EncoderReplacementFallback(String.Empty), new DecoderReplacementFallback(String.Empty));
using (TextWriter writer = new StreamWriter(new FileStream("C:\\abc.xml", FileMode.Create), Encoding.UTF8))
{
using (StreamReader sr = new StreamReader(
"C:\\old.xml",
encoding
))
{
int bufferSize = 10 * 1024 * 1024; //could be anything
char[] buffer = new char[bufferSize];
// Read from the file until the end of the file is reached.
int actualsize = sr.Read(buffer, 0, bufferSize);
writer.Write(buffer, 0, actualsize);
while (actualsize > 0)
{
actualsize = sr.Read(buffer, 0, bufferSize);
writer.Write(buffer, 0, actualsize);
}
}
}
try
{
XmlDocument xDoc = new XmlDocument();
xDoc.Load("C:\\abc.xml");
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
EDIT1: I have tried to change the size of buffer from 10M to 1M and it works! I am so confused, any ideas?
EDIT2: I find this issue is very easy to reproduce when the input old XML file is very big, like 100M or something. I am suspecting whether it is a .Net known bug? I am going to using tools like ProcessExplorer/ProcessMonitor to see which process locks the file to keep it from being accessed by XmlDocument.Load.
That works fine for me.
Purely a guess, but maybe a virus checker is scanning the file?
To investigate, try disabling your virus checker and see if it works (and then re-enable your virus checker).
As an aside, there is one way it can leave the file open: if the StreamReader constructor throws an exception; but then you won't reach the XmlDocument stuff anyway... but consider:
using (FileStream fs = new FileStream("C:\\abc.xml", FileMode.Create))
using (TextWriter writer = new StreamWriter(fs, Encoding.UTF8))
{
...
}
Now fs is disposed in the edge-case where new StreamWriter(...) throws. However, I do not believe that this is the problem here.
You running a FileSystemWatcher on the root perhaps?
You can also use ProcessMonitor to see who accesses that file.
The problem is your char[] which seems to be to big. If it is too big, it is located on the large objekt heap, not on the stack. Hence the large object heap is not compacted as long as the software is running, the once allocated space there may not be used again - which looks like a memory leak. Try splitting up your array to smaller chunks.
I second Leppie's suggestion to use ProcessMonitor (or equivalent) to see for sure who is locking the file. Anything else is just speculation.
Your buffer isnt being deallocated, is it?
Have you checked that no other process tries to access the file?
Code works fine. Just checked.
using will call Dispose, but will Dispose call close on the writing stream? If it does not, the system may still consider the file to be open for writing.
I'd try putting in a close of the writer just before then end of its using block.
Edit: Just tried out the code myself as well. Compiled and ran without the problem your are seeing. Try turning off Virus scanners like some others have mentioned and make sure you don't have a window somewhere with the file open.
The fact that it works for some people and not for others makes me think that the file isn't being closed. Close the writer before trying to load the file.
My bet is that you have some Antivirus solution running, which locks the file after it is being closed. To verify, try adding a delay (like, 1 second) before loading the file. If that works, you probably found the cause.
Run Process Explorer
Make sure it's your program locking the file first.