file access exception that I can't shake - c#

I have a third party api that is dropping a file into a directory every 100 miliseconds. Sometimes I can grab it and read it, but I am swallowing a lot of "process cannot access the file because it is being used by another process" exceptions just to get one good read. How do I stop this exception from happening?
Here is the code I am using. I read that using "using" statements should prevent this error, but it's not helping.
string content = "";
string path = "C:\\output.txt";
using (FileStream fs = File.Open(path, FileMode.Open, FileAccess.ReadWrite))
{
byte[] b = new byte[1024];
UTF8Encoding temp = new UTF8Encoding(true);
while (fs.Read(b, 0, b.Length) > 0)
{
content += temp.GetString(b);
}
}

Related

Detecting a corrupted file when loading data using binaryformatter / deserialize

I'm working on a videogame where I save/load player savegames using c#'s binaryformatter. This works 99% of the time, but sometimes a user's savegame will get corrupted somehow, and then the game won't be able to read the file. If I could detect when the game encounters this problem, though, I could tell it to load a backup copy of the last good savegame, though, which would be helpful for everyone.
This is how I'm loading the data:
if (File.Exists(Application.persistentDataPath + "/" + saveLoad.saveFileName))
{
BinaryFormatter bf = new BinaryFormatter();
FileStream file = File.Open(Application.persistentDataPath + "/" + saveLoad.saveFileName, FileMode.Open);
saveLoad.savedGames = (List<savedGame_latest>)bf.Deserialize(file);
file.Close();
success = true;
}
By the way, this is the error when the game loads a corrupted file:
EndOfStreamException: Failed to read past end of stream
Any ideas? What I want is basically a way for the system to detect "oops no, that's corrupted" and to then be shunted to try and load the last safe backup instead.
well you have to open the file to check whether it is opening or not.
what you can do is make a function which check whether the file can be opened or not -
To check for corrupted file
protected virtual bool IsFileCorrupted(FileInfo file)
{
FileStream stream = null;
try
{
stream = File.Open(FileMode.Open, FileAccess.Read, FileShare.None);
}
catch (IOException)
{
// File is corrupted
return true;
}
finally
{
if (stream != null)
stream.Close();
}
//file is not corrupted
return false;
}
Since BinaryFormatter stops reading the stream when it finishes you can simply add some hash or checksum value after the saved content without breaking the functionality.
Catching just EndOfStreamException detects only one possible corruption anyway.
Saving (hashAlg can be any HashAlgorithm implementation):
new BinaryFormatter().Serialize(stream, savegame); // regular data
var hash = hashAlg.ComputeHash(stream.ToArray());
stream.Write(hash, 0, hash.Length); // hash
And loading:
int hashLength = hashAlg.HashSize / 8; // get hash size in bytes
var bytes = stream.ToArray();
hash = hashAlg.ComputeHash(bytes, 0, (int)stream.Length - hashLength);
if (!hash.SequenceEqual(bytes.Skip(bytes.Length - hashLength)))
throw new ArgumentException("Savegame Corrupted"); // gotcha!
savegame = (SaveGame)new BinaryFormatter().Deserialize(stream);
return savegame;
Try also online.

Await for a process to finish execution

I have a process that takes some varying time to execute. The proceeding part of the code depends on its results.
The process creates a printable file (PRN) file. The proceeding section then reads that file and returns its bytes contents.
When i put a breakpoint at the using statement, i get to read the bytes of the created file and return them to where they are being requested. But when i execute as usual, i get the error.
_ The process cannot access the file 'linkToFile' because it is being used by another process _
lbl.PrintSettings.PrinterName = printerName;
byte[] fileBytes = null;
Task.Run(() => { lbl.Print(int.Parse(qty)); }).Wait(2000);
using (var strm = File.Open(outPutPrintFile,
FileMode.Open, FileAccess.Read, FileShare.Read))
{
using (var ms = new MemoryStream())
{
strm.CopyTo(ms);
fileBytes = ms.ToArray();
}
}
return Ok(fileBytes);
I tried to put the part that executes longer in a Task-Wait part but still getting the same error.
Try using Fileshare.ReadWrite instead of FileShare.Read. It's not for some unknown reason as you commented but ReadWrite make sure that further Read/Write operations can be done on opening the file. From your posted code it looks to be the option to choose.

What happens to a filestream when the file is deleted by a different process?

In C#, I open a file with FileShare.Delete. This allows me to open the file without restricting other processes from deleting it. For example:
using (FileStream fs = new FileStream(#"C:\temp\1.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite | FileShare.Delete))
{
int len = (int)fs.Length;
byte[] array = new byte[len];
int bytesRead = fs.Read(array, 0, len);
}
My questions are:
What happens if the file is deleted by a different process after we created the stream, but before we read it? Does the operating system keep a copy of the file until the stream\handle is closed?
Can I rely on reading the deleted file without getting any errors, or the wrong content?
The file is marked for deletion, but is not actually deleted until the last open handle to it is closed, as described in the documentation for DeleteFile.
Note that you cannot open a new handle to a file that is marked for deletion, but the file will still appear in directory listings and cannot be replaced by a file of the same name until it has actually been deleted. This is unlike Unix systems in which the file disappears from the directory (is "unlinked") immediately. As Ben suggests in the comments, you can work around this by renaming and/or moving the file before deleting it.
Also, as MoonRabbit pointed out, you can "delete" an open file using Explorer, but that is because that only moves the file to the recycle bin. The Shift+Delete option to delete a file immediately won't work.
Yes another process can delete the file but you will not get any exception as the pointer to the file on the disk was created so your process will continue reading, but when you retry to open the stream you will get an exception as the entry in the file system does not exist
here a full example to reproduce your case
try to execute this and go to explorer and delete your file
class Program
{
static void Main(string[] args)
{
for (int i = 0; i < 10000; i++)
{
File.AppendAllText(#"c:\temp\1.txt", Guid.NewGuid().ToString());
}
//read the file
using (FileStream fs = new FileStream(#"C:\temp\1.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite | FileShare.Delete))
{
while (fs.CanRead)
{
//here I read a chunk of 1000 bytes to let stream open
int len = 1000;
Thread.Sleep(1000);
byte[] array = new byte[len];
int bytesRead = fs.Read(array, 0, len);
}
}
}
}

copying file, file not disposed

I have the following code for copying file:
var copedFile = ConfigurationManager.AppSettings["PathToFirebirdDB"] + ".001";
using (var inputFile = new FileStream( ConfigurationManager.AppSettings["PathToFirebirdDB"],
FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (var outputFile = new FileStream(copedFile, FileMode.Create))
{
var buffer = new byte[0x10000];
int bytes;
while ((bytes = inputFile.Read(buffer, 0, buffer.Length)) > 0)
{
outputFile.Write(buffer, 0, bytes);
}
}
}
This code works fine only one time. The next time I get the folowing message:
The process cannot access the file 'D:\Programs\IBExpert\db.fdb.001' because it is being used by another process. System.IO.IOException: The process cannot access the file 'D:\Programs\IBExpert\db.fdb.001' because it is being used by another process.
Why? There are using block.
If you try to reopen the file just after closing it, there is a chance the file is still considered open by the system because it actually is.
A typical reason is that a virus scanner is keeping the file open to ensure it is not infected, this happens in the background and might continue running after you have closed the file yourself.
Probably because you are not closing the files.
BTW why don't you just use File.Copy?

Decompressing using GZipStream returns only the first line

I’ve been working on a function parsing 3rd party fms logs. The logs are in Gzip, so I use a decompressing function that works for any other Gzip files we use.
When decompressing these files I only get the first line of the compressed file, there’s no exception, it just doesn’t find the rest of the bytes as if there's an EOF at the first line.
I tried using Ionic.Zlib instead of System.IO.Compression but the result was the same. The files don’t seem to be corrupted in any way, decompressing them with Winrar works.
If anybody has any idea of how to solve this, I’ll appreciate your help.
Thanks
You can download a sample file here:
http://www.adjustyourset.tv/fms_6F9E_20120621_0001.log.gz
This is my decompression function:
public static bool DecompressGZip(String fileRoot, String destRoot)
{
try
{
using (FileStream fileStram = new FileStream(fileRoot, FileMode.Open, FileAccess.Read))
{
using (FileStream fOutStream = new FileStream(destRoot, FileMode.Create, FileAccess.Write))
{
using (GZipStream zipStream = new GZipStream(fileStram, CompressionMode.Decompress, true))
{
byte[] buffer = new byte[4096];
int numRead;
while ((numRead = zipStream.Read(buffer, 0, buffer.Length)) != 0)
{
fOutStream.Write(buffer, 0, numRead);
}
return true;
}
}
}
}
catch (Exception ex)
{
LogUtils.SaveToLog(DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss.fff"), "Eror decompressing " + fileRoot + " : " + ex.Message, Constants.systemLog, 209715200, 6);
return false;
}
}
I've put the last 45 minutes wrapping my head around this problem but I just can't explain why it isn't working. Somehow the DeflateStream-class isn't decoding your data properly. I wrote up my own GZip-parser (I can share the code if anyone wants to check it) which reads all the headers and checks them for validity (to make sure that there are no funny stuff there) and then use DeflateStream to inflate the actual data but with your file it still just gets me the first line.
If I recompress using your logfile using GZipStream (after first decompressing it with winrar) then it is decompressed just fine again both my my own parser and your own sample.
There seems to be some critizism on the net about Microsofts implementation of Deflate (http://www.virtualdub.org/blog/pivot/entry.php?id=335) so it might be that you found one of it's quirks.
However, a simple solution to your problem is to switch to SharZipLib (http://www.icsharpcode.net/opensource/sharpziplib/), I tried it out and it can decompress your file just fine.
public static void DecompressGZip(String fileRoot, String destRoot)
{
using (FileStream fileStram = new FileStream(fileRoot, FileMode.Open, FileAccess.Read))
using (GZipInputStream zipStream = new GZipInputStream(fileStram))
using (StreamReader sr = new StreamReader(zipStream))
{
string data = sr.ReadToEnd();
File.WriteAllText(destRoot, data);
}
}

Categories

Resources