Randomly Thrown IO Exceptions in C# - c#

I have been working on a program to archive old client data for the company I work for. The program copies the data from the work server to the local machine, creates a .zip file of all the data, then copies it to the archive server.
After it does all that, it deletes the original files and the local copies. Every now and then, the program errors because it can't delete the local copies off my computer. It does not error every different folder that it zips. I will error after doing 300 of them, or after 5. It throws one of the 3 following errors,"The directory is not empty", "File is being used by another process", or "Access to the file is denied". I have tried setting the file attributes to normal, using a forced garbage collection, and ending the winzip process manually.
I really do not understand why it does this only sometimes. I am the admin on my computer and it should be able to delete the files. I figured something else has to be using it, but there should be nothing else using it on my machine except the program in Visual Studio. Thanks.
Below is the cleanup method where it is not deleting the files and the method that zips the files.
[MethodImplAttribute(MethodImplOptions.NoInlining)]
static void CleanUp(SqlConnection Connection, string jNumber, DirectoryInfo dir, bool error, string prefix)
{
if (!error | (!error & emptyFolder))
{
try
{
SqlCommand updateJob = new SqlCommand(string.Format("update job set archived = 1 where job = {0}", jNumber), sdiConnection);
updateJob.ExecuteNonQuery();
}
catch
{
WriteToLog("SQL Error: " + jNumber, "There was an error changing the archive bit to 1 after the job has been archived");
}
try
{
GC.Collect();
GC.WaitForPendingFinalizers();
}
catch
{
WriteToLog("Error cleaning up after processing job", "There was an error garbage collecting.");
}
try
{
//path of the temporary folder created by the program
string tempDir = Path.Combine(Path.Combine(System.Environment.CurrentDirectory, "Temp"), jNumber);
//path of the destination folder
string destDir = Path.Combine(dir.ToString(), jNumber);
//SetFileAttributes(tempDir);
try
{
File.Delete(tempDir + ".zip");
}
catch (System.IO.IOException)
{
File.Delete(tempDir + ".zip");
}
try
{
Directory.Delete(destDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(destDir, true);
}
try
{
Directory.Delete(tempDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(tempDir, true);
}
}
catch
{
WriteToLog("File Error: " + jNumber, "There was an error removing files and/or folders after the job has been archived. Please check the source server and destination server to make sure everything copied correctly. The archive bit for this job was set.");
Directory.Delete(Path.Combine(System.Environment.CurrentDirectory, "Temp"), true);
Directory.CreateDirectory(Path.Combine(System.Environment.CurrentDirectory, "Temp"));
}
}
static bool ZipJobFolder(string jNumber, string jPath)
{
try
{
string CommandStr = #"L:\ZipFiles\winzip32.exe";
string parameters = "-min -a -r \"" + jNumber + "\" \"" + jPath + "\"";
ProcessStartInfo starter = new ProcessStartInfo(CommandStr, parameters);
starter.CreateNoWindow = true;
starter.RedirectStandardOutput = false;
starter.UseShellExecute = false;
Process process = new Process();
process.StartInfo = starter;
Console.WriteLine("Creating .zip file");
process.Start();
process.WaitForExit();
Process[] processes;
string procName = "winzip32.exe";
processes = Process.GetProcessesByName(procName);
foreach (Process proc in processes)
{
Console.WriteLine("Closing WinZip Process...");
proc.Kill();
}
}
catch
{
WriteToLog(jNumber, "There was error zipping the files of this job");
return false;
}
return true;
}

I have noticed this behavior using windows explorer, while deleting large folders with a lot of files and sub-folders. But after waiting a bit and then deleting again, it appears to work fine.
Because of that, I have always assumed it was a flaky behavior of the operating system.
Although this is not a solution, you could try it by making your application sleep for a small amount of time before attempting to delete those files, and see if the error occurs still.
If the errors go away it would appear to be related to some timing issue. I would myself want to know the source of the issue though.
Commenters are pointing to Anti Virus program. That would make sense, if that is true then you need to write some code to check if the file is locked before trying to delete. If it is locked then sleep for a bit, then check again until it is no longer locked and you can go ahead and delete it.
You just need to be careful not to get into an infinite race condition.
Edit:
There is a related question about How to best wait for a filelock to release check it out for ideas.
Edit2:
Here is another possible solution Wait until file is unlocked in .Net

Most likely you are getting a sharing violation - the delete can't get the exclusive file handle. One way this happens is the AV software gets triggered and doesn't finish scanning before the call to delete. It could also be that the WinZip process isn't fully dead yet.
Several ways to handle it:
1) On failure, sleep for a few seconds & try again.
2) I would probably not use WinZip & instead use ZipStorer (http://zipstorer.codeplex.com/). It will zip the file in the process and you won't have to do the kill step & you will have much more granular control. You could also do the zips in parallel by spinning up multiple threads.

One thing I found that helps is to not try to create temp files and directories in a single File.Move or File.Copy call. Rather, manually create the parent directories, starting at the highest level and working downwards. Finally, when all parent directories exist, Move or Copy the file.

Antivirus software could be an issue because if antivirus software is currently reading your file it will cause that, to be honest I've seen this pop up many a time when using the .NET framework and I just toss the handling in a loop and attempt to do whatever file operation is needed until it no longer throws the exception. This also happens if a file is currently being copied or is being registered in the buffer of the kernel if some kind of watcher is implemented.

Related

System.IO.File.Delete throws "The process cannot access the file because it is being used by another process"

Every time I save a file and delete it right away using the function below, I keep getting this error message: "System.IO.IOException: The process cannot access the file because it is being used by another process".
Waiting for a couple of minutes or closing visual studio seems to only unlock the files that you uploaded previously.
public static bool DeleteFiles(List<String> paths)
{ // Returns true on success
try
{
foreach (var path in paths)
{
if (File.Exists(HostingEnvironment.MapPath("~") + path))
File.Delete(HostingEnvironment.MapPath("~") + path);
}
}
catch (Exception ex)
{
return false;
}
return true;
}
I think that the way I'm saving the files may cause them to be locked. This is the code for saving the file:
if (FileUploadCtrl.HasFile)
{
filePath = Server.MapPath("~") + "/Files/" + FileUploadCtrl.FileName;
FileUploadCtrl.SaveAs(filePath)
}
When looking for an answer I've seen someone say that you need to close the streamReader but from what I understand the SaveAs method closes and disposes automatically so I really have no idea whats causing this
After some testing, I found the problem. turns out I forgot about a function I made that was called every time I saved a media file. the function returned the duration of the file and used NAudio.Wave.WaveFileReader and NAudio.Wave.Mp3FileReader methods which I forgot to close after I called them
I fixed these issues by putting those methods inside of a using statement
Here is the working function:
public static int GetMediaFileDuration(string filePath)
{
filePath = HostingEnvironment.MapPath("~") + filePath;
if (Path.GetExtension(filePath) == ".wav")
using (WaveFileReader reader = new WaveFileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
else if(Path.GetExtension(filePath) == ".mp3")
using (Mp3FileReader reader = new Mp3FileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
return 0;
}
The moral of the story is, to check if you are opening the file anywhere else in your project
I think that the problem is not about streamReader in here.
When you run the program, your program runs in a specific folder. Basically, That folder is locked by your program. In that case, when you close the program, it will be unlocked.
To fix the issue, I would suggest to write/delete/update to different folder.
Another solution could be to check file readOnly attribute and change this attribute which explained in here
Last solution could be using different users. What I mean is that, if you create a file with different user which not admin, you can delete with Admin user. However, I would definitely not go with this solution cuz it is too tricky to manage different users if you are not advance windows user.

jpegoptim on ASP.Net - "error opening temporary file"

I suspect I'm failing to understand where jpegoptim tries to write its temp files.
I have IIS 7.5 running an ASP.Net 4 AppDomain. In it I have a process that optimizes JPEGs with jpegoptim like so:
FileHelper.Copy(existingPath, optimizerPath);
var jpegOptimResult = await ImageHelper.JpegOptim(optimizerPath, 30);
Running locally I get an optimized image. Running on the above server I get:
D:\www\hplusf.com\b\pc\test.jpg 4096x2990 24bit N Adobe [OK] jpegoptim: error opening temporary file.
I can show the code for FileHelper.Copy(), but it's basically just File.Copy() that overwrites if the file already exists.
Here's ImageHelper.JpegOptim:
public static async Task<string> JpegOptim(string path, int quality)
{
string jpegOptimPath = Path.GetDirectoryName(new Uri(Assembly
.GetExecutingAssembly().CodeBase).LocalPath)
+ #"\Lib\jpegoptim.exe";
var jpegOptimResult = await ProcessRunner.O.RunProcess(
jpegOptimPath,
"-m" + quality + " -o -p --strip-all --all-normal \"" + path + "\"",
false, true
);
return jpegOptimResult;
}
jpegOptimResult is what you're seeing there as the error message it's producing. And here's ProcessRunner.RunProcess:
public async Task<string> RunProcess(string command, string args,
bool window, bool captureOutput)
{
var processInfo = new ProcessStartInfo(command, args);
if (!window)
makeWindowless(processInfo);
string output = null;
if (captureOutput)
output = await runAndCapture(processInfo);
else
runDontCapture(processInfo);
return output;
}
protected void makeWindowless(ProcessStartInfo processInfo)
{
processInfo.CreateNoWindow = true;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
protected async Task<string> runAndCapture(ProcessStartInfo processInfo)
{
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
var process = Process.Start(processInfo);
var output = process.StandardOutput;
var error = process.StandardError;
while (!process.HasExited)
{
await Task.Delay(100);
}
string s = output.ReadToEnd();
s += '\n' + error.ReadToEnd();
return s;
}
So:
jpegOptim runs properly on my local machine, and optimizes the file, so it's not how I'm calling jpegOptim.
The Copy operation succeeds without Exception, so it's not a Permissions issue with the ASP.Net user reading/writing from that directory
jpegOptim just optimizes and overwrites the file, so if it is in fact running under the same ASP.Net user, it should have no problem writing this file, but...
It's unclear where jpegOptim attempts to write its temp file, so perhaps the underlying issue is where this temporary file is being written.
However, judging by the Windows source:
http://sourceforge.net/p/jpegoptim/code/HEAD/tree/jpegoptim-1.3.0/trunk/jpegoptim.c
jpegOptim's "temporary file" appears to just be the destination file when used with the above options. Relevant lines of jpegOptim source:
int dest = 0;
int main(int argc, char **argv)
{
...
There's some code here looking for the -d argument that sets dest=1 - meaning here dest remains 0. It then hits an if branch, and the else clause, for dest == 0, does this:
if (!splitdir(argv[i],tmpdir,sizeof(tmpdir)))
fatal("splitdir() failed!");
strncpy(newname,argv[i],sizeof(newname));
That's copying the directory name portion of the input image filename to the variable tmpdir - so like C:\Blah\18.jpg would assign tmpdir="C:\Blah\". Then it dumps the entire input image filename to newname, meaning it's just going to overwrite it in place.
At this point in the code the variables it's using should be:
dest=0
argv[i]=D:\www\hplusf.com\b\pc\test.jpg
tmpdir=D:\www\hplusf.com\b\pc\
newname=D:\www\hplusf.com\b\pc\test.jpg
It then in fact opens the file, and there's an opportunity to error out there, suggesting jpegoptim is successfully opening the file. It also decompresses the file further confirming it's successfully opening it.
The specific error message I'm seeing occurs in these lines - I'll confess I don't know if MKSTEMPS is set or not for a default build (which I'm using):
snprintf(tmpfilename,sizeof(tmpfilename),
"%sjpegoptim-%d-%d.XXXXXX.tmp", tmpdir, (int)getuid(), (int)getpid());
#ifdef HAVE_MKSTEMPS
if ((tmpfd = mkstemps(tmpfilename,4)) < 0)
fatal("error creating temp file: mkstemps() failed");
if ((outfile=fdopen(tmpfd,"wb"))==NULL)
#else
tmpfd=0;
if ((outfile=fopen(tmpfilename,"wb"))==NULL)
#endif
fatal("error opening temporary file");
So snprintf is like C# String.Format(), which should produce a path like:
D:\www\hplusf.com\b\pc\jpegoptim-1-2.XXXXXX.tmp
Judging by what I can find it's likely MKSTEMPS is not defined meaning fopen is being called with "wb" meaning it's writing a binary file, and it's returning null meaning it failed to open, and out comes the error message.
So - possible causes:
Bad path in tmpdir It's possible I'm following the C++ poorly (likely), but, from the looks of it it should be identical to the source path of the image. But perhaps it's mangled for tmpdir, by jpegoptim? The input path is clearly clean because jpegoptim actually emits it cleanly in the error message.
Permissions issue Seems fairly unlikely. The ASP.Net user this is running under can clearly read and write because it copies to the dir before jpegoptim fires, and the only user on the machine with any permissions to this dir is that user, so jpegoptim should have failed prior to this point if it were permissions. It could be attempting to access a different dir, but that would really be the Bad tmpdir scenario.
Something else I've not thought of.
Ideas?
Note: This question is similar:
Using jpegtran, jpegoptim, or other jpeg optimization/compression in C#
However, that question is asking about a shared env on GoDaddy, causing answers to spiral around the likelihood he can't spin up processes. We have full control over our server, and as should be clear from the above, the jpegoptim Process is definitely starting successfully, so it's a different scenario.
As it turns out my reading of jpegoptim was incorrect. The tmpdir it uses is where the executable's Working Directory points to, not where the input images are, and not where the executable sits. So, the solution was 2-fold:
Give the exe permissions to write to its own directory* (but deny it access to modify itself)
Modify ProcessRunner to run processes in-place - set the Working Directory to where the exe resides.
The second modification looks like this:
var processInfo = new ProcessStartInfo(command, args);
// Ensure the exe runs in the path where it sits, rather than somewhere
// less safe like the website root
processInfo.WorkingDirectory = (new FileInfo(command)).DirectoryName;
*Note: I happen to have jpegoptim.exe isolated on the server to its own dir to limit risk. If you had it someplace more global like Program Files, you definitely should not do this - instead set the Working Directory as above, but to someplace isolated/safe like a tmp dir or even better a scratch disk. If you've got the RAM for it a RAMdrive would be fastest.
**Second Note: Because of how hard drives and jpegoptim work if the tmp location is not the same disk as the ultimate destination of the output there is a potential, partial race condition introduced between jpegoptim and other code you might be using that depends on its outputs. In particular if you use the same disk, when jpegoptim is done the output JPEG is complete - the OS changes the entry in its file table but the data for the image on the hard drive has already been written to completion. When tmp and destination are separate disks, jpegoptim finishes by telling the OS to move from the tmpdir to the output dir. That's a data move that finishes sometime after jpegoptim is done running. If your waiting code is fast enough it will start its work with an incomplete JPEG.

Dispose slow on very big filestreams?

I've got this code
string archiveFileName = BuildArchiveFileName(i, null);
string tmpArchiveFileName = BuildArchiveFileName(i, "tmp");
try
{
using (FileStream tmpArchiveMemoryStream = new FileStream(tmpArchiveFileName, FileMode.Create))
{
using (BinaryWriter pakWriter = new BinaryWriter(tmpArchiveMemoryStream))
{
if (i == 0)
{
WriteHeader(pakWriter, pakInfo.Header);
WriteFileInfo(pakWriter, pakInfo.FileList);
uint remainingBytesToDataOffset = pakInfo.Header.DataSectionOffset - CalculateHeaderBlockSize(pakInfo.Header);
pakWriter.Write(Util.CreatePaddingByteArray((int)remainingBytesToDataOffset));
}
foreach (String file in pakInfo.FileList.Keys)
{
DosPak.Model.FileInfo info = pakInfo.FileList[file];
if (info.IndexArchiveFile == i)
{
//Console.WriteLine("Writing " + file);
byte[] fileData = GetFileAsStream(file, false);
int paddingSize = (int)CalculateFullByteBlockSize((uint)fileData.Length) - fileData.Length;
pakWriter.Write(fileData);
pakWriter.Write(Util.CreatePaddingByteArray(paddingSize));
}
}
}
}
}
finally
{
File.Delete(archiveFileName);
File.Move(tmpArchiveFileName, archiveFileName);
}
I've tested this with NUnit on small file sizes and it works perfectly. Then when I tried it on a real life example , that are files over 1 GB. I get in trouble on the delete. It states the file is still in use by another process. While it shouldn't that file should have been disposed of after exiting the using branch. So I'm wondering if the dispose of the filestream is slow to execute and that is the reason I'm getting in trouble. Small note in all my file handling I use a FileStream with the using keyword.
While it shouldn't that file should have been disposed of after exiting the using branch
That's not what it is complaining about, you can't delete archiveFileName. Some other process has the file opened, just as the exception message says. If you have no idea what process that might be then start killing them off one-by-one with Task Manager's Processes tab. That being your own process is not entirely unusual btw. Best way is with SysInternals' Handle utility, it can show you the process name.
Deleting files is in general a perilous adventure on a multi-tasking operating system, always non-zero odds that some other process is interested in the file as well. They ought to open the file with FileShare.Delete but that's often overlooked.
The safest way to do this is with File.Replace(). The 3rd argument, the backup filename, is crucial, it allows the file to be renamed and continue to exist so that other process can continue to use it. You should try to delete that backup file at the start of your code. If that doesn't succeed then File.Replace() cannot work either. But do check that it isn't a bug in your program first, run the Handle utility.

Directory.Move(): Access to Path is Denied

I'm writing this Windows Form Application in Visual Studio 2010 using C#.
There is a Execute button on the form, the user will hit the button, the program will generate some files and are stored in the Output folder (which is created by the program using Directory.CreateDirectory())
I want to create an Archive folder to save the output files from previous runs.
In the beginning of each run, I try to move the existing Output folder to the Archive folder, then create a new Output folder. Below is the function I ran to move directory.
static void moveToArchive()
{
if (!Directory.Exists("Archive")) Directory.CreateDirectory("Archive");
string timestamp = DateTime.Now.ToString("yyyyMMddHHmms");
try
{
Directory.Move("Output", "Archive\\" + timestamp);
}
catch(Exception e)
{
Console.WriteLine("Can not move folder: " + e.Message);
}
}
The problem I ran into confuses me a lot...
There are some times that I can successfully move the Output folder to archive, but sometimes it fails.
The error message I got from catching the exception is Access to path 'Output' is denied.
I have checked that all the files in the Output folder are not in use. I don't understand how access is denied sometimes and not all the times.
Can someone explain to me and show me how to resolve the problem?
--Edit--
After HansPassant comment, I modified the function a little to get the current directory and use the full path. However, I'm still having the same issue.
The function now looks like this:
static void moveToArchive()
{
string currentDir = Environment.CurrentDirectory;
Console.WriteLine("Current Directory = " + currentDir);
if (!Directory.Exists(currentDir + "\\Archive")) Directory.CreateDirectory(currentDir + "\\Archive");
string timestamp = DateTime.Now.ToString("yyyyMMddHHmms");
try
{
Directory.Move(currentDir + "\\Output", currentDir + "\\Archive\\" + timestamp);
}
catch(Exception e)
{
Console.WriteLine("Can not move folder: " + e.Message);
}
}
I printed out the current directory and it is just as what I was expecting, and I'm still having trouble using full path. Access to path 'C:\Users\Me\Desktop\FormApp\Output' is denied.
--Edit--
Thank you everyone for answering and commenting.
I think some of you miss this part so I'm going stress it a bit more.
The Directory.Move() sometimes work and sometimes fails.
When the function succeed, there was no problem. Output folder is moved to Archive
When the function fails, the exception message I got was Access to path denied.
Thank you all for the replies and help. I have figured out what the issue was.
It is because there was a file that's not completely closed.
I was checking the files that were generated, and missed the files the program was reading from.
All files that were generated were closed completely. It was one file I used StreamReader to open but didn't close. I modified the code and am now not having problem, so I figure that's were the issue was.
Thanks for all the comments and answers, that definitely help me with thinking and figuring out the problem.
See http://windowsxp.mvps.org/processlock.htm
Sometimes, you try to move or delete a file or folder and receive access violation or file in use - errors. To successfully delete a file, you will need to identify the process which has locked the file. You need to exit the process first and then delete the particular file. To know which process has locked a file, you may use one of the methods discussed in this article.
Using Process Explorer - download from http://download.sysinternals.com/files/ProcessExplorer.zip
Process Explorer shows you information about which handles and DLLs processes have opened or loaded.
Download Process Explorer from Microsoft site and run the program.
Click the Find menu, and choose Find Handle or DLL...
Type the file name (name of the file which is locked by some process.)
After typing the search phrase, click the Search button
You should see the list of applications which are accessing the file.
I bumped on the same problem recently. Using PE I'd figured that only process using that particular directory was explorer.exe. I'd opened few windows with explorer, one pointing to parent directory of one that I was about to move.
It appeared, that after I visited that sub-folder and then returned (even to root level!) the handle was still being kept by explorer, so C# was not able to modify it in any way (changing flags, attributes etc.).
I had to kill that explorer window in order to made C# operate properly.
File.SetAttributes(Application.dataPath + "/script", FileAttributes.Normal);
Directory.Move(Application.dataPath + "/script", Application.dataPath + "/../script");
This fixed my problem.
Try this:
If this does not solve, maybe check/change the antivirus, or the some other program is locking some file in or the folder.
static object moveLocker = new object();
static void moveToArchive()
{
lock (moveLocker)
{
System.Threading.Thread.Sleep(2000); // Give sometime to ensure all file are closed.
//Environment.CurrentDirectory = System.AppDomain.CurrentDomain.BaseDirectory;
string applicationPath = System.AppDomain.CurrentDomain.BaseDirectory;
string archiveBaseDirectoryPath = System.IO.Path.Combine(applicationPath, "Archive");
if (!Directory.Exists(archiveBaseDirectoryPath)) Directory.CreateDirectory(archiveBaseDirectoryPath);
String timestamp = DateTime.Now.ToString("yyyyMMddHHmms");
String outputDirectory = System.IO.Path.Combine(Environment.CurrentDirectory, "Output");
String destinationTS = System.IO.Path.Combine(archiveBaseDirectoryPath, timestamp);
try
{
Directory.Move(outputDirectory, destinationTS);
}
catch (Exception ex)
{
Console.WriteLine("Can not move folder " + outputDirectory + " to: " + destinationTS + "\n" + ex.Message);
}
}
}
I had the same problem, it failed sometimes but not all the time. I thought I'd wrap it in a Try Catch block and present the user with an Access Denied message and once I wrapped it in the Try Catch block it stopped failing. I can't explain why.
If existingFile.FileName <> newFileName Then
Dim dir As New IO.DirectoryInfo(existingFile.FilePath)
Dim path As String = System.IO.Path.GetDirectoryName(dir.FullName)
newFileName = path & "\" & newFileName
File.SetAttributes(existingFile.FilePath, FileAttributes.Normal)
Try
IO.File.Move(existingFile.FilePath, newFileName)
Catch ex As Exception
End Try
End If
I had a similar problem. Renamed many directories in a loop when following the certain template. From time to time the program crashed on different directories. It helped to add a sleep thread before Directory.Move. I need to create some delay.
But it slows down the copying process.
foreach (var currentFullDirPath in Directory.GetDirectories(startTargetFullDirectory, "*", SearchOption.AllDirectories))
{
var shortCurrentFolderName = new DirectoryInfo(currentFullDirPath).Name.ToLower();
if (shortCurrentFolderName.Contains(shortSourceDirectoryName))
{
// Add Thread.Sleep(1000);
Thread.Sleep(1000);
var newFullDirName = ...;
Directory.Move(currentFullDirPath, newFullDirName);
}
}

Deleting a temp file that is open c#

i have written some pdf files to a temp directory and these get displayed as a thumbnail that the user can view. when i close my form i clean up all of the files in the temp directory.
If however the user has one of the thumbnails open and then closes my application - it deletes the files and then throws an exception because the pdf is open in another process and cant be cleaned up.
I guess this is a shocking programming decision by me, but i am still a novice! How should i account for this in my code?
Thanks
You can detect if the file is in use by using code similar to below, then use that to warn the user that a file can't be deleted.
Unfortunately you can't delete a file that is in use.
public static bool IsFileInUse(string pathToFile)
{
if (!System.IO.File.Exists(pathToFile))
{
// File doesn't exist, so we know it's not in use.
return false;
}
bool inUse = false;
System.IO.FileStream fs;
try
{
fs = System.IO.File.Open(pathToFile, System.IO.FileMode.OpenOrCreate, System.IO.FileAccess.Read, System.IO.FileShare.None);
fs.Close();
}
catch (System.IO.IOException ex)
{
string exMess = ex.Message;
inUse = true;
}
return inUse;
}
You should catch that exception (in catch block you can inform user to close that file or it will not be deleted), and if the temp directory is yours you can try to delete it when application starts (or when it ends again), if its windows temp directory, then it does not matter that much
Tools like File Unlocker can release a file. However I think this could make programs depending on the file crash...
Maybe you can look up how they unlock files or manage to execute the unlocker via Process.Start to unlock your file and delete it.
However if it's you blocking the file you should try and fix this in your programm. Maybe you should dispose all loaded files (filestreams etc) before trying to clean it up.

Categories

Resources