what process is preventing my file from getting deleted in C# - c#

I have a C# single thread application that creates a file. Uses that file and then deletes it. Some times the app has trouble deleting that file. The error I get is:
"The process cannot access the file --file path and file name-- because it is being used by another process."
How can I find out what process has a hold on this file and how can I make that process to let go so that the file can be deleted.

This thing rocks for that very "gotcha".
http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
Process Monitor v3.05
It has a "Filter" submenu so you can fine tune it to the file that is locked.

You need to post the relevant code so we can see.
It is however always important to make sure that your app close the file that it has opened.
usually something like this will ensure that:
using(var f = File.OpenRead("myfile")) {
...
}
or the equivalent:
try {
var f = File.OpenRead("myfile");
} finally {
f.close()
}

Make sure that you are closing file before delete.
if you are using StreamWriter class make sure that you are closing with its variable
Ex. StreamWriter sw = new StreamWriter();
// some writing operation
sw.Close();

Related

System.IO.File.Delete throws "The process cannot access the file because it is being used by another process"

Every time I save a file and delete it right away using the function below, I keep getting this error message: "System.IO.IOException: The process cannot access the file because it is being used by another process".
Waiting for a couple of minutes or closing visual studio seems to only unlock the files that you uploaded previously.
public static bool DeleteFiles(List<String> paths)
{ // Returns true on success
try
{
foreach (var path in paths)
{
if (File.Exists(HostingEnvironment.MapPath("~") + path))
File.Delete(HostingEnvironment.MapPath("~") + path);
}
}
catch (Exception ex)
{
return false;
}
return true;
}
I think that the way I'm saving the files may cause them to be locked. This is the code for saving the file:
if (FileUploadCtrl.HasFile)
{
filePath = Server.MapPath("~") + "/Files/" + FileUploadCtrl.FileName;
FileUploadCtrl.SaveAs(filePath)
}
When looking for an answer I've seen someone say that you need to close the streamReader but from what I understand the SaveAs method closes and disposes automatically so I really have no idea whats causing this
After some testing, I found the problem. turns out I forgot about a function I made that was called every time I saved a media file. the function returned the duration of the file and used NAudio.Wave.WaveFileReader and NAudio.Wave.Mp3FileReader methods which I forgot to close after I called them
I fixed these issues by putting those methods inside of a using statement
Here is the working function:
public static int GetMediaFileDuration(string filePath)
{
filePath = HostingEnvironment.MapPath("~") + filePath;
if (Path.GetExtension(filePath) == ".wav")
using (WaveFileReader reader = new WaveFileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
else if(Path.GetExtension(filePath) == ".mp3")
using (Mp3FileReader reader = new Mp3FileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
return 0;
}
The moral of the story is, to check if you are opening the file anywhere else in your project
I think that the problem is not about streamReader in here.
When you run the program, your program runs in a specific folder. Basically, That folder is locked by your program. In that case, when you close the program, it will be unlocked.
To fix the issue, I would suggest to write/delete/update to different folder.
Another solution could be to check file readOnly attribute and change this attribute which explained in here
Last solution could be using different users. What I mean is that, if you create a file with different user which not admin, you can delete with Admin user. However, I would definitely not go with this solution cuz it is too tricky to manage different users if you are not advance windows user.

Lines form a file deleting on start C#

The problem im having is that everytime that i run my code, lines from the "Titles.txt" are getting deleted, and i don't know why. Basiclly, i run the program, then i write to the file with a textbox, then i close the program, check if it wrote to the file and it did, i run it again and check the file again and is empty. What can i do?
public Form1()
{
InitializeComponent();
if(!File.Exists(mainFolder))
{
Directory.CreateDirectory(mainFolder);
Directory.CreateDirectory(tabTitlesFolder);
var file = File.Create(tabTitles);
file.Close();
}
}
You need to check for the file, not the folder.
public Form1()
{
InitializeComponent();
if(!File.Exists(tabTitles)) // check if the file exists, (you had a check on mainFolder)
{
Directory.CreateDirectory(mainFolder);
Directory.CreateDirectory(tabTitlesFolder);
var file = File.Create(tabTitles); // this is what you are creating so also what you should be checking for above in the if
file.Close();
}
}
Also File.Create will overwrite the file if it already exists, see the documentation.
Finally types that implement IDisposable should be wrapped in a using block or a try/finally block to ensure they are released by the code even if an exception were to be thrown. File.Create returns FileStream which is disposable so it should be wrapped.
using(File.Create(tabTitles)){}
As you are not using the result you do not need to assign it to anything but you could if you wanted to write to the file.
using(var file = File.Create(tabTitles)){
// do something with file
}
File.Exists returns false for directories, hence you recreate the file on each run.

Dispose slow on very big filestreams?

I've got this code
string archiveFileName = BuildArchiveFileName(i, null);
string tmpArchiveFileName = BuildArchiveFileName(i, "tmp");
try
{
using (FileStream tmpArchiveMemoryStream = new FileStream(tmpArchiveFileName, FileMode.Create))
{
using (BinaryWriter pakWriter = new BinaryWriter(tmpArchiveMemoryStream))
{
if (i == 0)
{
WriteHeader(pakWriter, pakInfo.Header);
WriteFileInfo(pakWriter, pakInfo.FileList);
uint remainingBytesToDataOffset = pakInfo.Header.DataSectionOffset - CalculateHeaderBlockSize(pakInfo.Header);
pakWriter.Write(Util.CreatePaddingByteArray((int)remainingBytesToDataOffset));
}
foreach (String file in pakInfo.FileList.Keys)
{
DosPak.Model.FileInfo info = pakInfo.FileList[file];
if (info.IndexArchiveFile == i)
{
//Console.WriteLine("Writing " + file);
byte[] fileData = GetFileAsStream(file, false);
int paddingSize = (int)CalculateFullByteBlockSize((uint)fileData.Length) - fileData.Length;
pakWriter.Write(fileData);
pakWriter.Write(Util.CreatePaddingByteArray(paddingSize));
}
}
}
}
}
finally
{
File.Delete(archiveFileName);
File.Move(tmpArchiveFileName, archiveFileName);
}
I've tested this with NUnit on small file sizes and it works perfectly. Then when I tried it on a real life example , that are files over 1 GB. I get in trouble on the delete. It states the file is still in use by another process. While it shouldn't that file should have been disposed of after exiting the using branch. So I'm wondering if the dispose of the filestream is slow to execute and that is the reason I'm getting in trouble. Small note in all my file handling I use a FileStream with the using keyword.
While it shouldn't that file should have been disposed of after exiting the using branch
That's not what it is complaining about, you can't delete archiveFileName. Some other process has the file opened, just as the exception message says. If you have no idea what process that might be then start killing them off one-by-one with Task Manager's Processes tab. That being your own process is not entirely unusual btw. Best way is with SysInternals' Handle utility, it can show you the process name.
Deleting files is in general a perilous adventure on a multi-tasking operating system, always non-zero odds that some other process is interested in the file as well. They ought to open the file with FileShare.Delete but that's often overlooked.
The safest way to do this is with File.Replace(). The 3rd argument, the backup filename, is crucial, it allows the file to be renamed and continue to exist so that other process can continue to use it. You should try to delete that backup file at the start of your code. If that doesn't succeed then File.Replace() cannot work either. But do check that it isn't a bug in your program first, run the Handle utility.

Streamwriter crashes when writing to existing file

Here is the code I have that writes to a couple of files:
StreamWriter writer = new StreamWriter(#"C:\TotalStock\data\points\" + stockName.ToUpper() + ".txt");
for(int i = 0; i < lines; i++)
{
writer.WriteLine(lineData[i]);
postGui.Send((object state) =>
{
progressBar2.PerformStep();
}, null);
}
writer.Close();
When I delete the text files and run the code there is no issue, but then when I close the application and run it once more the program gives me the following error. What is it that causes this error and what can I do to stop it?
Error:
An unhandled exception of type 'System.IO.IOException' occurred in mscorlib.dll
Additional information: The process cannot access the file 'C:\TotalStock\data\points\IBM.txt' because it is being used by another process
As marc_s pointed out, the error occurs because the file you are trying to edit is opened by another application. If you are certain that you don't have the file opened in any other editor/viewer, the problem may the program itself.
If several instances of the same code run concurrently and require access to the same file, say in a multithread environement.
Does another component in your application read from the text file while you are trying to write to the file?
Does your application hang, and the second instance requires the same file?
Do you run the same tests each time?
Probably the problem is that you open the file but do not close it when you exit the program. i see you are using StreamWriter to write the data to the file.
might by you get an exception and therefor does not close the file. when playing with files you should always do:
try
{
// Your code with files
}
catch
{
}
finally
{
writer.Close();
}
other reasons might be that you are using some other File/Stream/etc. please be sure that you close all the members that need to be closed before close the program.
please share all your code if you want us to check if you forgot something else
as Sayse Is saying another way to make sure you close your writers is a using statement:
using(StreamWriter writer = new StreamWriter(#"C:\TotalStock\data\points\" + stockName.ToUpper() + ".txt");)
{
// Your code of playing with files
}

Show every file in a directory, but don't show files that are currently being copied

I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}

Categories

Resources