When a file is created (FileSystemWatcher_Created) in one directory I copy it to another. But When I create a big (>10MB) file it fails to copy the file, because it starts copying already, when the file is not yet finished creating...
This causes Cannot copy the file, because it's used by another process to be raised. ;(
Any help?
class Program
{
static void Main(string[] args)
{
string path = #"D:\levan\FolderListenerTest\ListenedFolder";
FileSystemWatcher listener;
listener = new FileSystemWatcher(path);
listener.Created += new FileSystemEventHandler(listener_Created);
listener.EnableRaisingEvents = true;
while (Console.ReadLine() != "exit") ;
}
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
File.Copy(e.FullPath, #"D:\levan\FolderListenerTest\CopiedFilesFolder\" + e.Name);
Console.Read();
}
}
There is only workaround for the issue you are facing.
Check whether file id in process before starting the process of copy. You can call the following function until you get the False value.
1st Method, copied directly from this answer:
private bool IsFileLocked(FileInfo file)
{
FileStream stream = null;
try
{
stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
return true;
}
finally
{
if (stream != null)
stream.Close();
}
//file is not locked
return false;
}
2nd Method:
const int ERROR_SHARING_VIOLATION = 32;
const int ERROR_LOCK_VIOLATION = 33;
private bool IsFileLocked(string file)
{
//check that problem is not in destination file
if (File.Exists(file) == true)
{
FileStream stream = null;
try
{
stream = File.Open(file, FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (Exception ex2)
{
//_log.WriteLog(ex2, "Error in checking whether file is locked " + file);
int errorCode = Marshal.GetHRForException(ex2) & ((1 << 16) - 1);
if ((ex2 is IOException) && (errorCode == ERROR_SHARING_VIOLATION || errorCode == ERROR_LOCK_VIOLATION))
{
return true;
}
}
finally
{
if (stream != null)
stream.Close();
}
}
return false;
}
From the documentation for FileSystemWatcher:
The OnCreated event is raised as soon as a file is created. If a file
is being copied or transferred into a watched directory, the
OnCreated event will be raised immediately, followed by one or more
OnChanged events.
So, if the copy fails, (catch the exception), add it to a list of files that still need to be moved, and attempt the copy during the OnChanged event. Eventually, it should work.
Something like (incomplete; catch specific exceptions, initialize variables, etc):
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
try {
File.Copy(e.FullPath, #"D:\levani\FolderListenerTest\CopiedFilesFolder\" + e.Name);
}
catch {
_waitingForClose.Add(e.FullPath);
}
Console.Read();
}
public static void listener_Changed(object sender, FileSystemEventArgs e)
{
if (_waitingForClose.Contains(e.FullPath))
{
try {
File.Copy(...);
_waitingForClose.Remove(e.FullPath);
}
catch {}
}
}
It's an old thread, but I'll add some info for other people.
I experienced a similar issue with a program that writes PDF files, sometimes they take 30 seconds to render.. which is the same period that my watcher_FileCreated class waits before copying the file.
The files were not locked.
In this case I checked the size of the PDF and then waited 2 seconds before comparing the new size, if they were unequal the thread would sleep for 30 seconds and try again.
You're actually in luck - the program writing the file locks it, so you can't open it. If it hadn't locked it, you would have copied a partial file, without having any idea there's a problem.
When you can't access a file, you can assume it's still in use (better yet - try to open it in exclusive mode, and see if someone else is currently opening it, instead of guessing from the failure of File.Copy). If the file is locked, you'll have to copy it at some other time. If it's not locked, you can copy it (there's slight potential for a race condition here).
When is that 'other time'? I don't rememeber when FileSystemWatcher sends multiple events per file - check it out, it might be enough for you to simply ignore the event and wait for another one. If not, you can always set up a time and recheck the file in 5 seconds.
Well you already give the answer yourself; you have to wait for the creation of the file to finish. One way to do this is via checking if the file is still in use. An example of this can be found here: Is there a way to check if a file is in use?
Note that you will have to modify this code for it to work in your situation. You might want to have something like (pseudocode):
public static void listener_Created()
{
while CheckFileInUse()
wait 1000 milliseconds
CopyFile()
}
Obviously you should protect yourself from an infinite while just in case the owner application never releases the lock. Also, it might be worth checking out the other events from FileSystemWatcher you can subscribe to. There might be an event which you can use to circumvent this whole problem.
When the file is writing in binary(byte by byte),create FileStream and above solutions Not working,because file is ready and wrotted in every bytes,so in this Situation you need other workaround like this:
Do this when file created or you want to start processing on file
long fileSize = 0;
currentFile = new FileInfo(path);
while (fileSize < currentFile.Length)//check size is stable or increased
{
fileSize = currentFile.Length;//get current size
System.Threading.Thread.Sleep(500);//wait a moment for processing copy
currentFile.Refresh();//refresh length value
}
//Now file is ready for any process!
So, having glanced quickly through some of these and other similar questions I went on a merry goose chase this afternoon trying to solve a problem with two separate programs using a file as a synchronization (and also file save) method. A bit of an unusual situation, but it definitely highlighted for me the problems with the 'check if the file is locked, then open it if it's not' approach.
The problem is this: the file can become locked between the time that you check it and the time you actually open the file. Its really hard to track down the sporadic Cannot copy the file, because it's used by another process error if you aren't looking for it too.
The basic resolution is to just try to open the file inside a catch block so that if its locked, you can try again. That way there is no elapsed time between the check and the opening, the OS does them at the same time.
The code here uses File.Copy, but it works just as well with any of the static methods of the File class: File.Open, File.ReadAllText, File.WriteAllText, etc.
/// <param name="timeout">how long to keep trying in milliseconds</param>
static void safeCopy(string src, string dst, int timeout)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
//don't forget to either return from the function or break out fo the while loop
break;
}
catch (IOException)
{
//you could do the sleep in here, but its probably a good idea to exit the error handler as soon as possible
}
Thread.Sleep(100);
//if its a very long wait this will acumulate very small errors.
//For most things it's probably fine, but if you need precision over a long time span, consider
// using some sort of timer or DateTime.Now as a better alternative
timeout -= 100;
}
}
Another small note on parellelism:
This is a synchronous method, which will block its thread both while waiting and while working on the thread. This is the simplest approach, but if the file remains locked for a long time your program may become unresponsive. Parellelism is too big a topic to go into in depth here, (and the number of ways you could set up asynchronous read/write is kind of preposterous) but here is one way it could be parellelized.
public class FileEx
{
public static async void CopyWaitAsync(string src, string dst, int timeout, Action doWhenDone)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
doWhenDone();
break;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
public static async Task<string> ReadAllTextWaitAsync(string filePath, int timeout)
{
while (timeout > 0)
{
try {
return File.ReadAllText(filePath);
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
return "";
}
public static async void WriteAllTextWaitAsync(string filePath, string contents, int timeout)
{
while (timeout > 0)
{
try
{
File.WriteAllText(filePath, contents);
return;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
}
And here is how it could be used:
public static void Main()
{
test_FileEx();
Console.WriteLine("Me First!");
}
public static async void test_FileEx()
{
await Task.Delay(1);
//you can do this, but it gives a compiler warning because it can potentially return immediately without finishing the copy
//As a side note, if the file is not locked this will not return until the copy operation completes. Async functions run synchronously
//until the first 'await'. See the documentation for async: https://msdn.microsoft.com/en-us/library/hh156513.aspx
CopyWaitAsync("file1.txt", "file1.bat", 1000);
//this is the normal way of using this kind of async function. Execution of the following lines will always occur AFTER the copy finishes
await CopyWaitAsync("file1.txt", "file1.readme", 1000);
Console.WriteLine("file1.txt copied to file1.readme");
//The following line doesn't cause a compiler error, but it doesn't make any sense either.
ReadAllTextWaitAsync("file1.readme", 1000);
//To get the return value of the function, you have to use this function with the await keyword
string text = await ReadAllTextWaitAsync("file1.readme", 1000);
Console.WriteLine("file1.readme says: " + text);
}
//Output:
//Me First!
//file1.txt copied to file1.readme
//file1.readme says: Text to be duplicated!
You can use the following code to check if the file can be opened with exclusive access (that is, it is not opened by another application). If the file isn't closed, you could wait a few moments and check again until the file is closed and you can safely copy it.
You should still check if File.Copy fails, because another application may open the file between the moment you check the file and the moment you copy it.
public static bool IsFileClosed(string filename)
{
try
{
using (var inputStream = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None))
{
return true;
}
}
catch (IOException)
{
return false;
}
}
I would like to add an answer here, because this worked for me. I used time delays, while loops, everything I could think of.
I had the Windows Explorer window of the output folder open. I closed it, and everything worked like a charm.
I hope this helps someone.
Related
I am developing windows program using .Net Framework.
I want to create a program that executes a function when a file is created in a specific folder using FileSystemWatcher.
Below is my code.
public async Task<int> CollectFunc() {
string path = #"C:\test";
try
{
FileSystemWatcher watcher = new FileSystemWatcher
{
Path=path
Filter="test.log"
};
watcher.Created += new FileSystemEventHandler(WatcherFunc);
watcher.IncludeSubdrectories=true;
watcher.EnableRaisingEvents=true;
}
catch
{
Console.WriteLine("Error");
}
while(true)
{
await Task.Delay(100000);
}
}
public async void WatcherFunc(object source, FileSystemEventArgs e) {
Console.WriteLine("File Created: " + e.FullPath);
}
When I start the program, file creation is monitored until I close the program.
An example is shown below.
On September 1st, the following file is created.
C:\test\20200901\test.log
The program then prints "File Created: C:\test\20200901\test.log".
And on September 2nd
C:\test\20200902\test.log file is created,
The program will then output "File Created: C:\test\20200902\test.log".
...
But sometimes the Watcher doesn't work and I have to reboot the program.
Please let me know if there is any better or more stable logic than my source code.
I look forward to your kind reply.
Try these changes:
// Introduce a class field, to prevent the watcher reference from going out of scope.
private FileSystemWatcher watcher = null;
public void CollectFunc() { // no need for async any more ...
string path = #"C:\test";
try
{
// Init class field
watcher = new FileSystemWatcher
{
Path=path
Filter="test.log"
};
watcher.Created += new FileSystemEventHandler(WatcherFunc);
watcher.IncludeSubdrectories=true;
watcher.EnableRaisingEvents=true;
}
catch (Exception ex)
{
// Better know what the problem actually was.
Console.WriteLine($"Error: {ex.Message}");
}
// It's a winforms app - we don't need to block this => away with while(true)
}
public async void WatcherFunc(object source, FileSystemEventArgs e)
{
// Just in case, catch and log exceptions
try{
Console.WriteLine("File Created: " + e.FullPath);
} catch( Exception ex ) {
// TODO: Log Exception or handle it.
}
}
On top of that: It is a known issue, that a high number and frequency of changes can lead to some buffer to overflow in the watcher (if that still applies, but I remember running into this some years ago).
The problem with buffer overflow is mentioned here : https://learn.microsoft.com/en-us/dotnet/api/system.io.filesystemwatcher.internalbuffersize?view=netcore-3.1#remarks
It may also be worthwhile to register a handler to the Error event: https://learn.microsoft.com/en-us/dotnet/api/system.io.filesystemwatcher.error?view=netcore-3.1
I guess that your Console.WriteLine in the event handler is just an example code and you are actually doing more than that. In the past, I found that it releaves stress from the FileSystemWatcher's buffer if I keep code very small here and handle the event as quickly as possible.
So, what I did was enqueue the file path in a qeue and have that queue handled on a different thread. This ensures that event are handled as quickly as possible while not losing any. Peeks can be caught by the queue getting bigger and be dealt with independently by another thread. In other words: Things pile up outside the watcher's buffers.
I'm writing a FileSystemWatcher which is to copy images from folder A to folder B, whenever an image is uploaded to folder A. I'm trying to use this as a windows service on the server PC but I'm having some issues where my files are locked when they are to be copied. I think I've found the root to my issue, but I'm not having any luck solving it. So, when I run my windows service it always ends unexpectedly at either the first or the second picture upload. The error message I'm getting says this: The process cannot access the file 'filepath' because it is being used by another process.
Relevant parts of my code:
public void WatchForChanges()
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = Program.SourceFolder;
watcher.Created += new FileSystemEventHandler(OnImageAdded);
watcher.EnableRaisingEvents = true;
watcher.IncludeSubdirectories = true;
}
public void OnImageAdded(object source, FileSystemEventArgs e)
{
FileInfo file = new FileInfo(e.FullPath);
ImageHandler handler = new ImageHandler();
if (handler.IsImage(file))
{
handler.CopyImage(file);
}
}
and, my CopyImage method, which includes one of my proposed solutions to this problem, utilizing a while loop that catches the error and retries the copying of the image:
public void CopyImage(FileSystemInfo file)
{
// code that sets folder paths
// code that sets folder paths
bool retry = true;
if (!Directory.Exists(targetFolderPath))
{
Directory.CreateDirectory(targetFolderPath);
}
while (retry)
{
try
{
File.Copy(file.FullName, targetPath, true);
retry = false;
}
catch (Exception e)
{
Thread.Sleep(2000);
}
}
}
but this CopyImage solution just keeps on copying the same file, which is not very ideal in my case. I wish it was enough but sadly I've got a queue of images waiting.
The image file is probably being created by another application that uses an exclusive access lock on both reading and writing external processes (for more informations, read this, especially the paragraph related to Microsoft Windows). You have to either:
stop/kill the process which is using the file;
wait until the file isn't being used anymore.
Since the other process is probably writing the file in the moment you try to copy it with your application, the first option is by no means recommendable. It could also be an anti-virus checking the new file, and even in this case the first option would not be recommendable.
You could try to integrate the following code into your CopyImage method so that your application will wait until the file will be no longer in use before proceeding:
private Boolean WaitForFile(String filePath)
{
Int32 tries = 0;
while (true)
{
++tries;
Boolean wait = false;
FileStream stream = null;
try
{
stream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.None);
break;
}
catch (Exception ex)
{
Logger.LogWarning("CopyImage({0}) failed to get an exclusive lock: {1}", filePath, ex.ToString());
if (tries > 10)
{
Logger.LogWarning("CopyImage({0}) skipped the file after 10 tries.", filePath);
return false;
}
wait = true;
}
finally
{
if (stream != null)
stream.Close();
}
if (wait)
Thread.Sleep(250);
}
Logger.LogWarning("CopyImage({0}) got an exclusive lock after {1} tries.", filePath, tries);
return true;
}
While it seems straightforward, it's really unsatisfactorily complex.
The problem is, the application that's writing the file isn't done with it when you get the notification...and so you have a concurrency problem. There is no great way to know when the file closes. Well..one way is to subscribe to journal events - which is what FileSystemWatcher does - but this is fairly involved and requires a lot of moving parts. Going this route, you can be notified when the file closes. If you're interested, see https://msdn.microsoft.com/en-us/library/windows/desktop/aa363798(v=vs.85).aspx.
I'd divide the work into two parts. I think I'd start a ThreadPool thread to do the work, and have it read it's work from a list that the FileSystemWatcher's event handler writes to. That way, the event handler returns quickly. The ThreadPool thread would go through it's list, attempting to get an exclusive lock (similar to Tommaso's code) on the file. If it can't, it just moves on to the next file. Every time it successfully copies, it removes that file from the list.
You need to be concerned about thread safety...so you'd want to make a static object to coordinate writes to the list. Both the event handler and the ThreadPool thread would hold the lock while writing.
Here's a scaffold of the whole approach:
internal sealed class Copier: IDisposable
{
static object sync = new object();
bool quit;
FileSystemWatcher watcher;
List<string> work;
internal Copier( string pathToWatch )
{
work = new List<string>();
watcher = new FileSystemWatcher();
watcher.Path = pathToWatch;
watcher.Create += QueueWork;
ThreadPool.QueueUserWorkItem( TryCopy );
}
void Dispose()
{
lock( sync ) quit = true;
}
void QueueWork( object source, FileSystemEventArgs args )
{
lock ( sync )
{
work.Add( args.FullPath );
}
}
void TryCopy( object args )
{
List<string> localWork;
while( true )
{
lock ( sync )
{
if ( quit ) return; //--> we've been disposed
localWork = new List<string>( work );
}
foreach( var fileName in localWork )
{
var locked = true;
try
{
using
( var throwAway = new FileStream
( fileName,
FileMode.Open,
FileAccess.Read,
FileShare.None
)
); //--> no-op - will throw if we can't get exclusive read
locked = false;
}
catch { }
if (!locked )
{
File.Copy( fileName, ... );
lock( sync ) work.Remove( fileName );
}
}
}
}
}
Not tested - wrote it right here in the answer...but it, or something like it will cover the bases.
I'm working on writing my own FTPDownload class and an encountering an issue after having changed some things around that I'm trying to address.
The code I'll paste below effectively starts a download and runs through it for some time, pulling full arrays of data in each iteration of the 'main' loop. What I can say for sure is that for small files ( for instance a meeting agenda in html ) get copied quickly and seemingly completely - however something is still quite off.
As we speak I'm downloading a 1.05 gig file, and my download status control is showing me 325 Megs complete - I can attest to the fact that this much data has been collected over the stream BUT, this is where things get weird... So far only 548 KB has been written to my output file.
The following method is doing the collecting, counting, and writing of the data - and before my recent changes it was performing rather well, with the exception of some performance issues I was intending to resolve by turn this into an extension of another series of classes.
class DownloadThread : StreamReader
{
protected override void Cycle()
{
if (this.IsComplete)
{
this.FireEvent("OnComplete", new Event(this, "DownloadThread has confirmed completion of the targetted download."));
this.Stop();
return;
}
else if ((this._InputSource = this.GetResponseStream(this.RemotePath)) == null)
{
this.FireEvent("OnException",new Event(this,"DownloadThread could not execute. A Response Stream could not be retrieved from the Input Source '" + this.RemotePath + "'."));
this.Stop();
return;
}
else
{
try
{
this._StartTime = DateTime.Now;
Byte[] Bytes = new Byte[4096];
while ((this.BytesLastRead = this.Read(Bytes, 0, 4096)) > 0)
{
this.LocalStream.Write(Bytes, 0, DataMetrics.Length(Bytes));
this.LocalStream.Flush();
this.BytesSinceLast += this.BytesLastRead;
this.TotalRead += this.BytesLastRead;
this.TriggerProgress();
continue;
}
return;
}
catch (Exception e)
{
this.FireEvent("OnException", new Event(this, "An exception was encountered while reading data for the download." + Environment.NewLine + e.Message + Environment.NewLine + e.StackTrace));
return;
}
}
}
protected override Stream GetResponseStream(string RemotePath)
{
this.Request = (FtpWebRequest)FtpWebRequest.Create(this.RemotePath);
if (this.Credentials != null)
this.Request.Credentials = this.Credentials;
this.Response = (FtpWebResponse)this.Request.GetResponse();
Stream ResponseStream = this.Response.GetResponseStream();
return ResponseStream;
}
}
Something of note - before anyone jumps to conclusions - is that this 'cycle' method is automatically run inside of a loop in its own thread; and will recur every 'ThreadDelay' milliseconds ( assuming it returns ).
The 'LocalStream' variable in this class is as follows:
protected Stream _LocalStream = null;
protected Stream LocalStream
{
get
{
if (this.LocalFile == null)
return null;
else if (this._LocalStream == null)
{
if (!this.LocalDirectory.Exists)
this.LocalDirectory.Create();
this._LocalStream = this.LocalFile.OpenWrite();
return this.LocalStream;
}
else
return this._LocalStream;
}
}
I've considered that this could cause some conflicts - and ensuring some thread safety is on my to-do list, but I'm not encountering any issues in this field at the moment - I just assumed someone might want to take a glance.
Per the 'DownloadThread.Read' method - since I'm sure someone will be curious... This is inherited from my own 'StreamReader' class; again inherited from my 'InputReader' class. Affiliated fields included below.
protected Object _InputSource = null;
public Object InputSource
{
get
{
return this._InputSource;
}
}
public Stream InputStream
{
get
{
if (this.InputSource == null)
return null;
else
{
try
{
return (Stream)this.InputSource;
}
catch (Exception)
{
return null;
}
}
}
}
public override double Read(byte[] buffer, int offset, int count)
{
try
{
return this.InputStream.Read(buffer, offset, count);
}
catch (Exception e)
{
this.Stop();
this.FireEvent("OnException", new Events.Event(this, "An exception was encountered while reading from the InputStream." + Environment.NewLine + e.Message + Environment.NewLine + e.StackTrace));
return 0;
}
}
Now - to reiterate - my issue here is that for some reason the data I am receiving is not being pushed out to the file properly. I'm just about to finish the ~1 gig download as I am finishing typing this, and thus far no exceptions have been thrown and the stream has not 'fallen apart' - but my output file is only 17 megs.
After waiting another 30 seconds to allow this download to finish, my 1.05 gig file has turned into an 18 meg file; but I see no particular reason why the above code shouldn't handle this properly.
Anyone's advice would be great.
I think the problem might be in this line of code:
this.LocalStream.Write(Bytes, 0, DataMetrics.Length(Bytes));
I assume that your DataMetrics.Length() method returns the wrong number of bytes, thus I suggest you to try :
this.LocalStream.Write(Bytes, 0, this.BytesLastRead);
instead.
Something of note - before anyone jumps to conclusions - is that this
'cycle' method is automatically run inside of a loop in its own
thread; and will recur every 'ThreadDelay' milliseconds ( assuming it
returns ).
My first guess is that the OS is discarding data on the network card since it isn't actively being dequeued, which is why you're getting 17mb instead of 1gb.
When a file is being copied to the file watcher folder, how can I identify whether the file is completely copied and ready to use? Because I am getting multiple events during file copy. (The file is copied via another program using File.Copy.)
When I ran into this problem, the best solution I came up with was to continually try to get an exclusive lock on the file; while the file is being written, the locking attempt will fail, essentially the method in this answer. Once the file isn't being written to any more, the lock will succeed.
Unfortunately, the only way to do that is to wrap a try/catch around opening the file, which makes me cringe - having to use try/catch is always painful. There just doesn't seem to be any way around that, though, so it's what I ended up using.
Modifying the code in that answer does the trick, so I ended up using something like this:
private void WaitForFile(FileInfo file)
{
FileStream stream = null;
bool FileReady = false;
while(!FileReady)
{
try
{
using(stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None))
{
FileReady = true;
}
}
catch (IOException)
{
//File isn't ready yet, so we need to keep on waiting until it is.
}
//We'll want to wait a bit between polls, if the file isn't ready.
if(!FileReady) Thread.Sleep(1000);
}
}
Here is a method that will retry file access up to X number of times, with a Sleep between tries. If it never gets access, the application moves on:
private static bool GetIdleFile(string path)
{
var fileIdle = false;
const int MaximumAttemptsAllowed = 30;
var attemptsMade = 0;
while (!fileIdle && attemptsMade <= MaximumAttemptsAllowed)
{
try
{
using (File.Open(path, FileMode.Open, FileAccess.ReadWrite))
{
fileIdle = true;
}
}
catch
{
attemptsMade++;
Thread.Sleep(100);
}
}
return fileIdle;
}
It can be used like this:
private void WatcherOnCreated(object sender, FileSystemEventArgs e)
{
if (GetIdleFile(e.FullPath))
{
// Do something like...
foreach (var line in File.ReadAllLines(e.FullPath))
{
// Do more...
}
}
}
I had this problem when writing a file. I got events before the file was fully written and closed.
The solution is to use a temporary filename and rename the file once finished. Then watch for the file rename event instead of file creation or change event.
Note: this problem is not solvable in generic case. Without prior knowledge about file usage you can't know if other program(s) finished operation with the file.
In your particular case you should be able to figure out what operations File.Copy consist of.
Most likely destination file is locked during whole operation. In this case you should be able to simply try to open file and handle "sharing mode violation" exception.
You can also wait for some time... - very unreliable option, but if you know size range of files you may be able to have reasonable delay to let Copy to finish.
You can also "invent" some sort of transaction system - i.e. create another file like "destination_file_name.COPYLOCK" which program that copies file would create before copying "destination_file_name" and delete afterward.
private Stream ReadWhenAvailable(FileInfo finfo, TimeSpan? ts = null) => Task.Run(() =>
{
ts = ts == null ? new TimeSpan(long.MaxValue) : ts;
var start = DateTime.Now;
while (DateTime.Now - start < ts)
{
Thread.Sleep(200);
try
{
return new FileStream(finfo.FullName, FileMode.Open);
}
catch { }
}
return null;
})
.Result;
...of course, you can modify aspects of this to suit your needs.
One possible solution (It worked in my case) is to use the Change event. You can log in the create event the name of the file just created and then catch the change event and verify if the file was just created. When I manipulated the file in the change event it didn't throw me the error "File is in use"
If you are doing some sort of inter-process communication, as I do, you may want to consider this solution:
App A writes the file you are interested in, eg "Data.csv"
When done, app A writes a 2nd file, eg. "Data.confirmed"
In your C# app B make the FileWatcher listen to "*.confirmed" files. When you get this event you can safely read "Data.csv", as it is already completed by app A.
(Edit: inspired by commets) Delete the *.confirmed filed with app B when done processing the "Data.csv" file.
I have solved this issue with two features:
Implement the MemoryCache pattern seen in this question: A robust solution for FileSystemWatcher firing events multiple times
Implement a try\catch loop with a timeout for access
You need to collect average copy times in your environment and set the memory cache timeout to be at least as long as the shortest lock time on a new file. This eliminates duplicates in your processing directive and allows some time for the copy to finish. You will have much better success on first try, which means less time spent in the try\catch loop.
Here is an example of the try\catch loop:
public static IEnumerable<string> GetFileLines(string theFile)
{
DateTime startTime = DateTime.Now;
TimeSpan timeOut = TimeSpan.FromSeconds(TimeoutSeconds);
TimeSpan timePassed;
do
{
try
{
return File.ReadLines(theFile);
}
catch (FileNotFoundException ex)
{
EventLog.WriteEntry(ProgramName, "File not found: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (PathTooLongException ex)
{
EventLog.WriteEntry(ProgramName, "Path too long: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (DirectoryNotFoundException ex)
{
EventLog.WriteEntry(ProgramName, "Directory not found: " + theFile, EventLogEntryType.Warning, ex.HResult);
return null;
}
catch (Exception ex)
{
// We swallow all other exceptions here so we can try again
EventLog.WriteEntry(ProgramName, ex.Message, EventLogEntryType.Warning, ex.HResult);
}
Task.Delay(777).Wait();
timePassed = DateTime.Now.Subtract(startTime);
}
while (timePassed < timeOut);
EventLog.WriteEntry(ProgramName, "Timeout after waiting " + timePassed.ToString() + " seconds to read " + theFile, EventLogEntryType.Warning, 258);
return null;
}
Where TimeoutSeconds is a setting that you can put wherever you hold your settings. This can be tuned for your environment.
I need to modify a text file from multiple .NET processes, nothing I've tried works reliably. I have a C# GUI app which starts multiple processes to do some number crunching. Those need to append lines to the same text file every few milliseconds. The master process monitors the size of the file and once it reaches some threshold uploads it and deletes it.
The way these are currently coded, the processes that append text create the file if it doesn't exist, but that would be easy to change.
How can I implement this?
This method will repeatedly attempt to open the file until it can write to it, timing out after 10ms.
private static readonly TimeSpan timeoutPeriod = new TimeSpan(100000); // 10ms
private const string filename = "Output.txt";
public void WriteData(string data)
{
StreamWriter writer = null;
DateTime timeout = DateTime.Now + timeoutPeriod;
try
{
do
{
try
{
// Try to open the file.
writer = new StreamWriter(filename);
}
catch (IOException)
{
// If this is taking too long, throw an exception.
if (DateTime.Now >= timeout) throw new TimeoutException();
// Let other threads run so one of them can unlock the file.
Thread.Sleep(0);
}
}
while (writer == null);
writer.WriteLine(data);
}
finally
{
if (writer != null) writer.Dispose();
}
}