Delete file using file watcher not allowing me to delete second time - c#

My task is to delete file once the processing is completed . I am using FileWatcher to complete this task. It is watching specific folder . Suppose If i copy one file and put that in filewatcher folder it is deleting. Second time when i copy the same file and paste that in the same watching folder. This time it says that Another process is using that file . and exception is throwing . I think i am missing something. Here is my code
private static void Main(string[] args)
{
var fw = new FileSystemWatcher(EmailSetting.DataFolder)
{
IncludeSubdirectories = false
,
EnableRaisingEvents = true
};
fw.Created += (sender, e) =>
{
File.Delete(e.FullPath);
};
Console.ReadLine();
}

You receive the Created event when the file was created (hence the name). But at this point in time the other process that is actually creating it, didn't finish writing content into that file. So the file might be already there, but the other is still working on it (imagine you would copy a 8 GB file).
It would be wiser to simply write the path of the file into a list within the event and let another thread regularly check this concurrent bag (e.g. once a second). First it checks if the file exists and if yes, try to delete it. If succeeded, remove it from the bag, otherwise try again next time.
Code example
private static readonly ConcurrentQueue<FileInfo> _FileCandidates = new ConcurrentQueue<FileInfo>();
private static void Main(string[] args)
{
var watcher = new FileSystemWatcher
{
Path = #"R:\TestFolder",
IncludeSubdirectories = false,
Filter = "*.*",
};
Console.WriteLine("Start watching folder... " + watcher.Path);
watcher.Created += OnFileCreated;
watcher.EnableRaisingEvents = true;
var timer = new Timer
{
AutoReset = true,
Interval = 1000,
};
timer.Elapsed += OnTimerElapsed;
timer.Enabled = true;
Console.ReadKey();
}
static void OnTimerElapsed(object sender, ElapsedEventArgs e)
{
FileInfo file;
var stillInUseFiles = new List<FileInfo>();
Console.WriteLine("Check for file candidates...");
while (_FileCandidates.TryDequeue(out file))
{
try
{
Console.WriteLine("Delete " + file.FullName);
if (file.Exists)
file.Delete();
}
catch (IOException)
{
Console.WriteLine("Could not delete file, try again next time.");
stillInUseFiles.Add(file);
}
}
foreach (var unhappyFile in stillInUseFiles)
{
_FileCandidates.Enqueue(unhappyFile);
}
}
static void OnFileCreated(object sender, FileSystemEventArgs e)
{
Console.WriteLine("Found new file candidate " + e.FullPath);
_FileCandidates.Enqueue(new FileInfo(e.FullPath));
}

Related

Optimizer won't stop after end new files found by FSW

I have created a CLI Application that will watch over a directory and optimize any new PDF's that are moved into it. There are no errors as of my last build.
The issue I'm having is when I run it the application will detect a change and optimize the changed files, but it doesn't stop the cycle of optimizing the new files.
How would I set a stopping point in the optimization process once it reaches the end of the new files?
public class Methods
{
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
public static void Optimize()
{
Thread.Sleep(1000);
PDFNet.Initialize();
string input_Path = #"C:\Users\user\Desktop\testinpactive\";
string output_Path = #"C:\Users\user\Desktop\output\";
string[] files = Directory.GetFiles(input_Path, "*.pdf", SearchOption.AllDirectories);
foreach (string file in files)
{
string fileName = Path.GetFileName(file);
Console.WriteLine($"Optimizing {fileName}");
string sub = file.Substring(41, 7);
CreateFolder(output_Path + sub);
// CreateFolder(output_Path + );
try
{
using (PDFDoc doc = new PDFDoc(file))
{
//--------------------------------------------------------------------------------
// Example 1) Simple optimization of a pdf with default settings.
doc.InitSecurityHandler();
Optimizer.Optimize(doc);
doc.Save(output_Path + sub + fileName, SDFDoc.SaveOptions.e_linearized);
// File Delete Process
//File.Delete(input_Path + files);
//Console.WriteLine("File Deleted");
Console.WriteLine("Done..\n");
}
}
catch (PDFNetException e)
{
Console.WriteLine(e.Message);
}
}
}
public static void Run()
{
// Create a new FileSystemWatcher and set its properties.
// Params: Path, and filter
using (FileSystemWatcher watcher = new FileSystemWatcher(#"C:\Users\user\Desktop\testinpactive", "*.pdf"))
{
// To watch SubDirectories
watcher.IncludeSubdirectories = true;
FswHandler Handler = new FswHandler();
// Add event handlers.
watcher.Created += Handler.OnCreated;
// Begin watching.
watcher.EnableRaisingEvents = true;
// Wait for the user to quit the program.
Console.WriteLine("Press 'q' to quit the sample.");
while (Console.Read() != 'q');
}
}
public class FswHandler
{
public void OnCreated(Object source, FileSystemEventArgs e)
{
// Write out Path (Testing)
Console.WriteLine($"FILE: {e.FullPath} CHANGE-TYPE: {e.ChangeType}");
// Specify what is done when a file is changed, created, or deleted.
Optimize();
}
}
A big thanks to #BenVoigt for explaining what I was doing wrong.
Here are the changes I made to my code to fix the issue.
public static void Optimize(string filePath, string outputPath, string fileName, string fileNum)
{
PDFNet.Initialize();
try
{
using (PDFDoc doc = new PDFDoc(filePath))
{
//--------------------------------------------------------------------------------
// Example 1) Simple optimization of a pdf with default settings.
doc.InitSecurityHandler();
Optimizer.Optimize(doc);
Directory.CreateDirectory(outputPath + filePath.Substring(41, 7));
doc.Save(filePath, SDFDoc.SaveOptions.e_linearized);
//doc.Save(outputPath + fileNum + fileName, SDFDoc.SaveOptions.e_linearized);
Console.WriteLine("Done..\n");
}
}
catch (PDFNetException e)
{
Console.WriteLine(e.Message);
}
}
public static void Run()
{
// Create a new FileSystemWatcher and set its properties.
// Params: Path, and filter
using (FileSystemWatcher watcher = new FileSystemWatcher(#"C:\Users\user\Desktop\testinpactive", "*.pdf"))
{
// To watch SubDirectories
watcher.IncludeSubdirectories = true;
FswHandler Handler = new FswHandler();
// Add event handlers.
watcher.Created += Handler.OnCreated;
// Begin watching.
watcher.EnableRaisingEvents = true;
// Wait for the user to quit the program.
Console.WriteLine("Press 'q' to quit the sample.");
while (Console.Read() != 'q');
}
}
public class FswHandler
{
public void OnCreated(Object source, FileSystemEventArgs e)
{
string output = #"C:\Users\user\Desktop\output";
// Write out Path (Testing)
Console.WriteLine($"FILE: {e.FullPath} CHANGE-TYPE: {e.ChangeType}");
Thread.Sleep(800);
// Specify what is done when a file is changed, created, or deleted.
Optimize(e.FullPath, output, e.Name.Substring(7), e.FullPath.Substring(40, 8));
}
}

Add new files to upload queue in real time

What i want to do is to add new files to upload list while older files are being uploaded.
This is the code for uploading and the Watcher:
private static void Main(string[] args)
{
//get files from local path
List<string> localFilesWithFullPath =
Directory.GetFiles(#"c:\myfiles").ToList();
foreach (var file in localFilesWithFullPath)
{
// upload each file
ftpClient.Upload("/" + Path.GetFileName(file), file);
}
Run(ftpClient, localFilesWithFullPath);
}
public static void Run(Ftp ftpClient, List<string>
localFilesWithFullPath)
{
var watcher = new FileSystemWatcher { Path = #"c:\myfiles" };
watcher.Created += (source, e) =>
{
ftpClient.Upload("/" + e.Name, e.FullPath);
};
watcher.EnableRaisingEvents = true;
Console.Read();
}
This is the FtpClass for uploading, and it works perfect.
The problem with this code is it will add new files after the older files are uploaded completely NOT while they are uploading, so it will not add new files to queue in middle of uploading.
Q: How can i use Watcher (or something else) to add files to upload queue in real time ?

C# FileSystemWatcher Copy folder complete

I am using FileSystemWatcher to monitor a folder that will be used to do some file renaming.
The only thing that will be copied will be folders containing files. There will not be single files put into the monitored folder. This is the code for setting up the FileSystemWatcher
watcher.Path = path;
watcher.NotifyFilter = NotifyFilters.DirectoryName | NotifyFilters.FileName;
watcher.IncludeSubdirectories = true;
watcher.Filter = "*.*";
watcher.Created += new FileSystemEventHandler(watcher_Created);
watcher.Changed += new FileSystemEventHandler(watcher_Changed);
watcher.Renamed += new RenamedEventHandler(watcher_Renamed);
watcher.EnableRaisingEvents = true;
There doesn't seem to be any issues with this setup..
The folders being copied can be between 50-200mb big. Is there a way to check/make sure that all the files have completed copying before starting the renaming process.
I tried this thinking that i would get an IOException if the copying was still happening when the GetFiles() was called.
bool finishedCopying = false;
while (!finishedCopying)
{
try
{
List<FileInfo> fileList = directoryInfo.GetFiles().ToList();
AlbumSearch newAlbum = new AlbumSearch(directoryInfo);
return newAlbum;
}
catch (IOException)
{
finishedCopying = false;
}
}
If anymore information is required, just ask an i can provide.
Ta.
I gave this a go using a timer. It may not be the prettiest solution out there but at first testing it seems to be working so far. Essentially what this does is when a folder is copied to the monitored folder it will add the folder path to the AlbumList. The files in that folder will trigger the Created event. This waits for the file to finish copying. Once finished it starts a timer. If a new Created event gets triggered the timer will reset itself.
When the timer.elapsed event is triggered it assumes (and I know assumption is the mother of all f*&k ups) that there are no more files to be copied and can start to process the fully copied folder..
System.Timers.Timer eventTimer = new System.Timers.Timer();
List<string> AlbumList = new List<string>();
private void watcher_Created(object sender, FileSystemEventArgs e)
{
if (Directory.Exists(e.FullPath))
{
AlbumList.Add(e.FullPath);
}
if (File.Exists(e.FullPath))
{
eventTimer.Stop();
FileInfo newTrack = new FileInfo(e.FullPath);
while (IsFileLocked(newTrack))
{
// File is locked. Do Nothing..
}
eventTimer.Start();
}
}
private void eventTimer_Elapsed(object sender, ElapsedEventArgs e)
{
List<string> ItemToRemove = new List<string>();
foreach (var item in AlbumList)
{
DirectoryInfo di = new DirectoryInfo(item);
AlbumSearch newAlbum = new AlbumSearch(di);
if (DoSomethingMethod(newAlbum))
{
ItemToRemove.Add(item);
}
else
{
// why did it fail
}
}
foreach (var path in ItemToRemove)
{
AlbumList.Remove(path);
}
}
private bool DoSomethingMethod(AlbumSearch as)
{
// Do stuff here
return true;
}
This is a small demo app that check files at the beginning, and then uses two hashsets to track copied files. This will only work if the source directory is known. There is no way to know if a file was created from a file copy or from a direct creation, so you can only compare two known directories with Directory.GetFiles. And, as already said in comments, you will still have to check if during the copy process, other files were added / removed / renamed in the old directory
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleApplication1
{
class Program
{
static HashSet<string> oldDirFiles = new HashSet<string>();
static HashSet<string> newDirFiles = new HashSet<string>();
static string oldDir = "C:\\New Folder";
static string newDir = "C:\\New Folder 2";
static System.Threading.ManualResetEvent resetEvent = new System.Threading.ManualResetEvent(false);
static void Main(string[] args)
{
System.IO.FileSystemWatcher watcher = new System.IO.FileSystemWatcher();
watcher.Path = newDir;
watcher.NotifyFilter = NotifyFilters.DirectoryName | NotifyFilters.FileName;
watcher.IncludeSubdirectories = true;
watcher.Filter = "*.*";
watcher.Created += watcher_Created;
watcher.Changed += watcher_Changed;
watcher.Renamed += watcher_Renamed;
watcher.EnableRaisingEvents = true;
//get all files in old directory
var oldFiles = Directory.GetFiles(oldDir, "*.*", SearchOption.AllDirectories);
foreach (var file in oldFiles)
oldDirFiles.Add(file);
resetEvent.WaitOne();
//now launch the directory copy
//then you have to check if in the meaning time, new files were added or renamed
//that could be done also with a watcher in the old directory
}
static void watcher_Renamed(object sender, RenamedEventArgs e)
{
throw new NotImplementedException();
}
static void watcher_Changed(object sender, FileSystemEventArgs e)
{
throw new NotImplementedException();
}
static void watcher_Created(object sender, FileSystemEventArgs e)
{
//check if the copied file was in the old directory before starting
if (oldDirFiles.Contains(e.FullPath.Replace(newDir, oldDir)))
{
newDirFiles.Add(e.FullPath);
//if all the files have been copied, the file count will be the same in the two hashsets
//the resetevent.Set() signal the waiting thread and the program can proceed
if (newDirFiles.Count == oldDirFiles.Count)
resetEvent.Set();
}
}
}
}

File watcher in windows service

I am beginner trying to develop a Windows service which keeps checking a folder (or group of folders) for any new file or changed files. As soon as it detects any new file or changes (to files or folders) then it copies the files (and any new folders) and paste it to another location.
I have done the same thing with a Windows Forms application but in a Windows Service I don't know what to do - how can I do this in a Windows Service?
You could use the FileSystemWatcher class.
Here is how we can complete this task using file watcher without timer
public void ProcessFile(string filepath)
{
var fileN ="newfilename";
string destfile = "E:\\2nd folder" + fileN;
File.Copy(filepath, destfile, true);
}
protected override void OnStart(string[] args)
{
String[] files = Directory.GetFiles("E:\\", "*.*");
foreach (string file in files)
{
ProcessFile(file);
}
var fw = new FileSystemWatcher("folderpath");
fw.IncludeSubdirectories = false;
fw.EnableRaisingEvents = true;
fw.Created += Newfileevent;
}
static void Newfileevent(object sender, FileSystemEventArgs e)
{
ProcessFile(e.FullPath);
}
Here is correct answer for this question
public static void MyMethod()
{
String[] files = Directory.GetFiles("E:\\", "*.*");
foreach (string file in files)
{
var fileN = file.Substring(2);
string destfile = "E:\\2nd folder" + fileN;
File.Copy(file, destfile, true);
}
}
polling is needed to watch the file after a fix interval of time....this code is watch the file 2sec
protected override void OnStart(string[] args)
{
timer.Elapsed += new ElapsedEventHandler(OnElapsedTime);
timer.Interval = 2000;
timer.Enabled = true;
MyMethod();
}

FileSystemWatcher triggers for filestream open

I have a filesystemwatcher that will trigger an event when a file is modified. I want to read from that file once the lock has been removed. At the moment I am just trying to open the file once the event is triggered, when A large file is being copied the file lock stays on for a while after the events have been sent, preventing the file from being opened for read access.
Any suggestions?
This one's actually a bit of a doozie, unless the problem space has changed significantly since I last had to deal with it.
The easiest way is to simply try to open the file, catch the resulting IOException, and if the file is locked, add it to a queue to be checked later. You can't just try to process every file that comes in because there are all kinds of cases where multiple events will be generated for the same file, so setting up a retry loop on every single received event can turn into a disaster, fast. You need to queue them up instead and check the queue at a regular interval.
Here is a basic class template that should help you out with this problem:
public class FileMonitor : IDisposable
{
private const int PollInterval = 5000;
private FileSystemWatcher watcher;
private HashSet<string> filesToProcess = new HashSet<string>();
private Timer fileTimer; // System.Threading.Timer
public FileMonitor(string path)
{
if (path == null)
throw new ArgumentNullException("path");
watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.NotifyFilter = NotifyFilters.FileName;
watcher.Created += new FileSystemEventHandler(FileCreated);
watcher.EnableRaisingEvents = true;
fileTimer = new Timer(new TimerCallback(ProcessFilesTimer),
null, PollInterval, Timeout.Infinite);
}
public void Dispose()
{
fileTimer.Dispose();
watcher.Dispose();
}
private void FileCreated(object source, FileSystemEventArgs e)
{
lock (filesToProcess)
{
filesToProcess.Add(e.FullPath);
}
}
private void ProcessFile(FileStream fs)
{
// Your code here...
}
private void ProcessFilesTimer(object state)
{
string[] currentFiles;
lock (filesToProcess)
{
currentFiles = filesToProcess.ToArray();
}
foreach (string fileName in currentFiles)
{
TryProcessFile(fileName);
}
fileTimer.Change(PollInterval, Timeout.Infinite);
}
private void TryProcessFile(string fileName)
{
FileStream fs = null;
try
{
FileInfo fi = new FileInfo(fileName);
fs = fi.OpenRead();
}
catch (IOException)
{
// Possibly log this error
return;
}
using (fs)
{
ProcessFile(fs);
}
lock (filesToProcess)
{
filesToProcess.Remove(fileName);
}
}
}
(Note - I'm recalling this from memory here so it might not be perfect - let me know if it's buggy.)

Categories

Resources