I have requirement that one process will trigger generation of multiple files (i.e. N number of file), There will be an application that will notify an external process the success (1) or failure (0) of the file generation process. If N number of files are generated on specific folders then that will be the success criterion. The destination folders and completion time are different for N files. In addition to the completion of the complete file generation process will take different time of completion on different days based on data volume, network congestion etc . So scheduling a job to get the file count in a specific time will not a feasible solution here. Could you please put your suggestion here to address the problem?
You can try FileSystemWatcher class as follows :
private void keepchecking()
{
FileSystemWatcher wt = new FileSystemWatcher();
wt.Path = path;
wt.NotifyFilter = NotifyFilters.LastWrite;
wt.Filter = "*.*";
wt.Changed += new FileSystemEventHandler(OnChanged);
wt.EnableRaisingEvents = true;
}
private void OnChanged(object source, FileSystemEventArgs e)
{
MessageBox.Show("New FIle Detected");
}
Related
I am following this example of FileSystemWatcher, On the top of this, I have created windows form application which will open whenever any .txt file is created and renamed in Z drive.
I have built the console application and deployed to two system and both systems are listening to same network drive (I have mapped a network drive as Z drive in both systems)
However, the problem is whenever I am creating or renaming .txt file in network drive both system's forms are opening which is logical since both deployed console applications are listening to the same location.
But my requirement is " The form should be opened in that system only
who is performing the action of creating or renaming that .txt file."
Is there any way I can achieve this Or is this even possible with fileSystemWatcher class?
Here is the code snippet.
public class Watcher
{
public static void Main()
{
Run();
}
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
public static void Run()
{
string[] args = System.Environment.GetCommandLineArgs();
FileSystemWatcher watcher = new FileSystemWatcher("Z:\\", "*.txt");
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
watcher.IncludeSubdirectories = true;
// Add event handlers.
//watcher.Changed += new FileSystemEventHandler(OnChanged); //Fires everytime files is changed (mulitple times in copy operation)
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
// Begin watching.
watcher.EnableRaisingEvents = true;
// Wait for the user to quit the program.
Console.WriteLine("Press \'q\' to quit the sample.");
while (Console.Read() != 'q') ;
}
// Define the event handlers.
private static void OnChanged(object source, FileSystemEventArgs e)
{
// Specify what is done when a file is changed, created, or deleted.
Console.WriteLine("File: " + e.FullPath + " " + e.ChangeType);
Application.EnableVisualStyles();
Application.Run(new Feedback.Form1(e.FullPath));//Here I am opening new form for feedback
}
private static void OnRenamed(object source, RenamedEventArgs e)
{
// Specify what is done when a file is renamed.
Console.WriteLine("File: {0} renamed to {1}", e.OldFullPath, e.FullPath);
Application.EnableVisualStyles();
Application.Run(new Feedback.Form1(e.FullPath));//Here I am opening new form for feedback
}
}
FileSystemWatcher may notify you that something happened, and you might also be able to deduce what happened, but don't count on it. It's a quite limited and unreliable component in my (and others') experience. So if there is any chance of even moderate contention on the target folder I would use some kind of polling solution instead of file watcher.
That said, it won't tell you who did the change. Once you have deduced what has changed, you need to take additional steps for the "who" part. The filesystem stores quite sparse info, you won't find any source machine info. You could try mapping the fileshares that create these changes with different users, as you may deduce the modifying system from that:
Finding the user who modified the shared drive folder files.
If that is not an option, other solutions are much more complicated.
If you have access to the server hosting Z: you could turn on the file audit log for that resource and deduce who the machine was from the event log (event ids 4663 / 5145). The source machine name will be logged in this case. Should be a breeze to enable it if it's a windows server (Directory properties/security/advanced/audit), but reading and synchronizing logs is more complicated.
If none of the solutions above is possible, you may be able to implement a user-space filesystem to proxy your file share, using something like dokan. Source processes would map to your application instead of the fileshare, that way you could raise your own events or just write a detailed audit log to a database or whatever, and then you forward the actual commands to the fileshare. Very expensive and non-trivial solution though. But probably very fun.
FileSystemWatcher gives you notification on file changes.
If you want to use the file system for unique notification you'll need to create an isolated folder for each instance.
Something like :
Z:\Machine1\
Z:\Machine2\
Other option is to check who is the owner/created the file , but it can be really complicated in domain setups.
I've created a simple FileSystemWatcher service that's running on my PC:
public static void Run()
{
var watcher = new FileSystemWatcher
{
Path = #"C:\Users\XXX\Google Drive",
NotifyFilter = NotifyFilters.LastAccess
| NotifyFilters.LastWrite
| NotifyFilters.FileName
| NotifyFilters.DirectoryName,
Filter = "*.*",
};
watcher.Created += OnChanged;
watcher.EnableRaisingEvents = true;
}
private static void OnChanged(object source, FileSystemEventArgs e)
{
FooPrintClass.SendToPrinter(e.FullPath);
}
As you see I'm watching a Google Drive folder. That folder is also synced on my server. From time to time a system on my server will create 2 pair of files with the same name but with diffrent type:
(Foo.pdf, Foo.txt)
Sometimes the system will create over 50 of those pairs and they all will be synced to my Google Drive folder.
So far so good, now to my problem:
My FileSystemWatcher service do work as expected, but it dosen't treat them in any sorting matter at all.
I need my service to actually process each pair at a time.
Expected Result:
Foo.pdf, Foo.txt
Bar.pdf, Foo.txt
Actual Result:
Bar.txt, Foo.pdf
Foo.txt, Bar.pdf
As the expected result show, I need to print the pairs in order first.
There are many ways to implement a "queue" solution, but in my case I don't know how many files there will be. So I don't know the total of the files and therefor it'll be harder to build a queue and sorting algorithm.
Any tips?
As you use 3d party system for syncing files you have no control how it is done. You may have problems - no control in which order or they are synced, no guaranty that when you get a notification from your watched a file is not locked.
To easy the problem with sync order you may sync files in bundles.
If you could modify the system that creates these files, you can ZIP both files in one zip file. Having Foo.zip you can print both files in order you want.
It doesn't solve the problem with possible locking. If you could notify your service somehow about a new pair of files, you can just download these files directly from Google Drive using the API. In this case you will have full control over files and the order you get them.
You could use Reactive Extensions to buffer a number of events and sort them before continuing.
An example would be something like this:
Observable
.FromEventPattern<FileSystemEventArgs>(watcher, "Created")
.Buffer(TimeSpan.FromSeconds(10))
.Subscribe(onNext);
public void onNext(IList<string>) { ... }
The example buffers all changes happening in 10 seconds and passes them to onNext as a list. This allows you to sort the files before doing anything else.
This ignores some edge cases like files being created right at the time when the buffer window ends. But there are multiple ways to solve those issues.
My case is the following:
Files are creating inside a directory by some process (about 10 files per minute with max file size 5MB each). Lets call this folder MYROOT.
That files need to be moved and categorized into sub directories according some specific logic based on the filename and external settings.
There will be 5 sub directories under the MYROOT with some sub directories inside them also, lets name them C1, C2, C3, C4, C5. And some sub directories inside them as A1, A2, A3, An...
So, we will have the following categorization: C:\MYROOT\C1\A1, C:\MYROOT\C1\A2, C:\MYROOT\C2\B1 and so on...
All files are written to C:\MYROOT and have the same file type and same naming convention. No renames, deletes, changes are made into this folder.
The FileSystemWatcher is set as follows:
this._watcher = new FileSystemWatcher();
this._watcher.NotifyFilter = NotifyFilters.CreationTime;
this._watcher.Filter = string.Empty;
this._watcher.Path = #"C:\MYROOT";
this._watcher.IncludeSubdirectories = false;
this._watcher.Created += this._watcher_Created;
The event handler:
private void _watcher_Created(object sender, FileSystemEventArgs e)
{
string filePath = string.Empty;
if (e.ChangeType != WatcherChangeTypes.Created)
return;
lock (FileOrganizer.Manager.s_LockObject)
{
filePath = e.FullPath;
//Further processing here like creating a custom object that
//contains the file name and some other info, add that object inside a queue
//that is processed async with logic applied to wait the file to be written,
//validate file contents, etc.. and finally move it where it should be moved.
//The code here is very fast, exception free and in a fire-and-forget manner.
}
}
When the MYROOT directory is empty and files are created by another process, they are moved inside the folders as I described. The folders are only created once if they do not exist.
The amount of files that exist inside the sub folders are increasing and we are talking about ~200GB and counting.
Now hear this:
When no files are created (the "creator" process is not running) nothing should trigger the watcher and I can see that from logs that I enable for debugging before posting this question. But, my process containing the watcher hits a constant 13 to 14% cpu on an octa core processor and increases as the size of the sub directories increase. During processing even if I create (copy-paste) 2000 files at once goes 1% more than that.
The funny part is that when I change the monitoring path to an empty one or the same but with less volume inside it, the cpu utilization of my process is 0% and only when files are created goes to a max of 2% - 5%.
Again as said, the observation is clear, when the monitoring path sub directories contain data, that data affect the watcher internals even if you set not to monitor the sub directories. And if that data volume is big, then high cpu resources are needed by the file system watcher. That is the same even if no changes are taking place to trigger any watcher events.
Is this the normal or by design behavior of the FileSystemWatcher?
PS.
When I monitor MYROOT folder and move the files outside that folder everything seems OK, but that is not an acceptable solution.
Thanx.
Marios
I have browsed around but cannot find any information on what I am seeking, if there is another post that already goes over this then I apologize.
I am seeking help with code that will monitor a specific folder for when the folder is opened by another person or when a file under said folder is opened. At this point I can see when a user opens and modifies any files but if they just open the file to view it, it does not throw an event even when I add LastAccessed. Any information or help would be appreciated.
Folder name is C:\Junk
Code in C# 4.0:
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
public static void Run()
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = #"C:\";
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
watcher.Filter = "junk";
// Add event handlers.
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
watcher.IncludeSubdirectories = true;
watcher.EnableRaisingEvents = true;
// Wait for the user to quit the program.
Console.WriteLine("Press \'q\' to quit the sample.");
while (Console.Read() != 'q') ;
}
// Define the event handlers.
private static void OnChanged(object source, FileSystemEventArgs e)
{
// Specify what is done when a file is changed, created, or deleted.
Console.WriteLine("File: " + e.FullPath + " " + e.ChangeType);
}
private static void OnRenamed(object source, RenamedEventArgs e)
{
// Specify what is done when a file is renamed.
Console.WriteLine("File: {0} renamed to {1}", e.OldFullPath, e.FullPath);
}
it does not throw an event even when I add LastAccessed.
Because NotifyFilters.LastAccessed specifies that you wish to retreive that property, not the event to subscribe to. The available events are Changed, Created, or Deleted, none of which does what you want.
You should take a look at the ReadDirectoryChangesW Win32 function, documented here. It can be passed a FILE_NOTIFY_CHANGE_LAST_ACCESS flag, which seems to deliver what you want:
Any change to the last access time of files in the watched directory or subtree causes a change notification wait operation to return.
Edit: disregard this, the FileSystemWatcher does internally pass NotifyFilters.LastWrite as int 32, which is the same as FILE_NOTIFY_CHANGE_LAST_ACCESS, to ReadDirectoryChangesW. That function then still does not notify on file access, I've tried.
Perhaps this is caused by this:
Last Access Time has a loose granularity that only guarantees that the time is accurate to within one hour. In Windows Vista, we've disabled updates to Last Access Time to improve NTFS performance. If you are using an application that relies on this value, you can enable it using the following command:
fsutil behavior set disablelastaccess 0
You must restart the computer for this change to take effect.
If you execute that on the command prompt, perhaps then the LastAccess will be written and the event will fire. I'm not going to try in on my SSD and don't have a VM ready, but on Windows 7 disablelastaccess seems to be enabled out-of-the-box.
If it still doesn't work when you disable that behavior, wait for Raymond Chen's suggestion box (or himself) to come by, usually there's a quite logical explanation for why the documentation does not seem to correctly describe the behaviour you encounter. ;-)
You may as well just scan the directory in a loop and look at the LastAccessed property of the Files. What are you trying to do when a user opens a certain file?
To get On-Access file path there is one solution of minifilter driver. You have to implement minifilter driver to get the requirements implemented.
You should set
watcher.Path = #"C:\junk";
and delete watcher.Filter line if event should fire for all files
Using Filter property you can set wildcards for matching files, for example *.txt
what you really need is NtQuerySystemInformation enumeration and a timer, that way you can scan the directory and see if any of the files are open. the filesystemwatcher will not give you this info
public void OnChanged(object sender, FileSystemEventArgs e)
{
string FileName = System.IO.Path.GetFileName(e.FullPath);
if(IsAvailable(System.IO.Path.Combine(RecievedPath,FileName)))
{
ProcessMessage(FileName);
}
}
private void ProcessMessage(string fileName)
{
try
{
File.Copy(System.IO.Path.Combine(RecievedPath,fileName), System.IO.Path.Combine(SentPath,fileName));
MessageBox.Show("File Copied");
}
catch (Exception)
{ }
}
private static bool IsAvailable(String filePath)
{
try
{
using (FileStream inputStream = File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.None))
{
if (inputStream.Length > 0)
{
return true;
}
else
{
return false;
}
}
}
catch (Exception)
{
return false;
}
}
Digvijay Rathore gave already an answer, in my opinion the only good answer, even if a bit too short.
I want just to add a few words and a link where interested users could start.
The FileSystemWatcher is useful just to monitor what is happening inside the monitored folder, but it's not able to monitor and intercept what the user (or the OS) is doing.
For example, with a FSW (FileSystemWatcher) you can monitor when a file/folder is created, deleted, renamed or changed in some way, and those events are unleashed AFTER the action is completed, not before nor while.
A simple FSW is not able to know if the user is launching an executable from the monitored folder, in this case it will simply generate no events at all.
To catch when an executable is launched (and tons of other "events") before it is launched and make some action before the code of the executable is loaded in memory, you need to write something at lower (kernel) level, that is, you need to build a driver, in this specific case a (minifilter) File System driver.
This is a good starting point to start to understand the basic of Minifilter Windows Drivers:
https://learn.microsoft.com/en-us/windows-hardware/drivers/ifs/file-system-minifilter-drivers
I have an application that launches other applications, and then waits for them to create a specific data file (it watches one application at a time). Each time an application is launch it watches a specific directory for a specific file to be created. I am using the FileSystemWatcher to do this (set it to the directory, then filter for the correct file name). This works great the first time (always), but the second application launched never fires the event. The only way it seems to fire the event is if I place a break-point in the event handler, or if I have a Thread.Sleep command in the event handler. This seems very strange to me...is there some race condition that I'm not aware of? Here is the code. Notice I have a Thread.Sleep(500). With this line the code works every time. Without it will fail. I'm really not comfortable relying on a Sleep command. I'm not sure what condition will cause that not to work as well.
public static void watchFiles(string path)
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
public static void watcher_Handler(object sender, FileSystemEventArgs e)
{
//Hack - the sleep allows the second and third application to be caught by this event
Thread.Sleep(500);
switch (e.ChangeType.ToString())
{
case "Changed":
break;
case "Deleted":
break;
case "Created":
if (e.Name == "log.dat")
{
parseDataFile();
moveHTMLtoLMS();
}
break;
default:
break;
}
}
Anyone know why I need to have that Sleep (or break-point) to get the code to work a second time?
According to the documentation of the System.IO.FileSystemWatcher class:
The Windows operating system notifies your component of file changes in a buffer created by the FileSystemWatcher. If there are many changes in a short time, the buffer can overflow. This causes the component to lose track of changes in the directory, and it will only provide blanket notification. Increasing the size of the buffer with the InternalBufferSize property is expensive, as it comes from non-paged memory that cannot be swapped out to disk, so keep the buffer as small yet large enough to not miss any file change events. To avoid a buffer overflow, use the NotifyFilter and IncludeSubdirectories properties so you can filter out unwanted change notifications.
It might be that the event isn't being consumed fast enough and the internal buffer isn't large enough to handle all the notifications. By default, the watcher handles FileName, DirectoryName, LastWrite notifications yet you only consume creation events (both file and directory). Are your applications running in quick succession? I'd try putting a delay between the invocations of your applications (instead of the event handler), use more specific filters (just the FileName notification or watch only for log files using the Filter property), increase the internal buffer size or any combination of the above. I think that should fix your problem.
public static void watchFiles(string path)
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
The watcher variable is eligible for garbage collection at the end of this method. Instead of being a local variable, make it a class-level member as such:
private static FileSystemWatcher watcher;
public static void watchFiles(string path)
{
if (watcher != null)
{
watcher.EnableRaisingEvents = false;
watcher.Created -= new FileSystemEventHandler(watcher_Handler);
}
watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
You are listenting to only one "Created" event. You need to listen to all other ones too - OnChanged, OnDeleted - http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
EDIT: Most programs will not "Create" file when one already exists. You can use FileMon (now Process Monitor - http://technet.microsoft.com/en-us/sysinternals/bb896645 ) to see what operations each program perform with your file.
I'm facing the exact same problem here (running Windows XP). Your hack solves the problem. I would like to add some notes that might be relevant.
In my case the filename is always the same: C:\blah.txt is created, deleted, created and so forth. Also, I'm using a trick to hide my application:
Integrator.StartMonitor(); // Start the file monitor!
Form f = new Form();
f.ShowInTaskbar = false;
f.ShowIcon = false;
f.StartPosition = FormStartPosition.Manual;
f.Location = new Point(-32000, -32000);
f.Show();
f.Hide();
Application.Run();
My file watcher works in debug mode or when I add the sleep-hack of yours. It certainly looks like a bug in the FileSystemWatcher.