my application read DLL data from cache. but if any developer change DLLs DllCaching must change. So i have been using FileSystemWatcher to detect any changes on DLLs .
My systen Watcher mechanizm below: this project is in asp.net
public void CreateFileWatcher(string path)
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite | NotifyFilters.FileName | NotifyFilters.DirectoryName;
watcher.IncludeSubdirectories = true;
watcher.Filter = "*.dll";
// Add event handlers.
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnChanged);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
private static void OnChanged(object source, FileSystemEventArgs e)
{
//FillCache
}
Button1_Click
{
CreateFileWatcher(#"C:/data")
// like that:
myarray = CachData
}
How to make it? how to load dll(again loading) when Dlls changes.
It is a hurdle on how to clear cache if you are using cache in your application and it becomes tough when you implement a custom logic to clear cache. If you are using EnterpriseLibrary cache, we can have a dependency while adding items to the cache. Any change to this file will clear the cache.
Below is the sample code. The same overload is available for asp.net cache as well.
FileDependency cacheFileDependency = new FileDependency("\\mynetworkpath\abc.txt");
cacheMgr.Add(cacheName, cacheValueList,
Microsoft.Practices.EnterpriseLibrary.Caching.CacheItemPriority.Normal,
null, cacheFileDependency);
You do not understand the way ASP.NET works, i suppose. Here is my vision, hope it will helps you to understand the problem and find an appropriate solution.
I you instancing a FileSystemWatcher in a servers-side controls, that does not particularly means single-instance usage, that only means that that particular client accessed you web form or what will be granted (not really, actually) at least one thread. That is also means that IIS thread can be in less counts than it is required by the clients.
Addilitionally, that also means that you can not use an Singleton pattern effectively, as well as Session storage or cookies (it will break the law in case of web-farm async calls).
The only really working option is to implement a web service, and accumulate changes, deletes or additions to a specified server path (not a physical path, but rather a IIS (or any web server, f.e mono) path i.e \etc\bin\dlls\, which can be effectively translated to a physical server path using http utils BCL classes for the web), and to obtain required information periodically.
If you are willing to obtain such an information directly that will not be possible, because always it will be a gaps, between monitoring instances, where this folder will be in uncontrolled state, f.e. between one client thread is shutting down, and just before next client connects to a server, for example when user uploads a big file on a page, and then calls F5 or refreshes the page, but page gone to the expired state (expired state means, that meaningful state of the page for the client is lost, possibly, client consumer thread on a server is finished its work, or went to next client in a queue.
Related
I am currently implementing file content watchers for OpenFOAM output files. These files get written by OpenFOAM in an Unix environment and consumed by my applications in a Windows environment.
Please consider my first, working watcher for convergence files (these files get updated after each iteration of the solution):
FileSystemWatcher watcher;
watcher = new FileSystemWatcher(WatchPath, "convergenceUp*.out");
watcher.NotifyFilter = NotifyFilters.LastWrite | NotifyFilters.Attributes | NotifyFilters.FileName | NotifyFilters.Size;
watcher.Changed += Watcher_Changed;
watcher.EnableRaisingEvents = true;
private void Watcher_Changed(object sender, FileSystemEventArgs e)
{
Files = Directory.GetFiles(WatchPath, "convergenceUp*.out").OrderBy(x => x).ToList(); // Update List of all files in the directory
ReadFiles(); // Do fancy stuff with the files
}
This works as expected. Everytime a file matching the pattern is changed in the watched directory (Notepad++ does notify me that the file has changed aswell) the files are processed.
Moving on from this simple "all files are in one directory" scenario I started to build a watcher for a different type of file (Force function objects for those familiar with OpenFOAM). These files are saved in a hierarchical folder structure like thus:
NameOfFunctionObject
|_StartTimeOfSolutionSetup#1
| |_forces.dat
|_StartTimeOfSolutionSetup#2
|_forces.dat
My goal is to read all forces.dat from "NameOfFunctionObject" and do some trickery with all the contained data. Additionally I also like to have the chance of reading and watching just one file. So my implementation (which borrows heavily from the above) currently looks like this:
FileSystemWatcher watcher;
if (isSingleFile)
watcher = new FileSystemWatcher(Directory.GetParent(WatchPath).ToString(), Path.GetFileName(WatchPath));
else
watcher = new FileSystemWatcher(WatchPath, "forces.dat");
watcher.IncludeSubdirectories = !isSingleFile;
watcher.NotifyFilter = NotifyFilters.LastWrite | NotifyFilters.Attributes | NotifyFilters.FileName | NotifyFilters.Size | NotifyFilters.DirectoryName | NotifyFilters.LastAccess | NotifyFilters.CreationTime | NotifyFilters.Security;
watcher.Changed += Watcher_Changed;
watcher.Created += Watcher_Created;
watcher.Deleted += Watcher_Deleted;
watcher.Error += Watcher_Error;
watcher.Renamed += Watcher_Renamed;
watcher.EnableRaisingEvents = isWatchEnabled;
So depending on wether I want to watch just one file or multiple files I set up the directory to watch and the file filter. If I watch multiple files I set the watcher to watch subdirectories aswell. Because of vigorous testing I filter for all notifications and catch all watcher events.
If I test the single file option, everything works as expected, changes to the file are reported and processed correctly (again, the check with trusty old Notepad++ works)
On testing the multi-file option though, things get pear shaped.
The file paths are correct, the initial read works as expected. But neither watcher event fires. Here comes the curious bit: Notepad++ beeps still away, saying the file has changed, Windows explorer shows a new file date and a new file size. If I save the file within Notepad++, the watcher gets triggered. If I create a new file matching the pattern insinde the watched directory (top level or below does not matter!) the watcher gets triggered. Even watching for a filter of . to catch creation of temporary files does not trigger, so it is safe to assume that no temporary files are created.
In general, the watcher behaves as expected, it can detect changes to a single file, it can detect creations of files in the root watched folder and its subfolders. It just fails to recognise non-windows-changes to a file once it is located in a subfolder. Is this behaviour by design? And more importantly: how can I work elegantly around it without resorting to using a timer and polling by hand?
I think this might be relevant to you
FileSystemWatcher uses ReadDirectoryChangesW Winapi call with a few relevant flags
When you first call ReadDirectoryChangesW, the system allocates a
buffer to store change information. This buffer is associated with the
directory handle until it is closed and its size does not change
during its lifetime. Directory changes that occur between calls to
this function are added to the buffer and then returned with the next
call. If the buffer overflows, the entire contents of the buffer are
discarded
The analogue in FileSystemWatcher is the FileSystemWatcher.InternalBufferSize property
Remarks You can set the buffer to 4 KB or larger, but it must not
exceed 64 KB. If you try to set the InternalBufferSize property to
less than 4096 bytes, your value is discarded and the
InternalBufferSize property is set to 4096 bytes. For best
performance, use a multiple of 4 KB on Intel-based computers.
The system notifies the component of file changes, and it stores those
changes in a buffer the component creates and passes to the APIs. Each
event can use up to 16 bytes of memory, not including the file name.
If there are many changes in a short time, the buffer can overflow.
This causes the component to lose track of changes in the directory,
and it will only provide blanket notification. Increasing the size of
the buffer can prevent missing file system change events. However,
increasing buffer size is expensive, because it comes from non-paged
memory that cannot be swapped out to disk, so keep the buffer as small
as possible. To avoid a buffer overflow, use the NotifyFilter and
IncludeSubdirectories properties to filter out unwanted change
notifications.
If worse comes to worse, you can use a mix of polling and tracking, it has helped me out of trouble a few times
I am following this example of FileSystemWatcher, On the top of this, I have created windows form application which will open whenever any .txt file is created and renamed in Z drive.
I have built the console application and deployed to two system and both systems are listening to same network drive (I have mapped a network drive as Z drive in both systems)
However, the problem is whenever I am creating or renaming .txt file in network drive both system's forms are opening which is logical since both deployed console applications are listening to the same location.
But my requirement is " The form should be opened in that system only
who is performing the action of creating or renaming that .txt file."
Is there any way I can achieve this Or is this even possible with fileSystemWatcher class?
Here is the code snippet.
public class Watcher
{
public static void Main()
{
Run();
}
[PermissionSet(SecurityAction.Demand, Name = "FullTrust")]
public static void Run()
{
string[] args = System.Environment.GetCommandLineArgs();
FileSystemWatcher watcher = new FileSystemWatcher("Z:\\", "*.txt");
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
watcher.IncludeSubdirectories = true;
// Add event handlers.
//watcher.Changed += new FileSystemEventHandler(OnChanged); //Fires everytime files is changed (mulitple times in copy operation)
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
// Begin watching.
watcher.EnableRaisingEvents = true;
// Wait for the user to quit the program.
Console.WriteLine("Press \'q\' to quit the sample.");
while (Console.Read() != 'q') ;
}
// Define the event handlers.
private static void OnChanged(object source, FileSystemEventArgs e)
{
// Specify what is done when a file is changed, created, or deleted.
Console.WriteLine("File: " + e.FullPath + " " + e.ChangeType);
Application.EnableVisualStyles();
Application.Run(new Feedback.Form1(e.FullPath));//Here I am opening new form for feedback
}
private static void OnRenamed(object source, RenamedEventArgs e)
{
// Specify what is done when a file is renamed.
Console.WriteLine("File: {0} renamed to {1}", e.OldFullPath, e.FullPath);
Application.EnableVisualStyles();
Application.Run(new Feedback.Form1(e.FullPath));//Here I am opening new form for feedback
}
}
FileSystemWatcher may notify you that something happened, and you might also be able to deduce what happened, but don't count on it. It's a quite limited and unreliable component in my (and others') experience. So if there is any chance of even moderate contention on the target folder I would use some kind of polling solution instead of file watcher.
That said, it won't tell you who did the change. Once you have deduced what has changed, you need to take additional steps for the "who" part. The filesystem stores quite sparse info, you won't find any source machine info. You could try mapping the fileshares that create these changes with different users, as you may deduce the modifying system from that:
Finding the user who modified the shared drive folder files.
If that is not an option, other solutions are much more complicated.
If you have access to the server hosting Z: you could turn on the file audit log for that resource and deduce who the machine was from the event log (event ids 4663 / 5145). The source machine name will be logged in this case. Should be a breeze to enable it if it's a windows server (Directory properties/security/advanced/audit), but reading and synchronizing logs is more complicated.
If none of the solutions above is possible, you may be able to implement a user-space filesystem to proxy your file share, using something like dokan. Source processes would map to your application instead of the fileshare, that way you could raise your own events or just write a detailed audit log to a database or whatever, and then you forward the actual commands to the fileshare. Very expensive and non-trivial solution though. But probably very fun.
FileSystemWatcher gives you notification on file changes.
If you want to use the file system for unique notification you'll need to create an isolated folder for each instance.
Something like :
Z:\Machine1\
Z:\Machine2\
Other option is to check who is the owner/created the file , but it can be really complicated in domain setups.
I'm working on a service where in my OnStartmethod I have the following lines of code to set up my FileSystemWatcher
Log.Info($"File location {_location}");
var watcher = new FileSystemWatcher(_location);
watcher.Changed += new FileSystemEventHandler(OnChanged);
Then in my OnChanged method I am wanting to start a timer like so:
private void OnChanged(object source, FileSystemEventArgs e)
{
Log.Info($"A file has been placed in {_location} starting timer");
OnTimer(null, null); //run immediately at startup
StartEventTimer();
}
The timer code works, so I know that isn't an issue, likewise in my log I know it is checking for the correct location. What is it that I'm missing?
All I'm wanting my code to do is to trigger my timer, the moment a file is placed in my target location yet I've not been able to do so. Am I correct in that I should be using FileSystemWatcherto do this, or should I use something else as this code is within a service?
You might well find that the Changed event is firing more than once on a new file, a common problem with some applications, which might create unwanted side effects later on. Have a look and try changing to Created instead.
If you're looking for a new file appearing in a folder, you should use:
watcher.NotifyFilter = NotifyFilters.FileName;
watcher.Created += OnCreated;
Gist demonstrating it firing twice using Changed on LastWrite and a for predictable behaviour, a Gist demonstrating single fire on file create using Created and NotifyFilter.FileName
Just run it up in a Console App and copy a file into c:\temp.
There are a couple of things it could be based on what you said there.
The first thing of note is that the declaration for var watcher looks like it's not a class variable and will go out of scope when it exits OnStart(). You'll need to move the declaration outside of that.
The second item of interest is that it looks like EnableRaisingEvents isn't being set. A working example of the FileSystemWatcher is below.
public class SomeService
{
private FileSystemWatcher _watcher;
public void OnStart()
{
// set up the watcher
_watcher = new FileSystemWatcher(_location);
_watcher.Path = path;
_watcher.NotifyFilter = NotifyFilters.LastWrite;
_watcher.Filter = "*.*";
_watcher.Changed += new FileSystemEventHandler(OnChanged);
_watcher.EnableRaisingEvents = true;
}
}
EDIT
As Ben Hall mentioned, it is possible that multiple events can be raised for the same file when a file is moved into the folder. As per the MSDN documentation;
Common file system operations might raise more than one event. For example, when a file is moved from one directory to another, several OnChanged and some OnCreated and OnDeleted events might be raised. Moving a file is a complex operation that consists of multiple simple operations, therefore raising multiple events. Likewise, some applications (for example, antivirus software) might cause additional file system events that are detected by FileSystemWatcher
I need to detect when either of two file types are accessed in any way across an entire windows file system.
As I understand it the only way to do this without causing serious slow downs for the operating system is to create a file system filter driver?
Essentially all I need to do is take a copy of any doc(x) files and pdf's that are opened. I decided on this approach as it was either that or use file monitors in C# which wouldn't be effective for an entire drive.
My question is two fold, is there an easier way and secondly how would I go about simply taking a copy of each doc(x)/pdf file as it's accessed?
The solution needs to be deployable with the package we're currently producing.
UPDATE
I'm going to benchmark the file system watcher, after discussing it with people here I think it's possible that it may be acceptable, my concern is the fact that I need to monitor the common user directories where downloads will occur( so "C:\Users\SomeUser*" as well as the outlook temporary folder.
You will need to create a file system watcher. Here is a code example that will watch for changes to docx files.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Security.Permissions;
namespace filewatchtest
{
class Program
{
static void Main(string[] args)
{
Run();
}
[PermissionSet(SecurityAction.Demand, Name="FullTrust")]
public static void Run()
{
string[] args = System.Environment.GetCommandLineArgs();
// if directory not specified then end program
if (args.Length != 2)
{
Console.WriteLine("Usage: filewatchtest.exe directory");
return;
}
// create a new fileSystemWatcher and set its properties
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = args[1];
// set the notify filters
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite | NotifyFilters.FileName | NotifyFilters.DirectoryName;
// set the file extension filter
watcher.Filter = "*.docx";
// add event handlers
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
// bengin watching
watcher.EnableRaisingEvents = true;
// wait for the user to quit the program
Console.WriteLine("Plress q to quit the program");
while (Console.Read()!='q');
}
static void OnRenamed(object sender, RenamedEventArgs e)
{
Console.WriteLine("File: {0} renamed to {1}", e.OldFullPath, e.FullPath);
}
static void OnChanged(object sender, FileSystemEventArgs e)
{
Console.WriteLine("File:" + e.FullPath + " " + e.ChangeType);
}
}
}
I think that creating a copy on read will cause a lot of problems. For instance: virus scanners. Consider the following:
I open file "test.pdf"
Your program creates "test_copy.pdf"
Virus scanner detects new file and checks (reads) "test_copy.pdf"
Your program detects read access, and creates "test_copy_copy.pdf"
Virus scanner...
Now you ofcourse you could create copies with a different extension to prevent this, but still there will be a lot of READ actions on files. I sometimes open a file like 10 times, just because I closed it accidentally or I want to recheck something I just read. Now you'll have 10 copies?
I would definitly go with Hans Passant's suggestion of creating a copy on change/create. That happens a lot less by definition, because you always need to open it to alter it, but don't have to alter it when you open it.
The second problem would be to detect a read to a file. Now with docx you could check for the creation of hidden files like '~$_____.docx', but that doesn't work for PDF. Also like you mentioned, you will have to check an entire disk. There is no way around it. If a file can be in any folder, you'll have to check all the folders. Creating an internal list of docx and PDF files in a service could be faster, but as you'll have to loop trough each file again at set intervals it depends on how many files are on the system.
So if you really need to check read access, a file system driver is all you got. But since it will be called on every file access, causing problems or slow systems would be a mayor concern.
If you still want to, check out this File System Filter Driver Tutorial to learn how to do it. Personally, I wouldn't go there.
From what I read in the comments, a File System Watcher would probably work well. I am not exactly sure whether Search Everything uses one, but if it does, I cannot notice any impact.
Another option might be ETW - Windows Event Tracing as used by Process Monitor. Even with millions of changes, I can also hardly notice the impact.
I you want to go for Volume Shadow Copies as proposed by Hans Passant, Alpha Volume Shadow Copies might be a suitable library offering support for it.
Conclusion: a filter driver is probably not needed and keeps you away from other problems, although I admit that the description of hierarchical storage management systems might match your approach, thinking of the upload store as the next hierarchy after hard disk.
I have an application that launches other applications, and then waits for them to create a specific data file (it watches one application at a time). Each time an application is launch it watches a specific directory for a specific file to be created. I am using the FileSystemWatcher to do this (set it to the directory, then filter for the correct file name). This works great the first time (always), but the second application launched never fires the event. The only way it seems to fire the event is if I place a break-point in the event handler, or if I have a Thread.Sleep command in the event handler. This seems very strange to me...is there some race condition that I'm not aware of? Here is the code. Notice I have a Thread.Sleep(500). With this line the code works every time. Without it will fail. I'm really not comfortable relying on a Sleep command. I'm not sure what condition will cause that not to work as well.
public static void watchFiles(string path)
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
public static void watcher_Handler(object sender, FileSystemEventArgs e)
{
//Hack - the sleep allows the second and third application to be caught by this event
Thread.Sleep(500);
switch (e.ChangeType.ToString())
{
case "Changed":
break;
case "Deleted":
break;
case "Created":
if (e.Name == "log.dat")
{
parseDataFile();
moveHTMLtoLMS();
}
break;
default:
break;
}
}
Anyone know why I need to have that Sleep (or break-point) to get the code to work a second time?
According to the documentation of the System.IO.FileSystemWatcher class:
The Windows operating system notifies your component of file changes in a buffer created by the FileSystemWatcher. If there are many changes in a short time, the buffer can overflow. This causes the component to lose track of changes in the directory, and it will only provide blanket notification. Increasing the size of the buffer with the InternalBufferSize property is expensive, as it comes from non-paged memory that cannot be swapped out to disk, so keep the buffer as small yet large enough to not miss any file change events. To avoid a buffer overflow, use the NotifyFilter and IncludeSubdirectories properties so you can filter out unwanted change notifications.
It might be that the event isn't being consumed fast enough and the internal buffer isn't large enough to handle all the notifications. By default, the watcher handles FileName, DirectoryName, LastWrite notifications yet you only consume creation events (both file and directory). Are your applications running in quick succession? I'd try putting a delay between the invocations of your applications (instead of the event handler), use more specific filters (just the FileName notification or watch only for log files using the Filter property), increase the internal buffer size or any combination of the above. I think that should fix your problem.
public static void watchFiles(string path)
{
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
The watcher variable is eligible for garbage collection at the end of this method. Instead of being a local variable, make it a class-level member as such:
private static FileSystemWatcher watcher;
public static void watchFiles(string path)
{
if (watcher != null)
{
watcher.EnableRaisingEvents = false;
watcher.Created -= new FileSystemEventHandler(watcher_Handler);
}
watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.Created += new FileSystemEventHandler(watcher_Handler);
watcher.EnableRaisingEvents = true;
}
You are listenting to only one "Created" event. You need to listen to all other ones too - OnChanged, OnDeleted - http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
EDIT: Most programs will not "Create" file when one already exists. You can use FileMon (now Process Monitor - http://technet.microsoft.com/en-us/sysinternals/bb896645 ) to see what operations each program perform with your file.
I'm facing the exact same problem here (running Windows XP). Your hack solves the problem. I would like to add some notes that might be relevant.
In my case the filename is always the same: C:\blah.txt is created, deleted, created and so forth. Also, I'm using a trick to hide my application:
Integrator.StartMonitor(); // Start the file monitor!
Form f = new Form();
f.ShowInTaskbar = false;
f.ShowIcon = false;
f.StartPosition = FormStartPosition.Manual;
f.Location = new Point(-32000, -32000);
f.Show();
f.Hide();
Application.Run();
My file watcher works in debug mode or when I add the sleep-hack of yours. It certainly looks like a bug in the FileSystemWatcher.