I have about 5-6 Server Manager programs that write their own configuration file out to a particualr folder, such as C:\ACME. The config files all end with a *ServerConfig.cfg" where * = Program name that created it.
I have a Windows service that has a FileSystemWatcher setup that I want to FTP the configuration files each time the program updates. I've gotten everything to work, but I'm noticing that the different Server Manager programs are behaving differently.
When saving a configuration file, the FileSystemWatcher is picking up two "change" events. This is causing my program to FTP the configuration file twice where I only need it once.
In other instances I'm seeing where it may create 4, 5, or 6 "change" events when saving a configuration file.
What is the best way to handle processing/FTPing these files when they are really done saving only one time.
I really dont want o set something up to poll the directory for filechanges every so often... and like the idea that each time a configuration is saved, I get a duplicate copy along with a date/timestamp appended to the filename copied elsewhere.
I have seen lots of suggestions Googling around and even here on Stackoverflow, but nothing that seems to be all-in-one for me.
I suppose I could put the filename in a queue when a "change" event occurred if it didn't already exist in the queue. Not sure if this is the best approx.
Here is my sample code:
Startup-code:
private DateTime _lastTimeFileWatcherEventRaised = DateTime.Now;
_watcherCFGFiles = new FileSystemWatcher();
_watcherCFGFiles.Path = #"C:\ACME";
_watcherCFGFiles.IncludeSubdirectories = true;
_watcherCFGFiles.Filter = "*ServerConfig.cfg";
_watcherCFGFiles.NotifyFilter = NotifyFilters.Size;
//_watcherCFGFiles.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.FileName;
_watcherCFGFiles.Changed += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Created += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Deleted += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Renamed += new RenamedEventHandler(LogFileSystemRenaming);
_watcherCFGFiles.Error += new ErrorEventHandler(LogBufferError);
_watcherCFGFiles.EnableRaisingEvents = true;
Here is that actual handler for the "change" event. I'm skipping the first "change" event if the second is within 700ms. But this doesn't account for the files that make 3-4 change events...
void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
string log = string.Format("{0} | {1}", e.FullPath, e.ChangeType);
if( e.ChangeType == WatcherChangeTypes.Changed )
{
if(DateTime.Now.Subtract(_lastTimeFileWatcherEventRaised).TotalMilliseconds < 700)
{
return;
}
_lastTimeFileWatcherEventRaised = DateTime.Now;
LogEvent(log);
// Process file
FTPConfigFileUpdate(e.FullPath);
}
}
I had the exact same issue. I used a HashMap that mapped filenames to times of writes, I then used this as a lookup table for files to check and see if the changed event had been applied very quickly. I defined some epsilon (for me it was about 2 seconds to make sure events were flushed). If the time found in the map was older than that I would put it on a queue to be processed. Essentially all I had to do was keep the HashMap up to date with events and changes and this worked out (although you may want to change your epsilon value depending on your application).
Its normal this behavior because the antivirus system or other programs make more writes when a file change the content. I usually create a (global) HashTable and check if the filename exists, if don't, put the filename in it and start and an asynchronous operation to remove the filename after 3-5 seconds.
This is expected behavior - so you need to figure out how to handle it in your particular case.
The file system does not have a concept of "program done working with this file". I.e. one can write editor that updates (open/write/close) file on every keystroke. File system will report a lot of updates, but from the user point of view there is only one update when the editor is closed.
Related
My Windows Service application is writing logs to files in a folder every second. I want to monitor the log activities, if there are no logging for some time, it will signal error.
Below is an example of the log file names. Only the latest log file is being written into at a time, other files are not written into. The Monitor app cares about only the last log file.
MyApplication.MachineName.2018-06-05.log
MyApplication.MachineName.2018-06-04.001.log
Below is the code that monitors the log activities.
private void WatchFileChanges()
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = #"C:\Logs";
/* Watch for changes in LastWrite times */
watcher.NotifyFilter = NotifyFilters.LastWrite;
// Only watch text files.
watcher.Filter = "*.log";
// Add event handlers.
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
// Define the event handlers.
private void OnChanged(object source, FileSystemEventArgs e)
{
Console.WriteLine("File: " + e.FullPath + " " + e.ChangeType);
}
Question:
Is there a way to reduce the number of log files being monitored under the folder? For example, monitor only log files for today, or last two days, rather than all files in the folder. Because the internal buffer is limited, excess log files might cause the monitoring not working.
You need to get a little bit creative here, and the simplest approach is just the FileSystemWatcher.Filter Property
Gets or sets the filter string used to determine what files are
monitored in a directory.
Remarks
To watch changes in all files, set the Filter property to an empty
string (""). To watch a specific file, set the Filter property to the
file name. For example, to watch for changes in the file MyDoc.txt,
set the Filter property to "MyDoc.txt". You can also watch for changes
in a certain type of file. For example, to watch for changes in any
text files, set the Filter property to "*.txt". Use of multiple
filters such as "*.txt|*.doc" is not supported.
Some further examples
*.* : All files (default). An empty string ("") also watches all files.
*.txt : All files with a "txt" extension.
*recipe.doc : All files ending in "recipe" with a "doc" extension.
win*.xml : All files beginning with "win" with an "xml" extension.
Sales*200?.xls :
Matches the following:
Sales July 2001.xls
Sales Aug 2002.xls
Sales March 2004.xls
but does not match:
Sales Nov 1999.xls
MyReport.Doc : Watches only MyReport.doc
Now with this information, you can easily determine if you can create a filter for the current days logs, and if you can, then you can change the filter dynamically each day to target those logs. As noted in the documentation
The Filter property can be changed after the FileSystemWatcher object
has started receiving events.
Or as mentioned in the comments,
Either put the days logs in a different folder
If you are using a logging framework name your logs different for the current day so they "are" targetable
Or increase the FileSystemWatcher buffer and monitor everything
This is your mission, if you choose to accept it.
I've a problem with the FileSystemWatcher in C#.
I watch a file used by another program.
This is not a problem. The problem is that the only value that changes with the file is the size. The other program is writing the file without updating the change or write date.
And the size value is only updating when the (windows 7) explorer refreshing (F5, or clicked on the file).
FileSystemWatcher fileWatcher = new FileSystemWatcher();
fileWatcher.Changed += new FileSystemEventHandler(fileWatcher_Changed);
fileWatcher.Path = Path.GetDirectoryName(path); // get directory of file path.
fileWatcher.Filter = Path.GetFileName(path); // only this file
fileWatcher.NotifyFilter = NotifyFilter.Size; // and maybe other
fileWatcher.EnableRaisingEvents = true;
private void fileWatcher_Changed(object sender, FileSystemEventArgs e)
{
// ...
}
I guess that this problem can only be solved by polling. Because the file should be triggered for refreshing file info data.
But I hope for another solution without polling.
About my application:
Reading a file which is in use of another program. Getting the text of the file is possible with FileStream FileShare.ReadWrite. It's working fine. I want to update the textbox (reading the file) when the file has been changed. But the only value who is updating while the other program is accessing to it, is the file size. But only when the explorer is refreshing or I'm clicking on the file. This is the issue of this question. If the problem is unsolvable, the alternative is: updating the file content (reading file) all x time. (polling) without a file watcher.
Perhaps, following would solve your problem, and avoid polling. Try using WMI, querying root\cimv2 namespace with something like:
select filesize from 'cim_datafile' where name='_your_path_'
Might differ slightly when applied from C# but along those lines. Regarding WMI, there's lots of tutorials how to initialize a WMI listener in .NET. Search around.
I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.
I am developing a commenting system with asp.net. A user can attach an image with "Attach" button and post the comment with "Post" button. Uploading the image starts when the user attaches it. An ASHX handler saves the uploaded file to "temp" folder. If the user clicks "Post" button, I move the image into a safe place. If he doesn't click "Post", closes the browser and goes away, the file remains in the "temp" folder. How can I delete a file from this "temp" folder one hour later after it is uploaded?
Details:
I thought using System.Timers.Timer in the ashx file used for uploading
System.Timers.Timer timer = new System.Timers.Timer(300);
string fileName;
public void Cleaner()
{
System.Timers.Timer timer = new System.Timers.Timer(300); //3 second
timer.Elapsed += new System.Timers.ElapsedEventHandler(timer_Elapsed);
timer.Start();
}
protected void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs a)
{
timer.Stop();
timer.Close();
string path = "temp";
string mapPath = HttpContext.Current.Server.MapPath("../" + path);
FileInfo TheFile = new FileInfo(mapPath + "\\" + fileName);
if (TheFile.Exists) File.Delete(mapPath + "\\" + fileName);
}
public void ProcessRequest(HttpContext context)
{
//Saving uploaded file
Cleaner();
}
but I feel that I am not doing right.
Timer ticks after 3 seconds but HttpContext.Current in the timer_Elapsed() function returns null. Besides, file name also returns null after timer ticks. I couldn't find a way to pass file name as a parameter when binding an event. Simply, it is problematic. I am looking for a more elegant way to delete the uploaded file after one hour.
I would avoid timers as you will create one timer per file which will not scale very well at all.
How about this, run a clean up process on another thread in the web app started on app start that will delete temp files each time a session expires. That way you need no timers as the process will be prompted each time a session expires. You will need a class to store a reference (by unique name I guess) to the file which are still live (by that I mean the session to which they belong is still live) which the clean process can check.
LMK if you want some code pointers.
HttpContext.Current should be null as the the context died as soon as response was sent.
If you were using unix, I would suggest write a script and run using cron. But seems you are using Windows.
So, write a program (exe) which deletes files (even better only image files) from temp directory based on creation date. Google and you will find lots of tutorial how to do it. Deleting a file is one line of code. If you are using system temp dir, that is another line of code. If you are using custom temp dir, you already know the path.
If you want to check creation time property (or last modified time property), you need to write few more lines.
Now, schedule the exe as per your requirement using windows task manager. Or you can use 3rd party task managers available for windows.
I have a utility which goes through a processes a set of files in a directory - the process is relatively slow (and there are a lot of files) and so I've tried to optimise the process by only processes files that have a "last modified" later than the last processing date.
Usually this works well however I've found that as copying a file doesn't change the last modified date, and so there are various scenarios involving copying files in which certain files that have changed are skipped by the process, for example:
The user processes the directory at 9:00.
A file is then copied from this directory and modified so that it has a last modified date of 9:30
The directory is then processed again at 10:00
The modified file is then copied back into the directory at 10:30
Finally the directory is processed again at 11:00
As the modified date of the given file is 9:30, and the directory was last processed at 10:00 the file is skipped when it shouldn't be.
Unfortunately the above tends to happen far too often in certain situations (such as in a collaborative environment with source control etc...). Clearly my logic is flawed - what I really need is a "last modified or copied" date. does such a thing exist?
Failing that, is there another way to quickly determine with reasonable reliability if a given file has changed?
You might want to look at using the FileSystemWatcher class. This class lets you monitor a directory for changes and will fire an event when something is modified. Your code can then handle the event and process the file.
From MSDN:
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = args[1];
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
// Only watch text files.
watcher.Filter = "*.txt";
// Add event handlers.
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
Have you thought of running MD5 checksums on the files and storing them later for comparison? If your always processing a certain directory, this might be feasible.
You can use the FileInfo class to get the required change information (which you might be already using). You need to check two properties of a file, which are LastWriteTime and CreationTime. If either of them is higher than your last processing date, you need to copy the file. It is a common misconception that CreationTime is always less than LastWriteTime. It's not. If a file is copied to another file, the new file retains the LastWriteTime of the source but the CreationTime will be the time of the copy.
Have you considered adding a process to watch your directory instead? Using a FileSystemWatcher? Then you move from using a batch process and a real time system for monitoring your files.
As you've observed, copying a file to an existing destination file keeps the existing file's CreationTime, and sets LastWriteTime to the source file's LastWriteTime, rather than current system time when doing the copy. Two possible solutions:
Do a delete-and-copy, ensuring a destination CreationTime will be system's current time.
Check for file's Archived attribute as well, and clear it while processing. When copying source->dest, dest +A attribute will be set.