here is a scenario:
I copy a file to a folder
and I expect it to be taken by other program.
is there a way to monitor the time that a file exists in a folder
I want to raise an event in case the file is in the folder over X time
Please advise what can I use to accomplish that
I had in mind using File system watcher and save the time of it created
and compare if the file exists after X time
If you only care about timeout, then simply check for it after copying. E.g. using a Timer:
var timer = Timer(5000);
timer.Elapsed += (s, a) =>
{
// checking if file is still there
if(File.Exists(...)
{
// do something
}
}
timer.Start();
Depending on how precise you need to be, you could just have a process that checks the contents of the folder every so often and lets you know about any files that have been in the folder for too long.
To list files in a directory you could use Directory.GetFiles as below.
Then iterate thru all the files checking for file age.
string[] files = System.IO.Directory.GetFiles("c:\temp", "*.txt", System.IO.SearchOption.TopDirectoryOnly);
foreach(string file in files)
{
if (System.DateTime.UtcNow.Subtract(System.IO.File.GetCreationTimeUtc(file)).TotalMinutes > 5)
System.Diagnostics.Debug.WriteLine("TODO: Alert, file older than 5 minutes...");
}
Related
I have a base folder under a drive Data and under this I have around 100 folders.
In each folder Folder1.....100, one of the 3rd part application pushing zip file (zip contains 1 or more files).
I have to write a window service which will watch all 100 folders for file arrival.
Once file is available I need to extract the zip file and placing all the extracted files into a second folder and this I need to do for each folder (Folder 1 .. 100) as soon as files available.
Below code suggest me that through C# FileSystemWatcher, I can watch one folder at a time and act on that.
Question is, how to do watch for 100 folders in parallel?
class ExampleAttributesChangedFiringTwice
{
public ExampleAttributesChangedFiringTwice(string demoFolderPath)
{
var watcher = new FileSystemWatcher()
{
Path = demoFolderPath,
NotifyFilter = NotifyFilters.LastWrite,
Filter = "*.txt"
};
watcher.Changed += OnChanged;
watcher.EnableRaisingEvents = true;
}
private static void OnChanged(object source, FileSystemEventArgs e)
{
// extract zip file, do the validation, copy file into other destination
}
}
The target folder, is it the same folder regardless of the source folder of the zip? That is, it doesn't matter if it's from Folder1 or Folder2, both will be extracted to FolderX?
Target folder is common for all "C:\ExtractedData".
So every folder under Data will be watched? No "blacklisted" folder? What about if a zip appears in Data itself instead of its subfolder? What if a new subfolder is created, should it be watched too?
"zip" always comes inside "subfolders", it will never create inside Data folder.
Yes, there is a chance in future, more subfolders will come and need watch.
And does the extracted files goes into a separate subfolder inside the target folder based on their zip filename, or do they just get extracted on the target folder, eg, if it's A.zip, does the content goes to Target\A or just Target.
For example, if A.zip contains 2 files, "1.txt" and "2.txt", then both files goes to "C:\ExtractedData". This will be common for each zip files arrives at different subfolders.
The "100 folders in parallel" part turn out to be a red herring. Since all the new zip files are treated the same regardless of where they show up, just adding IncludeSubdirectories=true is enough. Note the following codes are prone to exceptions, read the comments
class WatchAndExtract
{
string inputPath, targetPath;
public WatchAndExtract(string inputPath, string targetPath)
{
this.inputPath = inputPath;
this.targetPath = targetPath;
var watcher = new FileSystemWatcher()
{
Path = inputPath,
NotifyFilter = NotifyFilters.FileName,
//add other filters if your 3rd party app don't immediately copy a new file, but instead create and write
Filter = "*.zip",
IncludeSubdirectories = true
};
watcher.Created += OnCreated; //use Changed if the file isn't immediately copied
watcher.EnableRaisingEvents = true;
}
private void OnCreated(object source, FileSystemEventArgs e)
{
//add filters if you're using Changed instead
//https://stackoverflow.com/questions/1764809/filesystemwatcher-changed-event-is-raised-twice
ZipFile.OpenRead(e.FullPath).ExtractToDirectory(targetPath);
//this will throw exception if the zip file is being written.
//Catch and add delay before retry, or watch for LastWrite event that already passed for a few seconds
}
}
If it skipped some files, you either have too many files created at once and/or too big zip to process. Either increase the buffer size or start them in new thread. On HDD with busy IO or extremely large zip files, the events might exceed the storage capability and skipped files after a prolonged busy period, you'll have to consider writing to a different physical (not just a different partition in the same device) drive instead. Always verify with your predicted usage pattern.
Technology Used: C#, IonicZip library.
From the list of multiple log files(Let's say 10,000, each of reasonable amount of size). I have to zip these files in a folder. But then zipped folder's size must be approximately under 4MB. How can I have minimum possible number of zipped folders.
private static string ZipAndReturnFolderPath(IEnumerable<string> files, string saveToFolder)
{
int listToSkip = 0;
using (var zip = new ZipFile())
{
do
{
zip.AddFiles(files.Skip(listToSkip * 10).Take(10));
zip.Save(saveToFolder);
listToSkip++;
}
while ((new FileInfo(saveToFolder).Length < _lessThan4MB) && totalFilesRemaining > 0);
}
return saveToFolder;
}
Here, to make it concise, I have removed few lines of code. Parameter: files - holds the path of the total remaining files to be zipped(Don't worry about how I will maintain that). saveToFolder is the destination for the zipped folder(this will be unique each time the function is called).
I believe this works. I have checked the files it has been zipping and there I find no duplication. But, zipping files to a folder, checking the condition and then again repeating the same process for the next few files in the already zipped folder doesn't sound to be a good approach.
Am I doing anything wrong or is there any efficient way I can achieve this?
I think what you're after has been answered here already, using ZipOutputStream could be what you're after.
I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.
I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}
I am currently using this code:
if (!Directory.Exists(command2)) Directory.CreateDirectory(command2);
if (Directory.Exists(vmdaydir)) Directory.Delete(vmdaydir,true);
if (!Directory.Exists(vmdaydir)) Directory.CreateDirectory(vmdaydir);
var dir = Path.GetDirectoryName(args[0]);
sb.AppendLine("Backing Up VM: " + DateTime.Now.ToString(CultureInfo.InvariantCulture));
Microsoft.VisualBasic.FileIO.FileSystem.CopyDirectory(dir, vmdaydir);
sb.AppendLine("VM Backed Up: " + DateTime.Now.ToString(CultureInfo.InvariantCulture));
As you can see, I am deleting the directory, then I am copying the folder back. This is taking way to long since the directory is ~80gb in size. I realized that I do not need to copy all the files, only the ones that have changed.
How would I copy the files from one folder to another but only copying the files that are newer? Anyone have any suggestions?
==== edit ====
I assume I can just do a file compare of each file and then copy it to the new directory, iterating through each folder/file? Is there a simpler way to do this?
Use the FileInfo class, and use the LastWriteTime property to get the last modified time of the file. Compare it to the time you're checking against and take only files that are later.
Loop through the files in the directory, checking the last modified time (FileInfo.LastWriteTime) - any files that are newer are copied over.
See FileInfo Class for more information.
You need to be careful when trying to do this that you can get a lock on the file otherwise another application may not be finished with it and you may try to copy it before you are ready.
So follow these steps...
1) attempt to lock file
2) if(got lock) copy file
3) else wait a short time
4) goto 1
:)