I am currently using this code:
if (!Directory.Exists(command2)) Directory.CreateDirectory(command2);
if (Directory.Exists(vmdaydir)) Directory.Delete(vmdaydir,true);
if (!Directory.Exists(vmdaydir)) Directory.CreateDirectory(vmdaydir);
var dir = Path.GetDirectoryName(args[0]);
sb.AppendLine("Backing Up VM: " + DateTime.Now.ToString(CultureInfo.InvariantCulture));
Microsoft.VisualBasic.FileIO.FileSystem.CopyDirectory(dir, vmdaydir);
sb.AppendLine("VM Backed Up: " + DateTime.Now.ToString(CultureInfo.InvariantCulture));
As you can see, I am deleting the directory, then I am copying the folder back. This is taking way to long since the directory is ~80gb in size. I realized that I do not need to copy all the files, only the ones that have changed.
How would I copy the files from one folder to another but only copying the files that are newer? Anyone have any suggestions?
==== edit ====
I assume I can just do a file compare of each file and then copy it to the new directory, iterating through each folder/file? Is there a simpler way to do this?
Use the FileInfo class, and use the LastWriteTime property to get the last modified time of the file. Compare it to the time you're checking against and take only files that are later.
Loop through the files in the directory, checking the last modified time (FileInfo.LastWriteTime) - any files that are newer are copied over.
See FileInfo Class for more information.
You need to be careful when trying to do this that you can get a lock on the file otherwise another application may not be finished with it and you may try to copy it before you are ready.
So follow these steps...
1) attempt to lock file
2) if(got lock) copy file
3) else wait a short time
4) goto 1
:)
Related
Technology Used: C#, IonicZip library.
From the list of multiple log files(Let's say 10,000, each of reasonable amount of size). I have to zip these files in a folder. But then zipped folder's size must be approximately under 4MB. How can I have minimum possible number of zipped folders.
private static string ZipAndReturnFolderPath(IEnumerable<string> files, string saveToFolder)
{
int listToSkip = 0;
using (var zip = new ZipFile())
{
do
{
zip.AddFiles(files.Skip(listToSkip * 10).Take(10));
zip.Save(saveToFolder);
listToSkip++;
}
while ((new FileInfo(saveToFolder).Length < _lessThan4MB) && totalFilesRemaining > 0);
}
return saveToFolder;
}
Here, to make it concise, I have removed few lines of code. Parameter: files - holds the path of the total remaining files to be zipped(Don't worry about how I will maintain that). saveToFolder is the destination for the zipped folder(this will be unique each time the function is called).
I believe this works. I have checked the files it has been zipping and there I find no duplication. But, zipping files to a folder, checking the condition and then again repeating the same process for the next few files in the already zipped folder doesn't sound to be a good approach.
Am I doing anything wrong or is there any efficient way I can achieve this?
I think what you're after has been answered here already, using ZipOutputStream could be what you're after.
I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.
I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}
I am trying to write out a text file to: C:\Test folder\output\, but without putting C:\ in.
i.e.
This is what I have at the moment, which currently works, but has the C:\ in the beginning.
StreamWriter sw = new StreamWriter(#"C:\Test folder\output\test.txt");
I really want to write the file to the output folder, but with out having to have C:\ in the front.
I have tried the following, but my program just hangs (doesn't write the file out):
(#"\\Test folder\output\test.txt");
(#".\Test folder\output\test.txt");
("//Test folder//output//test.txt");
("./Test folder//output//test.txt");
Is there anyway I could do this?
Thanks.
Thanks for helping guys.
A colleague of mine chipped in and helped as well, but #Kami helped a lot too.
It is now working when I have:
string path = string.Concat(Environment.CurrentDirectory, #"\Output\test.txt");
As he said: "The CurrentDirectory is where the program is run from.
I understand that you would want to write data to a specified folder. The first method is to specify the folder in code or through configuration.
If you need to write to specific drive or current drive you can do the following
string driveLetter = Path.GetPathRoot(Environment.CurrentDirectory);
string path = diveLetter + #"Test folder\output\test.txt";
StreamWriter sw = new StreamWriter(path);
If the directory needs to be relative to the current application directory, then user AppDomain.CurrentDomain.BaseDirectory to get the current directory and use ../ combination to navigate to the required folder.
You can use System.IO.Path.GetDirectoryName to get the directory of your running application and then you can add to this the rest of the path..
I don't get clearly what you want from this question , hope this get it..
A common technique is to make the directory relative to your exe's runtime directory, e.g., a sub-directory, like this:
string exeRuntimeDirectory =
System.IO.Path.GetDirectoryName(
System.Reflection.Assembly.GetExecutingAssembly().Location);
string subDirectory =
System.IO.Path.Combine(exeRuntimeDirectory, "Output");
if (!System.IO.Directory.Exists(subDirectory))
{
// Output directory does not exist, so create it.
System.IO.Directory.CreateDirectory(subDirectory);
}
This means wherever the exe is installed to, it will create an "Output" sub-directory, which it can then write files to.
It also has the advantage of keeping the exe and its output files together in one location, and not scattered all over the place.
My question is about handling temporary files in a small .NET program. This program/utility just does the following with two URLs:
Download each URL to a string (WebClient.DownloadString).
Save each string to a temporary file.
Call WinMerge (Process.Start) to diff the two files (in read-only mode for both files).
I currently have the Simplest Thing That Could Possibly Work:
save URL1 to windows/temp/leftFileToDiff.txt
save URL2 to windows/temp/rightFileToDiff.txt.
This works great - as WinMerge only needs to run in Read Only mode the files can be overwritten by running my program multiple times and nothing bad happens.
However, I would now like to change the temporary file names to something meaningful (related to the URL) so that I can see which is which in the WinMerge view. I also want to clean these files up when they are no longer needed. What are my options for this?
My next simplest idea is to have a specified folder where these are stored and just to zap this every time my program exits. Is there a better/more elegant/standard way?
Thanks.
Create a Guid-based folder under the user's temp area and use that?
string path = Path.Combine(Path.GetTempPath(),
Guid.NewGuid().ToString("n"));
Directory.CreateDirectory(path);
try
{
// work inside path
}
finally
{
try { Directory.Delete(path, true); }
catch (Exception ex) {Trace.WriteLine(ex);}
}