I am currently writing some background processes for a website that are to be run nightly on a webserver.
My main issue is that I need to detect whether an image file has changed in the last 24 hours. I thought that this would be easily achievable using the following code:
DateTime lastWrite = System.IO.File.GetLastWriteTimeUtc(HttpContext.Current.Server.MapPath(image.FileName));
if (lastWrite > DateTime.UtcNow.AddHours(-24) && lastWrite < DateTime.UtcNow)
{
var a = "This item has been modified";
}
else
{
var b = "This item has not been modified";
}
however, it seems that this will only give me the DateTime when the file was last modified. This is great for if the image has been edited using something like paint but it does not tell me whether the image has been overwritten using a cut and paste on an existing image as the modified and create date remain the same as the original image that was in place.
My question therefore is how do I detect whether a file (mainly images) have truly been modified (edited, copied over, removed then replaced) within a 24 hour period?
Any help would be greatly appreciated.
For this long-running-page application you'll need to keep a database containing information about the old files. One way of doing this would be to store the MD5 of each file and then compare and copy things that have changed.
using (var md5 = MD5.Create())
using (var stream = File.OpenRead(filename)) {
return Convert.ToBase64String(md5.ComputeHash(stream));
}
This has the added benefit of allowing you to sync even if some problem means that your "housekeeping" doesn't get run overnight, which will inevitably happen!
If you have no other software using the archive file attribute, that will work as you need it to - once you've copied the file to another location, unset the archive attribute. Overwriting the file from another directory will reset it!
// Check for flag
FileAttributes attributes = File.GetAttributes(path);
bool isArchiveSet = (attributes & File.Archive) == File.Archive
// Remove archive flag
FileAttributes attributes = File.GetAttributes(path);
attributes = attributes & ~FileAttributes.Archive;
File.SetAttributes(path, attributes);
(Untested code, if you have a problem let me know I'll look at it - but I have tested the behaviour of the archive bit and it does what you want it to).
Related
I have a program that checks a directory of zip files which get uploaded (via FTP) to my server daily (from different clients), each zip needs to contain a series of particular data files, so I am using the ZipArchive class to open the file, check the contents and make sure it contains the files we need and also make sure they have been recently updated (to make sure we're not uploading an old copy)
My program works for the most part, the problem I am facing is that if a file is currently being uploaded while Im trying to check it, my checker program is freezing. I cant seem to find any way around it.
Here is what Ive tried:
using (var zip = ZipFile.OpenRead("FileName")) // <--This is where it freezes.
{
// Check the contents
}
and I tried this:
using (var fs = File.OpenRead("FileName"))
{
using (var zip = new ZipArchive(fs, ZipArchiveMode.Read)) // <-- This is where it freezes.
{
// Check the contents
}
}
Ultimately I would just like it to throw an exception so I can continue onto the next file.
If anyone has had a similar issue or have any suggestions on what I can try it would be appreciated.
I have a situation here. I want to read files based on their creation of last modified time. Initially i used FileSystemWatcher so that i was notified when a new file was coming, but later i realized that if the system on which my software is running goes down or restarts the location where files were being dropped will still continue.
To make it easier for understanding i will give an example:
System A - File Server (Files are created every 2 min in a directory on this server)
System B - My Software will run and Monitor files from the Path of System A
If System B goes restarts and is up again after 10 min the FileSystemWatcher will skip all these files which were generated in those 10 min.
How Can I ensure that those files generated in those 10 min of time are also captured?
Let me know if my question is still not understandable.
If you don't want to split it up in two systems, you have to persist a little bit of information.
You could store the current timestamp in a file, every time a new event was fired on the filesystem watcher. Every time your service starts, you can read all files from the filesystem that are newer than the last timestamp. This way you shouldn't miss a file.
I would split this application into two parts and running a filesystemwatcher-wcf service that buffers the files created in this 10 minutes and will send it to system b when it is restarted. I can't see a other way, sorry.
I think the FileSystemWatcher must write info about file system into DB (or other type of storage). When System B starts, watcher compares current file system with this info and will raise events about changes.
Copy the entire files from the Source Machine and paste into the destination based on condition..
string dirPath = #"C:\A";
string DestinPath = #"C:\B";
if (Directory.Exists(dirPath) && Directory.Exists(DestinPath))
{
DirectoryInfo di = new DirectoryInfo(dirPath);
foreach (var file in di.GetFiles())
{
string destinFile = DestinPath + "\\" + file.Name;
if (File.Exists(destinFile))
{
continue;
}
else
file.CopyTo(destinFile);
}
}
Not sure if I understood your question correctly, but based on what I get and assuming both systems are in sync in terms of time, if for example you want to get files that have been modified within ten minutes ago:
DateTime tenMinutesAgo = DateTime.Now.AddMinutes(-10);
string[] systemAFiles = System.IO.Directory.GetFiles(systemAPath);
foreach (string files in systemAFiles)
{
DateTime lastWriteTime = System.IO.File.GetLastWriteTime(files);
if (lastWriteTime > tenMinutesAgo) //produced after ten minutes ago
{
//read file
}
}
I understood that these files are "generated" so they have been created or modified. If they have simply been moved from one folder to another this will not work. In that case the best way is to write a snapshot of the files in that list (and writing it to some sort of a save file) and compare it when it is running again.
I feel kind of stupid posting this, but it seems like a genuine issue that I've made sufficiently simple so as to demonstrate that it should not fail. As part of my work I am responsible for maintaining build systems that take files under version control, and copy them to other locations. Sounds simple, but I've constantly experienced file access violations when attempting to copy files that I've supposedly already set as 'Normal'.
The code sample below simply creates a set of test files, makes them read only, and then copies them over to another folder. If the files already exist in the destination folder, the RO attribute is cleared so that the file copy will not fail.
The code works to a point, but at seemingly random points an exception is thrown when the file copy is attempted. The code is all single threaded, so unless .NET is doing something under the hood that causes a delay on the setting of attributes I can't really explain the problem.
If anyone can explain why this is happening I'd be interested. I'm not looking for a solution unless there is something I am definitely doing wrong, as I've handled the issue already, I'm just curious as no one else seems to have reported anything related to this.
After a few iterations I get something like:
A first chance exception of type 'System.UnauthorizedAccessException' occurred in mscorlib.dll
Additional information: Access to the path 'C:\TempFolderB\TEMPFILENAME8903.txt' is denied.
One other fact, if you get the file attributes BEFORE the file copy, the resulting state says the file attributes indeed Normal, yet examination of the local file shows it as Read Only.
/// <summary>
/// Test copying multiple files from one folder to another while resetting RO attr
/// </summary>
static void MultiFileCopyTest()
{
/// Temp folders for our test files
string folderA = #"C:\TempFolderA";
string folderB = #"C:\TempFolderB";
/// Number of files to create
const int fileCount = 10000;
/// If the test folders do not exist populate them with some test files
if (System.IO.Directory.Exists(folderA) == false)
{
const int bufferSize = 32768;
System.IO.Directory.CreateDirectory(folderA);
System.IO.Directory.CreateDirectory(folderB);
byte[] tempBuffer = new byte[bufferSize];
/// Create a bunch of files and make them all Read Only
for (int i = 0; i < fileCount; i++)
{
string filename = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
if (System.IO.File.Exists(filename) == false)
{
System.IO.FileStream str = System.IO.File.Create(filename);
str.Write(tempBuffer, 0, bufferSize);
str.Close();
}
/// Ensure files are Read Only
System.IO.File.SetAttributes(filename, System.IO.FileAttributes.ReadOnly);
}
}
/// Number of iterations around the folders
const int maxIterations = 100;
for (int idx = 0; idx < maxIterations; idx++)
{
Console.WriteLine("Iteration {0}", idx);
/// Loop for copying all files after resetting the RO attribute
for (int i = 0; i < fileCount; i++)
{
string filenameA = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
string filenameB = folderB + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
try
{
if (System.IO.File.Exists(filenameB) == true)
{
System.IO.File.SetAttributes(filenameB, System.IO.FileAttributes.Normal);
}
System.IO.File.Copy(filenameA, filenameB, true);
}
catch (System.UnauthorizedAccessException ex)
{
Console.WriteLine(ex.Message);
}
}
}
}
(This isn't a full answer, but I don't have enough reputation yet to post comments...)
I don't think you are doing anything wrong, when I run your test code I can reproduce the problem every time. I have never got past 10 iterations without the error occurring.
I did a further test, which might shed some light on the issue:
I set all of the files in TempFolderA to hidden.
I then ensured all of the files in TempFolderB were NOT hidden.
I put a break-point on Console.WriteLine(ex.Message)
I ran the code, if it got past iteration 1 then I stopped, reset the hidden attributes and ran again.
After a couple of tries, I got a failure in the 1st iteration so I opened Windows Explorer on TempFolderB and scrolled down to the problematic file.
The file was 0 bytes in size, but had RHA attributes set.
Unfortunately I have no idea why this is. Process Monitor doesn't show any other activity which could be relevant.
Well, right in the documentation for the System.IO.File.SetAttributes(string path, System.IO.FileAttributes attributes) method, I found the following:
Exceptions:
System.UnauthorizedException:
path specified a file that is read-only.
-or- This operation is not supported on the current platform.
-or- The caller does not have the required permission.
So, if I had to guess, the file in the destination (e.g. filenameB) did in fact already exist. It was marked Read-Only, and so, the exception was thrown as per the documentation above.
Instead, what you need to do is remove the Read-Only attribute via an inverse bit mask:
if (FileExists(filenameB))
{
// Remove the read-only attribute
FileAttributes attributes = File.GetAttributes(filenameB);
attributes &= ~FileAttributes.ReadOnly;
File.SetAttributes(filenameB, attributes);
// You can't OR the Normal attribute with other attributes--see MSDN.
File.SetAttributes(filenameB, FileAttributes.Normal);
}
To be fair, the documentation on the SetAttributes method isn't real clear about how to go about setting file attributes once a file is marked as Readonly. Sure, there's an example (using the Hidden attribute), but they don't explicitly say that you need to use an inverted bitmask to remove the Hidden or Readonly attributes. One could easily assume it's just how they chose to "unset" the attribute. It's also not clear from the documentation about what would happen, for instance, if you marked the file thusly:
File.SetAttributes(pathToFile, FileAttributes.Normal);
File.SetAttributes(pathToFile, FileAttributes.Archived);
Does this result in the file first having Normal attributes set, then only Archived, or does it result in the file having Normal set, and then _additionallyhavingArchived` set, resulting in a Normal, but Archived file? I believe it's the former, rather than the latter, based on how attributes are "removed" from a file using the inverted bitmask.
If anyone finds anything contrary, please post a comment and I'll update my answer accordingly.
HTH.
This may cause because user don't have sufficient permission to access file system ,
work around :
1,Try to run application in administrative mode
OR
2,Try to run visual studio in administrative mode (if you are using debugger)
I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}
I have code that reads encrypted credentials from a text file. I updated that text file to include a connection string. Everything else is read and decrypted fine, but not the connection string (naturally, I updated my code accordingly, too).
So I got to wondering: is it reading the correct file. The answer: No! The file in \bin\debug is dated 6/5/2012 9:41 am, but this code:
using (StreamReader reader = File.OpenText("Credentials.txt")) {
string line = null;
MessageBox.Show(File.GetCreationTime("Credentials.txt").ToString());
...shows 6/4/2012 2:00:44 pm
So I searched my hard drive for all instances of "Credentials.txt" to see where it was reading the file from. It only found one instance, the one with today's date in \bin\debug.
???
Note: Credentials.txt is not a part of my solution; should it be? (IOW, I simply copied it into \bin\debug, I didn't perform an "Add | Existing Item")
Provided you don't change the current directory, the file in bin\Debug is going to be the one being read, as you're not specifying a full path.
The problem is likely due to the differences between the different File dates. The Creation Date (which is what you are fetching and displaying as 6/4 # 2:00:44pm) is likely different from the Date modified (which is what is shown by default in Windows Explorer). This date can be fetched using File.GetLastWriteTime instead of GetCreationTime.
That being said, I would recommend using the full path to the file, and not assuming that the current directory is the same as the executable path. Specifying the full path (which can be determined based on the executable path) will be safer, and less likely to cause problems later. This can be done via:
var exePath = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetEntryAssembly().Location);
var file = System.IO.Path.Combine(exePath, "Credentials.txt");
using (StreamReader reader = File.OpenText(file)) { // ...