every time i create a FileInfo object and access it's lastaccesstime property, it's always a few minutes off. the file property window remains constant, however the application shows that it is usually a few minutes after the property window time.
Also, i noticed that if i drag the file to the cmd window to pass the filename as an argument, the access time updates most of the time, but not always.
What could be causing this ?
below is an example:
static void Main(string[] args)
{
if (args.Length > 0)
{
FileInfo fi = new FileInfo(args[0].ToString());
Console.WriteLine(args[0]);
if (fi.Exists)
{
Console.Write("Current: " + DateTime.Now + "\n");
Console.Write("LAT: " + fi.LastAccessTime + "\n");
Console.Write("LWT: " + fi.LastWriteTime + "\n");
Console.Write("CT: " + fi.CreationTime + "\n");
}
Console.ReadKey();
}
}
alt text http://img407.imageshack.us/img407/4728/propertiesox6.png
alt text http://img380.imageshack.us/img380/7752/appgt0.png
In my experience, last access time is notoriously unreliable. According to http://technet.microsoft.com/en-us/library/cc781134.aspx...
The Last Access Time on disk is not always current because NTFS looks for a one-hour interval before forcing the Last Access Time updates to disk. NTFS also delays writing the Last Access Time to disk when users or programs perform read-only operations on a file or folder, such as listing the folder’s contents or reading (but not changing) a file in the folder.
Apparently, the in-memory copy will be correct, but in my experience, you may get a cached value which may be out of date. Also, note that last access time may be turned off by the user, and is turned off by default in Vista and 2008.
The MSDN article with basic info about file times has this to say about file time resolution and Last Access times:
For example, on FAT, create time has a resolution of 10 milliseconds, write time has a resolution of 2 seconds, and access time has a resolution of 1 day (really, the access date). NTFS delays updates to the last access time for a file by up to one hour after the last access.
This would imply that on both FAT and NTFS, the Last Write Time will generally not be very precise, although I'm not sure the exact values they quote are correct.
Hmm, possibly this from MSDN:
When first called, FileSystemInfo
calls Refresh and returns the cached
information on APIs to get attributes
and so on. On subsequent calls, you
must call Refresh to get the latest
copy of the information.
But you are seeing the LAT always being a few minutes in the [future|past]?
Related
I wanted to know how I could read from the taskmanager whether a task has started / been shut down. The way I thought about it is, that I'd have a loop, which constantly checks, whether a new task has started and search for a specific string within the task-manager. Though this is possible, I didn't really want to use that method, because it would "eat" a lot of performance, I think and so I wanted to ask, if you have a way to check, if a program has started / shut down. This is how I thought about it:
while(!"notepad.exe found")
{
SearchForTask("notepad.exe");
if(notepad.exe found)
//Do Something
}
If there is another way, please let me know.
Regards
Checking for a running process is easy:
if(Process.GetProcessesByName("notepad").Length == 0)
{
// "notepad" not running
} else {
// At least one "notepad" process is running
}
You can also check for Length being greater than zero, or store the amount of running processes the last time you checked and see if it's different (if less than the last amount, one of the processes closed, if more, one started), since you can actually have more than one "notepad" running.
This uses the "friendly name" (generally, the executable name without the exe or path), if you are interested on a very specific process with a very specific path, then you'd need to iterate through the array that GetProcessesByName returns.
If you are doing this in a loop, I'd leave some free time on the loop so you are not checking constantly (that'd depend on what you are doing with all of this), otherwise you can use a timer (one of the many available) and poll every 'n' milliseconds.
What I'm doing
I'm working on a webservice which is copying files from one location to another. Files are being updated(the size should be increased every 3 seconds since there is text added).
1st option:
I'm checking every 10 seconds if any of the files is modified(they are being modified every 5 seconds cca) so I can copy(and overwrite) them to the final destination. Atm I'm using a code which is comparing the last edit time of the file with actual time - some amount of time(1 minute atm).
DateTime lastEditTime = new DateTime();
lastEditTime = File.GetLastWriteTime(myFile);
if (lastEditTime > DateTime.Now.AddMinutes(-1))
{
File.Copy(myFile, newFileName, true);
}
But I think this is kinda bad approach since there might be some time space or something and I won't get some changes.
2nd option
I could check the file sizes(using the FileInfo.Length property probably) of each file in the source directory and compare them to the ones in final destination.
This should be ok too since the file sizes should only grow since the text is added only.
3rd option
I read a lot people recommend the FileSystemWatcher but I don't want to miss some changes which might happen - at least I read that at other SO questions(see https://stackoverflow.com/a/240008/2296407).
What is my question?
What is the best option to know if any file was changed(if the file in source is different from file in final destination) in last x mins or seconds cause I don't want to copy everything since there might be a lot of files.
By being best option I mean: is it faster to compare each files sizes or compare the File.GetLastWriteTime(myFile) with actual time - some time. In the second case ther is also question: How big the time span should be? If I put a big time span I will probably copy more files than I actually need but if I put it low I might miss some changes.
If you have some better options feel free to share them with me!
Although you already mentioned it in your option 3, I still think should give it a try with the FileSystemWatcher Class. As far as I understood you, you have not yet done so, right?
Although it is true that the watcher may lose some event in the default configuration, you can still make it work reliably if you do some tweaking.
Have a look at the "Remarks" section in the documentation (highlights by me):
The Windows operating system notifies your component of file changes
in a buffer created by the FileSystemWatcher. If there are many
changes in a short time, the buffer can overflow. This causes the
component to lose track of changes in the directory, and it will only
provide blanket notification. Increasing the size of the buffer with
the InternalBufferSize property is expensive, as it comes from
non-paged memory that cannot be swapped out to disk, so keep the
buffer as small yet large enough to not miss any file change events.
To avoid a buffer overflow, use the NotifyFilter and
IncludeSubdirectories properties so you can filter out unwanted change
notifications.
Things you can do to make it reliably work:
Note that a FileSystemWatcher may miss an event when the buffer size
is exceeded. To avoid missing events, follow these guidelines:
Increase the buffer size by setting the InternalBufferSize property.
Avoid watching files with long file names, because a long file name contributes to filling up the buffer. Consider renaming these files
using shorter names.
Keep your event handling code as short as possible.
For example user Nomix says he raised the buffer size (property InternalBufferSize) up to 16 MB and has never had a problem with the FileSystemWatcher class (SO post is here.) And I can confirm this with a project in my company that works fine for years, too, since we found out about the buffer.
Initialization of the object might look like this for example:
private void InitWatcher()
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = "Your path to watch";
// You only want to watch a single folder
watcher.IncludeSubdirectories = false;
// You mentioned both LastWrite and Size
// You can combine them or just watch for only a specific property
// Simply configure it to your needs
watcher.NotifyFilter = NotifyFilters.LastWrite | NotifyFilters.Size
// Only watch text files.
watcher.Filter = "*.txt";
// Add event handlers, omit those you are not interested in
watcher.Changed += new FileSystemEventHandler(OnChanged);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
You can then subscribe to those events that are of interest for you, like the Changed event and react to it as easy as:
private static void OnChanged(object source, FileSystemEventArgs e)
{
File.Copy(e.FullPath, newFileName, true);
}
I need to get a list of all Word Documents. *.doc and *.docx that are stored in a Windows based folder, with many subfolders, and sub sub folders etc...
Searching for a file with C# has an answer that works, it is 2 years old and takes 10 seconds to search through 1500 files, (in the future there may be 10,000 or more). I will post my code which is basically a copy from the above link. Does anyone have a better solution?
DateTime dt = DateTime.Now;
DirectoryInfo dir = new DirectoryInfo(MainFolder);
List<FileInfo> matches =
new List<FileInfo>(dir.GetFiles("*.doc*",SearchOption.AllDirectories));
TimeSpan ts = DateTime.Now-dt;
MessageBox.Show(matches.Count + " matches in " + ts.TotalSeconds + " seconds");
You can use Directory.EnumerateFiles instead of GetFiles. This has the advantage of returning the files as an IEnumerable<T>, which allows you to begin your processing of the result set immediately (instead of waiting for the entire list to be returned).
If you're merely counting the number of files or listing all files, it may not help. If, however, you can do your processing and/or filtering of the results, and especially if you can do any of it in other threads, it can be significantly faster.
From the documentation:
The EnumerateFiles and GetFiles methods differ as follows: When you use EnumerateFiles, you can start enumerating the collection of names before the whole collection is returned; when you use GetFiles, you must wait for the whole array of names to be returned before you can access the array. Therefore, when you are working with many files and directories, EnumerateFiles can be more efficient.
Doubt there's much you can do with that,
dir.GetFiles("*.doc|*.docx", SearchOptions.AllDirectories) might have an impact in that it's more restrictive pattern.
If you want the full list, other than making sure the Windows Indexing Service is enable on the target folders, not really. Your main delay is going to be reading from the hard drive, and no optimizing of your C# code will make that process any faster. You could create your own simple indexing service, perhaps using a FileSystemWatcher, that would give you sub-second response times no matter how many documents are added.
In a first time I suggest you to use StopWatch instead of DateTime to get the elapsed time.
In a second time to make your search faster you shouldn't store the result of GetFiles in a List but directly into an array.
And finally, you should optimize your search pattern : you want every doc and docx file, try "*.doc?"
Here is my suggestion :
var sw = new Stopwatch();
sw.Start();
var matches = Directory.GetFiles(MainFolder, "*.doc?", SearchOption.AllDirectories);
sw.Stop();
MessageBox.Show(matches.Length + " matches in " + sw.Elapsed.TotalSeconds + " seconds");
I want to name my file after the current time in miliseconds since 1970.
At the moment I just have a counter and increment it after every new file. But when the app restarts the counter goes back to zero and I overwrite the files when I start saving them again.
So I was thinking if I just use the time in seconds or miliseconds then I wont have this problem.
So my question is how to I get the time in miliseconds on windows mobile.
This is what I am currently doing to generate my file names.
string fileName = savedCounter + ".jpg";
You can use Ticks
A single tick represents one hundred nanoseconds or one ten-millionth
of a second. There are 10,000 ticks in a millisecond.
DateTime unixEpoch = new DateTime(1970, 1, 1);
DateTime currentDate = DateTime.Now;
long totalMiliSecond = (currentDate.Ticks - unixEpoch.Ticks) /10000;
Console.WriteLine(totalMiliSecond);
string fileName = string.Concat(totalMiliSecond,".jpg");
Console.WriteLine(fileName);
Are you just using the milliseconds to generate a unique filename? If so, you might be much better using Guid.NewGuid().ToString()
DateTime.UtcNow gives you the current Utc time
new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc) gives you 1970
So you could use:
var savedCounter = Math.Round((DateTime.UtcNow - new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc)).TotalSeconds);
Some alternative naming strategies include:
Given that not many copies of your app were around in 1970, you could probably use a baseline date like new DateTime(2012,1,1,0,0,0)
You could also use a DateTime.ToString format like yyyyMMddhhmmss to achieve a string based on a date - and this might be easier for a human to read (e.g. using the debugger or the isolated storage explorer)
Aside - for performance reasons be aware that you shouldn't create too many files in one directory - http://appangles.com/blogs/mickn/wp7/how-many-files-are-too-many-files-for-windows-phone-7-isolated-storage-performance/ - at some point it makes sense to use a single file instead (e.g. a database)
Either use Time or even better with your current Architecture, save the current Counter into the IsolatedStorageSettings. Its easy to use: http://msdn.microsoft.com/en-us/library/cc221360(v=vs.95).aspx
Even if you use a timestamp to generate the name of the file, if multiple instances of your application can run concurrently, then there's still a chance of conflicts. Regardless of whether or not you use a timestamp, you may want to do something like the following:
Initialize a counter to 0
Generate a name for the file, incorporating the counter into its name.
Try to create the file, opening it for exclusive R/W access and requiring that the file not already exist.
If the file creation failed for some reason, increment the counter and repeat steps 2-4.
In fact, this is most likely what routines like System.IO.Path.GetTempFileName() do.
it will be better to use Guid instead of DateTime.
string fileName = System.Guid.NewGuid() + ".jpg";
I need to build a unique filename in a multithreaded application which is serializing some data on the disk.
What approach could I use to ensure a unique name.
The app was not multithreaded before and was using Ticks. When using multiple threads, it failed much faster than I expected.
I now added the CurrentThreadId to the filename, and that should do it
string.Format("file_{0}_{1}.xml", DateTime.Now.Ticks, Thread.CurrentThread.ManagedThreadId)
Is there any "smarter" way of doing this?
What about Guid.NewGuid() instead of the thread id?
string.Format("file_{0}_{1:N}.xml", DateTime.Now.Ticks, Guid.NewGuid())
By keeping the ticks as part of the name, the names will still be in approximate date order if ordering is important.
The {1:N} gets rid of the curly braces and dashes in the guid.
Also, consider using DateTime.UtcNow.Ticks, so as to guarantee incremental ticks when daylight saving time kicks in.
Depending on your needs you can try:
System.IO.Path.GetTempFileName();
This creates a uniquely named temporary file in the %temp% directory. I suppose you can copy the file to your target location before or after writing to it.
To actually reserve the filename you will have to create the file immediately and check for exceptions.
Another option would be to include a Guid value in your filename.
How about
Guid.NewGuid().ToString()