I just noticed a strange behavior with deleting and re-writing a file : the file creation time does not get updated if the interval is short enough between the two operations.
I ran the following code :
File.Delete("hello");
using(var stream = new StreamWriter("hello"))
{
stream.WriteLine("hello");
}
var f = new FileInfo("hello");
Console.WriteLine("{0} {1}ms", f.CreationTime, f.CreationTime.Millisecond);
If I put a breakpoint on the using(...) line, I can see the file disappearing after the delete, but at the end it will still give me the old creation date. I can even delete the file manually from the explorer and then run this code, it still shows the old creation time.
However, if I wait an undetermined time (around 1 minute) between the deletion and recreation, the creation time is set correctly (works if I wait with the debugger on the breakpoint mentioned above).
Where does this come from ? Is it a documented windows behavior ? Am I forgetting something ?
PS : I'm testing this on windows XP, for what it matters.
This is a known problem/feature in Windows called file tunneling.
Ref:
http://support.microsoft.com/kb/172190
Related:
https://serverfault.com/questions/92757/incorrect-file-creation-date-in-windows-xp-vista
Why Windows sets new created file's "created time" property to old time?
Windows filesystem: Creation time of a file doesn't change when while is deleted and created again
Try to force the refresh of your FileInfo (because data are cached):
var f = new FileInfo("hello");
f.Refresh(); // Force to read all props again
Console.WriteLine("{0} {1}ms", f.CreationTime, f.CreationTime.Millisecond);
Related
I have 3 ObservableCollections in my ViewModel and one Class which I load when you run an app.
To ensure ObservableCollections are deserialized I just got.
if(SomeCollection.Count == 0)
ThisCollection = await deserializationMethod<ObservableColletion<T>>(filename);
If there is no file, deserializationMethod will create new object with
return Activator.CreateInstance<T>();
That works fine - no problem with that.
And for class I have
if(ClassObject.Loaded != true)
ThisObject = await deserializationMethod<T>(filename);
I added a property - if file is deserialized then it's true.
It looks like it works but it is NOT. It happens very rarely but sometimes file is not deserialized and when you use an app this file is overwritten so every data are destroyed. I cannot find what is causing the problem. It is that you just run an app and it happens - like once per 100 runs.
How to be very sure that if file exist then it will be deserialized for sure?
Or maybe I should make List of these ObservableCollections + Class and serialize it to one file? Is there any good practice with that?
EDIT:
I used SemaphoreSlim to ensure that everything is used as it supposed to, but today it happened again.
The thing is it happens when app is started and nothing else is even tapped. There is no way that something is writing at this moment. It looks like data is not deserialized or is not reading the file that exist. Because every changes are written with closing an app then everything is gone.
Any other ideas what it might be or how to be sure that data are deserialized?
EDIT FINAL - reproduced problem:
I finally reproduced what is going on. So I've removed edits with code that wasn't necessary here.
I have BackPressed event to handle when user is going back or want to exit an app (if on MainPage).
This part of code apparently was causing the problem. What is going on exactly.
First thing is that problem isn't possible to be reproduced using emulator.
My BackPressed method contained await with serializing method that saved the data that were later gone (so as Ondrej Svejdar has written it was writing before reading). BUT I started to test it and there is strange behaviour and I still have some questions about it.
How it happens.
When I started an app (by accident e.g.) and loading screen occurs I start to tap back button few times -> app isn't running it is closing ASAP and I can't even see a UI (sometimes I am able to see AppBar for a moment). Then when I try to open app again (doesn't matter if immediately or later) it is "resuming" and after this exact moment my data are gone. But not all of the data. Only the last one saved with await within BackPressed method. Only this one. I tried to save One, Two and Three ObservableCollections with and without this Class and ALWAYS the last one was saved "empty". After this awaits I got Application.Current.Exit() method that might cause this, but I'm not sure if this should matter when serializing method is Task and only the last one is wrongly serialized.
When I remove this awaits from BackPressed method I can't reproduce this issue so this is it.
Questions I still have are: Is this behavior expected? Is there better method to close an app and ensure serializing data or I just should save it during using an app not while exiting it?
If someone is interested how to do it properly I was thinking about it and come up with few conclusions.
Keep in mind that this are my suggestions and there might be better approach.
While handling BackPressedButton event (the hardware one) I had an implementation of GoBack to previous page (if not on MainPage) or leave an app if on MainPage.
I was closing an app using Application.Current.Exit() and that wasn't causing problems (because I was saving very small files) until I started doing strange things (read "EDIT FINAL - reproduced problem:" of the question for more details).
Thing was the file wasn't saved because an app was closed before writing has finished. Solution is actually very simple. To my Save method which is a Task it just should return true value when writing is finished and this value should be checked while closing an app.
bool saved = await saveDataAsync<T>(whichData, dataFileName)
if(saved)
Application.Current.Exit();
and serializing method looks like this (I'm using semaphoreSlim in case there is possibility to two methods trying to reach the same file)
public async Task<bool> saveDataAsync<T>(T whichData, string dataFileName)
{
var Serializer = new DataContractSerializer(typeof(T));
await mySemaphoreSlim.WaitAsync();
try
{
using (var stream = await ApplicationData.Current.LocalFolder.OpenStreamForWriteAsync(dataFileName, CreationCollisionOption.ReplaceExisting))
{
Serializer.WriteObject(stream, whichData);
}
}
finally
{
mySemaphoreSlim.Release();
}
return true;
}
(I know It's a common problem but I couldn't find an exact answer)
I need to write a windows service that monitors a directory, and upon the arrival of a file, opens it, parses the text, does something with it and moves it to another directory afterwards. I used IsFileLocked method mentioned in this post to find out if a file is still been written. My problem is that I don't know how much it takes for another party to complete writing into the file. I could wait a few seconds before opening the file but this is not a perfect solution since I don't know in which rate is the file written to and a few seconds may not suffice.
here's my code:
while (true)
{
var d = new DirectoryInfo(path);
var files = d.GetFiles("*.txt").OrderBy(f => f);
foreach (var file in files)
{
if (!IsFileLocked(file))
{
//process file
}
else
{
//???
}
}
}
I think you might use a FileSystemWatcher (more info about it here: http://msdn.microsoft.com/it-it/library/system.io.filesystemwatcher(v=vs.110).aspx ).
Specificially you could hook to the OnChanged event and after it raises you can check IsFileLocked to verify if it's still being written or not.
This strategy should avoid you to actively wait through polling.
Occasionally our site slows down and the RAM usage goes up massively high. Then the app pool stops and I have to restart it. Then it's ok for a few days before the RAM suddenly spikes again and the app pool soon stops. The CPU isn't high.
Before the app pool stops I've noticed that one of our pages always hangs. The line it hangs on is a foreach on a ResourceSet :
var englishLocations = Lang.Countries.ResourceManager.GetResourceSet(new CultureInfo("en-GB"),true,true);
foreach(DictionaryEntry entry2 in englishLocations) // THIS LINE HANGS
We have the same code deployed on a different box and this doesn't happen. The main differences between the two boxes are :
Bad box
Window Server 2008 R2 Standard SP 1
IIS 7.5.7600.16385
.NET 4.5
24GB RAM
Good box
Window Server 2008 Server SP 2
IIS 7.0.6000.16386 SP 2
.NET 4.0
24GB RAM
I've tried adding uploadReadAheadSize="0" to the web.config as described here :
http://rionscode.wordpress.com/2013/03/11/resolving-controller-blocking-within-net-4-5-and-asp-net-mvc/
Which didn't work.
Why would foreach hang? It's hanging on the very first item, actually on the foreach.
Thanks.
I know it is an old post, but nevertheless... There is the potential of a deadlock when iterating over a ResourceSet and at the same time retrieving some other object through from the same Resources.
The problem is that when using a ResourceSet the iterator takes out locks on the internal resource cache of the ResourceReader http://referencesource.microsoft.com/#mscorlib/system/resources/resourcereader.cs,1389 and then in the method AllocateStringNameForIndex takes out a lock on the reader itself: http://referencesource.microsoft.com/#mscorlib/system/resources/resourcereader.cs,447
lock (_reader._resCache) {
key = _reader.AllocateStringForNameIndex(_currentName, out _dataPosition); // locks the reader
Getting an object takes out the same locks int the opposite order:
http://referencesource.microsoft.com/#mscorlib/system/resources/runtimeresourceset.cs,300
and http://referencesource.microsoft.com/#mscorlib/system/resources/runtimeresourceset.cs,335
lock(Reader) {
....
lock(_resCache) {
_resCache[key] = resLocation;
}
}
This can lead to a deadlock. We had this exact issue recently..
I experienced very similar problem.
Every once in a while IIS would hang, and I would see number of requests just sitting there. They were all in state ExecuteRequestHandler and with ManagedPipelineHandler module name.
After investigating with process explorer, I could see that all of them were sitting at mscorlib.dll!ResourceEnumerator.get_Entry, additional stack trace suggested some NGen action and then ntdll.dll!WaitForMultipleObjects.
My working hypothesis is that when multiple threads start enumerating those resources, we can run into a deadlock (possibly on some native code file generation), and alll subsequent threads then just keep on piling up.
To resolve it, I just created a critical section around this code block, to ensure that it is executed sequentially - I haven't experienced the issue since.
private static readonly object ResourceLock = new object();
public static MvcHtmlString SerializeGlobalResources(this HtmlHelper helper)
{
lock (ResourceLock)
{
// Existing code goes here ....
}
}
Based upon another answer to give you some idea how about using a try catch model ?
Perhaps it hangs because that resource isnt available / locked /..permissions etc.
var englishLocations = Lang.Countries.ResourceManager.GetResourceSet(new CultureInfo("en-GB"),true,true);
foreach(DictionaryEntry entry2 in englishLocations) // THIS LINE HANGS
ResourceManager CultureResourceManager = new ResourceManager("My.Language.Assembly", System.Reflection.Assembly.GetExecutingAssembly());
ResourceSet resourceSet = CultureResourceManager.GetResourceSet("sv-SE", true, true);
try { resourceSet.GetString("my_language_resource");}
catch (exception ex) { // from here log your error ex to wherever you like with some code }
I am writing some C# code and I need to detect if a specific folder on my windows file system has been opened while the application is running. Is there any way to do it? WinAPI maybe?
There are three API things I think you should check out:
FindFirstChangeNotification() http://msdn.microsoft.com/en-us/library/aa364417%28VS.85%29.aspx
That gives you a handle you can wait on and use to find changes to a file in a particular file, directory, or tree of directories. It won't tell you when a directory is browsed, but it will tell you when a file is saved, renamed, and so on and so forth.
SetWindowsHookEx() http://msdn.microsoft.com/en-us/library/ms644990%28v=VS.85%29.aspx
You can set that up to give you a callback when any number of events occur - in fact I'm pretty positive that you CAN get this callback when a directory is opened, but it will probably be inordinately difficult because you'll be intercepting messages to explorer's window. So you'll be rebooting during debugging.
Windows Shells http://msdn.microsoft.com/en-us/library/bb776778%28v=VS.85%29.aspx
If that wasn't painful enough, you can try writing a shell program.
If you're trying to write a rootkit, I suppose you don't want me to spoil the details for you. If you're NOT trying to write a rootkit, I suggest you look it up - carefully. There are open source rootkits, and they all basically have to monitor file access this way to hide from the user / OS.
Go with the Windows Shell Extensions. You can use Shell Namespace Extensions to make a "virtual" folder that isn't there (or hides a real one), like the GAC (C:\Windows\assembly)
Here are several examples of Shell Extension coding in .Net 4.0.
A Column Handler would let you know when a folder is "Opened", and even let you provide extra data for each of the files (new details columns).
Check out the FileSystemWatcher class.
The closest thing that I can think of, that may be useful to you, is using the static Directory class. It provides methods to determine the last time a file or directory was accessed. You could setup a BackgroundWorker to monitor if the directory was accessed during a specified interval. Keep track of the start and end of the interval by using DateTime, and if the last access time falls between those, then you can use the BackgroundWorker's ProgressChanged event to notify the application.
BackgroundWorker folderWorker = new BackgroundWorker();
folderWorker.WorkerReportsProgress = true;
folderWorker.WorkerSupportsCancellation = true;
folderWorker.DoWork += FolderWorker_DoWork;
folderWorker.ProgressChanged += FolderWorker_ProgressChanged;
folderWorker.RunWorkerAsync();
void FolderWorker_DoWork(object sender, DoWorkEventArgs e)
{
BackgroundWorker worker = (BackgroundWorker)sender;
while(!worker.CancellationPending)
{
DateTime lastAccess = Directory.GetLastAccessTime(DIRECTORY_PATH);
//Check to see if lastAccess falls between the last time the loop started
//and came to end.
if(/*your check*/)
{
object state; //Modify this if you need to send back data.
worker.ReportProgress(0, state);
}
}
}
void FolderWorker_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
//Take action here from the worker.ReportProgress being invoked.
}
You could use the FileSystemInfo's LastAccessProperty. The problem though is that it can be cached.
FileSystemInfo: http://msdn.microsoft.com/en-us/library/975xhcs9.aspx
LastAccessTime Property: http://msdn.microsoft.com/en-us/library/system.io.filesysteminfo.lastaccesstimeutc.aspx
As noted that this can be pre-cached.
"The value of the LastAccessTimeUtc property is pre-cached if the current instance of the FileSystemInfo object was returned from any of the following DirectoryInfo methods:
GetDirectories
GetFiles
GetFileSystemInfos
EnumerateDirectories
EnumerateFiles
EnumerateFileSystemInfos
To get the latest value, call the Refresh method."
Therefore call the Refresh method but it still might not be up to date due to Windows caching the value. (This is according to msdn doc "FileSystemInfo.Refresh takes a snapshot of the file from the current file system. Refresh cannot correct the underlying file system even if the file system returns incorrect or outdated information. This can happen on platforms such as Windows 98." - link: http://msdn.microsoft.com/en-us/library/system.io.filesysteminfo.refresh.aspx
I think the only way you can reliably achieve this is by monitoring the currently running processes and watch closely for new Explorer.exe instances and/or new Explorer.exe spawned threads (the "Run every window on a separate process" setting gets in the way here).
I admit I don't have a clue about how to code this, but that's what I would look for.
In our C# WinForms application, we generate PDF files and launch Adobe Reader (or whatever the default system .pdf handler is) via the Process class. Since our PDF files can be large (approx 200K), we handle the Exited event to then clean up the temp file afterwards.
The system works as required when a file is opened and then closed again. However, when a second file is opened (before closing Adobe Reader) the second process immediately exits (since Reader is now using it's MDI powers) and in our Exited handler our File.Delete call should fail because it's locked by the now joined Adobe process. However, in Reader we instead get:
There was an error opening this document. This file cannot be found.
The unusual thing is that if I put a debugger breakpoint before the file deletion and allow it to attempt (and fail) the deletion, then the system behaves as expected!
I'm positive that the file exists and fairly positive that all handles/file streams to the file are closed before starting the process.
We are launching with the following code:
// Open the file for viewing/printing (if the default program supports it)
var pdfProcess = new Process();
pdfProcess.StartInfo.FileName = tempFileName;
if (pdfProcess.StartInfo.Verbs.Contains("open", StringComparer.InvariantCultureIgnoreCase))
{
var verb = pdfProcess.StartInfo.Verbs.First(v => v.Equals("open", StringComparison.InvariantCultureIgnoreCase));
pdfProcess.StartInfo.Verb = verb;
}
pdfProcess.StartInfo.Arguments = "/N"; // Specifies a new window will be used! (But not definitely...)
pdfProcess.SynchronizingObject = this;
pdfProcess.EnableRaisingEvents = true;
pdfProcess.Exited += new EventHandler(pdfProcess_Exited);
_pdfProcessDictionary.Add(pdfProcess, tempFileName);
pdfProcess.Start();
Note: We are using the _pdfProcessDictionary to store references to the Process objects so that they stay in scope so that Exited event can successfully be raised.
Our cleanup/exited event is:
void pdfProcess_Exited(object sender, EventArgs e)
{
Debug.Assert(!InvokeRequired);
var p = sender as Process;
try
{
if (_pdfProcessDictionary.ContainsKey(p))
{
var tempFileName = _pdfProcessDictionary[p];
if (File.Exists(tempFileName)) // How else can I check if I can delete it!!??
{
// NOTE: Will fail if the Adobe Reader application instance has been re-used!
File.Delete(tempFileName);
_pdfProcessDictionary.Remove(p);
}
CleanOtherFiles(); // This function will clean up files for any other previously exited processes in our dictionary
}
}
catch (IOException ex)
{
// Just swallow it up, we will deal with trying to delete it at another point
}
}
Possible solutions:
Detect that the file is still open in another process
Detect that the second process hasn't really been fully exited and that the file is opened in the first process instead
I just dealt with this a couple of days ago.
When there is no instance open already, the document opens in a new instance directly.
When there is an instance already open, I believe that instance spawns a new instance which you don't actually get a handle to. What happens is control returns to your function immediately, which then goes and deletes the file before the new instance has had a chance to read the file -- hence it appears to not be there.
I "solved" this by not deleting the files immediately, but keeping track of the paths in a list, and then nuking all of them when the program exits (wrap each delete in a try/catch with an empty catch block in case the file has disappeared in the meantime).
I would suggest following approach:
Create files in user's temp directory (Path.GetTempPath). You can create some sub-folder under it.
Attempt to delete files only when last instance of process gets exited (i.e. you need to count number of processes that you had launched, on exit, decrement the count and when it becomes zero, attempt to delete (all) files that are open so far)
Try to clean-up created sub-folder (under temp directory) while starting and ending the application. You can even attempt for periodic clean-up using timer.