I'm new to writing WindowsServices. I used to make web-applications.
I need to make this service that checks directories for files. If a certain trigger-file is present, the files in the directory get copied, and other stuff gets done.
My service is ready, almost.
It works fine, but I have an issue.
If the trigger-file is present, the copying and processing starts.
But at the same time, the service keeps checking my directories.
So at a given point, it comes back to the directory that is being copied.
How can I prevent it from recopying the directory?
(I hope I am clear in my explanation)
I found my solution, just when I was reading my question again and looking at your comments.
I can simply delete my trigger-file and then the service should skip the folder.
Sorry for your trouble.
Related
I've got an application I'm taking over with a very strange issue.
The background: 60+ identical IIS applications running on windows server 2012 that I rdp into.
Each application is identical except for some image files and the web.config files. (yeah, I know)
The applications are not compiled but just run as cs files. No proj files or sln files either.
There is one compiled app, which runs as a scheduled task, and uses some of the files in each of the application folders.
The code is C# and I'm editing it with notepad++.
The issue:
I've been trying to update some of the code in one of the test applications but my changes don't seem to be taking effect. (specifically update a log file and send emails). The current emails work but my new one does not appear, nor do my log files show up.
I tried to test it in another test application just to see if it failed there too, and found that that website came up with an error in some code on a specific line of a specific file.
Thing is, this line of code is not on the same line in the actual cs file.
I then added another line higher up to see if I could get a divide by zero error.
Same result. Same line of code failed with the same line number. No change at all.
Seems like my code is being cached and I can't refresh it.
I tried making sure it's not being cached by the only scheduled task and cycling IIS entirely (at the root).
Still happening.
I know for a fact that it's not a matter of an exe hiding somewhere as two weeks ago I made a change to the code and it worked. My change showed up. I also know for a fact that I am editing the correct file. I opened the folder using Explore in IIS.
There are no obj folders. There is a bin folder in each application folder but nothing in the bin folder except nuget package dlls.
Ok. So after I spent a week trying to set each app's web.config to use tempDirectory=a new temp directory based on the app name, I finally discovered that the previous coders had made two copies of the relevant file. Identical copies. One for use by a scheduled task and one for use by the website itself.
So all the time I was trying to detect changes in the result of my code, I was changing the one the scheduled task used.
Of course, in the several times I discussed the problem with the previous developer (who was still working with us), he never bothered suggesting the solution, although he's been working with this code for several yeas.
Updating the correct copy solved the problem. I've also discovered my head hair was shedding faster in the last few weeks than ever before.
I'm facing a very weird problem.
I have a C# project with a FolderWatcher.OnCreate on a specific LAN path.
FolderWatcher.Created += new FileSystemEventHandler(FSW_Created);
I'm working simultaneously with two systems on the NAS so when one system finishes to work - it "tells" the other by copying a file (File.Copy) to some location.
The second system is listening to that location (don't ask why that's the way of implementing that...it's a long story)
When the copy of the original file is created - the second system searches for the original copy.
This works for all these file, except of sometimes when I get a message that the original's directory doesn't exist...
I see the directory is correct, and it happens only for a few such files. It has to exist (and it actually exists) because the copy came from there....
If anyone has a clue - I'll be glad.
when the error happens I enter into the following code line:
if (!Directory.Exists(directoryPath))
Currently I just added a path - try 50 times, every second, until you find it... I can't check it now, but I guess it will work
Thanks
I have the following piece of code in my application:
if (!Directory.Exists(myPath))
Directory.CreateDirectory(myPath);
If I run it in a regular unit test sometimes it passes, sometimes not. The directory is always there (I made sure of it, so technically it will never be "created" by code). But every once in a while Directory.Exists(myPath) returned false, which makes the code try to create the folder and then I get an UnauthorizedAccessException!
The funny thing here is if I put a breakpoint on the CreateDirectory, and then move the yellow arrow up back to test, the test returns true!
What's going on?
myPath is \\nameOfLocalMachine\sharedFolder. The share is reliable and constantly used... .NET 4.0
I just made a fiddler simulate 3000 sequentials requests. 175 failed... All with the same message:
Access to the path '\nameOfLocalMachine\sharedFolder\randomFileName.json' is denied
This mishap is pretty normal on Windows. Programs open a handle on a directory like this and specify delete sharing. Which permits anybody to delete the directory, even though the program is using it. The directory won't actually disappear from the file system until that handle is closed. What follows is that trying to recreate that directory cannot work, it still exists. Windows generates an "access denied" error, reported in your C# program with the UnauthorizedAccessException.
While that sounds like an obscure feature, every program in Windows does this. Every process has a default working directory, the value of Environment.CurrentDirectory. Creating a handle on such a directory ensures that it cannot disappear while the program is using it. There are other cases, FileSystemWatcher would be another example. Or a program busy iterating the directory. Anti-malware and search indexers are notorious for hard to diagnose sources of such errors.
Otherwise a standard hazard of a multi-tasking operating system. You are not the only one using the file system. Not repeatedly deleting and creating the same directory ought to be very high on your list. If this is absolutely necessary then rename the directory first before you delete it. You'd still fail to delete the renamed directory but you won't fail recreating it. You can delete it later, next time you need to do this. Much lower odds for trouble then. Because more time passed.
I currently store a serialized XML file in the application directory that contains all changes specific to the program operation (not typical system or user configuration). Weeks ago, we started running into problems where it would not save correctly (read my previous question about this).
Long story short, we finally discovered that Windows 7 (and sometimes Vista) has an issue with writing into the application directory (specifically anything under Program Files). Now, if this were a normal configuration file I would simply store it under the user's APPDATA folder, but it is not normal. We run this on our own instrumentation, and misconfigurations are 99% of the reason customers have issues running our software. So we need this file to be accessible such that they can easily find it and email it to us. Appdata is hard enough for experienced users to find, much less very non-technological people.
We've also tried running it as Administrator, and making folder permissions wide open (we have control over every computer it runs on; it will never run on some random person's machine). But, these sometimes work, and sometimes do not.
The worst part is that when I write the file back out, it doesn't even throw an error; it simply writes it to some temporary directory that expires at some unknown point in time. Weeks later, our user will have an issue, and the configuration file is all messed up.
So, my question is where should I be storing this file, if not in Program Files? Should I just put it in APPDATA anyway, and make a small utility that emails it to us automatically in case of a problem? Or can I leave it in Program Files, but change some specific permission or registry key to allow it to operate normally?
It depends on whether or not the user needs to edit the file directly. If not, you should put them in %APPDATA%, which you can access via:
Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData)
Otherwise, you might put it in My Documents:
Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments)
Either way, putting it in Program Files is not a good idea. As you discovered, there are permission issues, even if running as Administrator.
For those users, you could build a button in that would open this directory. You could put it in an inconspicuous place that you could later direct them to.
For users that have an email client on their box, you could have a button that would create a new email with subject and automatically attach the file to the email.
When I call FileInfo(path).LastAccessTime or FileInfo(path).LastWriteTime on a file that is in the process of being written it returns the time that the file was created, not the last time it was written to (ie. now).
Is there a way to get this information?
Edit: To all the responses so far. I hadn't tried Refresh() but that does not do it either. I am returned the time that the file was started to be written to. The same goes for the static method, and creating a new instance of FileInfo.
Codymanix might have the answer, but I'm not running Windows Server (using Windows 7), and I don't know where the setting is to test.
Edit 2: Nobody finds it interesting that this function doesn't seem to work?
The FileInfo values are only loaded once and then cached. To get the current value, call Refresh() before getting a property:
f.Refresh();
t = f.LastAccessTime;
Another way to get the current value is by using the static methods on the File class:
t = File.GetLastAccessTime(path);
Starting in Windows Vista, last access time is not updated by default. This is to improve file system performance. You can find details here:
http://blogs.technet.com/b/filecab/archive/2006/11/07/disabling-last-access-time-in-windows-vista-to-improve-ntfs-performance.aspx
To reenable last access time on the computer, you can run the following command:
fsutil behavior set disablelastaccess 0
As James has pointed out LastAccessTime is not updated.
The LastWriteTime has also undergone a twist since Vista. When the process has the file still open and another process checks the LastWriteTime it will not see the new write time for a long time -- until the process has closed the file.
As a workaround you can open and close the file from your external process. After you have done that you can try to read the LastWriteTime again which is then the up to date value.
File System Tunneling:
If an application implements something like a rolling logger which closes the file and then renames it to a different file name you will also run into issues since the creation time and file size of the "old" file is remembered by the OS although you did create a new file. This includes wrong reports of the file size even if you did recreate log.txt from scratch which is still 0 bytes in size. This feature is called OS File System Tunneling which is still present on Windows 8.1 . An example how to work around this issue check out RollingFlatFileTracelistener from Enterprise Library.
You can see the effects of file system tunneling on your own machine from the cmd shell.
echo test > file1.txt
ren file1.txt file2.txt
Wait one minute
echo test > file1.txt
dir /tc file*.txt
...
05.07.2015 19:26 7 file1.txt
05.07.2015 19:26 7 file2.txt
The file system is a state machine. Keeping states correctly synchronized is hard if you care about performance and correctness.
This strange tunneling syndrome is obviously still used by application which do e.g. autosave a file and move it to a save location and then recreate the file again at the same location. For these applications it makes to sense to give the file a new creation date because it was only copied around. Some installers do also such tricks to move files temporarily to a different location and write the contents back later to get past some file exists check for some install hooks.
Have you tried calling Refresh() just before accessing the property (to avoid getting a cached value)? If that doesn't work, have you looked at what Explorer shows at the same time? If Explorer is showing the wrong information, then it's probably something you can't really address - it might be that the information is only updated when the file handle is closed, for example.
There is a setting in windows which is sometimes set especially on server systems so that modified and accessed times for files are not set for better performance.
From MSDN:
When first called, FileSystemInfo
calls Refresh and returns the
cached information on APIs to get
attributes and so on. On subsequent
calls, you must call Refresh to get
the latest copy of the information.
FileSystemInfo.Refresh()
If you're application is the one doing the writing, I think you are going to have to "touch" the file by setting the LastWriteTime property your self between each buffer of data you write. Some psuedocode:
while(bytesWritten < totalBytes)
{
bytesWritten += br.Write(buffer);
myFileInfo.LastWriteTime = DateTime.Now;
}
I'm not sure how severely this will affect write performance.
Tommy Carlier's answer got me thinking....
A good way to visualise the differences is seperately running the two snippets (I just used LinqPAD) simliar to below while also running sysinternals Process Monitor.
while(true)
File.GetLastAccessTime([file path here]);
and
FileInfo bob = new FileInfo(path);
while(true){
string accessed = bob.LastAccessTime.ToString();
}
If you look at Process Monitor while running the first snippet you will see repeated and constant access attempts to the file for the LinqPAD process. The second snippet will do an initial access of the file, for which you will see activity in process monitor, and then very little afterwards.
However if you go and modify the file (I just opened the text file I was monitoring using FileInfo and added a character and saved) you will see a series of access attempts by the LinqPAD process to the file in process monitor.
This illustrates the non-cached and cached behaviour of the two different approachs respectively.
Will the non-cached approach wear a hole in the hard drive?!
EDIT
I went away feeling all clever over my testing and then used the caching behaviour of FileInfo in my windows service (basically to sit in a loop and say 'Has-file-changed-has-file-changed...' before doing processing)
While this approach worked on my dev box, it did not work in the production environment, ie the process just kept running regardless if the file had changed or not. I ended up changing my approach to checking and just used GetLastAccessTime as part of it. Don't know why it would behave differently on production server....but I am not too concerned at this point.