I have written a C# program that copys itself and moves in a precise source directory. It is working fine the first time it moves, but the second time the value of the current directory is wrong. To determine the file path I use :
string current = Directory.GetCurrentDirectory();
The second time the program wants to move, I call GetCurrentDirectory again but the value of "current" is still the old path and it gives me FileNotFound Error.
What can I do to make GetCurrentDirectory() read the new path ?
You need to execute your application in a new process. You are likely using the original process and its current directory won't change to wherever you copied your application to.
I don't know what you are trying to achieve with your application, but definitely execute each copy as a new process and allow the current process to terminate itself.
Keep track of where you moved it.
GetCurrentDirectory() is the working directory the program is executed from. That wont change when you simply move the executable ( it would change if you executed it again from the new location). So either create a new process and exec the exe in its new location, or just keep track location you moved it to. The later is far easier.
As a note, copying the exe file doesn't have any effect on the running instance of the program, since it's still running from the "old" location ( likely its actually loaded in memory at this point). To the running program, the exe file copy is just a file. Why are you doing this ?
If you want to execute the new exe and exit the old one then use this api and call
Process.start("new/path/to/exe");
Environment.exit(0);
If you are just in main, you could simply return instead of calling exit.
This should work, it might cause both programs to exit ( if this kills child processes) but I don't think it does.
Related
I am looking for an idea on how I can launch a C# process based on something happening on a Windows server. My first challange is to determine when to start the first process. It needs to monitor a SFTP folder to see if a certain file type has been delivered. My initial thought was to have the task scheduler start a Perl script, have the script look to see if the file exists and then start the process. But once it has started the process, I don't want it to look for the file till the next day.
The second issue is that the first process moves files to another folder and then a third party application will start converting these files from PDFs to TEXT. The second process needs to start when this is done. I am not sure how to make this happen.
Thoughts??????
Write a windows service which uses a filewatcher to monitor for new files. https://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher(v=vs.110).aspx
That can then use File.Move to move the file out and into the alternate directory for further processing. https://msdn.microsoft.com/en-us/library/system.io.file.move(v=vs.110).aspx
I would use a Task for this and a task.continuewith to kick off the next 'stage' of your workflow, etc. Might also want to do a file COPY first, then a file delete (instead of a move, that way if something screws up during the copy you still have your original to work with).
I have the following piece of code in my application:
if (!Directory.Exists(myPath))
Directory.CreateDirectory(myPath);
If I run it in a regular unit test sometimes it passes, sometimes not. The directory is always there (I made sure of it, so technically it will never be "created" by code). But every once in a while Directory.Exists(myPath) returned false, which makes the code try to create the folder and then I get an UnauthorizedAccessException!
The funny thing here is if I put a breakpoint on the CreateDirectory, and then move the yellow arrow up back to test, the test returns true!
What's going on?
myPath is \\nameOfLocalMachine\sharedFolder. The share is reliable and constantly used... .NET 4.0
I just made a fiddler simulate 3000 sequentials requests. 175 failed... All with the same message:
Access to the path '\nameOfLocalMachine\sharedFolder\randomFileName.json' is denied
This mishap is pretty normal on Windows. Programs open a handle on a directory like this and specify delete sharing. Which permits anybody to delete the directory, even though the program is using it. The directory won't actually disappear from the file system until that handle is closed. What follows is that trying to recreate that directory cannot work, it still exists. Windows generates an "access denied" error, reported in your C# program with the UnauthorizedAccessException.
While that sounds like an obscure feature, every program in Windows does this. Every process has a default working directory, the value of Environment.CurrentDirectory. Creating a handle on such a directory ensures that it cannot disappear while the program is using it. There are other cases, FileSystemWatcher would be another example. Or a program busy iterating the directory. Anti-malware and search indexers are notorious for hard to diagnose sources of such errors.
Otherwise a standard hazard of a multi-tasking operating system. You are not the only one using the file system. Not repeatedly deleting and creating the same directory ought to be very high on your list. If this is absolutely necessary then rename the directory first before you delete it. You'd still fail to delete the renamed directory but you won't fail recreating it. You can delete it later, next time you need to do this. Much lower odds for trouble then. Because more time passed.
To put it simply, if process A starts application B.
Whenever application B tries to do a relative file access such as
using(StreamReader sr = new StreamReader("log.txt"))
It accesses log.txt in the folder that process A is inside of, not the folder that application B is inside of.
Right now my current fix for this is to get my application's module file name + path, remove the file name and prepend the path variable onto all my relative file access calls.
What causes this and how can I avoid it?
In process A, where you launch application B, you should have specified the working folder.
Take a look at this: http://msdn.microsoft.com/en-us/library/system.diagnostics.processstartinfo.workingdirectory.aspx
EDIT (after clarification from OP that process A is the task scheduler, and therefore cannot be modified)
I think the task scheduler allows you to specify the working directory for the application that you schedule. But in any case, even if you cannot, you can always set the current directory of your application to the right place first thing as soon as it starts, using SetCurrentDirectory().
Once you have access to the Process, you can try getting the Module. From here, you can access the full path of the process (using the FileName property) and, in turn, its directory.
string fullPath = myProcess.Modules[0].FileName;
string workingDirectory = System.IO.Directory.GetParent(fullPath);
According to this thread, 32-bit modules won't be able to enumerate modules of 64-bit assemblies, so you'll need to recompile your program to 64-bit if your target processes will be running in that mode.
I need to get the temp file to see what happened because the actual file is never output. However, I can't seem to find where the temp file is created.
I need to find this out without writing code or building the application because there are too many dependencies scattered all over the place. I would not be able to deploy a debug version.
That method returns the path of a temporary file. The path will tell you where its pointing.
For example:
Console.WriteLine(Path.GetTempFileName());
produces:
C:\Users\will\AppData\Local\Temp\tmp9BD5.tmp
for me on this machine, because the TEMP environment variable is pointing to C:\Users\will\AppData\Local\Temp\
But the whole point of a method like GetTempFileName is that you shouldn't have to care where the file ends up. On the off-chance that you do, you can always get there at command prompts or file-open dialogs by using %TEMP%
When I call FileInfo(path).LastAccessTime or FileInfo(path).LastWriteTime on a file that is in the process of being written it returns the time that the file was created, not the last time it was written to (ie. now).
Is there a way to get this information?
Edit: To all the responses so far. I hadn't tried Refresh() but that does not do it either. I am returned the time that the file was started to be written to. The same goes for the static method, and creating a new instance of FileInfo.
Codymanix might have the answer, but I'm not running Windows Server (using Windows 7), and I don't know where the setting is to test.
Edit 2: Nobody finds it interesting that this function doesn't seem to work?
The FileInfo values are only loaded once and then cached. To get the current value, call Refresh() before getting a property:
f.Refresh();
t = f.LastAccessTime;
Another way to get the current value is by using the static methods on the File class:
t = File.GetLastAccessTime(path);
Starting in Windows Vista, last access time is not updated by default. This is to improve file system performance. You can find details here:
http://blogs.technet.com/b/filecab/archive/2006/11/07/disabling-last-access-time-in-windows-vista-to-improve-ntfs-performance.aspx
To reenable last access time on the computer, you can run the following command:
fsutil behavior set disablelastaccess 0
As James has pointed out LastAccessTime is not updated.
The LastWriteTime has also undergone a twist since Vista. When the process has the file still open and another process checks the LastWriteTime it will not see the new write time for a long time -- until the process has closed the file.
As a workaround you can open and close the file from your external process. After you have done that you can try to read the LastWriteTime again which is then the up to date value.
File System Tunneling:
If an application implements something like a rolling logger which closes the file and then renames it to a different file name you will also run into issues since the creation time and file size of the "old" file is remembered by the OS although you did create a new file. This includes wrong reports of the file size even if you did recreate log.txt from scratch which is still 0 bytes in size. This feature is called OS File System Tunneling which is still present on Windows 8.1 . An example how to work around this issue check out RollingFlatFileTracelistener from Enterprise Library.
You can see the effects of file system tunneling on your own machine from the cmd shell.
echo test > file1.txt
ren file1.txt file2.txt
Wait one minute
echo test > file1.txt
dir /tc file*.txt
...
05.07.2015 19:26 7 file1.txt
05.07.2015 19:26 7 file2.txt
The file system is a state machine. Keeping states correctly synchronized is hard if you care about performance and correctness.
This strange tunneling syndrome is obviously still used by application which do e.g. autosave a file and move it to a save location and then recreate the file again at the same location. For these applications it makes to sense to give the file a new creation date because it was only copied around. Some installers do also such tricks to move files temporarily to a different location and write the contents back later to get past some file exists check for some install hooks.
Have you tried calling Refresh() just before accessing the property (to avoid getting a cached value)? If that doesn't work, have you looked at what Explorer shows at the same time? If Explorer is showing the wrong information, then it's probably something you can't really address - it might be that the information is only updated when the file handle is closed, for example.
There is a setting in windows which is sometimes set especially on server systems so that modified and accessed times for files are not set for better performance.
From MSDN:
When first called, FileSystemInfo
calls Refresh and returns the
cached information on APIs to get
attributes and so on. On subsequent
calls, you must call Refresh to get
the latest copy of the information.
FileSystemInfo.Refresh()
If you're application is the one doing the writing, I think you are going to have to "touch" the file by setting the LastWriteTime property your self between each buffer of data you write. Some psuedocode:
while(bytesWritten < totalBytes)
{
bytesWritten += br.Write(buffer);
myFileInfo.LastWriteTime = DateTime.Now;
}
I'm not sure how severely this will affect write performance.
Tommy Carlier's answer got me thinking....
A good way to visualise the differences is seperately running the two snippets (I just used LinqPAD) simliar to below while also running sysinternals Process Monitor.
while(true)
File.GetLastAccessTime([file path here]);
and
FileInfo bob = new FileInfo(path);
while(true){
string accessed = bob.LastAccessTime.ToString();
}
If you look at Process Monitor while running the first snippet you will see repeated and constant access attempts to the file for the LinqPAD process. The second snippet will do an initial access of the file, for which you will see activity in process monitor, and then very little afterwards.
However if you go and modify the file (I just opened the text file I was monitoring using FileInfo and added a character and saved) you will see a series of access attempts by the LinqPAD process to the file in process monitor.
This illustrates the non-cached and cached behaviour of the two different approachs respectively.
Will the non-cached approach wear a hole in the hard drive?!
EDIT
I went away feeling all clever over my testing and then used the caching behaviour of FileInfo in my windows service (basically to sit in a loop and say 'Has-file-changed-has-file-changed...' before doing processing)
While this approach worked on my dev box, it did not work in the production environment, ie the process just kept running regardless if the file had changed or not. I ended up changing my approach to checking and just used GetLastAccessTime as part of it. Don't know why it would behave differently on production server....but I am not too concerned at this point.