Create Directory on Network Path - c#

I have a server application which receives packets of information (basically a file path on a network) from various client applications via WCF. When it receives the incoming packet, it adds the file path to a list and then launches another process in a backgroundworker thread. In my backgroundworker DoWork function, I call a function called ProcessFiles() - note I've simplified this function to make the sample easier.
private bool ProcessFiles()
{
while (FileList.Count > 0)
{
var path = Path.Combine(FileList[0].TargetPath, #"\Translated Files");
if (!Directory.Exists(path))
{
Directory.CreateDirectory(path);
}
FileList.RemoveAt(0);
}
return true;
}
The function above simply starts working through the FileList and right now the only action I'm trying to do is simply create a new directory inside the target path destination. Now, this path is likely going to be on a network drive (I'm not sure if that matters)... but the server has access to that location. When I run my server/client applications and send it a file to process... theoretically it should create the new "Translated Files" folder in the TargetPath destination location... however, nothing ever gets created. The odd thing is that my DoWork function in my background worker completes its process correctly. I know this because I have a print statement in my RunWorkerCompleted event handler and it appears to be processing normally. Does anyone have any ideas why this directory folder is not being created correctly?

Related

Windows: Delete EXE after execution

I am working on an application in C# which does the following things:
Write an EXE to disk
Execute the EXE through Process.Start()
I am now trying to ensure that the EXE will be deleted once it is closed.
The easiest way to do so is to set the FileOptions.DeleteOnClose parameter when creating the EXE using File.Create.
However, this means that the EXE can not be executed while is in use. Once the file handle is closed, the EXE is immediately deleted before it can be executed.
Is there any way to retain a "weak reference" to the EXE in my application which does not lock the file and allows it to be executed? Alternatively, is there any way to unlock the EXE for execution with the file handle still open? Are there any other obvious solutions I am missing?
CLARIFICATION A: I am aware of other methods to delete files in use which will delete the file eventually (e.g. upon reboot). I am however looking for a method to delete the file immediately once it starts executing which is handled by the OS (e.g. when running a batch that first executes the file and then deletes it, the file would remain on disk if the batch job is terminated).
CLARIFICATION B: To explain the bigger picture: The application receives and decrypts an executable file. After decryption, the file should be executed. However, I want to make sure the decrypted version of the EXE does not stay on disk. Ideally, I also want to prevent users from copying the decrypted EXE. However, since the decryption application runs as the same user, this will be impossible to achieve in a truly secure fashion as both have the same privileges on the system.
You could use Process.WaitForExit:
var process = Process.Start(processPath);
process.WaitForExit();
// File.Delete(processPath); // Not strong enough (thanks to Binary Worrier)
DeleteOrDie(processPath); // Will attempts anything to delete the file.
But it gives the possibility to copy the exe from where you writed it.
A good solution is to run it from memory.
If your target exe is a CLR program, you can use the Assembly.Load function:
// read the file and put data in bin
...
Assembly a = Assembly.Load(bin);
MethodInfo method = a.EntryPoint;
if (method == null) throw new NoEntryPointException();
object o = a.CreateInstance(method.Name);
method.Invoke(o, null);
More details here.
If you want to load/execute any exe in memory, you could use the Nebbett’s Shuttle approach but you will need to code it in C/C++ and make a call to it from C#.
Also it looks like Microsoft doesn't like it (security issues) and I don't think you can achieve it from C# only. Good anti-virus will probably detect it.
In a not very good way but a way that can give you what you want, I suggest this solution:
(I use a Console Application with some input arguments for this solution)
[1: ] Write a function to check opened processes:
/// <summary>
/// Check application is running by the name of its process ("processName").
/// </summary>
static bool IsProcessOpen(string processName)
{
foreach (Processing.Process clsProcess in Processing.Process.GetProcesses())
{
if (clsProcess.ProcessName.ToUpper().Contains(processName.ToUpper()))
return true;
}
return false;
}
[2: ] Define some variables:
static bool IamRunning = true;
static int checkDuration = 1; // in seconds
[3: ] Use a Thread for run a loop for checking:
Thread t = new Thread(delegate() {
DateTime lastCheck = DateTime.MinValue;
while (IamRunning)
{
var now = DateTime.Now;
int dd = (now.Hour - lastCheck.Hour) * 3600 + (now.Minute - lastCheck.Minute) * 60 + now.Second - lastCheck.Second;
if (dd >= checkDuration)
if (!IsProcessOpen("ProcessName"))
{
delApplication(); // You have a function to delete ...
break;
}
}
});
t.SetApartmentState(ApartmentState.STA);
t.Start();
[4: ] Use a loop at the end of the program:
while (t.ThreadState == ThreadState.Running)
{
// just wait.
}
Note: This solution by Console Application in my low computer have 50% usage of CPU.

race condition while working with file system

I'm using a System.IO.FileSystemWatcher to get notified on file renaming inside a directory. This files are log files, created by a different process.
The event handler looks like this:
private async void FileRenamedHandler(object sender, RenamedEventArgs e)
{
//when file is renamed
//try to upload it to a storage
//if upload is succesful delete it from disk
}
all looks good until now but i need to add a second method that iterates through the directory when this application starts in order to upload existing log files to storage
so
public async Task UploadAllFilesInDirectory()
{
foreach (var file in Directory.GetFiles(_directoryPath))
{
await TryUploadLogAsync(file);
}
}
Problem is i get into race conditions like for example:
file has just been renamed and FileRenamedHandler is triggered but the same fill would be also parsed by UploadAllFilesInDirectory method. In this moment i may upload the same file twice or i would get an exception when trying to delete it from disk because it has been already deleted.
I can see more race condition cases with this code.
Any idea how i can solve this?
Thanks
You can use a ConcurrentDictionary to keep track of the items currently being processed, and let it worry about the thread safety.
Create the dictionary in which the key is the file path (or some other identifying object) and the value is...whatever. We're treating this as a set, not a dictionary, but there is no ConcurrentSet, so this will have to do.
Then for each file you have to process call TryAdd. If it returns true you added the object, and you can process the file. If it returns false then the file was there, and it's being processed elsewhere.
You can then remove the object when you're done processing it:
//store this somewhere
var dic = new ConcurrentDictionary<string, string>();
//to process each file
if (dic.TryAdd(path, path))
{
//process the file at "path"
dic.TryRemove(path, out path);
}
I would suggest to build a queue and store the files to be uploaded as some sort of job into the queue. If you process the items in the queue you can check the existence of every file before trying to upload it.

one process waiting for another process's output via the file system

I have a process A that reads in some data produced by some other process B. The data is 'exchanged' via the file system. To ensure that the file exists, process A currently checks for the file's existence like this:
while (!File.Exists(FileLocation))
{
Thread.Sleep(100);
}
This only seems to work 99 percent of the time. The other 1 percent of the time, process A establishes that the file exists but process B has not written everything yet (i.e. some data is missing).
Is there another simpler way to make the above situation more bullet proofed? Thanks.
Is there another simpler way to make the above situation more bullet proofed?
You could use a Mutex for reliable inter-process synchronization. Another possibility is to use a FileSystemWatcher.
After determining that the file exists, you can try opening the file for exclusive access, which will fail if another process still has the file open:
try
{
File.Open("foo",FileMode.Open,FileAccess.Read,FileShare.None);
}
catch(IOException ex)
{
// go back to
}
Given that you say that you can change both processes' code, you can use an EventWaitHandle to communicate between the processes.
In your program that creates the file, in the Main() method you can create an EventWaitHandle and keep it around until the end of the program. You'll need to pass the EventWaitHandle object around in your program so that it is available to the bit of code that creates the file (or provide some method that the file-creating code can call to set the event).
using (EventWaitHandle readySignaller = new EventWaitHandle(false, EventResetMode.ManualReset, "MySignalName"))
{
// Rest of program goes here...
// When your program creates the file, do this:
readySignaller.Set();
}
Then have some code like this in the program that's waiting for the file:
// Returns true if the wait was successful.
// Once this has returned true, it will return false until the file is created again.
public static bool WaitForFileToBeCreated(int timeoutMilliseconds) // Pass Timeout.Infinite to wait infinitely.
{
using (EventWaitHandle readySignaller = new EventWaitHandle(false, EventResetMode.ManualReset, "MySignalName"))
{
bool result = readySignaller.WaitOne(timeoutMilliseconds);
if (result)
{
readySignaller.Reset();
}
return result;
}
}
NOTE: If we successfully wait note that I am resetting the signal and it will remain reset until the other process sets it again. You can handle the logic differently if you need to; this is just an example.
Essentially what we are (logically) doing here is sharing a bool between two processes. You have to be careful about the order in which you set and reset that shared bool.
Try the FileSystemWatcher.
Listens to the file system change notifications and raises events when
a directory, or file in a directory, changes.

Handle system folders event in windows

I am writing some C# code and I need to detect if a specific folder on my windows file system has been opened while the application is running. Is there any way to do it? WinAPI maybe?
There are three API things I think you should check out:
FindFirstChangeNotification() http://msdn.microsoft.com/en-us/library/aa364417%28VS.85%29.aspx
That gives you a handle you can wait on and use to find changes to a file in a particular file, directory, or tree of directories. It won't tell you when a directory is browsed, but it will tell you when a file is saved, renamed, and so on and so forth.
SetWindowsHookEx() http://msdn.microsoft.com/en-us/library/ms644990%28v=VS.85%29.aspx
You can set that up to give you a callback when any number of events occur - in fact I'm pretty positive that you CAN get this callback when a directory is opened, but it will probably be inordinately difficult because you'll be intercepting messages to explorer's window. So you'll be rebooting during debugging.
Windows Shells http://msdn.microsoft.com/en-us/library/bb776778%28v=VS.85%29.aspx
If that wasn't painful enough, you can try writing a shell program.
If you're trying to write a rootkit, I suppose you don't want me to spoil the details for you. If you're NOT trying to write a rootkit, I suggest you look it up - carefully. There are open source rootkits, and they all basically have to monitor file access this way to hide from the user / OS.
Go with the Windows Shell Extensions. You can use Shell Namespace Extensions to make a "virtual" folder that isn't there (or hides a real one), like the GAC (C:\Windows\assembly)
Here are several examples of Shell Extension coding in .Net 4.0.
A Column Handler would let you know when a folder is "Opened", and even let you provide extra data for each of the files (new details columns).
Check out the FileSystemWatcher class.
The closest thing that I can think of, that may be useful to you, is using the static Directory class. It provides methods to determine the last time a file or directory was accessed. You could setup a BackgroundWorker to monitor if the directory was accessed during a specified interval. Keep track of the start and end of the interval by using DateTime, and if the last access time falls between those, then you can use the BackgroundWorker's ProgressChanged event to notify the application.
BackgroundWorker folderWorker = new BackgroundWorker();
folderWorker.WorkerReportsProgress = true;
folderWorker.WorkerSupportsCancellation = true;
folderWorker.DoWork += FolderWorker_DoWork;
folderWorker.ProgressChanged += FolderWorker_ProgressChanged;
folderWorker.RunWorkerAsync();
void FolderWorker_DoWork(object sender, DoWorkEventArgs e)
{
BackgroundWorker worker = (BackgroundWorker)sender;
while(!worker.CancellationPending)
{
DateTime lastAccess = Directory.GetLastAccessTime(DIRECTORY_PATH);
//Check to see if lastAccess falls between the last time the loop started
//and came to end.
if(/*your check*/)
{
object state; //Modify this if you need to send back data.
worker.ReportProgress(0, state);
}
}
}
void FolderWorker_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
//Take action here from the worker.ReportProgress being invoked.
}
You could use the FileSystemInfo's LastAccessProperty. The problem though is that it can be cached.
FileSystemInfo: http://msdn.microsoft.com/en-us/library/975xhcs9.aspx
LastAccessTime Property: http://msdn.microsoft.com/en-us/library/system.io.filesysteminfo.lastaccesstimeutc.aspx
As noted that this can be pre-cached.
"The value of the LastAccessTimeUtc property is pre-cached if the current instance of the FileSystemInfo object was returned from any of the following DirectoryInfo methods:
GetDirectories
GetFiles
GetFileSystemInfos
EnumerateDirectories
EnumerateFiles
EnumerateFileSystemInfos
To get the latest value, call the Refresh method."
Therefore call the Refresh method but it still might not be up to date due to Windows caching the value. (This is according to msdn doc "FileSystemInfo.Refresh takes a snapshot of the file from the current file system. Refresh cannot correct the underlying file system even if the file system returns incorrect or outdated information. This can happen on platforms such as Windows 98." - link: http://msdn.microsoft.com/en-us/library/system.io.filesysteminfo.refresh.aspx
I think the only way you can reliably achieve this is by monitoring the currently running processes and watch closely for new Explorer.exe instances and/or new Explorer.exe spawned threads (the "Run every window on a separate process" setting gets in the way here).
I admit I don't have a clue about how to code this, but that's what I would look for.

Adobe Reader process fails when starting second instance

In our C# WinForms application, we generate PDF files and launch Adobe Reader (or whatever the default system .pdf handler is) via the Process class. Since our PDF files can be large (approx 200K), we handle the Exited event to then clean up the temp file afterwards.
The system works as required when a file is opened and then closed again. However, when a second file is opened (before closing Adobe Reader) the second process immediately exits (since Reader is now using it's MDI powers) and in our Exited handler our File.Delete call should fail because it's locked by the now joined Adobe process. However, in Reader we instead get:
There was an error opening this document. This file cannot be found.
The unusual thing is that if I put a debugger breakpoint before the file deletion and allow it to attempt (and fail) the deletion, then the system behaves as expected!
I'm positive that the file exists and fairly positive that all handles/file streams to the file are closed before starting the process.
We are launching with the following code:
// Open the file for viewing/printing (if the default program supports it)
var pdfProcess = new Process();
pdfProcess.StartInfo.FileName = tempFileName;
if (pdfProcess.StartInfo.Verbs.Contains("open", StringComparer.InvariantCultureIgnoreCase))
{
var verb = pdfProcess.StartInfo.Verbs.First(v => v.Equals("open", StringComparison.InvariantCultureIgnoreCase));
pdfProcess.StartInfo.Verb = verb;
}
pdfProcess.StartInfo.Arguments = "/N"; // Specifies a new window will be used! (But not definitely...)
pdfProcess.SynchronizingObject = this;
pdfProcess.EnableRaisingEvents = true;
pdfProcess.Exited += new EventHandler(pdfProcess_Exited);
_pdfProcessDictionary.Add(pdfProcess, tempFileName);
pdfProcess.Start();
Note: We are using the _pdfProcessDictionary to store references to the Process objects so that they stay in scope so that Exited event can successfully be raised.
Our cleanup/exited event is:
void pdfProcess_Exited(object sender, EventArgs e)
{
Debug.Assert(!InvokeRequired);
var p = sender as Process;
try
{
if (_pdfProcessDictionary.ContainsKey(p))
{
var tempFileName = _pdfProcessDictionary[p];
if (File.Exists(tempFileName)) // How else can I check if I can delete it!!??
{
// NOTE: Will fail if the Adobe Reader application instance has been re-used!
File.Delete(tempFileName);
_pdfProcessDictionary.Remove(p);
}
CleanOtherFiles(); // This function will clean up files for any other previously exited processes in our dictionary
}
}
catch (IOException ex)
{
// Just swallow it up, we will deal with trying to delete it at another point
}
}
Possible solutions:
Detect that the file is still open in another process
Detect that the second process hasn't really been fully exited and that the file is opened in the first process instead
I just dealt with this a couple of days ago.
When there is no instance open already, the document opens in a new instance directly.
When there is an instance already open, I believe that instance spawns a new instance which you don't actually get a handle to. What happens is control returns to your function immediately, which then goes and deletes the file before the new instance has had a chance to read the file -- hence it appears to not be there.
I "solved" this by not deleting the files immediately, but keeping track of the paths in a list, and then nuking all of them when the program exits (wrap each delete in a try/catch with an empty catch block in case the file has disappeared in the meantime).
I would suggest following approach:
Create files in user's temp directory (Path.GetTempPath). You can create some sub-folder under it.
Attempt to delete files only when last instance of process gets exited (i.e. you need to count number of processes that you had launched, on exit, decrement the count and when it becomes zero, attempt to delete (all) files that are open so far)
Try to clean-up created sub-folder (under temp directory) while starting and ending the application. You can even attempt for periodic clean-up using timer.

Categories

Resources