I have some UI application that lives in the user's task bar that is written in C#. The EXE for the tool is checked in to our source control system on a number of projects that use it so we are able to update the version they run with by checking in updated EXE.
The problem is that when the users get the latest revision of the exe, the program is often running, and the sync fails on their machine. I want to fix it so the program doesn't lock the exe and any dependent DLL's when it runs so they can sync without having to shut down the program.
Currently, I have a program that takes an executable as a parameter and will launch it from memory by reading the assembly contents into memory ahead of time. Unfortunetly, this totally fails when it comes to the DLL's that the program requires.
The code I have right now looks something like this:
public class ExecuteFromMemory
{
public static void Main(string[] args)
{
//Figure out the name of the EXE to launch and the arguments to forward to it
string fileName = args[0];
string[] realArgs = new string[args.Length - 1];
Array.Copy(args, 1, realArgs, 0, args.Length - 1);
//Read the assembly from the disk
byte[] binary = File.ReadAllBytes(fileName);
//Execute the loaded assembly using reflection
Assembly memoryAssembly = null;
try
{
memoryAssembly = Assembly.Load(binary);
}
catch (Exception ex)
{
//Print error message and exit
}
MethodInfo method = memoryAssembly.EntryPoint;
if (method != null && method.IsStatic)
{
try
{
method.Invoke(null, new object[] { realArgs });
}
catch(Exception ex)
{
//Print error message and exit
}
}
else
{
//Print error message and exit
}
}
}
My question is, am I doing something totally stupid? Is there a better way to handle this? If not, what should I do to support handling external dependencies?
For example, the above code fails to load any dependent files if you try to run 'Foo.exe' that uses functions from 'Bar.dll', the 'Foo.exe' will be overwriteable, but 'Bar.dll' is still locked and can't be overwritten.
I tried getting the list of referenced assemblies from the 'GetReferencedAssemblies()' method on the loaded assmebly, but that doesn't seem to give any indication where the assemblies should be loaded from... Do I need to search for them myself? If so, what's the best way to do this?
It seems like other people might have come across this before, and I don't want to re-invent the wheel.
-
Update:
The EXE is checked in because thats how we distribute our in-house tools to the teams that use them. Its not optimal for this use-case, but I don't have the opportunity to change that policy.
Disclaimer: I don't use Windows, though I am familiar with its strange way of locking things.
In order to update your application while it is running, you'll likely need to have two processes: The executable itself, and an update “helper” application that will finish the update process. Let's say that your application is ProcessA.exe and your update helper is Updater.exe. Your main program will download a new copy of the executable, saving it with a random name. Then you run your updater program, which watches for the termination of your current process. When your process terminates, it displays a quick window showing the status of the update, moving the new executable into the place of the old one, and then restarting that program.
It'd be more elegant to be able to emulate POSIX filesystem semantics and be able to delete the currently-running process disk image and replace it with a new file, but I don't know if that is even possible on Windows. On a POSIX system, you can delete an in-use file and it won't actually be deleted until any remaining file handles are closed, though you can then reuse the filename.
You might want to check out an article written at CodeProject that talks about this. It also has a follow-up article.
Good luck!
Does the program need to keep running while updating?
Typically to update a program which is running you would copy over any of the files that are to be replaced to a temporary folder. Then shut down the old instance, delete it and move the new files over to the correct locations then re-launch it.
This allows for minimal down time of the application since the longest part is usually the copy and the file move is very fast if the temporary folder is on the same logical drive.
Although Michael's answer is one way of doing this, there are tools out there that are explicitly for managing what's installed on the desktop.
What you are doing with the exe being checked into source control is not normal. If you have a windows domain controller, you can use Group Policy to push programs down to the client. Alternatively, you could use something like Altiris to handle it.
If you must continue the way you are going then you have two options. One, using a helper / loader app which does a version check on launch. This is similar to how firefox works.
The second way is to build a helper service that sits in memory and polls every so often for updates. This is how Google Chrome, Adobe, etc work.
Related
I want to be able to programmatically (C#) detect when a program attempts to load (or otherwise access) a non-existent DLL from a directory that I control/own on Windows.
I can manually accomplish this using Sysinternals Process Monitor (ProcMon). For example, using the filters shown below, I am able to detect an attempt by the ClientPrj.exe program, to load a dll (GetEvenOdd.dll) from a directory I control (C:\MyDirectory);
How do I accomplish something similar programmatically?
My attempts thus far have involved manually enabling Windows Auditing on the folder, running the program, and then checking the Windows Event log for any Audit entries, but no new entries appear in the event log related to this folder.
Note, I am not looking to exactly replicate procmon, I simply want to detect when a file (in this case a DLL) is attempted to be loaded from the directory I control.
Side note; It's unclear to me why ProcMon lists the attempt to load the DLL as a "CreateFile" operation, because the "ClientPrj.exe" program is simply trying to Load the DLL (in C++ the "ClientPrj.exe" program is using the the LoadLibrary method to load the DLL).
I thinks is pretty safe to answer this question,
I want to be able to programmatically (C#) detect when a program
attempts to load (or otherwise access) a non-existent DLL from a
directory that I control/own on Windows.
There is really only one reliable way to achieve this, and its really not for the faint-or-heart.
I can manually accomplish this using Sysinternals Process Monitor
(ProcMon).
ProcMon is a very complex application and makes use of all sorts of black magic in Kernel Mode to achieve what it does
DLL injection
In computer programming, DLL injection is a technique used for running
code within the address space of another process by forcing it to load
a dynamic-link library.1 DLL injection is often used by external
programs to influence the behavior of another program in a way its
authors did not anticipate or intend.1[2][3] For example, the
injected code could hook system function calls,[4][5] or read the
contents of password textboxes, which cannot be done the usual way.[6]
A program used to inject arbitrary code into arbitrary processes is
called a DLL injector.
The premise is exceedingly simple, and its a well used technique to do various things.
Basically, you need to inject a DLL into the program address space. The best known and most often used method is "Import Table Patching". Each win32 module (application/DLL) has a so-called "import table", which is basically a list of all APIs, which this module calls. Patching this import table is a quite easy job and works very nicely.
Another more robust method is, you can also directly manipulate the API's binary code in memory. The most often used method is to overwrite the first 5 bytes of the API code with a JMP instruction, which then jumps to your callback function.
In your case you want to Find LoadLibrary in your target application, JMP to a proxy where you can monitor the libraries loaded and also the results of the call, then pass back the results to the original caller.
This is pretty intense stuff, however it is more common that what you think. There are libraries written that Use Drivers that work in Kernel Mode Which work for 64bit and 32Bit applications that take care off all the hard work, you basically just give it a scope a dll and a signature of the apis you want to hook and write a proxy. and it will take care of the rest. At that point you can IPC the results anywhere you like.
Your first problem is setting a hook before it loads your target lib. However once again this has all be done for you. take a look at http://help.madshi.net/madCodeHook.htm
The onyl down side here, is it has to be done with a traditional DLL and not in .net
Anyway good luck
I can only assume that you have not even tried. I just did
private FileSystemWatcher fsWatcher;
public void SetupWatcher()
{
fsWatcher = new FileSystemWatcher();
fsWatcher.Path = #"C:\SomePath\SomeSubFolder\";
fsWatcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
fsWatcher.Filter = "*.*";
fsWatcher.Changed += FsWatcher_Changed;
fsWatcher.EnableRaisingEvents = true;
System.IO.File.WriteAllText(fsWatcher.Path + "MyFile.txt", "testing" );
}
private void FsWatcher_Changed(object sender, FileSystemEventArgs e)
{
var msg = "Action " + e.ChangeType + "\r\n"
+ "FullPath " + e.FullPath + "\r\n"
+ "Just File Name " + e.Name;
MessageBox.Show(msg);
}
So, even though my "MyFile.txt" does not exist in the folder, as soon as I try to write to it, create, etc, the FsWatcher_Changed event handler is called. I can then check the change type, path of the file, just file name, etc. From that you can detect and confirm if the file name is what you expect (or not) and act on it as needed.
Feedback from comment...
As per from another program... If I am running my program with the above sample SystemFileWatcher and am listening to the entire folder in question.. Then I go to another applications (even ex: Word, Excel, whatever) and try to create a file in this folder, it will still get caught.
If another program is trying to open a file that does not exist, that program would fail out anyhow. Why would you necessarily care if another program fails validating a file to open that does not exist. That is a problem of the other program?
So i'm trying to write a program that will list all the installed programs on Windows and let user decided which one to be blocked(Killed).
I found the solution of List Installed program here. Get installed applications in a system
And to kill a program you just need the .exe name and kill all the process with that name.
But my problem is the names in the list of installed program is not necccesary the .exe name, and you couldn't find a process without the .exe name. For example, the installed program name is "Google Chrome" but the .exe or the process name would be chrome.
The current solution I can think of is to find out the installed directory of the installed program, find all the .exe files under it. And kill all of them if required. But I don't know how to get the directory of all the installed programs.
So, I can't help wonder, is there a more elegant of doing it. Any suggestions would be appreciated.
You have to split this question on 2 parts:
1. Find all the programs
What does it mean? It means that You want to find all the executable files (*.exe) on the hard drive of the client. This executable files does include the code which will run in the memory and do some work.
BUT: The code located on harddisk cannot be "killed", as the code (exe file) will start first new instance of application (via OS interface). Once You got the instance we can start talking about other part of question.
Also as stated there, You are unable to get easily 1 exe file, as multiple application might have multiple exe files. And if You really want to have them, You got to run analysis on the harddisk, like there:
var allExePaths =
from drive in Environment.GetLogicalDrives()
from exePath in Directory.GetFiles(drive, "*.exe", SearchOption.AllDirectories)
select exePath;
Also as Ivar stated in comments, there are multiple executables, like *.jar, *.pyc, ... which might be run via other executable file.
2. Kill the required program
If You want to kill some program/application, You need to get it's instance. Instance are distinguished by PID (process ID). Once You got this of the instance, You can kill it. Answer for this is there: How do I kill a process using Vb.NET or C#?
The code below will take all process with particular name and kill them all.
Code:
foreach ( Process p in System.Diagnostics.Process.GetProcessesByName("winword") )
{
try
{
p.Kill();
p.WaitForExit(); // possibly with a timeout
}
catch ( Win32Exception winException )
{
// process was terminating or can't be terminated - deal with it
}
catch ( InvalidOperationException invalidException )
{
// process has already exited - might be able to let this one go
}
}
From a C# MVC controller action, is it possible to execute a gulp task and if so how would I go about doing this?
In my C# app, I'm trying to check if a given string (submitted in a form) is valid sass.
There are a couple of C# CSS parsers but I can't find one that can handle sass (*.scss).
In my projects I use a gulp task that compiles sass and reports any errors so I was wondering if there was a way I could utilize this to do the validation in my C# app i.e. add the text input from my C# app to a .scss file, get the gulp task to try and compile it, if it passes without errors then I know the sass is valid.
Maybe I'm barking up the wrong tree here but any help would be much appreciated.
Ok. There's a few things here (in your question) that a scaring the absolute crap out of me but I think the easiest summary of it is this :-
Given an ASP.NET MVC Controller, how can I call -another- .exe
process?
So assuming you still want to do this, ignoring security vun's and stuff (I'll just roll with you, here...) I think you need to run a new AppDomain and then in that app domain run your exe. If the exe fails, you can then capture the error message also.
Updated as per comments
Random code taken from first'ish google result:
try
{
//Create a new appdoamin for execute my exe in a isolated way.
AppDomain sandBox = AppDomain.CreateDomain("sandBox");
try
{
//Exe executing with arguments
sandBox.ExecuteAssembly(".exe file name with path");
}
finally
{
AppDomain.Unload(sandBox);//destry created appdomain and memory is released.
}
}
catch (Exception ex)//Any exception that generate from executable can handle
{
//Logger.log(ex.Message);
}
so here, you would run the gulp.exe and pass in your command line args, include the sass content (content saved to a file?).
Now, if the .exe isn't a .NET assembly, then you might need to use the Process method...
something like this.. (pseudo code)
Process app = new Process();
app.StartInfo.FileName = #"D:/Path to /My/Program to be run.exe";
app.Start();
Again - not testing, but this should be enough to get you started...
Do note -> running a Process requires serious security permissions on the server, so generally hosted servers (like Azure Websites Site etc) lock all this down to protected themselves. If you own the server, then go nuts.
Final Note:
I've never seen or used Node.
I've never seen or used gulp.
I have a C# application that loads fine on my development machine but fails to even start on Win2008 machine.
Checked Frameworks and they match up, Net 4.0
I immediately suspected the problem was arising from references to specific files that I was reading from and sure enough, using some test code I narrowed it down to a single line.
public static string[] salesArray = (File.ReadAllLines("sales.txt").ToArray());
If I comment out the above line, the test app starts, if I leave it in, it fails. Any ideas?
I am copying the Debug directory to the second machine (sales.txt) within it.
This is the entire code. The app does nothing but open a blank window.
namespace testServer
{
public partial class Form1 : Form
{
public static string[] salesArray = (File.ReadAllLines("sales.txt").ToArray());
public Form1()
{
InitializeComponent();
}
}
}
Two potential issues jump out:
1) The current working directory of the app isn't what you think it is:
If you can print/show this, you would know for sure.
http://msdn.microsoft.com/en-us/library/system.io.directory.getcurrentdirectory.aspx
Because you are specifying a relative path, the current working directory is what the file is being resolved against.
2) Perhaps permissions. You might right-click, 'Run as administrator' as a quick check into that theory.
You should use a better mechanism to find that file, and check for its existence (ie: File.Exists) prior to opening it.
This will also let you report to the user if there is a problem, such as the file not existing where you expect it to be.
Put an exception handler around the code. You can get the error message and can handle the failure gracefully.
It should handle all errors (file not found, file in use, permission errors, etc.).
When I call FileInfo(path).LastAccessTime or FileInfo(path).LastWriteTime on a file that is in the process of being written it returns the time that the file was created, not the last time it was written to (ie. now).
Is there a way to get this information?
Edit: To all the responses so far. I hadn't tried Refresh() but that does not do it either. I am returned the time that the file was started to be written to. The same goes for the static method, and creating a new instance of FileInfo.
Codymanix might have the answer, but I'm not running Windows Server (using Windows 7), and I don't know where the setting is to test.
Edit 2: Nobody finds it interesting that this function doesn't seem to work?
The FileInfo values are only loaded once and then cached. To get the current value, call Refresh() before getting a property:
f.Refresh();
t = f.LastAccessTime;
Another way to get the current value is by using the static methods on the File class:
t = File.GetLastAccessTime(path);
Starting in Windows Vista, last access time is not updated by default. This is to improve file system performance. You can find details here:
http://blogs.technet.com/b/filecab/archive/2006/11/07/disabling-last-access-time-in-windows-vista-to-improve-ntfs-performance.aspx
To reenable last access time on the computer, you can run the following command:
fsutil behavior set disablelastaccess 0
As James has pointed out LastAccessTime is not updated.
The LastWriteTime has also undergone a twist since Vista. When the process has the file still open and another process checks the LastWriteTime it will not see the new write time for a long time -- until the process has closed the file.
As a workaround you can open and close the file from your external process. After you have done that you can try to read the LastWriteTime again which is then the up to date value.
File System Tunneling:
If an application implements something like a rolling logger which closes the file and then renames it to a different file name you will also run into issues since the creation time and file size of the "old" file is remembered by the OS although you did create a new file. This includes wrong reports of the file size even if you did recreate log.txt from scratch which is still 0 bytes in size. This feature is called OS File System Tunneling which is still present on Windows 8.1 . An example how to work around this issue check out RollingFlatFileTracelistener from Enterprise Library.
You can see the effects of file system tunneling on your own machine from the cmd shell.
echo test > file1.txt
ren file1.txt file2.txt
Wait one minute
echo test > file1.txt
dir /tc file*.txt
...
05.07.2015 19:26 7 file1.txt
05.07.2015 19:26 7 file2.txt
The file system is a state machine. Keeping states correctly synchronized is hard if you care about performance and correctness.
This strange tunneling syndrome is obviously still used by application which do e.g. autosave a file and move it to a save location and then recreate the file again at the same location. For these applications it makes to sense to give the file a new creation date because it was only copied around. Some installers do also such tricks to move files temporarily to a different location and write the contents back later to get past some file exists check for some install hooks.
Have you tried calling Refresh() just before accessing the property (to avoid getting a cached value)? If that doesn't work, have you looked at what Explorer shows at the same time? If Explorer is showing the wrong information, then it's probably something you can't really address - it might be that the information is only updated when the file handle is closed, for example.
There is a setting in windows which is sometimes set especially on server systems so that modified and accessed times for files are not set for better performance.
From MSDN:
When first called, FileSystemInfo
calls Refresh and returns the
cached information on APIs to get
attributes and so on. On subsequent
calls, you must call Refresh to get
the latest copy of the information.
FileSystemInfo.Refresh()
If you're application is the one doing the writing, I think you are going to have to "touch" the file by setting the LastWriteTime property your self between each buffer of data you write. Some psuedocode:
while(bytesWritten < totalBytes)
{
bytesWritten += br.Write(buffer);
myFileInfo.LastWriteTime = DateTime.Now;
}
I'm not sure how severely this will affect write performance.
Tommy Carlier's answer got me thinking....
A good way to visualise the differences is seperately running the two snippets (I just used LinqPAD) simliar to below while also running sysinternals Process Monitor.
while(true)
File.GetLastAccessTime([file path here]);
and
FileInfo bob = new FileInfo(path);
while(true){
string accessed = bob.LastAccessTime.ToString();
}
If you look at Process Monitor while running the first snippet you will see repeated and constant access attempts to the file for the LinqPAD process. The second snippet will do an initial access of the file, for which you will see activity in process monitor, and then very little afterwards.
However if you go and modify the file (I just opened the text file I was monitoring using FileInfo and added a character and saved) you will see a series of access attempts by the LinqPAD process to the file in process monitor.
This illustrates the non-cached and cached behaviour of the two different approachs respectively.
Will the non-cached approach wear a hole in the hard drive?!
EDIT
I went away feeling all clever over my testing and then used the caching behaviour of FileInfo in my windows service (basically to sit in a loop and say 'Has-file-changed-has-file-changed...' before doing processing)
While this approach worked on my dev box, it did not work in the production environment, ie the process just kept running regardless if the file had changed or not. I ended up changing my approach to checking and just used GetLastAccessTime as part of it. Don't know why it would behave differently on production server....but I am not too concerned at this point.