Is it possible to start background processes with Process.Start? [duplicate] - c#
I'm creating new processes using System.Diagnostics.Process class from my application. I want this processes to be killed when/if my application has crashed. But if I kill my application from Task Manager, child processes are not killed. Is there any way to make child processes dependent on parent process?
From this forum, credit to 'Josh'.
Application.Quit() and Process.Kill() are possible solutions, but have proven to be unreliable. When your main application dies, you are still left with child processes running. What we really want is for the child processes to die as soon as the main process dies.
The solution is to use "job objects" http://msdn.microsoft.com/en-us/library/ms682409(VS.85).aspx.
The idea is to create a "job object" for your main application, and register your child processes with the job object. If the main process dies, the OS will take care of terminating the child processes.
public enum JobObjectInfoType
{
AssociateCompletionPortInformation = 7,
BasicLimitInformation = 2,
BasicUIRestrictions = 4,
EndOfJobTimeInformation = 6,
ExtendedLimitInformation = 9,
SecurityLimitInformation = 5,
GroupInformation = 11
}
[StructLayout(LayoutKind.Sequential)]
public struct SECURITY_ATTRIBUTES
{
public int nLength;
public IntPtr lpSecurityDescriptor;
public int bInheritHandle;
}
[StructLayout(LayoutKind.Sequential)]
struct JOBOBJECT_BASIC_LIMIT_INFORMATION
{
public Int64 PerProcessUserTimeLimit;
public Int64 PerJobUserTimeLimit;
public Int16 LimitFlags;
public UInt32 MinimumWorkingSetSize;
public UInt32 MaximumWorkingSetSize;
public Int16 ActiveProcessLimit;
public Int64 Affinity;
public Int16 PriorityClass;
public Int16 SchedulingClass;
}
[StructLayout(LayoutKind.Sequential)]
struct IO_COUNTERS
{
public UInt64 ReadOperationCount;
public UInt64 WriteOperationCount;
public UInt64 OtherOperationCount;
public UInt64 ReadTransferCount;
public UInt64 WriteTransferCount;
public UInt64 OtherTransferCount;
}
[StructLayout(LayoutKind.Sequential)]
struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION
{
public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation;
public IO_COUNTERS IoInfo;
public UInt32 ProcessMemoryLimit;
public UInt32 JobMemoryLimit;
public UInt32 PeakProcessMemoryUsed;
public UInt32 PeakJobMemoryUsed;
}
public class Job : IDisposable
{
[DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
static extern IntPtr CreateJobObject(object a, string lpName);
[DllImport("kernel32.dll")]
static extern bool SetInformationJobObject(IntPtr hJob, JobObjectInfoType infoType, IntPtr lpJobObjectInfo, uint cbJobObjectInfoLength);
[DllImport("kernel32.dll", SetLastError = true)]
static extern bool AssignProcessToJobObject(IntPtr job, IntPtr process);
private IntPtr m_handle;
private bool m_disposed = false;
public Job()
{
m_handle = CreateJobObject(null, null);
JOBOBJECT_BASIC_LIMIT_INFORMATION info = new JOBOBJECT_BASIC_LIMIT_INFORMATION();
info.LimitFlags = 0x2000;
JOBOBJECT_EXTENDED_LIMIT_INFORMATION extendedInfo = new JOBOBJECT_EXTENDED_LIMIT_INFORMATION();
extendedInfo.BasicLimitInformation = info;
int length = Marshal.SizeOf(typeof(JOBOBJECT_EXTENDED_LIMIT_INFORMATION));
IntPtr extendedInfoPtr = Marshal.AllocHGlobal(length);
Marshal.StructureToPtr(extendedInfo, extendedInfoPtr, false);
if (!SetInformationJobObject(m_handle, JobObjectInfoType.ExtendedLimitInformation, extendedInfoPtr, (uint)length))
throw new Exception(string.Format("Unable to set information. Error: {0}", Marshal.GetLastWin32Error()));
}
#region IDisposable Members
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
#endregion
private void Dispose(bool disposing)
{
if (m_disposed)
return;
if (disposing) {}
Close();
m_disposed = true;
}
public void Close()
{
Win32.CloseHandle(m_handle);
m_handle = IntPtr.Zero;
}
public bool AddProcess(IntPtr handle)
{
return AssignProcessToJobObject(m_handle, handle);
}
}
Looking at the constructor ...
JOBOBJECT_BASIC_LIMIT_INFORMATION info = new JOBOBJECT_BASIC_LIMIT_INFORMATION();
info.LimitFlags = 0x2000;
The key here is to setup the job object properly. In the constructor I'm setting the "limits" to 0x2000, which is the numeric value for JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE.
MSDN defines this flag as:
Causes all processes associated with
the job to terminate when the last
handle to the job is closed.
Once this class is setup...you just have to register each child process with the job. For example:
[DllImport("user32.dll", SetLastError = true)]
public static extern uint GetWindowThreadProcessId(IntPtr hWnd, out uint lpdwProcessId);
Excel.Application app = new Excel.ApplicationClass();
uint pid = 0;
Win32.GetWindowThreadProcessId(new IntPtr(app.Hwnd), out pid);
job.AddProcess(Process.GetProcessById((int)pid).Handle);
This answer started with #Matt Howells' excellent answer plus others (see links in the code below). Improvements:
Supports 32-bit and 64-bit.
Fixes some problems in #Matt Howells' answer:
The small memory leak of extendedInfoPtr
The 'Win32' compile error, and
A stack-unbalanced exception I got in the call to CreateJobObject (using Windows 10, Visual Studio 2015, 32-bit).
Names the Job, so if you use SysInternals, for example, you can easily find it.
Has a somewhat simpler API and less code.
Here's how to use this code:
// Get a Process object somehow.
Process process = Process.Start(exePath, args);
// Add the Process to ChildProcessTracker.
ChildProcessTracker.AddProcess(process);
To support Windows 7 requires:
A simple app.manifest change as #adam smith describes.
Registry settings to be added if you are using Visual Studio.
In my case, I didn't need to support Windows 7, so I have a simple check at the
top of the static constructor below.
/// <summary>
/// Allows processes to be automatically killed if this parent process unexpectedly quits.
/// This feature requires Windows 8 or greater. On Windows 7, nothing is done.</summary>
/// <remarks>References:
/// https://stackoverflow.com/a/4657392/386091
/// https://stackoverflow.com/a/9164742/386091 </remarks>
public static class ChildProcessTracker
{
/// <summary>
/// Add the process to be tracked. If our current process is killed, the child processes
/// that we are tracking will be automatically killed, too. If the child process terminates
/// first, that's fine, too.</summary>
/// <param name="process"></param>
public static void AddProcess(Process process)
{
if (s_jobHandle != IntPtr.Zero)
{
bool success = AssignProcessToJobObject(s_jobHandle, process.Handle);
if (!success && !process.HasExited)
throw new Win32Exception();
}
}
static ChildProcessTracker()
{
// This feature requires Windows 8 or later. To support Windows 7 requires
// registry settings to be added if you are using Visual Studio plus an
// app.manifest change.
// https://stackoverflow.com/a/4232259/386091
// https://stackoverflow.com/a/9507862/386091
if (Environment.OSVersion.Version < new Version(6, 2))
return;
// The job name is optional (and can be null) but it helps with diagnostics.
// If it's not null, it has to be unique. Use SysInternals' Handle command-line
// utility: handle -a ChildProcessTracker
string jobName = "ChildProcessTracker" + Process.GetCurrentProcess().Id;
s_jobHandle = CreateJobObject(IntPtr.Zero, jobName);
var info = new JOBOBJECT_BASIC_LIMIT_INFORMATION();
// This is the key flag. When our process is killed, Windows will automatically
// close the job handle, and when that happens, we want the child processes to
// be killed, too.
info.LimitFlags = JOBOBJECTLIMIT.JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE;
var extendedInfo = new JOBOBJECT_EXTENDED_LIMIT_INFORMATION();
extendedInfo.BasicLimitInformation = info;
int length = Marshal.SizeOf(typeof(JOBOBJECT_EXTENDED_LIMIT_INFORMATION));
IntPtr extendedInfoPtr = Marshal.AllocHGlobal(length);
try
{
Marshal.StructureToPtr(extendedInfo, extendedInfoPtr, false);
if (!SetInformationJobObject(s_jobHandle, JobObjectInfoType.ExtendedLimitInformation,
extendedInfoPtr, (uint)length))
{
throw new Win32Exception();
}
}
finally
{
Marshal.FreeHGlobal(extendedInfoPtr);
}
}
[DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
static extern IntPtr CreateJobObject(IntPtr lpJobAttributes, string name);
[DllImport("kernel32.dll")]
static extern bool SetInformationJobObject(IntPtr job, JobObjectInfoType infoType,
IntPtr lpJobObjectInfo, uint cbJobObjectInfoLength);
[DllImport("kernel32.dll", SetLastError = true)]
static extern bool AssignProcessToJobObject(IntPtr job, IntPtr process);
// Windows will automatically close any open job handles when our process terminates.
// This can be verified by using SysInternals' Handle utility. When the job handle
// is closed, the child processes will be killed.
private static readonly IntPtr s_jobHandle;
}
public enum JobObjectInfoType
{
AssociateCompletionPortInformation = 7,
BasicLimitInformation = 2,
BasicUIRestrictions = 4,
EndOfJobTimeInformation = 6,
ExtendedLimitInformation = 9,
SecurityLimitInformation = 5,
GroupInformation = 11
}
[StructLayout(LayoutKind.Sequential)]
public struct JOBOBJECT_BASIC_LIMIT_INFORMATION
{
public Int64 PerProcessUserTimeLimit;
public Int64 PerJobUserTimeLimit;
public JOBOBJECTLIMIT LimitFlags;
public UIntPtr MinimumWorkingSetSize;
public UIntPtr MaximumWorkingSetSize;
public UInt32 ActiveProcessLimit;
public Int64 Affinity;
public UInt32 PriorityClass;
public UInt32 SchedulingClass;
}
[Flags]
public enum JOBOBJECTLIMIT : uint
{
JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE = 0x2000
}
[StructLayout(LayoutKind.Sequential)]
public struct IO_COUNTERS
{
public UInt64 ReadOperationCount;
public UInt64 WriteOperationCount;
public UInt64 OtherOperationCount;
public UInt64 ReadTransferCount;
public UInt64 WriteTransferCount;
public UInt64 OtherTransferCount;
}
[StructLayout(LayoutKind.Sequential)]
public struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION
{
public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation;
public IO_COUNTERS IoInfo;
public UIntPtr ProcessMemoryLimit;
public UIntPtr JobMemoryLimit;
public UIntPtr PeakProcessMemoryUsed;
public UIntPtr PeakJobMemoryUsed;
}
I carefully tested both the 32-bit and 64-bit versions of the structs by programmatically comparing the managed and native versions to each other (the overall size as well as the offsets for each member).
I've tested this code on Windows 7, 8, and 10.
This post is intended as an extension to #Matt Howells' answer, specifically for those who run into problems with using Job Objects under Vista or Win7, especially if you get an access denied error ('5') when calling AssignProcessToJobObject.
tl;dr
To ensure compatibility with Vista and Win7, add the following manifest to the .NET parent process:
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
<v3:trustInfo xmlns:v3="urn:schemas-microsoft-com:asm.v3">
<v3:security>
<v3:requestedPrivileges>
<v3:requestedExecutionLevel level="asInvoker" uiAccess="false" />
</v3:requestedPrivileges>
</v3:security>
</v3:trustInfo>
<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
<!-- We specify these, in addition to the UAC above, so we avoid Program Compatibility Assistant in Vista and Win7 -->
<!-- We try to avoid PCA so we can use Windows Job Objects -->
<!-- See https://stackoverflow.com/questions/3342941/kill-child-process-when-parent-process-is-killed -->
<application>
<!--The ID below indicates application support for Windows Vista -->
<supportedOS Id="{e2011457-1546-43c5-a5fe-008deee3d3f0}"/>
<!--The ID below indicates application support for Windows 7 -->
<supportedOS Id="{35138b9a-5d96-4fbd-8e2d-a2440225f93a}"/>
</application>
</compatibility>
</assembly>
Note that when you add new manifest in Visual Studio 2012 it will contain the above snippet already so you do not need to copy it from hear. It will also include a node for Windows 8.
full explanation
Your job association will fail with an access denied error if the process you're starting is already associated with another job. Enter Program Compatibility Assistant, which, starting in Windows Vista, will assign all kinds of processes to its own jobs.
In Vista you can mark your application to be excluded from PCA by simply including an application manifest. Visual Studio seems to do this for .NET apps automatically, so you're fine there.
A simple manifest no longer cuts it in Win7. [1] There, you have to specifically specify that you're compatible with Win7 with the tag in your manifest. [2]
This led me to worry about Windows 8. Will I have to change my manifest once again? Apparently there's a break in the clouds, as Windows 8 now allows a process to belong to multiple jobs. [3] So I haven't tested it yet, but I imagine that this madness will be over now if you simply include a manifest with the supportedOS information.
Tip 1: If you're developing a .NET app with Visual Studio, as I was, here [4] are some nice instructions on how to customize your application manifest.
Tip 2: Be careful with launching your application from Visual Studio. I found that, after adding the appropriate manifest, I still had problems with PCA when launching from Visual Studio, even if I used Start without Debugging. Launching my application from Explorer worked, however. After manually adding devenv for exclusion from PCA using the registry, starting applications that used Job Objects from VS started working as well. [5]
Tip 3: If you ever want to know if PCA is your problem, try launching your application from the command line, or copy the program to a network drive and run it from there. PCA is automatically disabled in those contexts.
[1] http://blogs.msdn.com/b/cjacks/archive/2009/06/18/pca-changes-for-windows-7-how-to-tell-us-you-are-not-an-installer-take-2-because-we-changed-the-rules-on-you.aspx
[2] http://ayende.com/blog/4360/how-to-opt-out-of-program-compatibility-assistant
[3] http://msdn.microsoft.com/en-us/library/windows/desktop/ms681949(v=vs.85).aspx:
"A process can be associated with more than one job in Windows 8"
[4] How can I embed an application manifest into an application using VS2008?
[5] How to stop the Visual Studio debugger starting my process in a job object?
Here's an alternative that may work for some when you have control of the code the child process runs. The benefit of this approach is it doesn't require any native Windows calls.
The basic idea is to redirect the child's standard input to a stream whose other end is connected to the parent, and use that stream to detect when the parent has gone away. When you use System.Diagnostics.Process to start the child, it's easy to ensure its standard input is redirected:
Process childProcess = new Process();
childProcess.StartInfo = new ProcessStartInfo("pathToConsoleModeApp.exe");
childProcess.StartInfo.RedirectStandardInput = true;
childProcess.StartInfo.CreateNoWindow = true; // no sense showing an empty black console window which the user can't input into
And then, on the child process, take advantage of the fact that Reads from the standard input stream will always return with at least 1 byte until the stream is closed, when they will start returning 0 bytes. An outline of the way I ended up doing this is below; my way also uses a message pump to keep the main thread available for things other than watching standard in, but this general approach could be used without message pumps too.
using System;
using System.IO;
using System.Threading;
using System.Windows.Forms;
static int Main()
{
Application.Run(new MyApplicationContext());
return 0;
}
public class MyApplicationContext : ApplicationContext
{
private SynchronizationContext _mainThreadMessageQueue = null;
private Stream _stdInput;
public MyApplicationContext()
{
_stdInput = Console.OpenStandardInput();
// feel free to use a better way to post to the message loop from here if you know one ;)
System.Windows.Forms.Timer handoffToMessageLoopTimer = new System.Windows.Forms.Timer();
handoffToMessageLoopTimer.Interval = 1;
handoffToMessageLoopTimer.Tick += new EventHandler((obj, eArgs) => { PostMessageLoopInitialization(handoffToMessageLoopTimer); });
handoffToMessageLoopTimer.Start();
}
private void PostMessageLoopInitialization(System.Windows.Forms.Timer t)
{
if (_mainThreadMessageQueue == null)
{
t.Stop();
_mainThreadMessageQueue = SynchronizationContext.Current;
}
// constantly monitor standard input on a background thread that will
// signal the main thread when stuff happens.
BeginMonitoringStdIn(null);
// start up your application's real work here
}
private void BeginMonitoringStdIn(object state)
{
if (SynchronizationContext.Current == _mainThreadMessageQueue)
{
// we're already running on the main thread - proceed.
var buffer = new byte[128];
_stdInput.BeginRead(buffer, 0, buffer.Length, (asyncResult) =>
{
int amtRead = _stdInput.EndRead(asyncResult);
if (amtRead == 0)
{
_mainThreadMessageQueue.Post(new SendOrPostCallback(ApplicationTeardown), null);
}
else
{
BeginMonitoringStdIn(null);
}
}, null);
}
else
{
// not invoked from the main thread - dispatch another call to this method on the main thread and return
_mainThreadMessageQueue.Post(new SendOrPostCallback(BeginMonitoringStdIn), null);
}
}
private void ApplicationTeardown(object state)
{
// tear down your application gracefully here
_stdInput.Close();
this.ExitThread();
}
}
Caveats to this approach:
the actual child .exe that is launched must be a console application so it remains attached to stdin/out/err. As in the above example, I easily adapted my existing application that used a message pump (but didn't show a GUI) by just creating a tiny console project that referenced the existing project, instantiating my application context and calling Application.Run() inside the Main method of the console .exe.
Technically, this merely signals the child process when the parent exits, so it will work whether the parent process exited normally or crashed, but its still up to the child processes to perform its own shutdown. This may or may not be what you want...
There is another relevant method, easy and effective, to finish child processes on program termination. You can implement and attach a debugger to them from the parent; when the parent process ends, child processes will be killed by the OS. It can go both ways attaching a debugger to the parent from the child (note that you can only attach one debugger at a time). You can find more info on the subject here.
Here you have an utility class that launches a new process and attaches a debugger to it. It has been adapted from this post by Roger Knapp. The only requirement is that both processes need to share the same bitness. You cannot debug a 32bit process from a 64bit process or vice versa.
public class ProcessRunner
{
// see http://csharptest.net/1051/managed-anti-debugging-how-to-prevent-users-from-attaching-a-debugger/
// see https://stackoverflow.com/a/24012744/2982757
public Process ChildProcess { get; set; }
public bool StartProcess(string fileName)
{
var processStartInfo = new ProcessStartInfo(fileName)
{
UseShellExecute = false,
WindowStyle = ProcessWindowStyle.Normal,
ErrorDialog = false
};
this.ChildProcess = Process.Start(processStartInfo);
if (ChildProcess == null)
return false;
new Thread(NullDebugger) {IsBackground = true}.Start(ChildProcess.Id);
return true;
}
private void NullDebugger(object arg)
{
// Attach to the process we provided the thread as an argument
if (DebugActiveProcess((int) arg))
{
var debugEvent = new DEBUG_EVENT {bytes = new byte[1024]};
while (!this.ChildProcess.HasExited)
{
if (WaitForDebugEvent(out debugEvent, 1000))
{
// return DBG_CONTINUE for all events but the exception type
var continueFlag = DBG_CONTINUE;
if (debugEvent.dwDebugEventCode == DebugEventType.EXCEPTION_DEBUG_EVENT)
continueFlag = DBG_EXCEPTION_NOT_HANDLED;
ContinueDebugEvent(debugEvent.dwProcessId, debugEvent.dwThreadId, continueFlag);
}
}
}
else
{
//we were not able to attach the debugger
//do the processes have the same bitness?
//throw ApplicationException("Unable to attach debugger") // Kill child? // Send Event? // Ignore?
}
}
#region "API imports"
private const int DBG_CONTINUE = 0x00010002;
private const int DBG_EXCEPTION_NOT_HANDLED = unchecked((int) 0x80010001);
private enum DebugEventType : int
{
CREATE_PROCESS_DEBUG_EVENT = 3,
//Reports a create-process debugging event. The value of u.CreateProcessInfo specifies a CREATE_PROCESS_DEBUG_INFO structure.
CREATE_THREAD_DEBUG_EVENT = 2,
//Reports a create-thread debugging event. The value of u.CreateThread specifies a CREATE_THREAD_DEBUG_INFO structure.
EXCEPTION_DEBUG_EVENT = 1,
//Reports an exception debugging event. The value of u.Exception specifies an EXCEPTION_DEBUG_INFO structure.
EXIT_PROCESS_DEBUG_EVENT = 5,
//Reports an exit-process debugging event. The value of u.ExitProcess specifies an EXIT_PROCESS_DEBUG_INFO structure.
EXIT_THREAD_DEBUG_EVENT = 4,
//Reports an exit-thread debugging event. The value of u.ExitThread specifies an EXIT_THREAD_DEBUG_INFO structure.
LOAD_DLL_DEBUG_EVENT = 6,
//Reports a load-dynamic-link-library (DLL) debugging event. The value of u.LoadDll specifies a LOAD_DLL_DEBUG_INFO structure.
OUTPUT_DEBUG_STRING_EVENT = 8,
//Reports an output-debugging-string debugging event. The value of u.DebugString specifies an OUTPUT_DEBUG_STRING_INFO structure.
RIP_EVENT = 9,
//Reports a RIP-debugging event (system debugging error). The value of u.RipInfo specifies a RIP_INFO structure.
UNLOAD_DLL_DEBUG_EVENT = 7,
//Reports an unload-DLL debugging event. The value of u.UnloadDll specifies an UNLOAD_DLL_DEBUG_INFO structure.
}
[StructLayout(LayoutKind.Sequential)]
private struct DEBUG_EVENT
{
[MarshalAs(UnmanagedType.I4)] public DebugEventType dwDebugEventCode;
public int dwProcessId;
public int dwThreadId;
[MarshalAs(UnmanagedType.ByValArray, SizeConst = 1024)] public byte[] bytes;
}
[DllImport("Kernel32.dll", SetLastError = true)]
private static extern bool DebugActiveProcess(int dwProcessId);
[DllImport("Kernel32.dll", SetLastError = true)]
private static extern bool WaitForDebugEvent([Out] out DEBUG_EVENT lpDebugEvent, int dwMilliseconds);
[DllImport("Kernel32.dll", SetLastError = true)]
private static extern bool ContinueDebugEvent(int dwProcessId, int dwThreadId, int dwContinueStatus);
[DllImport("Kernel32.dll", SetLastError = true)]
public static extern bool IsDebuggerPresent();
#endregion
}
Usage:
new ProcessRunner().StartProcess("c:\\Windows\\system32\\calc.exe");
One way is to pass PID of parent process to the child. The child will periodically poll if the process with the specified pid exists or not. If not it will just quit.
You can also use Process.WaitForExit method in child method to be notified when the parent process ends but it might not work in case of Task Manager.
I was looking for a solution to this problem that did not require unmanaged code. I was also not able to use standard input/output redirection because it was a Windows Forms application.
My solution was to create a named pipe in the parent process and then connect the child process to the same pipe. If the parent process exits then the pipe becomes broken and the child can detect this.
Below is an example using two console applications:
Parent
private const string PipeName = "471450d6-70db-49dc-94af-09d3f3eba529";
public static void Main(string[] args)
{
Console.WriteLine("Main program running");
using (NamedPipeServerStream pipe = new NamedPipeServerStream(PipeName, PipeDirection.Out))
{
Process.Start("child.exe");
Console.WriteLine("Press any key to exit");
Console.ReadKey();
}
}
Child
private const string PipeName = "471450d6-70db-49dc-94af-09d3f3eba529"; // same as parent
public static void Main(string[] args)
{
Console.WriteLine("Child process running");
using (NamedPipeClientStream pipe = new NamedPipeClientStream(".", PipeName, PipeDirection.In))
{
pipe.Connect();
pipe.BeginRead(new byte[1], 0, 1, PipeBrokenCallback, pipe);
Console.WriteLine("Press any key to exit");
Console.ReadKey();
}
}
private static void PipeBrokenCallback(IAsyncResult ar)
{
// the pipe was closed (parent process died), so exit the child process too
try
{
NamedPipeClientStream pipe = (NamedPipeClientStream)ar.AsyncState;
pipe.EndRead(ar);
}
catch (IOException) { }
Environment.Exit(1);
}
Use event handlers to make hooks on a few exit scenarios:
var process = Process.Start("program.exe");
AppDomain.CurrentDomain.DomainUnload += (s, e) => { process.Kill(); process.WaitForExit(); };
AppDomain.CurrentDomain.ProcessExit += (s, e) => { process.Kill(); process.WaitForExit(); };
AppDomain.CurrentDomain.UnhandledException += (s, e) => { process.Kill(); process.WaitForExit(); };
Yet another addition to the abundant richness of solutions proposed so far....
The problem with many of them is that they rely upon the parent and child process to shut down in an orderly manner, which isn't always true when development is underway. I found that my child process was often being orphaned whenever I terminated the parent process in the debugger, which required me to kill the orphaned process(es) with Task Manager in order to rebuild my solution.
The solution: Pass the parent process ID in on the commandline (or even less invasive, in the environment variables) of the child process.
In the parent process, the process ID is available as:
Process.CurrentProcess().Id;
In the child process:
Process parentProcess = Process.GetProcessById(parentProcessId);
parentProcess.Exited += (s, e) =>
{
// clean up what you can.
this.Dispose();
// maybe log an error
....
// And terminate with prejudice!
//(since something has already gone terribly wrong)
Process.GetCurrentProcess().Kill();
};
I am of two minds as to whether this is acceptable practice in production code. On the one hand, this should never happen. But on the other hand, it may mean the difference between restarting a process, and rebooting a production server. And what should never happen often does.
And it sure is useful while debugging orderly shutdown problems.
Solution that worked for me:
When creating process add tag process.EnableRaisingEvents = true;:
csc = new Process();
csc.StartInfo.UseShellExecute = false;
csc.StartInfo.CreateNoWindow = true;
csc.StartInfo.FileName = Path.Combine(HLib.path_dataFolder, "csc.exe");
csc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
csc.StartInfo.ErrorDialog = false;
csc.StartInfo.RedirectStandardInput = true;
csc.StartInfo.RedirectStandardOutput = true;
csc.EnableRaisingEvents = true;
csc.Start();
I see two options:
If you know exactly what child process could be started and you are sure they are only started from your main process, then you could consider simply searching for them by name and kill them.
Iterate through all processes and kill every process that has your process as a parent (I guess you need to kill the child processes first). Here is explained how you can get the parent process id.
I've made a child process management library where the parent process and the child process are monitored due a bidirectional WCF pipe. If either the child process terminates or the parent process terminates each other is notified.
There is also a debugger helper available which automatically attaches the VS debugger to the started child process
Project site:
http://www.crawler-lib.net/child-processes
NuGet Packages:
https://www.nuget.org/packages/ChildProcesses
https://www.nuget.org/packages/ChildProcesses.VisualStudioDebug/
Just my 2018 version.
Use it aside your Main() method.
using System.Management;
using System.Diagnostics;
...
// Called when the Main Window is closed
protected override void OnClosed(EventArgs EventArgs)
{
string query = "Select * From Win32_Process Where ParentProcessId = " + Process.GetCurrentProcess().Id;
ManagementObjectSearcher searcher = new ManagementObjectSearcher(query);
ManagementObjectCollection processList = searcher.Get();
foreach (var obj in processList)
{
object data = obj.Properties["processid"].Value;
if (data != null)
{
// retrieve the process
var childId = Convert.ToInt32(data);
var childProcess = Process.GetProcessById(childId);
// ensure the current process is still live
if (childProcess != null) childProcess.Kill();
}
}
Environment.Exit(0);
}
call job.AddProcess better to do after start of the process:
prc.Start();
job.AddProcess(prc.Handle);
When calling AddProcess before the terminate, child processes are not killed. (Windows 7 SP1)
private void KillProcess(Process proc)
{
var job = new Job();
job.AddProcess(proc.Handle);
job.Close();
}
I had the same problem. I was creating child processes that never got killed if my main app crashed. I had to destroy manually child processes when debugging. I found that there was no need to make the children somewhat depend on parent. In my main, I added a try catch to do a CleanUp() of child processes on exit.
static void Main()
{
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
try
{
Application.Run(new frmMonitorSensors());
}
catch(Exception ex)
{
CleanUp();
ErrorLogging.Add(ex.ToString());
}
}
static private void CleanUp()
{
List<string> processesToKill = new List<string>() { "Process1", "Process2" };
foreach (string toKill in processesToKill)
{
Process[] processes = Process.GetProcessesByName(toKill);
foreach (Process p in processes)
{
p.Kill();
}
}
}
If you can catch situation, when your processes tree should be killed, since .NET 5.0 (.NET Core 3.0) you could use Process.Kill(bool entireProcessTree)
Related
Ensure that the process i opened will close if my application will crash [duplicate]
I'm creating new processes using System.Diagnostics.Process class from my application. I want this processes to be killed when/if my application has crashed. But if I kill my application from Task Manager, child processes are not killed. Is there any way to make child processes dependent on parent process?
From this forum, credit to 'Josh'. Application.Quit() and Process.Kill() are possible solutions, but have proven to be unreliable. When your main application dies, you are still left with child processes running. What we really want is for the child processes to die as soon as the main process dies. The solution is to use "job objects" http://msdn.microsoft.com/en-us/library/ms682409(VS.85).aspx. The idea is to create a "job object" for your main application, and register your child processes with the job object. If the main process dies, the OS will take care of terminating the child processes. public enum JobObjectInfoType { AssociateCompletionPortInformation = 7, BasicLimitInformation = 2, BasicUIRestrictions = 4, EndOfJobTimeInformation = 6, ExtendedLimitInformation = 9, SecurityLimitInformation = 5, GroupInformation = 11 } [StructLayout(LayoutKind.Sequential)] public struct SECURITY_ATTRIBUTES { public int nLength; public IntPtr lpSecurityDescriptor; public int bInheritHandle; } [StructLayout(LayoutKind.Sequential)] struct JOBOBJECT_BASIC_LIMIT_INFORMATION { public Int64 PerProcessUserTimeLimit; public Int64 PerJobUserTimeLimit; public Int16 LimitFlags; public UInt32 MinimumWorkingSetSize; public UInt32 MaximumWorkingSetSize; public Int16 ActiveProcessLimit; public Int64 Affinity; public Int16 PriorityClass; public Int16 SchedulingClass; } [StructLayout(LayoutKind.Sequential)] struct IO_COUNTERS { public UInt64 ReadOperationCount; public UInt64 WriteOperationCount; public UInt64 OtherOperationCount; public UInt64 ReadTransferCount; public UInt64 WriteTransferCount; public UInt64 OtherTransferCount; } [StructLayout(LayoutKind.Sequential)] struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION { public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation; public IO_COUNTERS IoInfo; public UInt32 ProcessMemoryLimit; public UInt32 JobMemoryLimit; public UInt32 PeakProcessMemoryUsed; public UInt32 PeakJobMemoryUsed; } public class Job : IDisposable { [DllImport("kernel32.dll", CharSet = CharSet.Unicode)] static extern IntPtr CreateJobObject(object a, string lpName); [DllImport("kernel32.dll")] static extern bool SetInformationJobObject(IntPtr hJob, JobObjectInfoType infoType, IntPtr lpJobObjectInfo, uint cbJobObjectInfoLength); [DllImport("kernel32.dll", SetLastError = true)] static extern bool AssignProcessToJobObject(IntPtr job, IntPtr process); private IntPtr m_handle; private bool m_disposed = false; public Job() { m_handle = CreateJobObject(null, null); JOBOBJECT_BASIC_LIMIT_INFORMATION info = new JOBOBJECT_BASIC_LIMIT_INFORMATION(); info.LimitFlags = 0x2000; JOBOBJECT_EXTENDED_LIMIT_INFORMATION extendedInfo = new JOBOBJECT_EXTENDED_LIMIT_INFORMATION(); extendedInfo.BasicLimitInformation = info; int length = Marshal.SizeOf(typeof(JOBOBJECT_EXTENDED_LIMIT_INFORMATION)); IntPtr extendedInfoPtr = Marshal.AllocHGlobal(length); Marshal.StructureToPtr(extendedInfo, extendedInfoPtr, false); if (!SetInformationJobObject(m_handle, JobObjectInfoType.ExtendedLimitInformation, extendedInfoPtr, (uint)length)) throw new Exception(string.Format("Unable to set information. Error: {0}", Marshal.GetLastWin32Error())); } #region IDisposable Members public void Dispose() { Dispose(true); GC.SuppressFinalize(this); } #endregion private void Dispose(bool disposing) { if (m_disposed) return; if (disposing) {} Close(); m_disposed = true; } public void Close() { Win32.CloseHandle(m_handle); m_handle = IntPtr.Zero; } public bool AddProcess(IntPtr handle) { return AssignProcessToJobObject(m_handle, handle); } } Looking at the constructor ... JOBOBJECT_BASIC_LIMIT_INFORMATION info = new JOBOBJECT_BASIC_LIMIT_INFORMATION(); info.LimitFlags = 0x2000; The key here is to setup the job object properly. In the constructor I'm setting the "limits" to 0x2000, which is the numeric value for JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE. MSDN defines this flag as: Causes all processes associated with the job to terminate when the last handle to the job is closed. Once this class is setup...you just have to register each child process with the job. For example: [DllImport("user32.dll", SetLastError = true)] public static extern uint GetWindowThreadProcessId(IntPtr hWnd, out uint lpdwProcessId); Excel.Application app = new Excel.ApplicationClass(); uint pid = 0; Win32.GetWindowThreadProcessId(new IntPtr(app.Hwnd), out pid); job.AddProcess(Process.GetProcessById((int)pid).Handle);
This answer started with #Matt Howells' excellent answer plus others (see links in the code below). Improvements: Supports 32-bit and 64-bit. Fixes some problems in #Matt Howells' answer: The small memory leak of extendedInfoPtr The 'Win32' compile error, and A stack-unbalanced exception I got in the call to CreateJobObject (using Windows 10, Visual Studio 2015, 32-bit). Names the Job, so if you use SysInternals, for example, you can easily find it. Has a somewhat simpler API and less code. Here's how to use this code: // Get a Process object somehow. Process process = Process.Start(exePath, args); // Add the Process to ChildProcessTracker. ChildProcessTracker.AddProcess(process); To support Windows 7 requires: A simple app.manifest change as #adam smith describes. Registry settings to be added if you are using Visual Studio. In my case, I didn't need to support Windows 7, so I have a simple check at the top of the static constructor below. /// <summary> /// Allows processes to be automatically killed if this parent process unexpectedly quits. /// This feature requires Windows 8 or greater. On Windows 7, nothing is done.</summary> /// <remarks>References: /// https://stackoverflow.com/a/4657392/386091 /// https://stackoverflow.com/a/9164742/386091 </remarks> public static class ChildProcessTracker { /// <summary> /// Add the process to be tracked. If our current process is killed, the child processes /// that we are tracking will be automatically killed, too. If the child process terminates /// first, that's fine, too.</summary> /// <param name="process"></param> public static void AddProcess(Process process) { if (s_jobHandle != IntPtr.Zero) { bool success = AssignProcessToJobObject(s_jobHandle, process.Handle); if (!success && !process.HasExited) throw new Win32Exception(); } } static ChildProcessTracker() { // This feature requires Windows 8 or later. To support Windows 7 requires // registry settings to be added if you are using Visual Studio plus an // app.manifest change. // https://stackoverflow.com/a/4232259/386091 // https://stackoverflow.com/a/9507862/386091 if (Environment.OSVersion.Version < new Version(6, 2)) return; // The job name is optional (and can be null) but it helps with diagnostics. // If it's not null, it has to be unique. Use SysInternals' Handle command-line // utility: handle -a ChildProcessTracker string jobName = "ChildProcessTracker" + Process.GetCurrentProcess().Id; s_jobHandle = CreateJobObject(IntPtr.Zero, jobName); var info = new JOBOBJECT_BASIC_LIMIT_INFORMATION(); // This is the key flag. When our process is killed, Windows will automatically // close the job handle, and when that happens, we want the child processes to // be killed, too. info.LimitFlags = JOBOBJECTLIMIT.JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE; var extendedInfo = new JOBOBJECT_EXTENDED_LIMIT_INFORMATION(); extendedInfo.BasicLimitInformation = info; int length = Marshal.SizeOf(typeof(JOBOBJECT_EXTENDED_LIMIT_INFORMATION)); IntPtr extendedInfoPtr = Marshal.AllocHGlobal(length); try { Marshal.StructureToPtr(extendedInfo, extendedInfoPtr, false); if (!SetInformationJobObject(s_jobHandle, JobObjectInfoType.ExtendedLimitInformation, extendedInfoPtr, (uint)length)) { throw new Win32Exception(); } } finally { Marshal.FreeHGlobal(extendedInfoPtr); } } [DllImport("kernel32.dll", CharSet = CharSet.Unicode)] static extern IntPtr CreateJobObject(IntPtr lpJobAttributes, string name); [DllImport("kernel32.dll")] static extern bool SetInformationJobObject(IntPtr job, JobObjectInfoType infoType, IntPtr lpJobObjectInfo, uint cbJobObjectInfoLength); [DllImport("kernel32.dll", SetLastError = true)] static extern bool AssignProcessToJobObject(IntPtr job, IntPtr process); // Windows will automatically close any open job handles when our process terminates. // This can be verified by using SysInternals' Handle utility. When the job handle // is closed, the child processes will be killed. private static readonly IntPtr s_jobHandle; } public enum JobObjectInfoType { AssociateCompletionPortInformation = 7, BasicLimitInformation = 2, BasicUIRestrictions = 4, EndOfJobTimeInformation = 6, ExtendedLimitInformation = 9, SecurityLimitInformation = 5, GroupInformation = 11 } [StructLayout(LayoutKind.Sequential)] public struct JOBOBJECT_BASIC_LIMIT_INFORMATION { public Int64 PerProcessUserTimeLimit; public Int64 PerJobUserTimeLimit; public JOBOBJECTLIMIT LimitFlags; public UIntPtr MinimumWorkingSetSize; public UIntPtr MaximumWorkingSetSize; public UInt32 ActiveProcessLimit; public Int64 Affinity; public UInt32 PriorityClass; public UInt32 SchedulingClass; } [Flags] public enum JOBOBJECTLIMIT : uint { JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE = 0x2000 } [StructLayout(LayoutKind.Sequential)] public struct IO_COUNTERS { public UInt64 ReadOperationCount; public UInt64 WriteOperationCount; public UInt64 OtherOperationCount; public UInt64 ReadTransferCount; public UInt64 WriteTransferCount; public UInt64 OtherTransferCount; } [StructLayout(LayoutKind.Sequential)] public struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION { public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation; public IO_COUNTERS IoInfo; public UIntPtr ProcessMemoryLimit; public UIntPtr JobMemoryLimit; public UIntPtr PeakProcessMemoryUsed; public UIntPtr PeakJobMemoryUsed; } I carefully tested both the 32-bit and 64-bit versions of the structs by programmatically comparing the managed and native versions to each other (the overall size as well as the offsets for each member). I've tested this code on Windows 7, 8, and 10.
This post is intended as an extension to #Matt Howells' answer, specifically for those who run into problems with using Job Objects under Vista or Win7, especially if you get an access denied error ('5') when calling AssignProcessToJobObject. tl;dr To ensure compatibility with Vista and Win7, add the following manifest to the .NET parent process: <?xml version="1.0" encoding="utf-8" standalone="yes"?> <assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0"> <v3:trustInfo xmlns:v3="urn:schemas-microsoft-com:asm.v3"> <v3:security> <v3:requestedPrivileges> <v3:requestedExecutionLevel level="asInvoker" uiAccess="false" /> </v3:requestedPrivileges> </v3:security> </v3:trustInfo> <compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1"> <!-- We specify these, in addition to the UAC above, so we avoid Program Compatibility Assistant in Vista and Win7 --> <!-- We try to avoid PCA so we can use Windows Job Objects --> <!-- See https://stackoverflow.com/questions/3342941/kill-child-process-when-parent-process-is-killed --> <application> <!--The ID below indicates application support for Windows Vista --> <supportedOS Id="{e2011457-1546-43c5-a5fe-008deee3d3f0}"/> <!--The ID below indicates application support for Windows 7 --> <supportedOS Id="{35138b9a-5d96-4fbd-8e2d-a2440225f93a}"/> </application> </compatibility> </assembly> Note that when you add new manifest in Visual Studio 2012 it will contain the above snippet already so you do not need to copy it from hear. It will also include a node for Windows 8. full explanation Your job association will fail with an access denied error if the process you're starting is already associated with another job. Enter Program Compatibility Assistant, which, starting in Windows Vista, will assign all kinds of processes to its own jobs. In Vista you can mark your application to be excluded from PCA by simply including an application manifest. Visual Studio seems to do this for .NET apps automatically, so you're fine there. A simple manifest no longer cuts it in Win7. [1] There, you have to specifically specify that you're compatible with Win7 with the tag in your manifest. [2] This led me to worry about Windows 8. Will I have to change my manifest once again? Apparently there's a break in the clouds, as Windows 8 now allows a process to belong to multiple jobs. [3] So I haven't tested it yet, but I imagine that this madness will be over now if you simply include a manifest with the supportedOS information. Tip 1: If you're developing a .NET app with Visual Studio, as I was, here [4] are some nice instructions on how to customize your application manifest. Tip 2: Be careful with launching your application from Visual Studio. I found that, after adding the appropriate manifest, I still had problems with PCA when launching from Visual Studio, even if I used Start without Debugging. Launching my application from Explorer worked, however. After manually adding devenv for exclusion from PCA using the registry, starting applications that used Job Objects from VS started working as well. [5] Tip 3: If you ever want to know if PCA is your problem, try launching your application from the command line, or copy the program to a network drive and run it from there. PCA is automatically disabled in those contexts. [1] http://blogs.msdn.com/b/cjacks/archive/2009/06/18/pca-changes-for-windows-7-how-to-tell-us-you-are-not-an-installer-take-2-because-we-changed-the-rules-on-you.aspx [2] http://ayende.com/blog/4360/how-to-opt-out-of-program-compatibility-assistant [3] http://msdn.microsoft.com/en-us/library/windows/desktop/ms681949(v=vs.85).aspx: "A process can be associated with more than one job in Windows 8" [4] How can I embed an application manifest into an application using VS2008? [5] How to stop the Visual Studio debugger starting my process in a job object?
Here's an alternative that may work for some when you have control of the code the child process runs. The benefit of this approach is it doesn't require any native Windows calls. The basic idea is to redirect the child's standard input to a stream whose other end is connected to the parent, and use that stream to detect when the parent has gone away. When you use System.Diagnostics.Process to start the child, it's easy to ensure its standard input is redirected: Process childProcess = new Process(); childProcess.StartInfo = new ProcessStartInfo("pathToConsoleModeApp.exe"); childProcess.StartInfo.RedirectStandardInput = true; childProcess.StartInfo.CreateNoWindow = true; // no sense showing an empty black console window which the user can't input into And then, on the child process, take advantage of the fact that Reads from the standard input stream will always return with at least 1 byte until the stream is closed, when they will start returning 0 bytes. An outline of the way I ended up doing this is below; my way also uses a message pump to keep the main thread available for things other than watching standard in, but this general approach could be used without message pumps too. using System; using System.IO; using System.Threading; using System.Windows.Forms; static int Main() { Application.Run(new MyApplicationContext()); return 0; } public class MyApplicationContext : ApplicationContext { private SynchronizationContext _mainThreadMessageQueue = null; private Stream _stdInput; public MyApplicationContext() { _stdInput = Console.OpenStandardInput(); // feel free to use a better way to post to the message loop from here if you know one ;) System.Windows.Forms.Timer handoffToMessageLoopTimer = new System.Windows.Forms.Timer(); handoffToMessageLoopTimer.Interval = 1; handoffToMessageLoopTimer.Tick += new EventHandler((obj, eArgs) => { PostMessageLoopInitialization(handoffToMessageLoopTimer); }); handoffToMessageLoopTimer.Start(); } private void PostMessageLoopInitialization(System.Windows.Forms.Timer t) { if (_mainThreadMessageQueue == null) { t.Stop(); _mainThreadMessageQueue = SynchronizationContext.Current; } // constantly monitor standard input on a background thread that will // signal the main thread when stuff happens. BeginMonitoringStdIn(null); // start up your application's real work here } private void BeginMonitoringStdIn(object state) { if (SynchronizationContext.Current == _mainThreadMessageQueue) { // we're already running on the main thread - proceed. var buffer = new byte[128]; _stdInput.BeginRead(buffer, 0, buffer.Length, (asyncResult) => { int amtRead = _stdInput.EndRead(asyncResult); if (amtRead == 0) { _mainThreadMessageQueue.Post(new SendOrPostCallback(ApplicationTeardown), null); } else { BeginMonitoringStdIn(null); } }, null); } else { // not invoked from the main thread - dispatch another call to this method on the main thread and return _mainThreadMessageQueue.Post(new SendOrPostCallback(BeginMonitoringStdIn), null); } } private void ApplicationTeardown(object state) { // tear down your application gracefully here _stdInput.Close(); this.ExitThread(); } } Caveats to this approach: the actual child .exe that is launched must be a console application so it remains attached to stdin/out/err. As in the above example, I easily adapted my existing application that used a message pump (but didn't show a GUI) by just creating a tiny console project that referenced the existing project, instantiating my application context and calling Application.Run() inside the Main method of the console .exe. Technically, this merely signals the child process when the parent exits, so it will work whether the parent process exited normally or crashed, but its still up to the child processes to perform its own shutdown. This may or may not be what you want...
There is another relevant method, easy and effective, to finish child processes on program termination. You can implement and attach a debugger to them from the parent; when the parent process ends, child processes will be killed by the OS. It can go both ways attaching a debugger to the parent from the child (note that you can only attach one debugger at a time). You can find more info on the subject here. Here you have an utility class that launches a new process and attaches a debugger to it. It has been adapted from this post by Roger Knapp. The only requirement is that both processes need to share the same bitness. You cannot debug a 32bit process from a 64bit process or vice versa. public class ProcessRunner { // see http://csharptest.net/1051/managed-anti-debugging-how-to-prevent-users-from-attaching-a-debugger/ // see https://stackoverflow.com/a/24012744/2982757 public Process ChildProcess { get; set; } public bool StartProcess(string fileName) { var processStartInfo = new ProcessStartInfo(fileName) { UseShellExecute = false, WindowStyle = ProcessWindowStyle.Normal, ErrorDialog = false }; this.ChildProcess = Process.Start(processStartInfo); if (ChildProcess == null) return false; new Thread(NullDebugger) {IsBackground = true}.Start(ChildProcess.Id); return true; } private void NullDebugger(object arg) { // Attach to the process we provided the thread as an argument if (DebugActiveProcess((int) arg)) { var debugEvent = new DEBUG_EVENT {bytes = new byte[1024]}; while (!this.ChildProcess.HasExited) { if (WaitForDebugEvent(out debugEvent, 1000)) { // return DBG_CONTINUE for all events but the exception type var continueFlag = DBG_CONTINUE; if (debugEvent.dwDebugEventCode == DebugEventType.EXCEPTION_DEBUG_EVENT) continueFlag = DBG_EXCEPTION_NOT_HANDLED; ContinueDebugEvent(debugEvent.dwProcessId, debugEvent.dwThreadId, continueFlag); } } } else { //we were not able to attach the debugger //do the processes have the same bitness? //throw ApplicationException("Unable to attach debugger") // Kill child? // Send Event? // Ignore? } } #region "API imports" private const int DBG_CONTINUE = 0x00010002; private const int DBG_EXCEPTION_NOT_HANDLED = unchecked((int) 0x80010001); private enum DebugEventType : int { CREATE_PROCESS_DEBUG_EVENT = 3, //Reports a create-process debugging event. The value of u.CreateProcessInfo specifies a CREATE_PROCESS_DEBUG_INFO structure. CREATE_THREAD_DEBUG_EVENT = 2, //Reports a create-thread debugging event. The value of u.CreateThread specifies a CREATE_THREAD_DEBUG_INFO structure. EXCEPTION_DEBUG_EVENT = 1, //Reports an exception debugging event. The value of u.Exception specifies an EXCEPTION_DEBUG_INFO structure. EXIT_PROCESS_DEBUG_EVENT = 5, //Reports an exit-process debugging event. The value of u.ExitProcess specifies an EXIT_PROCESS_DEBUG_INFO structure. EXIT_THREAD_DEBUG_EVENT = 4, //Reports an exit-thread debugging event. The value of u.ExitThread specifies an EXIT_THREAD_DEBUG_INFO structure. LOAD_DLL_DEBUG_EVENT = 6, //Reports a load-dynamic-link-library (DLL) debugging event. The value of u.LoadDll specifies a LOAD_DLL_DEBUG_INFO structure. OUTPUT_DEBUG_STRING_EVENT = 8, //Reports an output-debugging-string debugging event. The value of u.DebugString specifies an OUTPUT_DEBUG_STRING_INFO structure. RIP_EVENT = 9, //Reports a RIP-debugging event (system debugging error). The value of u.RipInfo specifies a RIP_INFO structure. UNLOAD_DLL_DEBUG_EVENT = 7, //Reports an unload-DLL debugging event. The value of u.UnloadDll specifies an UNLOAD_DLL_DEBUG_INFO structure. } [StructLayout(LayoutKind.Sequential)] private struct DEBUG_EVENT { [MarshalAs(UnmanagedType.I4)] public DebugEventType dwDebugEventCode; public int dwProcessId; public int dwThreadId; [MarshalAs(UnmanagedType.ByValArray, SizeConst = 1024)] public byte[] bytes; } [DllImport("Kernel32.dll", SetLastError = true)] private static extern bool DebugActiveProcess(int dwProcessId); [DllImport("Kernel32.dll", SetLastError = true)] private static extern bool WaitForDebugEvent([Out] out DEBUG_EVENT lpDebugEvent, int dwMilliseconds); [DllImport("Kernel32.dll", SetLastError = true)] private static extern bool ContinueDebugEvent(int dwProcessId, int dwThreadId, int dwContinueStatus); [DllImport("Kernel32.dll", SetLastError = true)] public static extern bool IsDebuggerPresent(); #endregion } Usage: new ProcessRunner().StartProcess("c:\\Windows\\system32\\calc.exe");
One way is to pass PID of parent process to the child. The child will periodically poll if the process with the specified pid exists or not. If not it will just quit. You can also use Process.WaitForExit method in child method to be notified when the parent process ends but it might not work in case of Task Manager.
I was looking for a solution to this problem that did not require unmanaged code. I was also not able to use standard input/output redirection because it was a Windows Forms application. My solution was to create a named pipe in the parent process and then connect the child process to the same pipe. If the parent process exits then the pipe becomes broken and the child can detect this. Below is an example using two console applications: Parent private const string PipeName = "471450d6-70db-49dc-94af-09d3f3eba529"; public static void Main(string[] args) { Console.WriteLine("Main program running"); using (NamedPipeServerStream pipe = new NamedPipeServerStream(PipeName, PipeDirection.Out)) { Process.Start("child.exe"); Console.WriteLine("Press any key to exit"); Console.ReadKey(); } } Child private const string PipeName = "471450d6-70db-49dc-94af-09d3f3eba529"; // same as parent public static void Main(string[] args) { Console.WriteLine("Child process running"); using (NamedPipeClientStream pipe = new NamedPipeClientStream(".", PipeName, PipeDirection.In)) { pipe.Connect(); pipe.BeginRead(new byte[1], 0, 1, PipeBrokenCallback, pipe); Console.WriteLine("Press any key to exit"); Console.ReadKey(); } } private static void PipeBrokenCallback(IAsyncResult ar) { // the pipe was closed (parent process died), so exit the child process too try { NamedPipeClientStream pipe = (NamedPipeClientStream)ar.AsyncState; pipe.EndRead(ar); } catch (IOException) { } Environment.Exit(1); }
Use event handlers to make hooks on a few exit scenarios: var process = Process.Start("program.exe"); AppDomain.CurrentDomain.DomainUnload += (s, e) => { process.Kill(); process.WaitForExit(); }; AppDomain.CurrentDomain.ProcessExit += (s, e) => { process.Kill(); process.WaitForExit(); }; AppDomain.CurrentDomain.UnhandledException += (s, e) => { process.Kill(); process.WaitForExit(); };
Yet another addition to the abundant richness of solutions proposed so far.... The problem with many of them is that they rely upon the parent and child process to shut down in an orderly manner, which isn't always true when development is underway. I found that my child process was often being orphaned whenever I terminated the parent process in the debugger, which required me to kill the orphaned process(es) with Task Manager in order to rebuild my solution. The solution: Pass the parent process ID in on the commandline (or even less invasive, in the environment variables) of the child process. In the parent process, the process ID is available as: Process.CurrentProcess().Id; In the child process: Process parentProcess = Process.GetProcessById(parentProcessId); parentProcess.Exited += (s, e) => { // clean up what you can. this.Dispose(); // maybe log an error .... // And terminate with prejudice! //(since something has already gone terribly wrong) Process.GetCurrentProcess().Kill(); }; I am of two minds as to whether this is acceptable practice in production code. On the one hand, this should never happen. But on the other hand, it may mean the difference between restarting a process, and rebooting a production server. And what should never happen often does. And it sure is useful while debugging orderly shutdown problems.
Solution that worked for me: When creating process add tag process.EnableRaisingEvents = true;: csc = new Process(); csc.StartInfo.UseShellExecute = false; csc.StartInfo.CreateNoWindow = true; csc.StartInfo.FileName = Path.Combine(HLib.path_dataFolder, "csc.exe"); csc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden; csc.StartInfo.ErrorDialog = false; csc.StartInfo.RedirectStandardInput = true; csc.StartInfo.RedirectStandardOutput = true; csc.EnableRaisingEvents = true; csc.Start();
I see two options: If you know exactly what child process could be started and you are sure they are only started from your main process, then you could consider simply searching for them by name and kill them. Iterate through all processes and kill every process that has your process as a parent (I guess you need to kill the child processes first). Here is explained how you can get the parent process id.
I've made a child process management library where the parent process and the child process are monitored due a bidirectional WCF pipe. If either the child process terminates or the parent process terminates each other is notified. There is also a debugger helper available which automatically attaches the VS debugger to the started child process Project site: http://www.crawler-lib.net/child-processes NuGet Packages: https://www.nuget.org/packages/ChildProcesses https://www.nuget.org/packages/ChildProcesses.VisualStudioDebug/
Just my 2018 version. Use it aside your Main() method. using System.Management; using System.Diagnostics; ... // Called when the Main Window is closed protected override void OnClosed(EventArgs EventArgs) { string query = "Select * From Win32_Process Where ParentProcessId = " + Process.GetCurrentProcess().Id; ManagementObjectSearcher searcher = new ManagementObjectSearcher(query); ManagementObjectCollection processList = searcher.Get(); foreach (var obj in processList) { object data = obj.Properties["processid"].Value; if (data != null) { // retrieve the process var childId = Convert.ToInt32(data); var childProcess = Process.GetProcessById(childId); // ensure the current process is still live if (childProcess != null) childProcess.Kill(); } } Environment.Exit(0); }
call job.AddProcess better to do after start of the process: prc.Start(); job.AddProcess(prc.Handle); When calling AddProcess before the terminate, child processes are not killed. (Windows 7 SP1) private void KillProcess(Process proc) { var job = new Job(); job.AddProcess(proc.Handle); job.Close(); }
I had the same problem. I was creating child processes that never got killed if my main app crashed. I had to destroy manually child processes when debugging. I found that there was no need to make the children somewhat depend on parent. In my main, I added a try catch to do a CleanUp() of child processes on exit. static void Main() { Application.EnableVisualStyles(); Application.SetCompatibleTextRenderingDefault(false); try { Application.Run(new frmMonitorSensors()); } catch(Exception ex) { CleanUp(); ErrorLogging.Add(ex.ToString()); } } static private void CleanUp() { List<string> processesToKill = new List<string>() { "Process1", "Process2" }; foreach (string toKill in processesToKill) { Process[] processes = Process.GetProcessesByName(toKill); foreach (Process p in processes) { p.Kill(); } } }
If you can catch situation, when your processes tree should be killed, since .NET 5.0 (.NET Core 3.0) you could use Process.Kill(bool entireProcessTree)
MinidumpWriteDump from managed code throws an AccessViolationException
I have the basic MiniDumpWriteDump method interop copied off the internet in my C# (3.5) project. Up till now, i have used this code to register on the UnhandledException event, to take a crash dump before the process shuts down. In a particular scenario i am facing now, i have set this function to be used in some other case, to take a diagnostic memory dump of the process. Whenever this function gets called (not from the UnhandledException handler), it throws an AccessViolationException Here's what the MiniDump code looks like (removed some redundant parts): using (var fs = new System.IO.FileStream(fileName, System.IO.FileMode.Create, System.IO.FileAccess.Write, System.IO.FileShare.None)) { MiniDumpExceptionInformation exp; exp.ThreadId = GetCurrentThreadId(); exp.ClientPointers = false; exp.ExceptionPointers = Marshal.GetExceptionPointers(); bool bRet = MiniDumpWriteDump( GetCurrentProcess(), GetCurrentProcessId(), fs.SafeFileHandle.DangerousGetHandle(), (uint)dumpType, ref exp, IntPtr.Zero, IntPtr.Zero); return bRet; } Native types are defined like so: //typedef struct _MINIDUMP_EXCEPTION_INFORMATION { // DWORD ThreadId; // PEXCEPTION_POINTERS ExceptionPointers; // BOOL ClientPointers; //} MINIDUMP_EXCEPTION_INFORMATION, *PMINIDUMP_EXCEPTION_INFORMATION; // Pack=4 is important! So it works also for x64! [StructLayout(LayoutKind.Sequential, Pack = 4)] struct MiniDumpExceptionInformation { public uint ThreadId; public IntPtr ExceptionPointers; [MarshalAs(UnmanagedType.Bool)] public bool ClientPointers; }
Marshal.GetExceptionPointers() might return IntPtr.Zero (maybe when the OS sees the exception as handled). If this is the case you can try to put the call of 'MiniDumpWriteDump' somewhere else. I have had the same problem and solved it by putting 'MiniDumpWriteDump' into the event handler 'AppDomain.CurrentDomain.FirstChanceException'.
Multiple InfoPath interop automation instances
I am trying to automate multiple parallel instances of Office InfoPath 2010 via a windows service. I understand automating Office from a service is not supported however it is a requirement of my customer. I can automate other Office applications in a parallel fashion, however InfoPath behaves differently. What I have found is that there will only ever be one instance of the INFOPATH.EXE process created, no matter how many parallel calls to CreateObject("InfoPath.Application") are made. In contrast to this, multiple instances of WINWORD.EXE can be created via the similar mechanism CreateObject("Word.Application") To reproduce this issue, a simple console application can be used. static void Main(string[] args) { // Create two instances of word in parallel ThreadPool.QueueUserWorkItem(Word1); ThreadPool.QueueUserWorkItem(Word2); System.Threading.Thread.Sleep(5000); // Attempt to create two instances of infopath in parallel ThreadPool.QueueUserWorkItem(InfoPath1); ThreadPool.QueueUserWorkItem(InfoPath2); } static void Word1(object context) { OfficeInterop.WordTest word = new OfficeInterop.WordTest(); word.Test(); } static void Word2(object context) { OfficeInterop.WordTest word = new OfficeInterop.WordTest(); word.Test(); } static void InfoPath1(object context) { OfficeInterop.InfoPathTest infoPath = new OfficeInterop.InfoPathTest(); infoPath.Test(); } static void InfoPath2(object context) { OfficeInterop.InfoPathTest infoPath = new OfficeInterop.InfoPathTest(); infoPath.Test(); } The InfoPathTest and WordTest classes (VB) are in another project. Public Class InfoPathTest Public Sub Test() Dim ip As Microsoft.Office.Interop.InfoPath.Application ip = CreateObject("InfoPath.Application") System.Threading.Thread.Sleep(5000) ip.Quit(False) End Sub End Class Public Class WordTest Public Sub Test() Dim app As Microsoft.Office.Interop.Word.Application app = CreateObject("Word.Application") System.Threading.Thread.Sleep(5000) app.Quit(False) End Sub End Class The interop classes simply create the automation objects, sleep and then quit (although in the case of Word, I have completed more complex tests). When running the console app, I can see (via Task Manager) two WINWORD.EXE processes created in parallel, and only a single INFOPATH.EXE process created. In fact when the first instance of InfoPathTest calls ip.Quit, the INFOPATH.EXE process terminates. When the second instance of InfoPathTest calls ip.Quit, a DCOM timeout exception is thrown - it appears as though the two instances were sharing the same underlying automation object, and that object no longer exists after the first call to ip.Quit. At this stage my thoughts were only a single INFOPATH.EXE is supported per user login. I expanded the windows service to start two new processes (a console application called InfoPathTest), each running under a different user account. These new processes would then attempt to automate INFOPATH.EXE Here's where it gets interesting, this actually works, but only on some machines, and I cannot figure out why that is the case. And the service code (with help from AsproLock): public partial class InfoPathService : ServiceBase { private Thread _mainThread; private bool isStopping = false; public InfoPathService() { InitializeComponent(); } protected override void OnStart(string[] args) { if (_mainThread == null || _mainThread.IsAlive == false) { _mainThread = new Thread(ProcessController); _mainThread.Start(); } } protected override void OnStop() { isStopping = true; } public void ProcessController() { while (isStopping == false) { try { IntPtr hWinSta = GetProcessWindowStation(); WindowStationSecurity ws = new WindowStationSecurity(hWinSta, System.Security.AccessControl.AccessControlSections.Access); ws.AddAccessRule(new WindowStationAccessRule("user1", WindowStationRights.AllAccess, System.Security.AccessControl.AccessControlType.Allow)); ws.AddAccessRule(new WindowStationAccessRule("user2", WindowStationRights.AllAccess, System.Security.AccessControl.AccessControlType.Allow)); ws.AcceptChanges(); IntPtr hDesk = GetThreadDesktop(GetCurrentThreadId()); DesktopSecurity ds = new DesktopSecurity(hDesk, System.Security.AccessControl.AccessControlSections.Access); ds.AddAccessRule(new DesktopAccessRule("user1", DesktopRights.AllAccess, System.Security.AccessControl.AccessControlType.Allow)); ds.AddAccessRule(new DesktopAccessRule("user2", DesktopRights.AllAccess, System.Security.AccessControl.AccessControlType.Allow)); ds.AcceptChanges(); ThreadPool.QueueUserWorkItem(Process1); ThreadPool.QueueUserWorkItem(Process2); } catch (Exception ex) { System.Diagnostics.Debug.WriteLine(String.Format("{0}: Process Controller Error {1}", System.Threading.Thread.CurrentThread.ManagedThreadId, ex.Message)); } Thread.Sleep(15000); } } private static void Process1(object context) { SecureString pwd2; Process process2 = new Process(); process2.StartInfo.FileName = #"c:\debug\InfoPathTest.exe"; process2.StartInfo.UseShellExecute = false; process2.StartInfo.LoadUserProfile = true; process2.StartInfo.WorkingDirectory = #"C:\debug\"; process2.StartInfo.Domain = "DEV01"; pwd2 = new SecureString(); foreach (char c in "password") { pwd2.AppendChar(c); }; process2.StartInfo.Password = pwd2; process2.StartInfo.UserName = "user1"; process2.Start(); process2.WaitForExit(); } private static void Process2(object context) { SecureString pwd2; Process process2 = new Process(); process2.StartInfo.FileName = #"c:\debug\InfoPathTest.exe"; process2.StartInfo.UseShellExecute = false; process2.StartInfo.LoadUserProfile = true; process2.StartInfo.WorkingDirectory = #"C:\debug\"; process2.StartInfo.Domain = "DEV01"; pwd2 = new SecureString(); foreach (char c in "password") { pwd2.AppendChar(c); }; process2.StartInfo.Password = pwd2; process2.StartInfo.UserName = "user2"; process2.Start(); process2.WaitForExit(); } [DllImport("user32.dll", SetLastError = true)] public static extern IntPtr GetProcessWindowStation(); [DllImport("user32.dll", SetLastError = true)] public static extern IntPtr GetThreadDesktop(int dwThreadId); [DllImport("kernel32.dll", SetLastError = true)] public static extern int GetCurrentThreadId(); } The InfoPathTest.exe process simply calls the InfoPathTest.Test() method detailed above. In summary, this works, but only on certain machines. When it fails, the second INFOPATH.EXE process is actually created, but immediately quits with an exitcode of 0. There is nothing in the event logs, nor any exceptions in the code. I've looked at many things to try and differentiate between working / non working machines, but I'm now stuck. Any pointers appreciated, especially if you have other thoughts on how to automate multiple InfoPath instances in parallel.
I'm guessing you'd get similar behavior if you tried to do the same thing with Outlook, which would mean Microsoft thinks it is a bad idea to run multiple copies. If that is so, I see two options. Option one is to make your Infopath automation synchronous, running one instance at a time. Option two, and I have NO idea if it would even work, would be to see if you can launch virtual machines to accomplish youe InfoPath work. I hope this can at least spark some new train of though that will lead to success.
I’ve encountered a very similar issue with Outlook. The restriction of allowing only a single instance of the application to be running does not apply per user, but per interactive login session. You may read more about it in Investigating Outlook's Single-Instance Restriction: Outlook was determining whether or not another instance was already running in the interactive login session. […] During Outlook's initialization, it checks to see if a window named "Microsoft Outlook" with class name "mspim_wnd32" exists, and if so, it assumes that another instance is already running. There are ways of hacking around it – there is a tool for launching multiple Outlook instances on the Hammer of God site (scroll down) – but they will probably involve intercepting Win32 calls. As for your code only working on certain machines: That’s probably due to a race condition. If both processes manage to start up fast enough simultaneously, then they won’t detect each other’s window, and assume that they’re the only instance running. However, if the machine is slow, one process would open its window before the other, thereby causing the second process to detect the first process’s window and shut itself down. To reproduce, try introducing a delay of several seconds between launching the first process and the second – this way, only the first process should ever succeed.
How terminate child processes when parent process terminated in C#
Task: Auto kill all child processes if parent process terminate. Parent procees can be terminated not only in correct way, but also by killing in ProcessExplorer, for example. How can I do it? Similar question in С topic advice to use Job objects. How to use it in C# without exporting external DLL? I tried to use Job Objects. But this code doesn't work properly: var job = PInvoke.CreateJobObject(null, null); var jobli = new PInvoke.JOBOBJECT_BASIC_LIMIT_INFORMATION(); jobli.LimitFlags = PInvoke.LimitFlags.JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE | PInvoke.LimitFlags.JOB_OBJECT_LIMIT_PRIORITY_CLASS | PInvoke.LimitFlags.JOB_OBJECT_LIMIT_JOB_TIME | PInvoke.LimitFlags.JOB_OBJECT_LIMIT_DIE_ON_UNHANDLED_EXCEPTION | PInvoke.LimitFlags.JOB_OBJECT_LIMIT_JOB_MEMORY; var res = PInvoke.SetInformationJobObject(job, PInvoke.JOBOBJECTINFOCLASS.JobObjectBasicLimitInformation, jobli, 48); if (!res) { int b = PInvoke.GetLastError(); Console.WriteLine("Error " + b); } var Prc = Process.Start(...); PInvoke.AssignProcessToJobObject(job, Prc.Handle); PInvoke.SetInformationJobObject returns with error. GetLastError returns error 24. However, PInvoke.AssignProcessToJobObject works and child process added to Job Queue (I can see it in ProcessExplorer). But, because PInvoke.SetInformationJobObject don't work - spawned process stay alive when I kill parent one. What do I have incorrect in this code?
To kill a process tree on windows, given only the parent process or process id, you'll need to walk the process tree. For that, you'll need a way to get the parent process id for a given process. using System; using System.Collections.Generic; using System.Text; using System.Threading; using System.Diagnostics; using System.Management; namespace KillProcessTree { public static class MyExtensions { public static int GetParentProcessId(this Process p) { int parentId = 0; try { ManagementObject mo = new ManagementObject("win32_process.handle='" + p.Id + "'"); mo.Get(); parentId = Convert.ToInt32(mo["ParentProcessId"]); } catch (Exception ex) { Console.WriteLine(ex.ToString()); parentId = 0; } return parentId; } } Once you have that, actually killing the tree is not hard. class Program { /// <summary> /// Kill specified process and all child processes /// </summary> static void Main(string[] args) { if (args.Length < 1) { Console.WriteLine("Usage: KillProcessTree <pid>"); return; } int pid = int.Parse(args[0]); Process root = Process.GetProcessById(pid); if (root != null) { Console.WriteLine("KillProcessTree " + pid); var list = new List<Process>(); GetProcessAndChildren(Process.GetProcesses(), root, list, 1); // kill each process foreach (Process p in list) { try { p.Kill(); } catch (Exception ex) { Console.WriteLine(ex.ToString()); } } } else { Console.WriteLine("Unknown process id: " + root); } } /// <summary> /// Get process and children /// We use postorder (bottom up) traversal; good as any when you kill a process tree </summary> /// </summary> /// <param name="plist">Array of all processes</param> /// <param name="parent">Parent process</param> /// <param name="output">Output list</param> /// <param name="indent">Indent level</param> private static void GetProcessAndChildren(Process[] plist, Process parent, List<Process> output, int indent) { foreach (Process p in plist) { if (p.GetParentProcessId() == parent.Id) { GetProcessAndChildren(plist, p, output, indent + 1); } } output.Add(parent); Console.WriteLine(String.Format("{0," + indent*4 + "} {1}", parent.Id, parent.MainModule.ModuleName)); } } } // namespace
I tried the code above and indeed, it does not work, complaining of a bad size. The reason for this is that the structure used changes size depending on the host platform; the original code fragment (seen on a dozen websites) assumes a 32 bit application. Switch the structure to this (note the IntPtr resizing members) and it will work. At least it did for me. [StructLayout(LayoutKind.Sequential)] struct JOBOBJECT_BASIC_LIMIT_INFORMATION { public Int64 PerProcessUserTimeLimit; public Int64 PerJobUserTimeLimit; public Int16 LimitFlags; public UIntPtr MinimumWorkingSetSize; public UIntPtr MaximumWorkingSetSize; public Int16 ActiveProcessLimit; public Int64 Affinity; public Int16 PriorityClass; public Int16 SchedulingClass; }
You can pass ProcessID of the parent process as an argument to the child process. And then child processes will be responsible for checking from time to time whether the parent process still running. (By calling Process.GetProcessById.) Another way to track existence of the parent process is to use Mutex synchronization primitive. Parent application will initially create a global mutex with the name known by children. Children can check from time to time whether the mutex still exists and terminate if not. (Once the parent process is closed the mutex will be destroyed by the system automatically, regardless of the way it way closed.)
Did you pay attention to the error code? Error 24 is ERROR_BAD_LENGTH, which probably means that 48 isn't the right length of the structure. I think it's 44, but you should do a sizeof to be sure.
Windows does not force child processes to close when a parent process closes. When you select "Kill Tree" in a tool like Task Manager or Process explorer, the tool actually finds all child processes and kill them one by one. If you want to ensure that child processes are cleaned when your application terminates, you can create a ProcessManager class that implements IDisposable that actually creates the processes, keeps track of their instances and calls Kill on each one of them on Dispose, e.g. public class ProcessManager:IDisposable { List<Process> processes=new List<Process>(); public Process Start(ProcessStartInfo info) { var newProcess = Process.Start(info); newProcess.EnableRaisingEvents = true processes.Add(newProcess); newProcess.Exited += (sender, e) => processes.Remove(newProcess); return newProcess; } ~ProcessManager() { Dispose(false); } public void Dispose() { Dispose(true); GC.SuppressFinalize(this); } protected virtual void Dispose(bool disposing) { foreach (var process in processes) { try { if (!process.HasExited) process.Kill(); } catch{} } } }
Suspend Process in C#
How do I suspend a whole process (like the Process Explorer does when I click Suspend) in C#. I'm starting the Process with Process.Start, and on a certain event, I want to suspend the process to be able to do some investigation on a "snapshot" of it.
Here's my suggestion: [Flags] public enum ThreadAccess : int { TERMINATE = (0x0001), SUSPEND_RESUME = (0x0002), GET_CONTEXT = (0x0008), SET_CONTEXT = (0x0010), SET_INFORMATION = (0x0020), QUERY_INFORMATION = (0x0040), SET_THREAD_TOKEN = (0x0080), IMPERSONATE = (0x0100), DIRECT_IMPERSONATION = (0x0200) } [DllImport("kernel32.dll")] static extern IntPtr OpenThread(ThreadAccess dwDesiredAccess, bool bInheritHandle, uint dwThreadId); [DllImport("kernel32.dll")] static extern uint SuspendThread(IntPtr hThread); [DllImport("kernel32.dll")] static extern int ResumeThread(IntPtr hThread); [DllImport("kernel32", CharSet = CharSet.Auto,SetLastError = true)] static extern bool CloseHandle(IntPtr handle); private static void SuspendProcess(int pid) { var process = Process.GetProcessById(pid); // throws exception if process does not exist foreach (ProcessThread pT in process.Threads) { IntPtr pOpenThread = OpenThread(ThreadAccess.SUSPEND_RESUME, false, (uint)pT.Id); if (pOpenThread == IntPtr.Zero) { continue; } SuspendThread(pOpenThread); CloseHandle(pOpenThread); } } public static void ResumeProcess(int pid) { var process = Process.GetProcessById(pid); if (process.ProcessName == string.Empty) return; foreach (ProcessThread pT in process.Threads) { IntPtr pOpenThread = OpenThread(ThreadAccess.SUSPEND_RESUME, false, (uint)pT.Id); if (pOpenThread == IntPtr.Zero) { continue; } var suspendCount = 0; do { suspendCount = ResumeThread(pOpenThread); } while (suspendCount > 0); CloseHandle(pOpenThread); } }
Thanks to Magnus After including the Flags, I modified the code a bit to be an extension method in my project. I could now use var process = Process.GetProcessById(param.PId); process.Suspend(); Here is the code for those who might be interested. public static class ProcessExtension { [DllImport("kernel32.dll")] static extern IntPtr OpenThread(ThreadAccess dwDesiredAccess, bool bInheritHandle, uint dwThreadId); [DllImport("kernel32.dll")] static extern uint SuspendThread(IntPtr hThread); [DllImport("kernel32.dll")] static extern int ResumeThread(IntPtr hThread); public static void Suspend(this Process process) { foreach (ProcessThread thread in process.Threads) { var pOpenThread = OpenThread(ThreadAccess.SUSPEND_RESUME, false, (uint)thread.Id); if (pOpenThread == IntPtr.Zero) { break; } SuspendThread(pOpenThread); } } public static void Resume(this Process process) { foreach (ProcessThread thread in process.Threads) { var pOpenThread = OpenThread(ThreadAccess.SUSPEND_RESUME, false, (uint)thread.Id); if (pOpenThread == IntPtr.Zero) { break; } ResumeThread(pOpenThread); } } } I have a utility done which I use to generally suspend/kill/list a process. Full source is on Git
So really, what the other answer's are showing is suspending thread's in the process, there is no way to really suspend the process (i.e. in one call).... A bit of a different solution would be to actually debug the target process which you are starting, see Mike Stall's blog for some advice how to implement this from a managed context. If you implement a debugger, you will be able to scan memory or what other snap-shotting you would like. However, I would like to point out, that technically, there is now way to really do this. Even if you do debugbreak a target debuggee process, another process on your system may inject a thread and will be given some ability to execute code regardless of the state of the target process (even let's say if it's hit a breakpoint due to an access violation), if you have all thread's suspended up to a super high suspend count, are currently at a break point in the main process thread and any other such presumed-frozen status, it is still possible for the system to inject another thread into that process and execute some instructions. You could also go through the trouble of modifying or replacing all of the entry point's the kernel usually calls and so on, but you've now entered the viscous arm's race of MALWARE ;)... In any case, using the managed interfaces for debugging seems' a fair amount easier than p/invoke'ng a lot of native API call's which will do a poor job of emulating what you probably really want to be doing... using debug api's ;)
See this CodeProject article for the win32 basics : http://www.codeproject.com/KB/threads/pausep.aspx. This sample code makes use of the ToolHelp32 library from the SDK, so I would recommend turning this sample code into an unmanaged C++/CLI library with a simple interface like "SuspendProcess(uint processID). Process.Start will return you a Process object, from which you can get the process id, and then pass this to your new library based on the above. Dave
[DllImport("ntdll.dll", PreserveSig = false)] public static extern void NtSuspendProcess(IntPtr processHandle); static IntPtr handle; string p = ""; foreach (Process item in Process.GetProcesses()) { if (item.ProcessName == "GammaVPN") { p = item.ProcessName; handle = item.Handle; NtSuspendProcess(handle); } } Console.WriteLine(p); Console.WriteLine("done");