I need to run several instances of an external executable from my app. The average run time for this executable is about 3 minutes.
I want to redirect output from these processes, and update a progress bar in my GUI.
Of course I don't want to wait for them to return before I can continue using my app.
I think I should create a thread for every instance, and update my progress bar when a thread finishes.
Is this the right approach ?
Also, do you recommend a good resource / documentation to understand how it works ? I've found http://www.dotnetperls.com/threadpool only.
edit : these processes are network-based, ie: the run time may vary a lot depending on the link latency/bandwidth.
Concerning the progress bar, I would like to update it every time a process finishes. Is there a handler for that ? Later i will add more detailed update, based on the processes output to increase the progress done at each execution step.
edit 2 :
Thanks for your inputs. As I may have to run a lot of process (up to 20), and I don't want to saturate bandwidth, i'll run 5 in parallel max. Every time a process finishes, I increment the progress counter (for my progress bar) and I run another one until they're all completed, using :
Process p = new Process();
p.StartInfo.FileName = pathToApp;
p.EnableRaisingEvents = true;
p.Exited += OnCalibrationProcessExited;
p.Start();
private void OnCalibrationProcessExited(object sender, EventArgs e)
{
runAnotherOne function
}
Is it correct or is there a more elegant way to achieve this ?
I don't want my app to be blocked during execution of course.
Is it better to use background workers for this ?
You should be using Process and ProcessStartInfo.
You'll need to set ProcessStartInfo.UseShellExecute to false, ErrorDialog to false, RedirectStandardOutput to true (and possibly RedirectStandardError too).
You'll also need to provide a delegate to the Process object to handle to output generated by the external process through OutputDataReceived (and possibly ErrorDataReceived as well).
There's also an Exited delegate you can set that will be called whenever the process exits.
Example:
ProcessStartInfo processInfo = new ProcessStartInfo("Write500Lines.exe");
processInfo.ErrorDialog = false;
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
Process proc = Process.Start(processInfo);
proc.ErrorDataReceived += (sender, errorLine) => { if (errorLine.Data != null) Trace.WriteLine(errorLine.Data); };
proc.OutputDataReceived += (sender, outputLine) => { if (outputLine.Data != null) Trace.WriteLine(outputLine.Data); };
proc.BeginErrorReadLine();
proc.BeginOutputReadLine();
proc.WaitForExit();
Just waiting for each thread to end before updating the progress bar results in nothing happening ... then a quick jump .. 3 times. You may as well skip the progress bar.
The correct way to do it IMHO would be to calculate the toal work done across all 3 process:
totalwork = time1 + time2 + time3
Now, if you have multiple processors, it will take more like max(time1, time2, time3) but thats ok. It's a representation of work.
Have a shared variable for work-done. Each time a process does some more work, update the progress bar by calculating work-done += my-work-increment. The progress is just work-done/totalwork.
This will give good results regardless of whether the threads run sequentially or in parallel. Since you don't know how things are going to run (you might have a single processor cpu) this is the best approach.
Just create several instances of the Process class by calling the constructor, set the properties to redirect output stream, and then start them.
Your program won't wait for the called process to exit, as long as you don't call the WaitForExit method. No multi-threading is required.
Create a single thread.
In that thread (pseudocode):
Thread begins here
for each externalApp
Run the application with redirect output
Wait for exit
Update progress bar
end for
Thread ends here
See http://msdn.microsoft.com/en-us/library/ty0d8k56.aspx for wait for exit
Or... do you want to run the external apps in parallel?
Edit: based on the latest updates in the orginal post:
If you don't know the actual progress then don't use a regular progress bar. How about an infinite progress bar or a "working" icon?
An infinite progress bar could be a progress bar that fills up and the starts from beginning until everything is done. A working icon is like the Windows busy cursor (the ever spinning circle).
How about creating an ObservableCollection of TaskProgressInfo,
where TaskProgressInfo is a custom class, to which you write your progress.
Bind a WPF listview to that collection, using a datatemplate (with target type = TaskProgressInfo) to show a progressbar for each item (task).
Create an array of BackgroundWorkers that launch the external app and monitor it.
Each background worker should update its TaskProgressInfo, thus updating the datasource of a progressbar.
Upon completion each BackgroundWorker should remove its TaskProgressInfo from the ObservableCollection, thus removing a progress bar from the UI.
Since BackgroundWorker uses the background (UI) thread for report progress and completion,
the changes to the ObservableCollection will be done by its creating thread (thread safe).
Behind the scenes, .NET will use a ThreadPool - some backgroundworkers will share threads.
Related
I have an application that runs every 15 minutes. I need to add a new task (Maybe a Method) which will be called by the application and should run asynchronously, so the application can complete within the 15 Minutes. If the new task takes longer than the application is running, it will be aborted and can't complete its work. How can I keep a Task running?
There's a few ways you can go about this.
Independent process
The first is, don't run a task, run an independent process that can do the work. The first process will end when it needs to; the worker process will end when it needs to, and there's no communication between them (unless there needs to be for a different reason).
// in main thread...
var process = new Process(); // create the process
...
process.Start();
// process.WaitForExit(); this is commented out because you want main process to end without waiting
// main then ends here
Now the above means creating two separate executables; one for the launcher and one for the worker. That means two separate projects... or not.
What you can do is have both sets of functionality in the same executable, and invoke the functionality you need by separating it with command-line arguments. In your main you could do:
static void Main(string[] args)
{
if (args.Length == 1 && args[0] == "worker")
DoWorkerStuff();
else
{
var process = new Process(); // create the process
...
// use process API to call yourself with the arg
process.Start();
// process.WaitForExit(); this is commented out because you want main process to end without waiting
}
// main then ends here for both the launcher and the worker
}
I like this approach because you have full isolation due to process boundaries, and yet you don't have to compile and maintain separate projects.
Wait on the main thread
The second way is for the main thread to wait until the worker task is done. Trying to have tasks outlive the main thread can be problematic. Waiting is easy
// in main thread...
var task = ... // however you create the task
...
task.Wait(); // or one of the other Wait() overloads
There are, of course, other ways to start background work (e.g. using Thread) and for ways for the background work to signal the main thread, but these all require that original thread to wait until the work is done.
I recently found that the cause of an obscure bug was due to Process.Start. The method triggered a pump in the event queue while the process was starting, which in turn made code run that was dependent on the process being started.
The bug seemed to be timing related since it did not happen consistently, and was quite difficult to reproduce. Since we have more places in the code where we use Process.Start, and we might very well use it more in the future, I thought it would be good to get to the bottom of how to trigger it more consistently.
We created a standalone winforms project with a single button, here is the click eventhandler:
private void button1_Click(object sender, EventArgs e) {
this.BeginInvoke(new MethodInvoker(() => { MessageBox.Show("Hello World"); }));
var process = new System.Diagnostics.Process() { EnableRaisingEvents = true };
process.Exited += CurrentEditorProcess_Exited;
// Does pump
process.StartInfo = new System.Diagnostics.ProcessStartInfo(#"C:\Users\micnil\Pictures\test.png");
// Does not pump
//process.StartInfo = new System.Diagnostics.ProcessStartInfo("Notepad.exe");
process.Start();
Thread.Sleep(3000);
Console.WriteLine("Done");
}
Running this code will:
show the messagebox,
finish opening the test image,
After exiting the messagebox, the program will sleep 3 seconds and then log "Done".
If I replace the ProcessStartInfo argument with "Notepad.exe" instead, the code will:
start Notepad,
log Done,
Show the message box.
This can be reproduced consistently. The problem is, the program that we are starting is more similar to notepad. It is our own custom text editor built in WPF. So why does starting our executable sometimes trigger a pump, but I cannot make Notepad do the same?
We found a few others that has been hit by the same problem:
Which blocking operations cause an STA thread to pump COM messages?
No answer to why
Process.Start causes processing of Windows Messages:
Qouting Dave Andersson
ShellExecuteEx may pump window messages when opening a document in its
associated application, specifically in the case where a DDE
conversation is specified in the file association.
Additionally, ShellExecuteEx may create a separate thread to do the
heavy lifting. Depending on the flags specified in the
SHELLEXECUTEINFO structure, the ShellExecuteEx call may not return
until the separate thread completes its work. In this case,
ShellExecuteEx will pump window messages to prevent windows owned by
the calling thread from appearing hung.
You can either ensure that the variables in question are initialized
prior to calling Process.Start, move the Process.Start call to a
separate thread, or call ShellExecuteEx directly with the
SEE_MASK_ASYNCOK flag set.
"May create a thread" - How do I know when?, "You can either ensure that the variables in question are initialized prior to calling Process.Start" - How?
We solved the problem by setting process.StartInfo.UseShellExecute = false before starting the process. But the question remains, does anyone know how to write a minimal program that would trigger a pump from starting a executable like Notepad?
Update:
Done some more investigation and have two new findings:
private void button1_Click(object sender, EventArgs e) {
this.BeginInvoke(new MethodInvoker(() => {
Log("BeginInvoke");
}));
string path = Environment.GetEnvironmentVariable("PROCESS_START_EXE");
var process = new System.Diagnostics.Process() { EnableRaisingEvents = true };
process.Exited += CurrentEditorProcess_Exited;
process.StartInfo = new System.Diagnostics.ProcessStartInfo(path, "params");
process.Start();
Thread.Sleep(3000);
Log("Finished Sleeping");
}
1:
When setting the environment variable PROCESS_START_EXE to "Notepad" the logging becomes:
BeginInvoke
Finished Sleeping
When setting the environment variable PROCESS_START_EXE to "Notepad.exe" (note the ".exe") the logging becomes:
Finished Sleeping
BeginInvoke
Which I find strange, but I don't think is related to the problem we were having. We always specify the exact path and filename to the executable, including ".exe"
2:
The scenario in which I found the bug, was that i launched the application through a Windows shortcut with a target similar to this:
C:\Windows\System32\cmd.exe /c "SET PROCESS_START_EXE=<path to custom editor> && START /D ^"<path to application directory>^" App.exe"
It first sets an environment variable with the path to the executable that is supposed to be started, and then launches the winforms application.
If I use this shortcut to launch the application, and use the environment variable to start the process, then the logging always becomes:
BeginInvoke
Finished Sleeping
This does seem like the problem we were having. Only that I couldn't reproduce it consistently every time. Why does it matter that I am setting the environment variable right before launching the application?
Note that if I use the same shortcut, but do not use the environment variable to start the process, the message loop will not pump.
I have this code here that starts a process wait 8 seconds and then kill it and again.
for (int i = 0; i < 20; i++)
{
Process pro = new Process();
pro.StartInfo.FileName = #"C:\Program Files (x86)\Mozilla Firefox\firefox.exe";
pro.StartInfo.WindowStyle = ProcessWindowStyle.Minimized;
pro.Start();
Thread.Sleep(8000);
try
{
pro.Kill();
Thread.Sleep(1000);
}
catch
{
return;
}
}
As i run the application either on debug mode or direct from the .exe , it successfully starts and kills the process but it is frozen. I cant move its window around or click on other buttons..
it is frozen. I cant move its window around or click on other buttons.
That's correct. You said to put the thread to sleep.
People seem to have this strange idea that running code when buttons are pressed happens by magic. It does not happen by magic. It happens because the thread runs code that processes the "a button was clicked" message from the operating system. If you put a thread to sleep then it stops processing those messages, because it is asleep.
Putting a thread to sleep is 99% of the time the completely wrong thing to do, so just don't do it.
The right thing to do in C# 5 is to make your method async and then do an await Task.Delay(whatever). Alternatively, create a timer that ticks after some number of seconds. In the tick handling event, turn the timer off and do your logic there.
Well, my initial guess is that you are doing this all on your UI Thread. Since you make the UI thread sleep, your application will be frozen.
The obvious solution would be doing this in a new thread.
As Servy sais, this is not a great idea. You can use a Timer (https://msdn.microsoft.com/en-us/library/system.timers.timer%28v=vs.110%29.aspx) to do the waiting instead of blocking the UI thread.
The main thread of your application is a loop that constantly peeks messages from Windows message queue. It's like cooperative multi-threading - if you don't update your UI explicitly it won't do that automatically. And it won't because your program (main thread) just spawns a process then sleeps and then kills it (you don't even need that try/catch block - any exception on any thread of your application will terminate it). Sleeping within a thread blocks it. And because you sleep in the main thread (UI) you block the application from peeking from the message queue.
I have simplified my code for the sake of this question, but basically what happens is this:
private void RunScript(string path)
{
Process.Start(path);
lblStatus.Text = "Step 6 Complete";
}
However, I don't want the label's text to be updated until the script finishes running. Is that possible? Or rather, practical / feasible?
Just replace your code by this if you want to wait for a maximum given amount of time:
Process.Start(path).WaitForExit(milli seconds);
or this if you want to wait possibly forever (usually until it finishes)
Process.Start(path).WaitForExit();
You can use Process.Start(path).WaitForExit(); that will ... wait for the process to exit before continuing.
Becareful if this is on your main UI thread though as the UI will freeze and become unresponsive whilst it is working. Kick the process off on a BackgroundWorker or similar and update / kick off stage 7 when you get a successful completion (or have to handle an error).
Process.Start returns true when process has started added a do while loop and that should be it
bool started = Process.Start(path)
http://msdn.microsoft.com/en-us/library/vstudio/e8zac0ca
So in the snippet of below I very simply look in a specific folder and copy the images from the source to the destination.
The copy is VERY fast and it works great for the first bunch of folders (maybe 20 or so) which takes a few seconds. But then the progress bar stops moving and I get a spinning mouse cursor. I can look in the destination folder and it is still processing the folders.
When it's done I get the the "Process Complete" dialog box, the progress bar is 100% and everything ran fine.
Just want to make sure the end user doesn't think it's frozen.
private void readInvoices()
{
string InvoiceFile = txtInvoiceFile.Text;
//read in the text file and get all the invoices to copy
string[] Invoices = File.ReadAllLines(InvoiceFile);
//set the max val of the progress bar
progBar.Maximum = Invoices.Length;
try
{
//for every invoice
foreach (string invoice in Invoices)
{
//Set the source and destination directories
string sourceInvFolder = string.Format(#"{0}\{1}", txtSource.Text, invoice);
string destInvFolder = string.Format(#"{0}\{1}", txtDest.Text, invoice);
DirectoryInfo SourceDI = new DirectoryInfo(sourceInvFolder);
DirectoryInfo DestDI = new DirectoryInfo(destInvFolder);
//we know we have it in the CSV but does the directory actually exist?
//if so then let's process
if (Directory.Exists(SourceDI.FullName) == true)
{
//let's copy of the files
CopyAll(SourceDI, DestDI);
RenameFolder(sourceInvFolder);
}
//inc the progress bar
progBar.Increment(1);
}
}
catch (Exception ex)
{
MessageBox.Show("Error" + ex.Message);
}
finally
{
MessageBox.Show("Process Complete");
CleanUp();
}
}
The UI freezes because it's running in a single thread. A workaround to fix the freezing part is putting this line of code inside your loop.
Application.DoEvents();
This code checks if there are messages waiting to be processed, if there are, it processes them before proceeding to another loop. You can use a ProgressBar control to let the user see how much is processed already. If you don't want to stay with the single thread method, use a BackGroundWorker to prevent the form from appearing like it's frozen. This is multithreading, which means a separate thread is processing something while you do something else.
One thing to remember though, using the code above makes the whole looping process go slower since it has to do the checking for every loop, which means more work, in return, you get real-time progress report. The reason it looks frozen is that the loop hasn't finished yet, you'll have to let it finish first before you can do something else because it's running in a single thread.
if you want Just want to make sure the end user doesn't think it's frozen. you should use multihreading . the More suitable class for this task is BackgroundWorker
From MSDN :
The BackgroundWorker class allows you to run an operation on a separate, dedicated thread. Time-consuming operations like downloads and database transactions can cause your user interface (UI) to seem as though it has stopped responding while they are running. When you want a responsive UI and you are faced with long delays associated with such operations, the BackgroundWorker class provides a convenient solution.
try to follow the example provided by msdn and put the call readInvoices() of your method in DoWork event
Your code is running on the UI thread (at least I assume so as you have a MessageBox in your catch block).
So, it will not necessarily process UI updates.
Have a look into doing the work using the TPL.
http://msdn.microsoft.com/en-us/library/dd460717(v=vs.110).aspx