I'm building a tool that consists of a Webservice that will run commandline tools at specific times. On average there will be running 15-20 CL tools at the same time. Every CL tool runs no longer than about 0,5-1 minute on average. The webservice needs to be able to check each CL's status every 2-3 seconds or so.
I've got some advice about how to do this; named pipes looks likes the best "technical" solution. However I'm wondering if communicating with simple very small text "status" files would be a better, less complicated and less error-risk, way. I prefer the solution that is the most resource friendly.
Please advice.
Named pipes would be better. You might run into access problems with the two processes trying to access the small text file at once.
I did something similar before and here's what I did.
I start the command line process by setting ProcessStartInfo.RedirectStandardOutput to true. Then I start listening to the command line process' output by doing:
process.OutputDataReceived += dataReceivedEventHandler;
process.BeginOutputReadLine();
Then in my dataReceivedEventHandler, I parse DataReceivedEventArgs.Data to see if my console application sent me a "message" that begins with a tag that I created (ex: "[ServerProgress]10").
So basically, the applications are communicating through the console output. Not elegant, but it did the job for me.
Hope this helps.
For your situation, status checks - not a huge footprint there, I'd go with named pipes any day. Using memory >> using HDD here.
Related
What I'm Trying to Achieve
I'm attempting to build a console game that has multiple console windows that would be displaying inventory, status effects, current map, and health. Another console would be the main one that gathers input to effect the other consoles. The reason I want to do it this way is so that the other consoles can be updating their "graphics" (or text) without disturbing the input flow.
What I've Tried So Far
So far, I've attempted to use System.IO's File, FileStream, StreamWriter, and StreamReader to communicate between the consoles via text files. The problem I've ran into is that, when the main console (the input console) is attempting to write inputs to a file--which is communicating with another console (the "graphics" console)--it throws an error because the "graphical" console is trying to read the input of the file (or vice versa).
I figured that making the FileStream's FileAccess be Readable would do the trick, but I ran into the same issue.
I think I could get this to work if I could communicate between the consoles to tell each other that one is done writing to or reading the file... kind of like a back and forth... "I'm writing to the file... okay, I'm done" "I'm reading the file... okay, I'm done" and the cycle continues...
So, in summary, I suppose, my question is how can I communicate between two consoles using files?
Possible Solutions I could try learning SQL, but I don't know if I'd end up running into the same issue... so, if I must learn SQL for this project, I suppose, that'd be my last option.
Thank you!
IPC (inter process communication) is the keyword you're looking for.
There are multiple ways to do IPC, e.g. shared memory, named pipes or similar. .NET has an IpcChannel which uses TCP or a named pipe if the destination is on the local PC.
I am looking to transfer a binary file via RS232. I need to do this to 5000 different devices (no joke). I could do them one-by-one through a terminal program but that will take a lot of time.
So, I am writing a C# program that will be able to automate the process. I am looking at using XMODEM protocol and command line parameters to start the process. I have been looking for this for a few hours now and far my results have turned up little. I tried using uCON but that takes some sort of scripting language.
I was wondering if anyone in the community here might know of a solution to transfer a file over RS232 and program it in C#. Whether it is from a protocol (XMODEM) or a program that accepts command line or some other custom solution, doesn't really matter to me.
Ok so I was able to confirm that the code found on the website( ghielectronics.com/community/codeshare/entry/825) was able to work. The issue was was that I did not know how long it would take the computer to transfer the file. I thought that it would be quick. However, after further testing, it actually takes about 30 sec to 1 min depending on the file size.
This C# code snippet allows anyone to transfer a binary file over the Serial Port using the XMODEM protocol. This is done in code and a terminal is not needed which fits the requirements that I needed for the project.
Is there a way to monitor the state of a console application?
I am trying to build a user interface which shows whether or not a console application is currently running on the server (processing files). If it is running, I would like to show the current state: how many files processed, what file currently being processed, etc.
The only way that I can think of doing this:
Create a text/xml file when application is started
Update text file with information about current state for each object it processes
Delete text file when the application is finished processing
To me, this doesn't seem like a very good or efficient way to do it. Is there a way to detect if the ClickOnce application is running, and perhaps some other way to access the "Messages" or Log of it to show the progress?
Note - I am also looking into using NodeJS to do this, but unsure if it has this capability.
First, you should consider writing this as a Windows service instead of a console application.
That said, scraping a log file that your application is writing is a reasonable approach. Just ensure that it never gets too big.
Alternatively, you could look at using custom performance counters. That would open the door to using System Monitor/perfmon as your monitoring tool, so no need to write any client code.
There are at least two ways to achieve that:
Your console application writes some logs, some state files, during its run, so other processes can read those files and understand what is going on in that console process.
Implement an IPC mechanism. There are different ways to do that. It may help you look in What is the easiest way to do inter process communication in C#?.
What do you think is the best approach to solve this issue. I have a C# application that receives a XML list of files. Based off this XML list many files (~10 megabyte size) get copied from one folder on our local SAN to another folder on the same local SAN. But all those file copies transfer via the C# Windows App that is actually doing the file copying. This takes 20 minutes per job. Any good ideas on how to dispatch a set of copy instructions to the SAN via C# but not have the local box be involved in the transfer?
Does the Mono codebase offer a way to SSH into a box and execute commands? This is my only idea so far to greatly reduce the execution time these jobs take.
UPDATE
The SAN is the Clariion NS 480
http://www.tech.proact.co.uk/emc/emc_celerra_ns-480_nas.htm
I assume this is a Linux or Unix operating system inside.
Trying to find some better technical specs.
You could run a process on the SAN that executes a specific script every x minutes. Create that script from your c# application.
Usually I would use SSH from the C# application in such a situation to issue the needed copy commands BUT from the link you posted none of the protocols listed supports such a mechanism... neither SSH not TELNET (which I would NOT recommend for security reasons) is listed as a supported protocol... without such a protocol your only option is to have your local machine involved in the copy process... I highly recommend checking back with the vendor about SSH support...
I have an ASP.NET page that gets a list of game server ip addresses (quickly) and loops through them running a command line tool against them to get special game server information. I have to use the command line tool because I don't know how it works to get the information from the machines and I don't want to reinvent the wheel. The looping is the slow part (surprise surprise). Each command line tool run takes up to a second so with approximately 60 ip addresses polled on average, the page load can take from 30-60 seconds to render the results I need.
My obvious thought was "multithread that thing!" Well, I tried that w/ thread pools but ended up with a hanging website if more than one person accessed the page at a time. This was only using 4-5 calls at a time up to the 60 making it a 10 sec load time. So not only did it hang with multiple users, it was still too slow. I'd be happy if I could get it to under 3 seconds.
I should mention this page is in a shared hosting environment. I had a great solution before outside of the shared hosting environment but I had to cut costs and I'm trying to make it work w/ shared now.
Is there any hope?
You shouldn't really be polling these servers "on demand." It would be better to use ASP.NET to show the list of server information, and some other process - like a windows service, or scheduled task - to poll the servers every couple of minutes to generate that list. To summarize: The service would create an XML file (for example) and ASP.NET would display it to users. This way, the amount of users viewing the page does not affect the amount of times you try to poll the servers.
Update:
You need to ensure the process that pings servers is a singleton. Specifically, a singleton is a class in which only a single instance can exist. In more general terms for your case, you need to set a global flag that says "i'm currently pinging servers" and another global datetime value to says "the last time i pinged the servers was at hh:mm:ss" - you could use the Application dictionary to store the boolean flag and the datetime. Each time someone loads your page, check the flag to see if it's already pinging the servers. If it is, don't do it. If the flag says ok, then check the current time against the last time you did it. If it's less than 5 minutes, don't do it. All of this should be done in a background thread. This thread should update an xml file in App_Data. All requests to your pages should render this data immediately. A page request should never block. If the file is not there on the first call, then return "ping in progress, try again in 5 minutes." Follow?
Read about the ASP.NET Application state dictionary here:
http://msdn.microsoft.com/en-us/library/ie/ms178594.aspx
Low tech solution might be to call a bat file that makes each of the exe calls, instead of the exe repeatedly from asp.net. Saves the repeated shells to the OS overhead
Each call to the exe can pipe the results to a text file, which can then be read back all at once, once control returns to the asp.net app from the bat.
If the list of ip's change, then the the asp.net application could create the bat file before running it.