In c#,
I have a device that accepts HTTP requests as remote commands.
I've created a button that send those requests, it's not the perfect but it works.
However, when the device is disconnected and there is
destination unreachable
response the application freezes until i restarts it.
I need a way around it, maybe some timeout that will close the stream after 1 second.
private void httpBTN_Click(object sender, EventArgs e)
{
String URI = "http://192.168.1.118/cgi-bin/aw_cam?cmd=DCB:0&res=0";
WebClient webClient = new WebClient();
Stream stream = webClient.OpenRead(URI);
stream.Close();
}
Long running operations in GUI's are propblem: As long as the operation has not finished (result or timeout), the event does not finish. And as long as a Event does not finish, no other code can run. Not even the code that updates the GUI with changes or "yes, I got the message about a user input" to Windows. You will need some form of Multitasking.
Luckily Networking is one of the most common cases of long running Operations, so it has build-in ways of multitasking (the async functions). However that might no be the best start. As Multitasking beginner I would advise towards using the BackgroundWorker in Windows Forms. At least until you got a handle on Invoke and Race Conditions. async/await is the better pattern longterm, but has a much steeper learning curve.
Related
I'm wondering what the best approach might be for what I'm trying to do. My application has a button that starts a background download operation, and then performs some actions dependent on the downloaded files.
It's like a "The game will begin shortly once necessary data is downloaded" situation, where the user may still use the main form.
private void btnStart_Click(object sender, EventArgs e)
{
//execute some code
Downloader.RunWorkerAsync(files); //this worker reports progress to the main form
while (Downloader.IsBusy)
Application.DoEvents();
//execute some more code
}
I'm aware that doing it that way is not good at all.
I cannot execute the download code synchronously, because the main form needs to remain responsive during the download operation.
I also cannot put the final code into the download completed event, as it is used by many other areas of the program and must remain a "generic" download system.
So, is there a way to do what I want? I do not have any experience with other async methods.
If you use BackgrounWorker you must configure it properly. BW has RunWorkerCompleted event to which you must subscribe to handle completion of you async work.
I think you should use asynchronous programming features of the .net 4.5 framework (await and async).
refer to async programming
first time poster here!
I'm a Senior Computer Science Student and I'm currently developing a GUI that plays A board game (othello ) online using telnet.
the pseudo is something like this...
click button
update GUI
recieve telnet input
update GUI
rinse and repeat!
the problem is though, the only way i know how get the telnet function to go is by putting it inside the Click event handler, but the GUI won't update until the whole function is finished. Meaning it updates every two moves instead of one. Is there a way to tell C# ( which I'm new to) to call a new function immediatly after one has finished? specifically on a GUI.
any input is appreciated
Thanks
I'm not sure I understood correctly the problem, but the "receive telnet input" line makes me worry a lot.
Are you writing this application in a single thread without using any kind of asynchronous TCP/IP communication?
If the answer is yes, the error is in the architecture you are using.
You need asynchronous tcp/ip communication, for example, with another thread running in parallel, with asynchronous sockets or with asynchronous streams.
You cannot stop the GUI waiting for the network, it would be a bad architecture.
Try to read this simple but complete article on codeproject: http://www.codeproject.com/KB/IP/socketsincs.aspx
Windows OS uses a thing called "message pump" to handle windows. Everything is a message that is processed by a single thread (your application thread).
Events are enqueued in the message queue.
If you stop the execution of the main thread for too long you are stopping the message queue from being processed, and this will stop user input and also painting, since rendering is also a windows message that can be enqueued.
You'll need to use threads. This way while one thread is still processing you can fire off a new thread. I think that's the only way you'll be able to simultaneously finish processing one task while starting up another at the same time.
Once the task is done processing you can join it back to main thread.
I have an HttpListener that for each request received launches a winform and returns an array of byte representing an image. This image is taken by the winform once it executes some commands parsed from request's querystring. Any winforms is executing inside a different Thread.
Right now after the winform has done its job, it comes up with an array of byte as the response and then its thread died, if the same user makes a new request a new thread (with a new winform) is created.
I'm guessing if I can maintain each thread, using an ID for each user (IP address, a GUID, cookie), so I don't have to recreate the winform every time and at the same time the winform will maintain previous state.
Is it possible? Or I have to move to another direction or design?
You could change the design to workers/tasks approach, defining separately the worker pool (thread+Form) and the task pool. That done, you can decide wether to create a new thread/form pair to handle a task or reuse an existing one, you can choose the number of workers to run, limiting the risk of over consuming the server resources. But using Winforms on a server side is definitely not the best way, as shf301 pointed out.
I am writing a windows form application in .net using C#.
I am running into a problem that if my program is running when the computer goes into the sleep and/or hibernate state (I am not sure at this time which one, or if both, cause the problem), when the machine wakes up again the program just hangs. The only way to exit out of it is to kill the process from the task manager.
This is, for obvious reasons, not the way I want the program to function. Even if I just shut the program down when it goes into these states, that would be fine, but I am not quite sure how to do this or if there is a more graceful way altogether of handling this.
You need:
using Microsoft.Win32;
And this is the code:
void SystemEvents_PowerModeChanged(object sender, PowerModeChangedEventArgs e)
{
if(e.Mode == PowerModes.Suspend)
{
this.GracefullyHandleSleep();
}
}
This is what I went with.
Handling those events may be a work-around. But before applying this kind of work-around I'd try to figure out what the application was doing when the OS went into hibernate.
Occurs hanging although application was just idle?
Is the application doing some kind of low-level work (communication with device drivers or external hardware) that should not be interrupted?
Does the application use some kind of network connection?
This article covers listening for those events. You'll have to do something like override WndProc and listen for PBT_APMSUSPEND events.
I'm working on a web application import program. Currently an admin user can upload a formatted csv file that my page will parse. I'm experiencing an execution duration issue as each line pertains to a file that has to be saved to Scribd, S3, as well as some internal processing.
What would you guys recommend for improving execution time? Since this is an admin only page, I doubt it would get run more than once a week, so my hope is to get it out the door asap.
I've looked some at the Async="true" flag, but I wasn't sure if that was the direction I wanted to go, or if I should look more that a windows server.
Two options come to mind:
Threads: In your code setup a collection of threads, join them and then have each one process a single file. Once all the threads complete you'll be able to return the page. This will increase your turn around time, but could still leave something to be desired on page returns
Queue: Have the user submit the csv file and provide a GUID/Hash/Whatever ID where the admin could then go to the "status" page, input their ID and check the details of their job. This solution will provide a quick feedback to the user and allow them to keep track of the results without having to wait around.
A quick and dirty option might be to set Page.Server.ScriptTimeout to a really high value on that page. (I think it maxes at Int.MaxValue).
Probably advisable to block the submit button after its been clicked, and inform the user that they may want to go make a coffee.
I'd suggest using AJAX to have an internal post back occur that would handle the asynchronous processing. You can periodically poll the state, and prevent your master page from having the "processing" wheel constantly churning on the page for the lengthy process.
I have a web page that takes a long time to process a mailing list so I kick it off in it's own thread. When the process is done, a report can be seen from another link on the result page. I have a runable MailSender class. The ASPX script has a bit in it that looks like this:
// prep the MailSender
MailSender ms = new MailSender(people, Subject, FileName....);
if (SendAsync) {
ThreadStart ts = new ThreadStart(ms.run);
Thread WorkerThread = new Thread(ts);
WorkerThread.Start();
} else {
ms.run();
}
If you want to speed your code up, try to break it into parallelizable pieces if you can and write a class for each piece. You could then kick off a new thread for each bit and monitor the status somewhere so the user can be informed when to come back to the results. You said that each line of your input would generate it's own output file. Sounds like a great candidate for multi-threading. Won't speed things up much if you don't have multi-cores availabe on the server though.
One problem with this whole scheme is that server restarts or application pool recycling will kill your long running process. This can be a problem if you threads are going to run for an hour or two.
As external factors are involved in the processing time, you need to consider if performance improvements would affect "actual" performance, if most of the time is in processing it and sending it to the thirdparty (ie Scribd,S3), then making improvements on your end might not have a huge affect and might increase the complexity for a simple task.
What I would do is have the aspx page only doing what aspx does best; ie handling the user interface part only (ie the upload), so once the upload is complete as far as the user is concerned their part is done. You could implement a progress indicator using AJAX to make it nicer but as its an admin section I wouldnt bother with the niceties,
Then have simple console application sheduled to fire at specific intervals, or a windows service watching a directory (depending on how timecritical the updates are), once the app runs as it is in the back ground and does not require user interaction, time is not a critical factor (ie you dont have a user waiting for context to be returned)..
it will appear to the user that things are very snappy (ie the time it takes to upload the file) and you are keeping needless complexity out of your solution.
I think the simplest solution to what you want is to use asynchronous pages in ASP.NET. Is there any particular reason why you don't want to go that route?
I can think of an alternative, which is to have some background process (like a process triggered by a scheduled task in Windows, or a Windows service) that will look at a queue of waiting jobs (say, from a database table) and process those jobs. This way you will have to upload that CSV somewhere and insert a db record so that the background process will see that CSV and use it when it comes around. But to me it seems like more work, so I'd rather use asynchronous pages :)
Here's a nice tutorial on ASP.NET asynchronous pages