I have a file conversion app, whose main job is to read from an input file, process the data, save it in another input file.
This process is already organized as an asynchronous task working on two streams, in orderd to show a live progress bar while the conversion is done.
The issue I have now is that when the app is suspended, the conversion is suspended too, which is a problem since the files involved get quite big and users want to do else meanwhile.
What can I do to keep this task alive?
Use the Extended Execution API to postpone app suspension:
https://learn.microsoft.com/en-us/windows/uwp/launch-resume/run-minimized-with-extended-execution
Related
I have a web application to return images to my frontend.
In this application what happens is: when a request is made to a particular image the application checks if the image already exists on disk; if it exists the image is returned.
My problem starts when the image does not exist on disk. In that case two requests are made at the same time for the same image which does not exist on disk. Problem occurs when two threads try to create the same file on disk at the same time.
To solve the problem, what I tried to do was to create a Mutex in the creation of disk image. But it had a problem: as the server load is enormous due to the large number of simultaneous requests, the server crashes.
I would like to ask what your ideas to solve this problem. Or what you would do otherwise?
Thank you.
You could try the following pattern:
Try to read the image (if succeeds, than done)
Try to create the image with Write lock
Only on "File in use exception", small delay (milliseconds)
Go back to step 1 (retry)
Make the delay really small, just a tiny bit larger than the time it should need to create an image.
Implement a retry limit, max 3 times or so.
This would allow you to make use of the already existing (file) locking mechanism
You can call the open function with O_CREAT and O_EXCL flags. The first process's open call will get exclusive access to create the file and it will start downloading the image. The subsequent process's open call will fail because their open is not exclusive and "errno" will be set to EEXIST.
Based on your design, the subsequent processes can either wait for complete file creation or can return back.
fd = open(path, O_CREAT|O_EXCL)
I'm writing a GUI in c#, in which when the uesr click on a button, the GUI calls for kitty.exe (which is exactly like PuTTy.exe, with extra features) to run some command in Linux, during this time there's a log on kitty ssh command shell which is passed to a log file (using KiTTY embedded feature). When it is finished, I check the log file for the word 'Succeed' or 'Fail'.
What I do nowadays is I wait (using the 'sleep' command in c#) 2 minutes, then I check the log file. This does work, but I want it to be more efficient, because it sometimes take 90 sec and sometimes 60 sec, and I don't want to wait so long after it is finished. Furthermore, it's not so elegant solution :o).
My question is how (and if) can I check the log running on the SSH windows in real time, and search for the key words (succeed or fail), rather than just wait for it to finish and then check it.
using the '>' in cmd.exe to call the kiTTy.exe and pass its log to a file does not work, by the way.
Thank you!
If you invoke your system process using System.Diagnostics.Process.Start(), then you can use the WaitForExit method after invoking. That will block further execution on the thread until the process completes at which point you can check the log. This may be preferable to waiting an arbitrary amount of time.
If you want to do it in real time rather than waiting until completion, there is a facility to get a stream reader via System.Diagnostics.Process.StandardOutput which you can use to look for the item of interest.
I'm writing a Windows Phone application, and it needs to download very large mp3 files, and save them to isolated storage. I've got all the code for this working, and I tested it with smaller files, but now using the actual files and monitoring what the code is doing using the debug output, I've realized that the threads are actually exiting half way through downloads, and files never actually finish downloading.
Is there a reason for this happening, and if so, what can I do to prevent this?
How long does it timeout after? If you are using HttpWebRequest to download the file, the default time out is 100,000ms (100 seconds). This can be changed as simply as inserting:
HttpWebRequest.Timeout = 10;
Obviously setting your own timeout (in milliseconds!) and attaching it to your WebRequest :)
If your not using HttpWebRequest, let me know what you are using and i'll try my best to hep you out :)
WP's internal memory and process management takes care about this. If you spawned a thread from your app which downloads a lot of data in the background OS will drop it off when those resources (by most chance memory) becomes needed for other processes.
You can do two things, depending on your approach for download:
To periodically save buffer chunks in IsolatedStorage when buffer reaches certain size, thus limiting memory usage of the thread.
Implement download thread as BackgroundTask, which should allow "endless" execution.
I have not used multi threading much for asp.net. I have a web application that uploads a large temp file to folder. I would like to take that temp file and do some other things with it after it is uploaded. Can I do this work on another thread without the user being on the website any more? Thanks for any help or suggestions.
1.User post large file
2.uploading temp to server
3.After upload completes. I would like to run another thread/worker that can run without any user iteration but is trigger by the user.
void uploading(){
//Uploading file To server
}
void Submitclick(){
Start a Thread
Thread thread = new Thread(DoThreadWork);// does the user still need to be logged in?
Send to another page
}
void DoThreadWork(){Do this in background}
It's definitely possible, I've used background threads quite a bit in ASP.NET to do some funky stuff. If you have complete control over the server it might be more elegant to run the background code in a separate application or a windows service.
It's a better separation of concerns to have your IIS app dealing with just responding to web requests, and it's not geared up for that.
Also a warning, if you have a background thread in ASP.NET 2.0 and it has an unhandled exception, the default is to reset the application pool.
More information here: http://blogs.msdn.com/b/tess/archive/2006/04/27/584927.aspx
// 3 downvotes?
Listen, it's not alway possible to avoid running something in a background thread. I've hit this in several situations:
I've worked in a company with an unreasonable attitude to software
where we were not allowed to deploy a separate app to handle the
background processing. I argued for a windows service, but was
overruled and told to implement it in a background thread. Obviously
I moved on from that company to a healthier environment, but that is
the reality is this you have to deal with unreasonable situations
sometimes.
if you're in a hosted environment you don't always have the option to offload onto a seperate process.
The question was if it is possible. I'm answering that question.
If you want to separate the uploading of file from website user interaction you can make a windows service that will contineously check that if file is ready for upload and upload the file.
You can use the thread pool for that. The sample code relies on the article in the following lik: http://msdn.microsoft.com/en-us/library/3dasc8as(v=vs.80).aspx
First write your method that does the work. The method must get 1 argument of type object:
public void DoWork(object threadContext)
{
}
than, in the place of the code that you want to call the method do:
...
var threadParam = ... // Put any value you want to be used by the DoWork method
ThreadPool.QueueUserWorkItem(DoWork, threadParam );
The method will be queued until the system will hav free thread to handle the work and execute the method regardless if the request has been ended or not.
I am using a webclient to download a media file from my web server and save to isolated storage.
If you click a button it starts the download and save to Iso store process, but if you click the button while the file is downloading it tries to create a concurrent IO thread to download again and errors with webclient does not allow concurrent IO threads.
I want to write a conditional if statement to check if there is already a IO thread in being used but I'm not sure how I would do this.
Any help would be greatly appreciated.
Can't you just use a boolean to see if you started the download already? Either way it sounds like it would be better to actually disable the button in the UI after you start a download, and enable it again once it finishes or fails.
Your UI should be consistent with what users have the ability to do at a given time - letting them try something and then make them fail sounds like a frustrating user experience.