In my application, there is a list of images through which the user can step. Image loading is slow, so to improve user experience I would like to preload some images in the background (e.g. those images in the list succeeding the currently selected one).
I've never really used threads in C#, so I am looking for some kind of "best practice" advice how to implement the following behaviour:
public Image LoadCachedImage(string path)
{
// check if the cache (being operated in the background)
// has preloaded the image
Image result = TryGetFromCache(path);
if (result == null) { result = LoadSynchronously(path); }
// somehow get a list of images that should be preloaded,
// e.g. the successors in the list
string[] candidates = GetCandidates(path);
// trigger loading of "candidates" in the background, so they will
// be in the cache when queried later
EnqueueForPreloading(candidates);
return result;
}
I believe, a background thread should be monitoring the queue, and consecutively process the elements that are posted through EnqueueForPreloading(). I would like to know how to implement this "main loop" of the background worker thread (or maybe there is a better way to do this?)
If you really need sequential processing of the candidates, you can do one of the following:
Create a message queue data structure that has a AutoResetEvent. The class should spawn a thread that waits on the event and then processes everything in the queue. The class's Add or Enqueue should add it to the queue and then set the event. This would release the thread, which processes the items in the queue.
Create a class that starts an STA thread, creates a System.Windows.Forms.Control, and then enters Application.Run(). Every time you want to process an image asynchronously, call Control.BeginInvoke(...) and the STA thread will pick it up in its message queue.
There are probably other alternatives, but these two would be what I would try.
If you don't actually need sequential processing, consider using ThreadPool.QueueUserWorkItem(...). If there are free pool threads, it will use them, otherwise it will queue up the items. But you won't be guaranteed order of processing, and several may/will get processed concurrently.
Here's a (flawed) example of a message queue:
class MyBackgroundQueue<T>
{
private Queue<T> _queue = new Queue<T>();
private System.Threading.AutoResetEvent _event = new System.Threading.AutoResetEvent(false);
private System.Threading.Thread _thread;
public void Start()
{
_thread = new System.Threading.Thread(new System.Threading.ThreadStart(ProcessQueueWorker));
_thread.Start();
}
public class ItemEventArgs : EventArgs
{ public T Item { get; set; } }
public event EventHandler<ItemEventArgs> ProcessItem;
private void ProcessQueueWorker()
{
while (true)
{
_event.WaitOne();
while (_queue.Count > 0)
ProcessItem(this, new ItemEventArgs { Item = _queue.Dequeue() });
}
}
public void Enqueue(T item)
{
_queue.Enqueue(item);
_event.Set();
}
}
One flaw here, of course, are that _queue is not locked so you'll run into race conditions. But I'll leave it to you to fix that (e.g. use the 2 queue swap method). Also, the while(true) never breaks, but I hope the sample serves your purpose.
This is what I call cheat caching. The operating system already caches files for you, but you have to access them first. So what you can do is just load the files but don't save a reference to them.
You can do this without multi-threading per-se, and without holding the images in a list. Just create a method delegate and invoke for each file you want to load in the background.
For example, pre-loading all the jpeg images in a directory.
Action<string> d = (string file) => { System.Drawing.Image.FromFile(file); };
foreach(string file in dir.GetFiles("*.jpg"))
d.BeginInvoke(file);
BeginInvoke() is a multi-threaded approach to this, that loop will go very fast, but each file will be loaded on a different thread. Or you could change that up a little to put the loop inside the delegate, aka.
public void PreCache(List<string> files)
{
foreach(string file in files)
System.Drawing.Image.FromFile(file);
}
Then in your code
Action<List<string>> d = PreCache;
d.BeginInvoke(theList);
Then all the loading is done on just one worker thread.
Related
I'm implementing some kind of buffering mechanism:
private static readonly ConcurrentQueue<ProductDto> ProductBuffer = new ConcurrentQueue<ProductDto>();
private async void On_ProductReceived(object sender, ProductReceivedArgs e)
{
ProductBuffer.Enqueue(e.Product);
if (ProductBuffer.Count >= handlerConfig.ProductBufferSize)
{
var products = ProductBuffer.ToList();
ProductBuffer.Clear();
await SaveProducts(products);
}
}
And the question is - should I bother to add some kind of lock, to ensure no data is lost (f.e. some other thread will add product after buffer.ToList() and before buffer.Clear(), hypothetically:), or ConcurrentQueue will handle all the dirty work for me?
You can do it like this:
if (ProductBuffer.Count < handlerConfig.ProductBufferSize)
return;
var productsToSave = new List<Product>();
Product dequeued = null;
while(ProductBuffer.TryDequeue(out dequeued))
{
productsToSave.Add(dequeued);
}
SaveProducts(products);
You never Clear the queue. You just keep taking things out until it's empty. Or you could stop taking things out when productsToSave reaches a certain size, process that list, and then start a new one if you don't want to save too many products at once.
This way it doesn't matter if new items are added to the queue. If they're added while you're reading from the queue, they get read too. If they're added just after you stop reading from the queue, they'll be there and get read the next time the queue gets full and you process it.
The point of a ConcurrentQueue is that you can add to it and read from it from multiple threads, with no need for lock.
If you were to do this:
productsToSave = ProductBuffer.ToList();
ProductBuffer.Clear();
then you would need the lock (which would defeat the purpose.) Presumably you're using a ConcurrentQueue because multiple threads may be adding items to the queue. If that's the case then it is entirely possible that something could go into the queue in between the execution of those two statements. It wouldn't get added to the list, but it would be deleted by Clear. That item would be lost.
This is how I would implement it, I am assuming you do not need to be notified of when the save is finished?
private void On_ProductReceived(object sender, ProductReceivedArgs e)
{
// Variable to hold potential list of products to save
List<Products> productsToSave;
// Lock buffer
lock(ProductBuffer)
{
ProductBuffer.Enqueue(e.Product);
// If it is under size, return immediately
if (ProductBuffer.Count < handlerConfig.ProductBufferSize)
return;
// Otherwise save products, clear buffer, release lock.
productsToSave = ProductBuffer.ToList();
ProductBuffer.Clear();
}
// Save Produts,
SaveProducts(products);
}
What if you get 1 product, and don't get anything else, will you not want to save this after some timeout?
I would use something like Rx for your use case, especially IObservable<T>.Buffer(count)
So first all of my example code:
class Program
{
static List<string> queue = new List<string>();
static System.Threading.Thread queueWorkerThread;
static void Main(string[] args)
{
// Randomly call 'AddItemToQueue' at certain circumstances and user inputs (unpredictable)
}
static void AddItemToQueue(string item)
{
queue.Add(item);
// Check if the queue worker thread is active
if (queueWorkerThread == null || queueWorkerThread.IsAlive == false)
{
queueWorkerThread = new System.Threading.Thread(QueueWorker);
queueWorkerThread.Start();
Console.WriteLine("Added item to queue and started queue worker!");
}
else
{
Console.WriteLine("Added item to queue and queue worker is already active!");
}
}
static void QueueWorker()
{
do
{
string currentItem = queue[0];
// ...
// Do things with 'currentItem'
// ...
// Remove item from queue and process next one
queue.RemoveAt(0);
} while (queue.Count > 0);
// Reference Point (in my question) <----
}
}
What I am trying to create in my code is a QueueWorker()-method which is always active when there is something in the queue.
Items can be added to the queue via a AddItemToQueue()-method as you can see in the code example.
It basically adds the item to the queue and then checks whether the queue worker is active (aka. if there were other items in the queue previously) or if its not (aka. if the queue was completely empty previously).
And what I am not fully sure about is that: Let's say the queue-worker-thread was currently at the position shown in the screenshot (it left the while-loop just now) and of course the thread's IsAlive-property is still set to true at this point.
So what if the AddItemToQueue()-method checked the thread's IsAlive-property at the exact same time?
That would mean the thread would end shortly after and the new item would just be left in the queue and nothing would happen because the AddItemToQueue()-method didn't realize that the thread was just about to end.
How do I deal with this? (I want to make sure everything works 100%)
(If there's any questions about my question or something is not clear, then feel free to ask!)
I'm very new to threading, so I'm not sure if I'm doing this right, but would appreciate some assistance. I have the following code to run when the user clicks the mouse; it basically runs some path-finding code and moves the player.
However, my problem is when I click the mouse again while the thread is running, it causes issues. Is there a way to stop the previous thread and start a new one when this code is reached a second time?
private void checkMouse()
{
mouseCommand mc = new mouseCommand();
Thread oThread = new Thread(() => mc.leftClick(Mouse.GetState().X,Mouse.GetState().Y));
oThread.Start();
}
Perhaps something like this would work for you?
private object lock_object - new object();
private Thread oThread = new Thread();
private void checkMouse()
{
lock(lock_object)
{
if (oThread.ThreadState != ThreadState.Running)
{
mouseCommand mc = new mouseCommand();
oThread = new Thread(() => mc.leftClick(Mouse.GetState().X,Mouse.GetState().Y));
oThread.Start();
}
}
}
There's a few ways you can do this.
The simplest, and the first you should learn about when learning about threading is a lock. Have an object that you use to lock on this and any related actions that would also cause problem if they happened together:
private object lockObj = new object();
private static void DoLClick()
{
lock(lockObj)
{
mouseCommand mc = new mouseCommand();
mc.leftClick(Mouse.GetState().X,Mouse.GetState().Y));
}
}
private void checkMouse()
{
Thread oThread = new Thread(DoLClick);
oThread.Start();
}
The benefit is that this keeps threads from stepping on each other toes.
The downside is the loss of concurrency (all these threads are waiting on each other, instead of doing something) and the risk of deadlock (if thread A has lock 1 and needs lock 2 and thread B has lock 2 and needs lock 1, you're stuck).
It remains the simplest approach. Often even if you're going to need to use another approach, it's well worth starting with some widely defined locks, and then changing to narrower locks (that is, locks that cover less code) or different approaches later.
Another possibility is to have a lock, but instead of using lock(){} to obtain it, you use Monitor.TryEnter() with a time-out (perhaps of zero) and just give up if you don't get it:
private object lockObj = new object();
private static void DoLClick()
{
if(!Monitor.TryEnter(lockObj, 0))
return; // Just do nothing if we're busy.
try
{
mouseCommand mc = new mouseCommand();
mc.leftClick(Mouse.GetState().X,Mouse.GetState().Y));
}
finally
{
Monitor.Exit(lockObj);
}
}
private void checkMouse()
{
Thread oThread = new Thread(DoLClick);
oThread.Start();
}
The downside is that you don't get that second task done. The upside is that you often don't want something done if it's already being done, and you get that for free.
Some other approaches are a variant of this, where you've a thread-safe object describing tasks to do; it could be an integer count of actions that need doing that you use Interlocked.Increment() and Interlocked.Decrement() to change, or a ConcurrentQueue of objects that describe the task that needs doing. Then you could have the thread that failed to get the lock add to that, and that which did get the lock take over that thread's work when it's finished. Or you could perhaps have a dedicated thread that just keeps looking for work to do, and waits on an AutoResetEvent whenever it runs out of work - threads giving it work (adding to the queue) set that event to make sure it's not just sitting doing nothing.
All these possibilities (and more) are worth learning about, and have their place, but the first suggestion with lock is the first to learn.
I am working on an application that searches the files in the directory provided using background worker... the problem is with the backgroundWorker1.RunWorkerAsync();
following is my code when i am trying to give multiple paths for searching the file i type in the textbox
private void toolStripTextBox1_KeyDown(object sender, KeyEventArgs e)
{
if (e.KeyValue == 13)
{
foreach (string s in listBox1.Items)
{
DirectoryInfo deer = new DirectoryInfo(s);
toolStripButton2.Visible = true;
//listView1.Items.Clear();
if (!backgroundWorker1.IsBusy)
{
backgroundWorker1.RunWorkerAsync(deer);
}
else
MessageBox.Show("Can't run the worker twice!");
// backgroundWorker1.RunWorkerAsync(deer);
}
}
listView1.AutoResizeColumns(ColumnHeaderAutoResizeStyle.HeaderSize);
listView1.AutoResizeColumns(ColumnHeaderAutoResizeStyle.ColumnContent);
}
and i get the following error
This BackgroundWorker is currently busy and cannot run multiple tasks concurrently.
please help me out..
Not sure what you're trying to achieve here.
1) If you wish to run multiple tasks concurrently on different threads (i.e. to process each one of the items in the listBox1.Items), you will have to create separate threads or tasks to do so, and not use the same background worker.
2) If you simply wish to handle the overall processing of these items in the background without affecting (blocking) the UI you will need to use one background worker and pass it the entire collection.
In any case, the current code should not throw the error, unless you comment out the mbox and uncomment the other backgroundWorker1.RunWorkerAsync(deer);. If you do that, then you're basically trying to start the same thread before it finished it's previous work. If you don't do that, you're basically skipping items in the list from being processed until the thread becomes available again.
A general example of the 1st should look like this:
foreach (string s in listBox1.Items)
{
DirectoryInfo deer = new DirectoryInfo(s);
toolStripButton2.Visible = true;
Task.Run(() => TheDoWorkMethodYouUsed(deer);
}
A general example of the 2nd would be to modify your do work method to run over the entire collection, and passing that collection:
if (e.KeyValue == 13)
{
backgroundWorker1.RunWorkerAsync(listBox1.Items);
}
And in the DoWork method:
foreach (string s in passedList)
{
DirectoryInfo deer = new DirectoryInfo(s);
// continue with normal processing of the method
}
Use multiple Threads :
Create multiple threads and wait all of them to complete
Or
http://msdn.microsoft.com/en-us/library/ff649143.aspx
I am trying to populate a text box with some data, namely the names of several instruments a line at a time.
I have a class that will generate and return a list of instruments, I then iterate through the list and append a new line to the text box after each iteration.
Starting the Thread:
private void buttonListInstruments_Click(object sender, EventArgs e)
{
if (ins == null)
{
ins = new Thread(GetListOfInstruments);
ins.Start();
}
else if (ins != null)
{
textBoxLog.AppendText("Instruments still updating..");
}
}
Delegate to update textbox:
public delegate void UpdateLogWithInstrumentsCallback(List<Instrument> instruments);
private void UpdateInstruments(List<Instrument> instruments)
{
textBoxLog.AppendText("Listing available Instruments...\n");
foreach (var value in instruments)
{
textBoxLog.AppendText(value.ToString() + "\n");
}
textBoxLog.AppendText("End of list. \n");
ins = null;
}
Invoking the control:
private void GetListOfInstruments()
{
textBoxLog.Invoke(new UpdateLogWithInstrumentsCallback(this.UpdateInstruments),
new object[] { midiInstance.GetInstruments() });
}
Note: GetInstruments() returns a List of type Instrument.
I am implementing therads to try to keep the GUI functional whilst the text box updates.
For some reason the other UI controls on the WinForm such as a seperate combo box remain inactive when pressed until the text box has finished updating.
Am I using threads correctly?
Thanks.
You haven't accomplished anything, the UpdateInstruments() method still runs on the UI thread, just like it did before. Not so sure why you see such a long delay, that must be a large number of instruments. You can possibly make it is less slow by first appending all of them into a StringBuilder, then append its ToString() value to the TextBox. That cuts out the fairly expensive Windows call.
I would recommend using a SynchronizationContext in general:
From the UI thread, e.g. initialization:
// make sure a SC is created automatically
Forms.WindowsFormsSynchronizationContext.AutoInstall = true;
// a control needs to exist prior to getting the SC for WinForms
// (any control will do)
var syncControl = new Forms.Control();
syncControl.CreateControl();
SyncrhonizationContext winformsContext = System.Threading.SynchronizationContext.Current;
Later on, from any thread wishing to post to the above SC:
// later on -- no need to worry about Invoke/BeginInvoke! Whoo!
// Post will run async and will guarantee a post to the UI message queue
// that is, this returns immediately
// it is OKAY to call this from the UI thread or a non-UI thread
winformsContext.Post(((state) => ..., someState);
As others have pointed out, either make the UI update action quicker (this is the better method!!!) or separate it into multiple actions posted to the UI queue (if you post into the queue then other message in the queue won't be blocked). Here is an example of "chunking" the operations into little bit of time until it's all done -- it assumes UpdateStuff is called after the data is collected and not necessarily suitable when the collection itself takes noticeable time. This doesn't take "stopping" into account and is sort of messy as it uses a closure instead of passing the state. Anyway, enjoy.
void UpdateStuff (List<string> _stuff) {
var stuff = new Queue<string>(_stuff); // make copy
SendOrPostCallback fn = null; // silly so we can access in closure
fn = (_state) => {
// this is in UI thread
Stopwatch s = new Stopwatch();
s.Start();
while (s.ElapsedMilliseconds < 20 && stuff.Count > 0) {
var item = stuff.Dequeue();
// do stuff with item
}
if (stuff.Count > 0) {
// have more stuff. we may have run out of our "time-slice"
winformsContext.Post(fn, null);
}
};
winformsContext.Post(fn, null);
}
Happy coding.
Change this line:
textBoxLog.Invoke(new UpdateLogWithInstrumentsCallback(this.UpdateInstruments),
new object[] { midiInstance.GetInstruments() });
with this:
textBoxLog.BeginInvoke(new UpdateLogWithInstrumentsCallback(this.UpdateInstruments),
new object[] { midiInstance.GetInstruments() });
You are feeding all instruments into the textbox at once rather then one-by-one in terms of threading. The call to Invoke shall be placed in the for-loop and not to surround it.
nope, you start a thread, and then use invoke, which basically means you are going back to the UI thread to do the work... so your thread does nothing!
You might find that it's more efficient to build a string first and append to the textbox in one chunk, instead of line-by-line. The string concatenation operation could then be done on the helper thread as well.