Equivalent of PosthreadMessage - c#

Does .Net has an equivalent for PosthreadMessage?
We presently use a List (for keeping the items), a lock (protecting the list) and an event (to notify the consumer thread that an item has been added to the list) for the same functionality.
Is there any optimized way for implementing this?

There are some concurrent collections in .NET 4.0 (System.Collections.Concurrent) that perhaps you could use instead of rolling your own thread-safe data structure? I'm not sure what your requirements are, and I'm not sure how your wanting to optimize your container has anything to do with making it equivalent to PostThreadMessage.
If you want, you can always use Managed C++ to expose PostThreadMessage to your .NET application. Or you can use PInvoke to call it from your app as well.

Doesn't look like there's much room for optimization here, unless you want to design it as single-threaded to avoid the context switch. This may anyway be preferred, from program logic point of view.

Related

Is it possible to use a custom memory allocator for LINQ?

Is there a way to use a custom memory allocator for LINQ?
For example when I call:
someCollection.Where(x).SelectMany(y).ToList();
Methods like ToList() or OrderBy() will always create a new array, so lots of GC will happen.
With a custom allocator, I could always use the same List, which will be cleared and refilled every time. Iam aware that reusing buffers could lead to problems with reentrancy.
The background is, my application is a game and GC means stuttering.
Please don't tell me "Use C++ instead" or "Do not use LINQ", I know that :)
(Although you asked not to be suggested against it, I thin this answer could help the community)
LINQ is a facility built on top the CLR, therefore it uses the CLR allocator, and it cannot be changed.
You can tune it a little bit, for example configuring whether or not the GC cycle should be offloaded to a background thread, but you can't go any further.
The aim of LINQ is to simply writing code for certain class of problems sacrificing the freedom to choose the implementation of every building block (that's why we usually choose LINQ).
However, depending on the scenario, LINQ could not be your best friend as its design choices may play against yours.
If, after profiling your code, you identify that you have a serious performance problems you should try at first to identify whether or not you can isolate the bottleneck in some of LINQ methods and see whether you can roll your own implementation, via extension methods.
Of course this option is viable when yuo are the main caller, unless you manage to roll something that is IEnumerable complaint. You need to be very lucky, because your implementation should abide to LINQ rules. Particularly, as you are not in control of how the objects are manipulated, you cannot perform the optimizations you would in your own code.
Closures and deferred execution work against you.
Otherwise, what has been suggested by the comments, is the only viable option: avoid using LINQ for that specific task.
The reason for stepping away from LINQ is that it is not the right tool to solve your problem with performance constraint you require.
Additionally, as stated in the comments, the (ab)use of lambda expressions significantly increase the memory pressure as backing objects are created to implement the closures.
We had performance issues similar to yours, where we had to rewrite certain slow paths. In other (rare) cases, preallocating the lists and loading the results via AddRange helped.

C# Multithreading: Do I have to use locks when only getting objects from a list/dictionary?

I am currently working on a multithreaded c# application.
In my case, I have a list/dictionary, which is assigned and filled in the main-thread while the application is starting up. The list will never be modified again. I only use the list to get objects.
Do I have to use locks?
lock(list) { var test = list[0]; }
or can I access the object directly?
I know, if I access the object in the list, the object has to be thread-safe.
Reading is not a problem. But be sure that unexpected behaviors can appear if someone else is writing/deleting. When you are reading
if this list is prepared before and not changed after this you could access the object without locking which is also faster.
But be really aware that you should strongly avoid and modification action to happend when you read from the collection.
When you need also the writing operations then you would need synchronization like with ReaderWriterLockSlim or have a look at the system.collections.concurrent namespace
As long as you don't change the content of the list/array, there is no immediate need for locks.
But I would suggest to implement some synchronization (like locks) anyway. Can you be sure that you won't change your application in the next years so that you will change the content later at runtime?
Lock is used to avoid fetching dirty reads if there's other thread . Since you won't change it, list is lock-free.
If you really want to prevent some unexpected changes for debugging (get error when it happens), you can declare it as const.
As others have mentioned, reading is not a problem. But as you said, you are populating this collection at start up. But you have not mentioned at what point are you starting to read. So presumably there can be unexpected behaviours. You have to use thread safe collections for that. For an example you can use a blocking collection for this purpose.
Here is the MSDN article which explains more about thread safe collections Link.

"Synchronous" functions in C#?

I am creating a custom statemachine and in order to be determinist, I have to "synchronise" my transitions. I'm not sure about the word "synchronize" but what I want is that when I call a function (through EventHandler), the system is like frozen before I can call another function (through EventHandler too).
It's kinda hard to explain it precisely in english but I think you know what I mean...
I was thinking about Threading but I'd REALLY like to avoid this...
If you are looking to emulate the effect of the "synchronized" keyword from java, the best way is probably to wrap the entire method code inside
lock(this)
{
// code
}
Not sure if that's what you are looking for, but C# iterator blocks are essentially state machines.
Synchronization is when you're in a multi-threaded environment and you need to make access to resources by the threads synchronized (1 at a time). This ensures unpredictable results are not achieved when threads are changing resources while other threads are trying to access them. There are many constructs available to you in C# to handle synchronization. It all depends on what your threads are trying to accomplish.
Here is a link from MSDN that shows some simple examples: http://msdn.microsoft.com/en-us/library/ms173179.aspx

Is there a threadsafe and generic IList<T> in c#?

Is List<T> or HashSet<T> or anything else built in threadsafe for addition only?
My question is similar to Threadsafe and generic arraylist? but I'm only looking for safety to cover adding to this list threaded, not removal or reading from it.
System.Collections.Concurrent.BlockingCollection<T>
Link.
.NET 4.0 you could use the BlockingCollection<T>, but that is still designed to be thread safe for all operations, not just addition.
In general, it's uncommon to design a data structure that guarantees certain operations to be safe for concurrency and other to not be so. If you're concerned that there is an overhead when accessing a collection for reading, you should do some benchmarking before you go out of your way to look for specialized collections to deal with that.

Concurrent Dictionary in C#

I need to implement concurrent Dictionary because .Net does not contain concurrent implementation for collections(Since .NET4 will be contains). Can I use for it "Power Threading Library" from Jeffrey Richter or present implemented variants or any advice for implemented?
Thanks ...
I wrote a thread-safe wrapper for the normal Dictionary class that uses Interlocked to protect the internal dictionary. Interlocked is by far the fastest locking mechanism available and will give much better performance than ReaderWriterLockSlim, Monitor or any of the other available locks.
The code was used to implement a Cache class for Fasterflect, which is a library to speed up reflection. As such we tried a number of different approaches in order to find the fastest possible solution. Interestingly, the new concurrent collections in .NET 4 are noticeably faster than my implementation, although both are pretty darn fast compared to solutions using a less performance locking mechanism. The implementation for .NET 3.5 is located inside a conditional region in the bottom half of the file.
You can use Reflector to view the source code of the concurrent implementation of .NET 4.0 RC and copy it to your own code. This way you will have the least problems when migrating to .NET 4.0.
I wrote a concurrent dictionary myself (prior to .NET 4.0's System.Collections.Concurrent namespace); there's not much to it. You basically just want to make sure certain methods are not getting called at the same time, e.g., Contains and Remove or something like that.
What I did was to use a ReaderWriterLock (in .NET 3.5 and above, you could go with ReaderWriterLockSlim) and call AcquireReaderLock for all "read" operations (like this[TKey], ContainsKey, etc.) and AcquireWriterLock for all "write" operations (like this[TKey] = value, Add, Remove, etc.). Be sure to wrap any calls of this sort in a try/finally block, releasing the lock in the finally.
It's also a good idea to modify the behavior of GetEnumerator slightly: rather than enumerate over the existing collection, make a copy of it and allow enumeration over that. Otherwise you'll face potential deadlocks.
Here's a simple implementation that uses sane locking (though Interlocked would likely be faster):
http://www.tech.windowsapplication1.com/content/the-synchronized-dictionarytkey-tvalue
Essentially, just create a Dictionary wrapper/decorator and synchronize access to any read/write actions.
When you switch to .Net 4.0, just replace all of your overloads with delegated calls to the underlying ConcurrentDictionary.

Categories

Resources